• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 306
  • 139
  • 34
  • 31
  • 24
  • 19
  • 16
  • 16
  • 14
  • 12
  • 7
  • 5
  • 4
  • 3
  • 2
  • Tagged with
  • 745
  • 745
  • 745
  • 141
  • 118
  • 112
  • 102
  • 86
  • 68
  • 65
  • 59
  • 58
  • 55
  • 55
  • 52
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
11

Quasi-objective Nonlinear Principal Component Analysis and applications to the atmosphere

Lu, Beiwei 05 1900 (has links)
NonLinear Principal Component Analysis (NLPCA) using three-hidden-layer feed-forward neural networks can produce solutions that over-fit the data and are non-unique. These problems have been dealt with by subjective methods during the network training. This study shows that these problems are intrinsic due to the three-hidden-layer architecture. A simplified two-hidden-layer feed-forward neural network that has no encoding layer and no bottleneck and output biases is proposed. This new, compact NLPCA model alleviates these problems without employing the subjective methods and is called quasi-objective. The compact NLPCA is applied to the zonal winds observed at seven pressure levels between 10 and 70 hPa in the equatorial stratosphere to represent the Quasi-Biennial Oscillation (QBO) and investigate its variability and structure. The two nonlinear principal components of the dataset offer a clear picture of the QBO. In particular, their structure shows that the QBO phase consists of a predominant 28.4-month cycle that is modulated by an 11-year cycle and a longer-period cycle. The significant difference in variability of the winds between cold and warm seasons and the tendency for a seasonal synchronization of the QBO phases are well captured. The one-dimensional NLPCA approximation of the dataset provides a better representation of the QBO than the classical principal component analysis and a better description of the asymmetry of the QBO between westerly and easterly shear zones and between their transitions. The compact NLPCA is then applied to the Arctic Oscillation (AO) index and aforementioned zonal winds to investigate the relationship of the AO with the QBO. The NLPCA of the AO index and zonal-winds dataset shows clearly that, of covariation of the two oscillations, the phase defined by the two nonlinear principal components progresses with a predominant 28.4-month periodicity, plus the 11-year and longer-period modulations. Large positive values of the AO index occur when westerlies prevail near the middle and upper levels of the equatorial stratosphere. Large negative values of the AO index arise when easterlies occupy over half the layer of the equatorial stratosphere. / Science, Faculty of / Earth, Ocean and Atmospheric Sciences, Department of / Graduate
12

Contrasting Environments Associated with Storm Prediction Center Tornado Outbreak Forecasts using Synoptic-Scale Composite Analysis

Bates, Alyssa Victoria 17 May 2014 (has links)
Tornado outbreaks have significant human impact, so it is imperative forecasts of these phenomena are accurate. As a synoptic setup lays the foundation for a forecast, synoptic-scale aspects of Storm Prediction Center (SPC) outbreak forecasts of varying accuracy were assessed. The percentages of the number of tornado outbreaks within SPC 10% tornado probability polygons were calculated. False alarm events were separately considered. The outbreaks were separated into quartiles using a point-in-polygon algorithm. Statistical composite fields were created to represent the synoptic conditions of these groups and facilitate comparison. Overall, temperature advection had the greatest differences between the groups. Additionally, there were significant differences in the jet streak strengths and amounts of vertical wind shear. The events forecasted with low accuracy consisted of the weakest synoptic-scale setups. These results suggest it is possible that events with weak synoptic setups should be regarded as areas of concern by tornado outbreak forecasters.
13

WASP: An Algorithm for Ranking College Football Teams

Earl, Jonathan January 2016 (has links)
Arrow's Impossibility Theorem outlines the flaws that effect any voting system that attempts to order a set of objects. For its entire history, American college football has been determining its champion based on a voting system. Much of the literature has dealt with why the voting system used is problematic, but there does not appear to be a large collection of work done to create a better, mathematical process. More generally, the inadequacies of ranking in football are a manifestation of the problem of ranking a set of objects. Herein, principal component analysis is used as a tool to provide a solution for the problem, in the context of American college football. To show its value, rankings based on principal component analysis are compared against the rankings used in American college football. / Thesis / Master of Science (MSc) / The problem of ranking is a ubiquitous problem, appearing everywhere from Google to ballot boxes. One of the more notable areas where this problem arises is in awarding the championship in American college football. This paper explains why this problem exists in American college football, and presents a bias-free mathematical solution that is compared against how American college football awards their championship.
14

A Quantitative Analysis of Pansharpened Images

Vijayaraj, Veeraraghavan 07 August 2004 (has links)
There has been an exponential increase in satellite image data availability. Image data are now collected with different spatial, spectral, and temporal resolutions. Image fusion techniques are used extensively to combine different images having complementary information into one single composite. The fused image has rich information that will improve the performance of image analysis algorithms. Pansharpening is a pixel level fusion technique used to increase the spatial resolution of the multispectral image using spatial information from the high resolution panchromatic image while preserving the spectral information in the multispectral image. Resolution merge, image integration, and multisensor data fusion are some of the equivalent terms used for pansharpening. Pansharpening techniques are applied for enhancing certain features not visible in either of the single data alone, change detection using temporal data sets, improving geometric correction, and enhancing classification. Various pansharpening algorithms are available in the literature, and some have been incorporated in commercial remote sensing software packages such as ERDAS Imagine® and ENVI®. The performance of these algorithms varies both spectrally and spatially. Hence evaluation of the spectral and spatial quality of the pansharpened images using objective quality metrics is necessary. In this thesis, quantitative metrics for evaluating the quality of pansharpened images have been developed. For this study, the Intensity-Hue-Saturation (IHS) based sharpening, Brovey sharpening, Principal Component Analysis (PCA) based sharpening and a Wavelet-based sharpening method is used.
15

Large Scale Matrix Completion and Recommender Systems

Amadeo, Lily 04 September 2015 (has links)
"The goal of this thesis is to extend the theory and practice of matrix completion algorithms, and how they can be utilized, improved, and scaled up to handle large data sets. Matrix completion involves predicting missing entries in real-world data matrices using the modeling assumption that the fully observed matrix is low-rank. Low-rank matrices appear across a broad selection of domains, and such a modeling assumption is similar in spirit to Principal Component Analysis. Our focus is on large scale problems, where the matrices have millions of rows and columns. In this thesis we provide new analysis for the convergence rates of matrix completion techniques using convex nuclear norm relaxation. In addition, we validate these results on both synthetic data and data from two real-world domains (recommender systems and Internet tomography). The results we obtain show that with an empirical, data-inspired understanding of various parameters in the algorithm, this matrix completion problem can be solved more efficiently than some previous theory suggests, and therefore can be extended to much larger problems with greater ease. "
16

Classification of Genotype and Age of Eyes Using RPE Cell Size and Shape

Yu, Jie 18 December 2012 (has links)
Retinal pigment epithelium (RPE) is a principal site of pathogenesis in age-related macular de-generation (AMD). AMD is a main source of vision loss even blindness in the elderly and there is no effective treatment right now. Our aim is to describe the relationship between the morphology of RPE cells and the age and genotype of the eyes. We use principal component analysis (PCA) or functional principal component method (FPCA), support vector machine (SVM), and random forest (RF) methods to analyze the morphological data of RPE cells in mouse eyes to classify their age and genotype. Our analyses show that amongst all morphometric measures of RPE cells, cell shape measurements (eccentricity and solidity) are good for classification. But combination of cell shape and size (perimeter) provide best classification.
17

Comparison of Classification Effects of Principal Component and Sparse Principal Component Analysis for Cardiology Ultrasound in Left Ventricle

Yang, Hsiao-ying 05 July 2012 (has links)
Due to the association of heart diseases and the patterns of the diastoles and systoles of heart in left ventricle, we analyze and classify the data gathered form Kaohsiung Veterans General Hospital by using the cardiology ultrasound images. We make use of the differences between the gray-scale values of diastoles and systoles in left ventricle to evaluate the function of heart. Following Chen (2011) and Kao (2011), we modified the way about the reduction and alignment of the image data. We also add some more subjects into the study. We treat images in two manners, saving the parts of concern. Since the ultrasound image after transformation to data form is expressed as a high-dimensional matrix, the principal component analysis is adapted to retain the important factors and reduce the dimensions. In this work, we compare the loadings calculated by the usual principal and sparse principal component analysis, then the factor scores are used to carry out the discriminant analysis and discuss the accuracy of classification. By the statistical methods in this work, the accuracy, sensitivity and specificity of the original classifications are over 80% and the cross validations are over 60%.
18

Examining the Relationship Between Hydroclimatological Variables and High Flow Events

Fliehman, Ryan Mark January 2012 (has links)
In our study we identify dominant hydroclimatic variables and large-scale patterns that lead to high streamflow events in the Santa Cruz, Salt, and Verde River in Arizona for the period 1979-2009 using Principal Component Analysis (PCA). We used winter (Nov - March) data from the USGS daily streamflow database and 11 variables from the North American Reanalysis (NARR) database, in addition to weather maps from the Hydrometeorological Prediction Center (HPC). Using streamflow data, we identify precipitation events that led to the highest 98th percentile of daily streamflow events and find dominant hydroclimatic variables associated with these events. We find that upper level winds and moisture fluxes are dominant variables that characterize events. The dominant mode for all three basins is associated with frontal systems, while the second mode is associated with cut-off upper level low pressure systems. Our goal is to provide forecasting agencies with tools to improve flood forecasting practices.
19

Resilient Average and Distortion Detection in Sensor Networks

Aguirre Jurado, Ricardo 15 May 2009 (has links)
In this paper a resilient sensor network is built in order to lessen the effects of a small portion of corrupted sensors when an aggregated result such as the average needs to be obtained. By examining the variance in sensor readings, a change in the pattern can be spotted and minimized in order to maintain a stable aggregated reading. Offset in sensors readings are also analyzed and compensated to help reduce a bias change in average. These two analytical techniques are later combined in Kalman filter to produce a smooth and resilient average given by the readings of individual sensors. In addition, principal components analysis is used to detect variations in the sensor network. Experiments are held using real sensors called MICAz, which are use to gather light measurements in a small area and display the light average generated in that area.
20

Extração de características de imagens de faces humanas através de wavelets, PCA e IMPCA / Features extraction of human faces images through wavelets, PCA and IMPCA

Bianchi, Marcelo Franceschi de 10 April 2006 (has links)
Reconhecimento de padrões em imagens é uma área de grande interesse no mundo científico. Os chamados métodos de extração de características, possuem as habilidades de extrair características das imagens e também de reduzir a dimensionalidade dos dados gerando assim o chamado vetor de características. Considerando uma imagem de consulta, o foco de um sistema de reconhecimento de imagens de faces humanas é pesquisar em um banco de imagens, a imagem mais similar à imagem de consulta, de acordo com um critério dado. Este trabalho de pesquisa foi direcionado para a geração de vetores de características para um sistema de reconhecimento de imagens, considerando bancos de imagens de faces humanas, para propiciar tal tipo de consulta. Um vetor de características é uma representação numérica de uma imagem ou parte dela, descrevendo seus detalhes mais representativos. O vetor de características é um vetor n-dimensional contendo esses valores. Essa nova representação da imagem propicia vantagens ao processo de reconhecimento de imagens, pela redução da dimensionalidade dos dados. Uma abordagem alternativa para caracterizar imagens para um sistema de reconhecimento de imagens de faces humanas é a transformação do domínio. A principal vantagem de uma transformação é a sua efetiva caracterização das propriedades locais da imagem. As wavelets diferenciam-se das tradicionais técnicas de Fourier pela forma de localizar a informação no plano tempo-freqüência; basicamente, têm a capacidade de mudar de uma resolução para outra, o que as fazem especialmente adequadas para análise, representando o sinal em diferentes bandas de freqüências, cada uma com resoluções distintas correspondentes a cada escala. As wavelets foram aplicadas com sucesso na compressão, melhoria, análise, classificação, caracterização e recuperação de imagens. Uma das áreas beneficiadas onde essas propriedades tem encontrado grande relevância é a área de visão computacional, através da representação e descrição de imagens. Este trabalho descreve uma abordagem para o reconhecimento de imagens de faces humanas com a extração de características baseado na decomposição multiresolução de wavelets utilizando os filtros de Haar, Daubechies, Biorthogonal, Reverse Biorthogonal, Symlet, e Coiflet. Foram testadas em conjunto as técnicas PCA (Principal Component Analysis) e IMPCA (Image Principal Component Analysis), sendo que os melhores resultados foram obtidos utilizando a wavelet Biorthogonal com a técnica IMPCA / Image pattern recognition is an interesting area in the scientific world. The features extraction method refers to the ability to extract features from images, reduce the dimensionality and generates the features vector. Given a query image, the goal of a features extraction system is to search the database and return the most similar to the query image according to a given criteria. Our research addresses the generation of features vectors of a recognition image system for human faces databases. A feature vector is a numeric representation of an image or part of it over its representative aspects. The feature vector is a n-dimensional vector organizing such values. This new image representation can be stored into a database and allow a fast image retrieval. An alternative for image characterization for a human face recognition system is the domain transform. The principal advantage of a transform is its effective characterization for their local image properties. In the past few years researches in applied mathematics and signal processing have developed practical wavelet methods for the multi scale representation and analysis of signals. These new tools differ from the traditional Fourier techniques by the way in which they localize the information in the time-frequency plane; in particular, they are capable of trading on type of resolution for the other, which makes them especially suitable for the analysis of non-stationary signals. The wavelet transform is a set basis function that represents signals in different frequency bands, each one with a resolution matching its scale. They have been successfully applied to image compression, enhancement, analysis, classification, characterization and retrieval. One privileged area of application where these properties have been found to be relevant is computer vision, especially human faces imaging. In this work we describe an approach to image recognition for human face databases focused on feature extraction based on multiresolution wavelets decomposition, taking advantage of Biorthogonal, Reverse Biorthogonal, Symlet, Coiflet, Daubechies and Haar. They were tried in joint the techniques together the PCA (Principal Component Analysis) and IMPCA (Image Principal Component Analysis)

Page generated in 0.1289 seconds