• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 385
  • 168
  • 46
  • 44
  • 28
  • 21
  • 19
  • 18
  • 17
  • 17
  • 15
  • 6
  • 4
  • 3
  • 3
  • Tagged with
  • 940
  • 940
  • 742
  • 149
  • 146
  • 142
  • 124
  • 113
  • 97
  • 86
  • 75
  • 72
  • 70
  • 63
  • 63
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
71

Macroeconomic Forecasting: Statistically Adequate, Temporal Principal Components

Dorazio, Brian Arthur 05 June 2023 (has links)
The main goal of this dissertation is to expand upon the use of Principal Component Analysis (PCA) in macroeconomic forecasting, particularly in cases where traditional principal components fail to account for all of the systematic information making up common macroeconomic and financial indicators. At the outset, PCA is viewed as a statistical model derived from the reparameterization of the Multivariate Normal model in Spanos (1986). To motivate a PCA forecasting framework prioritizing sound model assumptions, it is demonstrated, through simulation experiments, that model mis-specification erodes reliability of inferences. The Vector Autoregressive (VAR) model at the center of these simulations allows for the Markov (temporal) dependence inherent in macroeconomic data and serves as the basis for extending conventional PCA. Stemming from the relationship between PCA and the VAR model, an operational out-of-sample forecasting methodology is prescribed incorporating statistically adequate, temporal principal components, i.e. principal components which capture not only Markov dependence, but all of the other, relevant information in the original series. The macroeconomic forecasts produced from applying this framework to several, common macroeconomic indicators are shown to outperform standard benchmarks in terms of predictive accuracy over longer forecasting horizons. / Doctor of Philosophy / The landscape of macroeconomic forecasting and nowcasting has shifted drastically in the advent of big data. Armed with significant growth in computational power and data collection resources, economists have augmented their arsenal of statistical tools to include those which can produce reliable results in big data environments. At the forefront of such tools is Principal Component Analysis (PCA), a method which reduces the number of predictors into a few factors containing the majority of the variation making up the original data series. This dissertation expands upon the use of PCA in the forecasting of key, macroeconomic indicators, particularly in instances where traditional principal components fail to account for all of the systematic information comprising the data. Ultimately, a forecasting methodology which incorporates temporal principal components, ones capable of capturing both time dependence as well as the other, relevant information in the original series, is established. In the final analysis, the methodology is applied to several, common macroeconomic and financial indicators. The forecasts produced using this framework are shown to outperform standard benchmarks in terms of predictive accuracy over longer forecasting horizons.
72

Facilitating Self-As-Context: A Treatment Component Study

Williams, Neville Farley 31 July 2015 (has links)
A crucial step in assessing the scientific basis of a psychotherapeutic intervention is examining the individual components of the treatment to determine if they are additive or important to treatment outcomes. The construct of self-as-context (S-A-C), a central process in the acceptance and commitment therapy (ACT) approach, has not yet been studied in a component analysis. A previous dismantling trial, however, has shown this process has an additive effect as part of an ACT package (Williams, 2006). The current study is a preliminary trial of feasibility and efficacy to determine a) the practicality of assessing S-A-C in isolation in a laboratory setting, and b) the impact of manipulating S-A-C on theoretically related variables, including theorized mechanisms of change in various clinical approaches. 68 participants (55 female, 13 male) were randomly assigned to receive either a brief S-A-C intervention employing a common therapeutic metaphor (the chessboard metaphor), or the control condition, which involved discussing a mildly positive topic with the researcher. Results from the main analyses showed that there was no group-by-time interaction on measures to assess immediate impact on the construct, previously validated therapeutic mediation measures, or symptom measures. Several possible explanations for the failure to identify significant findings are discussed, including limitations of construct measurement. When analyses were repeated using only those participants whose scores were in the mild range or higher for stress, anxiety, or depression, time by condition interactions were significant for stress and approached significance for depression, with participants in the S-A-C group doing better than those in the control group, offering tentative support for the utility of this process among individuals with clinical difficulties. Implications for future studies are reported. / Ph. D.
73

A Statistical Examination of the Climatic Human Expert System, The Sunset Garden Zones for California

Logan, Ben 11 January 2008 (has links)
Twentieth Century climatology was dominated by two great figures: Wladamir Köppen and C. Warren Thornthwaite. The first carefully developed climatic parameters to match the larger world vegetation communities. The second developed complex formulas of "Moisture Factors" that provided efficient understanding of how evapotranspiration influences plant growth and health, both for native and non-native communities. In the latter half of the Twentieth Century, the Sunset Magazine Corporation develop a purely empirical set of Garden Zones, first for California, then for the thirteen states of the West, now for the entire nation in the National Garden Maps. The Sunset Garden Zones are well recognized and respected in Western States for illustrating the several factors of climate that distinguish zones. But the Sunset Garden Zones have never before been digitized and examined statistically for validation of their demarcations. This thesis examines the digitized zones with reference to PRISM climate data. Variable coverages resembling those described by Sunset are extracted from the PRISM data. These variable coverages are collected for two buffered areas, one in northern California and one in southern California. The coverages are exported from ArcGIS 9.1 to SAS® where they are processed first through a Principal Component Analysis, and then the first five principal components are entered into a Ward's Hierarchical Cluster Analysis. The resulting clusters were translated back into ArcGIS as a raster coverage, where the clusters were climatic regions. This process is quite amenable for further examination of other regions of California / Master of Science
74

Iterative issues of ICA, quality of separation and number of sources: a study for biosignal applications

Naik, Ganesh Ramachandra, ganesh.naik@rmit.edu.au January 2009 (has links)
This thesis has evaluated the use of Independent Component Analysis (ICA) on Surface Electromyography (sEMG), focusing on the biosignal applications. This research has identified and addressed the following four issues related to the use of ICA for biosignals: • The iterative nature of ICA • The order and magnitude ambiguity problems of ICA • Estimation of number of sources based on dependency and independency nature of the signals • Source separation for non-quadratic ICA (undercomplete and overcomplete) This research first establishes the applicability of ICA for sEMG and also identifies the shortcomings related to order and magnitude ambiguity. It has then developed, a mitigation strategy for these issues by using a single unmixing matrix and neural network weight matrix corresponding to the specific user. The research reports experimental verification of the technique and also the investigation of the impact of inter-subject and inter-experimental variations. The results demonstrate that while using sEMG without separation gives only 60% accuracy, and sEMG separated using traditional ICA gives an accuracy of 65%, this approach gives an accuracy of 99% for the same experimental data. Besides the marked improvement in accuracy, the other advantages of such a system are that it is suitable for real time operations and is easy to train by a lay user. The second part of this thesis reports research conducted to evaluate the use of ICA for the separation of bioelectric signals when the number of active sources may not be known. The work proposes the use of value of the determinant of the Global matrix generated using sparse sub band ICA for identifying the number of active sources. The results indicate that the technique is successful in identifying the number of active muscles for complex hand gestures. The results support the applications such as human computer interface. This thesis has also developed a method of determining the number of independent sources in a given mixture and has also demonstrated that using this information, it is possible to separate the signals in an undercomplete situation and reduce the redundancy in the data using standard ICA methods. The experimental verification has demonstrated that the quality of separation using this method is better than other techniques such as Principal Component Analysis (PCA) and selective PCA. This has number of applications such as audio separation and sensor networks.
75

The application of multivariate statistical analysis and batch process control in industrial processes

Lin, Haisheng January 2010 (has links)
To manufacture safe, effective and affordable medicines with greater efficiency, process analytical technology (PAT) has been introduced by the Food and Drug Agency to encourage the pharmaceutical industry to develop and design well-understood processes. PAT requires chemical imaging techniques to be used to collect process variables for real-time process analysis. Multivariate statistical analysis tools and process control tools are important for implementing PAT in the development and manufacture of pharmaceuticals as they enable information to be extracted from the PAT measurements. Multivariate statistical analysis methods such as principal component analysis (PCA) and independent component analysis (ICA) are applied in this thesis to extract information regarding a pharmaceutical tablet. ICA was found to outperform PCA and was able to identify the presence of five different materials and their spatial distribution around the tablet.Another important area for PAT is in improving the control of processes. In the pharmaceutical industry, many of the processes operate in a batch strategy, which introduces difficult control challenges. Near-infrared (NIR) spectroscopy is a non-destructive analytical technique that has been used extensively to extract chemical and physical information from a product sample based on the scattering effect of light. In this thesis, NIR measurements were incorporated as feedback information into several control strategies. Although these controllers performed reasonably well, they could only regulate the NIR spectrum at a number of wavenumbers, rather than over the full spectrum.In an attempt to regulate the entire NIR spectrum, a novel control algorithm was developed. This controller was found to be superior to the only comparable controller and able to regulate the NIR similarly. The benefits of the proposed controller were demonstrated using a benchmark simulation of a batch reactor.
76

Detection And Classification Of Buried Radioactive Materials

Wei, Wei 09 December 2011 (has links)
This dissertation develops new approaches for detection and classification of buried radioactive materials. Different spectral transformation methods are proposed to effectively suppress noise and to better distinguish signal features in the transformed space. The contributions of this dissertation are detailed as follows. 1) Propose an unsupervised method for buried radioactive material detection. In the experiments, the original Reed-Xiaoli (RX) algorithm performs similarly as the gross count (GC) method; however, the constrained energy minimization (CEM) method performs better if using feature vectors selected from the RX output. Thus, an unsupervised method is developed by combining the RX and CEM methods, which can efficiently suppress the background noise when applied to the dimensionality-reduced data from principle component analysis (PCA). 2) Propose an approach for buried target detection and classification, which applies spectral transformation followed by noisejusted PCA (NAPCA). To meet the requirement of practical survey mapping, we focus on the circumstance when sensor dwell time is very short. The results show that spectral transformation can alleviate the effects from spectral noisy variation and background clutters, while NAPCA, a better choice than PCA, can extract key features for the following detection and classification. 3) Propose a particle swarm optimization (PSO)-based system to automatically determine the optimal partition for spectral transformation. Two PSOs are incorporated in the system with the outer one being responsible for selecting the optimal number of bins and the inner one for optimal bin-widths. The experimental results demonstrate that using variable bin-widths is better than a fixed bin-width, and PSO can provide better results than the traditional Powell’s method. 4) Develop parallel implementation schemes for the PSO-based spectral partition algorithm. Both cluster and graphics processing units (GPU) implementation are designed. The computational burden of serial version has been greatly reduced. The experimental results also show that GPU algorithm has similar speedup as cluster-based algorithm.
77

Feature Extraction using Dimensionality Reduction Techniques: Capturing the Human Perspective

Coleman, Ashley B. January 2015 (has links)
No description available.
78

Utilização de análise de componentes principais em séries temporais / Use of principal component analysis in time series

Teixeira, Sérgio Coichev 12 April 2013 (has links)
Um dos principais objetivos da análise de componentes principais consiste em reduzir o número de variáveis observadas em um conjunto de variáveis não correlacionadas, fornecendo ao pesquisador subsídios para entender a variabilidade e a estrutura de correlação dos dados observados com uma menor quantidade de variáveis não correlacionadas chamadas de componentes principais. A técnica é muito simples e amplamente utilizada em diversos estudos de diferentes áreas. Para construção, medimos a relação linear entre as variáveis observadas pela matriz de covariância ou pela matriz de correlação. Entretanto, as matrizes de covariância e de correlação podem deixar de capturar importante informações para dados correlacionados sequencialmente no tempo, autocorrelacionados, desperdiçando parte importante dos dados para interpretação das componentes. Neste trabalho, estudamos a técnica de análise de componentes principais que torna possível a interpretação ou análise da estrutura de autocorrelação dos dados observados. Para isso, exploramos a técnica de análise de componentes principais para o domínio da frequência que fornece para dados autocorrelacionados um resultado mais específico e detalhado do que a técnica de componentes principais clássica. Pelos métodos SSA (Singular Spectrum Analysis) e MSSA (Multichannel Singular Spectrum Analysis), a análise de componentes principais é baseada na correlação no tempo e entre as diferentes variáveis observadas. Essas técnicas são muito utilizadas para dados atmosféricos na identificação de padrões, tais como tendência e periodicidade. / The main objective of principal component analysis (PCA) is to reduce the number of variables in a small uncorrelated data sets, providing support and helping researcher understand the variation present in all the original variables with small uncorrelated amount of variables, called components. The principal components analysis is very simple and frequently used in several areas. For its construction, the components are calculated through covariance matrix. However, the covariance matrix does not capture the autocorrelation information, wasting important information about data sets. In this research, we present some techniques related to principal component analysis, considering autocorrelation information. However, we explore the principal component analysis in the domain frequency, providing more accurate and detailed results than classical component analysis time series case. In subsequent method SSA (Singular Spectrum Analysis) and MSSA (Multichannel Singular Spectrum Analysis), we study the principal component analysis considering relationship between locations and time points. These techniques are broadly used for atmospheric data sets to identify important characteristics and patterns, such as tendency and periodicity.
79

Utilização de análise de componentes principais em séries temporais / Use of principal component analysis in time series

Sérgio Coichev Teixeira 12 April 2013 (has links)
Um dos principais objetivos da análise de componentes principais consiste em reduzir o número de variáveis observadas em um conjunto de variáveis não correlacionadas, fornecendo ao pesquisador subsídios para entender a variabilidade e a estrutura de correlação dos dados observados com uma menor quantidade de variáveis não correlacionadas chamadas de componentes principais. A técnica é muito simples e amplamente utilizada em diversos estudos de diferentes áreas. Para construção, medimos a relação linear entre as variáveis observadas pela matriz de covariância ou pela matriz de correlação. Entretanto, as matrizes de covariância e de correlação podem deixar de capturar importante informações para dados correlacionados sequencialmente no tempo, autocorrelacionados, desperdiçando parte importante dos dados para interpretação das componentes. Neste trabalho, estudamos a técnica de análise de componentes principais que torna possível a interpretação ou análise da estrutura de autocorrelação dos dados observados. Para isso, exploramos a técnica de análise de componentes principais para o domínio da frequência que fornece para dados autocorrelacionados um resultado mais específico e detalhado do que a técnica de componentes principais clássica. Pelos métodos SSA (Singular Spectrum Analysis) e MSSA (Multichannel Singular Spectrum Analysis), a análise de componentes principais é baseada na correlação no tempo e entre as diferentes variáveis observadas. Essas técnicas são muito utilizadas para dados atmosféricos na identificação de padrões, tais como tendência e periodicidade. / The main objective of principal component analysis (PCA) is to reduce the number of variables in a small uncorrelated data sets, providing support and helping researcher understand the variation present in all the original variables with small uncorrelated amount of variables, called components. The principal components analysis is very simple and frequently used in several areas. For its construction, the components are calculated through covariance matrix. However, the covariance matrix does not capture the autocorrelation information, wasting important information about data sets. In this research, we present some techniques related to principal component analysis, considering autocorrelation information. However, we explore the principal component analysis in the domain frequency, providing more accurate and detailed results than classical component analysis time series case. In subsequent method SSA (Singular Spectrum Analysis) and MSSA (Multichannel Singular Spectrum Analysis), we study the principal component analysis considering relationship between locations and time points. These techniques are broadly used for atmospheric data sets to identify important characteristics and patterns, such as tendency and periodicity.
80

Automatic Target Recognition In Infrared Imagery

Bayik, Tuba Makbule 01 September 2004 (has links) (PDF)
The task of automatically recognizing targets in IR imagery has a history of approximately 25 years of research and development. ATR is an application of pattern recognition and scene analysis in the field of defense industry and it is still one of the challenging problems. This thesis may be viewed as an exploratory study of ATR problem with encouraging recognition algorithms implemented in the area. The examined algorithms are among the solutions to the ATR problem, which are reported to have good performance in the literature. Throughout the study, PCA, subspace LDA, ICA, nearest mean classifier, K nearest neighbors classifier, nearest neighbor classifier, LVQ classifier are implemented and their performances are compared in the aspect of recognition rate. According to the simulation results, the system, which uses the ICA as the feature extractor and LVQ as the classifier, has the best performing results. The good performance of this system is due to the higher order statistics of the data and the success of LVQ in modifying the decision boundaries.

Page generated in 0.0879 seconds