• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 103
  • 41
  • 8
  • 7
  • 4
  • 2
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 204
  • 204
  • 204
  • 34
  • 32
  • 32
  • 23
  • 18
  • 17
  • 17
  • 17
  • 16
  • 16
  • 14
  • 14
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
71

Quantile Forecasting of Commodity Futures' Returns: Are Implied Volatility Factors Informative?

Dorta, Miguel 2012 May 1900 (has links)
This study develops a multi-period log-return quantile forecasting procedure to evaluate the performance of eleven nearby commodity futures contracts (NCFC) using a sample of 897 daily price observations and at-the-money (ATM) put and call implied volatilities of the corresponding prices for the period from 1/16/2008 to 7/29/2011. The statistical approach employs dynamic log-returns quantile regression models to forecast price densities using implied volatilities (IVs) and factors estimated through principal component analysis (PCA) from the IVs, pooled IVs and lagged returns. Extensive in-sample and out-of-sample analyses are conducted, including assessment of excess trading returns, and evaluations of several combinations of quantiles, model specifications, and NCFC's. The results suggest that the IV-PCA-factors, particularly pooled return-IV-PCA-factors, improve quantile forecasting power relative to models using only individual IV information. The ratio of the put-IV to the call-IV is also found to improve quantile forecasting performance of log returns. Improvements in quantile forecasting performance are found to be better in the tails of the distribution than in the center. Trading performance based on quantile forecasts from the models above generated significant excess returns. Finally, the fact that the single IV forecasts were outperformed by their quantile regression (QR) counterparts suggests that the conditional distribution of the log-returns is not normal.
72

The Construction and Application of Hybrid Factor Model

Tao, Yun-jhen 28 July 2010 (has links)
A Multifactor model is used to explain asset return and risk and its explanatory power depends on common factors that the model uses. Researchers strive to find reasonable factors to enhance multifactor model¡¦s efficiency. However, there are still some unknown factors to be discovered. Miller (2006) presents a general concept and structure of hybrid factor model. The study follows the idea of Miller (2006) and aims to build a complete flow of constructing hybrid factor model that is based on fundamental factor model and statistical factor models. We also apply the hybrid factor model to the Taiwan stock market. We assume that a fundamental factor model is already developed and therefore this study focuses on building the second stage, statistical factor model. Principal Component Analysis is used to form statistical factor and spectral decomposition is used to prepare data for principal component analysis. Those methods are applied to stocks on the Taiwan Stock Exchange in the period of January 1, 2000 to December 31, 2009. This study presents a complete construction flow of hybrid factor models and further confirms that a hybrid factor model is able to find missing factors in a developing market such as Taiwan¡¦s stock market. The study also discovers that the missing factors might be market factor and extensive electronic industry factor.
73

Principal component analysis with multiresolution

Brennan, Victor L., January 2001 (has links) (PDF)
Thesis (Ph. D.)--University of Florida, 2001. / Title from first page of PDF file. Document formatted into pages; contains xi, 124 p.; also contains graphics. Vita. Includes bibliographical references (p. 120-123).
74

Multivariate statistical monitoring and diagnosis with applications in semiconductor processes /

Yue, Hongyu, January 2000 (has links)
Thesis (Ph. D.)--University of Texas at Austin, 2000. / Vita. Includes bibliographical references (leaves 187-201). Available also in a digital version from Dissertation Abstracts.
75

Understanding the variations in fluorescence spectra of gynecologic tissue

Chang, Sung Keun 28 August 2008 (has links)
Not available / text
76

Methods for improving the reliability of semiconductor fault detection and diagnosis with principal component analysis

Cherry, Gregory Allan 28 August 2008 (has links)
Not available / text
77

A principal components analysis of anatomical fat patterning in South African children

Goon, Daniel Ter. January 2011 (has links)
D. Tech. Clinical Technology. / Examines anatomical fat patterning in SouthAfrican children (black and white) by utilising principal components analysis and to provide normative data on fat patterning for South African children. This statistical method has rarely been used to determine fat patterning in South African children.
78

Using Geochemical Tracers to Determine Aquifer Connectivity, Flow Paths, and Base-Flow Sources: Middle Verde River Watershed, Central Arizona

Zlatos, Caitlan McEwen January 2008 (has links)
Combining geochemical data with physical data produces a powerful method for understanding sources and fluxes of waters to river systems. This study highlights this for river systems in regions of complex hydrogeology, shown here through the identification and quantification of base-flow sources to the Verde River and its tributaries within the middle Verde River watershed. Specifically, geochemical tracers (major solutes, stable and radioactive isotopes) characterize the principal aquifers (C, Redwall-Muav, and Verde Formation) and provide a conceptual understanding of the hydrologic connection between them. For the surface-water system, PCA is utilized to identify potential base-flow sources to the Verde River on a several-kilometer scale. Solute mixing diagrams then provide relative inputs of these sources, and when combined with stream discharge, allow for quantification of water sources. The results of this study provide an improved conceptual model that reveals the complexity of groundwater-surface water exchanges in this river basin.
79

A Multitemporal Analysis of Georgia's Coastal Vegetation, 1990-2005

Breeden, Charles, III F 17 April 2008 (has links)
Land and vegetation changes are part of the continuous and dynamic cycle of earth system variation. This research examines vegetation changes in the 21-county eco-region along coastal Georgia. The Advanced Very High Resolution Radiometer (AVHRR) with Normalized Difference Vegetation Index (NDVI) data is used in tandem with a Principal Component Analysis (PCA) and climatic variables to determine where, and to what extent vegetation and land cover change is occurring. This research is designed around a 16 year time-series from 1990-2005. Findings were that mean NDVI values were either steady or slightly improved, and that PC1 (Healthiness) and PC2 (Time-Change) explained nearly 99 percent of the total mean variance. Healthiness declines are primarily the result of expanding urban districts and decreased soil moisture while increases are the results of restoration, and increased soil moisture. This research aims to use this analysis for the assessment of land changes as the conduit for future environmental research.
80

Eigenimage Processing of Frontal Chest Radiographs

Butler, Anthony Philip Howard January 2007 (has links)
The goal of this research was to improve the speed and accuracy of reporting by clinical radiologists. By applying a technique known as eigenimage processing to chest radiographs, abnormal findings were enhanced and a classification scheme developed. Results confirm that the method is feasible for clinical use. Eigenimage processing is a popular face recognition routine that has only recently been applied to medical images, but it has not previously been applied to full size radiographs. Chest radiographs were chosen for this research because they are clinically important and are challenging to process due to their large data content. It is hoped that the success with these images will enable future work on other medical images such as those from CT and MRI. Eigenimage processing is based on a multivariate statistical method which identifies patterns of variance within a training set of images. Specifically it involves the application of a statistical technique called principal components analysis to a training set. For this research, the training set was a collection of 77 normal radiographs. This processing produced a set of basis images, known as eigenimages, that best describe the variance within the training set of normal images. For chest radiographs the basis images may also be referred to as 'eigenchests'. Images to be tested were described in terms of eigenimages. This identified patterns of variance likely to be normal. A new image, referred to as the remainder image, was derived by removing patterns of normal variance, thus making abnormal patterns of variance more conspicuous. The remainder image could either be presented to clinicians or used as part of a computer aided diagnosis system. For the image sets used, the discriminatory power of a classification scheme approached 90%. While the processing of the training set required significant computation time, each test image to be classified or enhanced required only a few seconds to process. Thus the system could be integrated into a clinical radiology department.

Page generated in 0.1409 seconds