• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 642
  • 99
  • 46
  • 40
  • 22
  • 13
  • 10
  • 9
  • 9
  • 9
  • 9
  • 9
  • 9
  • 9
  • 8
  • Tagged with
  • 992
  • 992
  • 992
  • 140
  • 128
  • 107
  • 105
  • 94
  • 93
  • 88
  • 84
  • 83
  • 79
  • 68
  • 63
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
591

STATISTICAL METHODS FOR SPECTRAL ANALYSIS OF NONSTATIONARY TIME SERIES

Bruce, Scott Alan January 2018 (has links)
This thesis proposes novel methods to address specific challenges in analyzing the frequency- and time-domain properties of nonstationary time series data motivated by the study of electrophysiological signals. A new method is proposed for the simultaneous and automatic analysis of the association between the time-varying power spectrum and covariates. The procedure adaptively partitions the grid of time and covariate values into an unknown number of approximately stationary blocks and nonparametrically estimates local spectra within blocks through penalized splines. The approach is formulated in a fully Bayesian framework, in which the number and locations of partition points are random, and fit using reversible jump Markov chain Monte Carlo techniques. Estimation and inference averaged over the distribution of partitions allows for the accurate analysis of spectra with both smooth and abrupt changes. The new methodology is used to analyze the association between the time-varying spectrum of heart rate variability and self-reported sleep quality in a study of older adults serving as the primary caregiver for their ill spouse. Another method proposed in this dissertation develops a unique framework for automatically identifying bands of frequencies exhibiting similar nonstationary behavior. This proposal provides a standardized, unifying approach to constructing customized frequency bands for different signals under study across different settings. A frequency-domain, iterative cumulative sum procedure is formulated to identify frequency bands that exhibit similar nonstationary patterns in the power spectrum through time. A formal hypothesis testing procedure is also developed to test which, if any, frequency bands remain stationary. This method is shown to consistently estimate the number of frequency bands and the location of the upper and lower bounds defining each frequency band. This method is used to estimate frequency bands useful in summarizing nonstationary behavior of full night heart rate variability data. / Statistics
592

Understanding school productivity study through time-related policy analysis

Williams, John M. 06 June 2008 (has links)
This study used time series analysis of 21 years (1970- 1990) of school productivity data from Virginia to demonstrate the usefulness of time series models in describing variations in school input (primarily expenditures) and output (primarily student attainment and achievement) variables. In the study, a series of trend-removed, ARIMA(1,0,0) autoregressive time series models for school input variables were developed to describe long-term trends in school expenditures, instructional salaries and pupil/teacher ratios and to account for year to year variation in levels of school inputs. Residuals from these models for school inputs were correlated with student attainment scores and achievement score residuals with student ability removed to identify those school productivity inputs having the strongest association with school outputs. The scores of input variables having strong associations with school outputs were then plotted over the 1970-90 time period and descriptively related to historical records of legislative and administrative policy decisions thought to have had statewide effects on school productivity in Virginia. The association of school productivity relationship changes with actual policy events was then described. All school input variables could be described with time series accounting for 90+% of the year to year variance in inputs. Time series residuals from expenditures, instructional salaries and pupil/teacher ratio inputs were moderately to strongly associated with two output measures: 1) the percent of Virginia school graduates attending college; and 2) the percent of dropouts, in most Virginia (30 < N < 100) school districts. These inputs shared 20 to 40% of their variance in common with school attainment outputs. School input residuals for local expenditures and pupil/teacher ratio were also strongly associated with reading, math, and language arts achievement residuals in a small number (N=2- 31) of Virginia school districts. Stronger relationships between inputs and achievement scores in greater numbers of Virginia school districts may be revealed when more years of data are available for future analysis. Plots of significant school input variables concurrently with school outputs and historical policy change events suggested that at least three policy change events may have had positive long term effects on school productivity in Virginia from 1970-90. Legislative commitment to a reduction in pupil/teacher ratio in the early 1980's seems to be associated with a long term decrease in dropout rates and increases in college attendance among students in most Virginia school districts. Commitment to higher teacher salaries in the same time period also seems to be associated with positive changes in college attendance and reductions in dropout rates. Finally, the long term expansion of total educational expenditures in Virginia, primarily through adoption of special education, health education, and dropout prevention curriculum initiatives, seems to be associated with rising levels of student promotion rates, percent of ninth grade students graduating and percent of students attending college from 1970-90. / Ph. D.
593

The use of neural networks in the combining of time series forecasts with differential penalty costs

Kohers, Gerald 21 October 2005 (has links)
The need for accurate forecasting and its potential benefits are well established in the literature. Virtually all individuals and organizations have at one time or another made decisions based on forecasts of future events. This widespread need for accurate predictions has resulted in considerable growth in the science of forecasting. To a large degree, practitioners are heavily dependent on academicians for generating new and improved forecasting techniques. In response to an increasingly dynamic environment, diverse and complex forecasting methods have been proposed to more accurately predict future events. These methods, which focus on the different characteristics of historical data, have ranged in complexity from simplistic to very sophisticated mathematical computations requiring a high level of expertise. By combining individual techniques to form composite forecasts in order to improve on the forecasting accuracy, researchers have taken advantage of the various strengths of these techniques. A number of combining methods have proven to yield better forecasts than individual methods, with the complexity of the various combining methods ranging from a simple average to quite complex weighting schemes. The focus of this study is to examine the usefulness of neural networks in composite forecasting. Emphasis is placed on the effectiveness of two neural networks (i.e., a backpropagation neural network and a modular neural network) relative to three traditional composite models (i.e., a simple average, a constrained mathematical programming model, and an unconstrained mathematical programming model) in the presence of four penalty cost functions for forecasting errors. Specifically, the overall objective of this study is to compare the shortterm predictive ability of each of the five composite forecasting techniques on various first-order autoregressive models, taking into account penalty cost functions representing four different situations. The results of this research suggest that in the vast majority of scenarios examined in this study, the neural network model clearly outperformed the other composite models. / Ph. D.
594

A New State Transition Model for Forecasting-Aided State Estimation for the Grid of the Future

Hassanzadeh, Mohammadtaghi 09 July 2014 (has links)
The grid of the future will be more decentralized due to the significant increase in distributed generation, and microgrids. In addition, due to the proliferation of large-scale intermittent wind power, the randomness in power system state will increase to unprecedented levels. This dissertation proposes a new state transition model for power system forecasting-aided state estimation, which aims at capturing the increasing stochastic nature in the states of the grid of the future. The proposed state forecasting model is based on time-series modeling of filtered system states and it takes spatial correlation among the states into account. Once the states with high spatial correlation are identified, the time-series models are developed to capture the dependency of voltages and angles in time and among each other. The temporal correlation in power system states (i.e. voltage angles and magnitudes) is modeled by using autoregression, while the spatial correlation among the system states (i.e. voltage angles) is modeled using vector autoregression. Simulation results show significant improvement in power system state forecasting accuracy especially in presence of distributed generation and microgrids. / Ph. D.
595

The effect of segment averaging on the quality of the Burg spectral estimator

Rahman, Md. Anisur January 1984 (has links)
The Burg spectral estimator (BSE) exhibits better peak resolution than conventional linear spectral estimators, particularly for short data records. Based on this property, the quality of the BSE is investigated with the available data record segmented and the relevant parameters or functions associated with each segment averaged. Averaging of autoregressive coefficients, reflection coefficients, or spectral density functions is used with the BSE and the corresponding performances are studied. Approximate expressions for the mean and variance of these modified Burg spectral estimators are derived. Lower bounds for the mean and variance of reflection coefficients are also deduced. Finally, the variance of the estimation errors associated with the modified power spectral density estimators is compared against the theoretical Cramer-Rao lower bound. / M.S.
596

Labor Market Dynamics in West Virginia and the Appalachian Region

Beverly, Joshua Paul 11 January 2023 (has links)
This dissertation consists of three manuscripts analyzing labor market dynamics in West Virginia and the Appalachian Region. The first manuscript examines the dynamic effects of national, regional, and local labor market shocks on labor force participation rates in Appalachia. A dynamic factor model with time-varying loading parameters and stochastic volatility is used to explore the synchronicity and divergence between state labor force participation rates within and outside the Appalachian region. We find that the choice of time and state is crucial to the relative importance of the level of synchronization on observed change in LFPR variations. Our findings can help better target labor policy by taking advantage of the sensitivity exhibited by each state to various labor market conditions. The second manuscript examines the dynamic effects of state, Metro/Non-Metro, and county labor market shocks on labor force participation rates in West Virginia. In the first stage, using a dynamic factor model, we find that non-metropolitan and county-specific components are dominant contributors to the observed variations in the change in West Virginia LFPRs. In the second stage, using a fixed effects panel model, we find county demographics, education levels, income, access to interstate highways, and industry composition are useful covariates for explaining the variance contributions of the state, metro/non-metro and county factors. The third manuscript uses cointegration analysis in the presence of structural breaks to determine whether the Unemployment Invariance Hypothesis exists in West Virginia. Using monthly labor force data from 1976 - 2022, we find mixed support for the unemployment invariance, added worker effect, and discouraged worker effect hypotheses over multiple sub-sample periods. These results suggest that labor markets are temporally-dynamic, and a one-size-fits-all approach could prove disadvantageous to growth. / Doctor of Philosophy / This dissertation focuses on labor market dynamics in West Virginia and the Appalachian Region. In the first of three manuscripts, we investigate how much U.S. state labor force participation rates move together nationally, and within the Appalachian Region. We find that how much labor force participation rates move together across the U.S. and within the Appalachian Region depends on the choice of time and state. In the second manuscript, we examine how much West Virginia county labor force participation rates move together across the state and within the Metropolitan and Non-Metropolitan regions. We also study how county characteristics such as industry composition and education levels influence the variation in how much labor force participation rates move together. We find that Non-metropolitan county labor force participation rates exhibit similar dynamic behavior and that education, personal income, access to highways, and industry composition of the counties influences how much the rates move together at the different levels. In the third manuscript, we investigate whether changes in the unemployment rate in West Virginia result influences that state's labor force participation rate in the long-run. We find that evidence of said long-run relationship albeit changing over time. We posit that the relationship dynamics are largely explained by the ensuing labor market and economic conditions. By extension, labor market policies and interventions should be timely and flexible.
597

Deforestation, degradation, and natural disturbance in the Amazon: using a new monitoring approach to estimate area and carbon loss

Bullock, Eric L. 10 February 2020 (has links)
Forest degradation causes environmental damage and carbon emissions, but its extent and magnitude are not well understood. New methods for monitoring forest degradation and deforestation show that more disturbance has occurred in the Amazon in recent decades than previously realized, indicating an unaccounted for source of carbon emissions and damage to Amazon ecosystems. Forest degradation and natural disturbance change a landscape, but the visible damage apparent in satellite images may be temporary and difficult to differentiate from undisturbed forests. Time series analysis of Landsat data used in a spectral mixture analysis improves monitoring of forest degradation and natural disturbance. In addition, the use of statistical inference accounts for classification bias and provides an estimate of uncertainty. Application of the methodology developed in this dissertation to the Amazon Ecoregion found that forest degradation and natural disturbance were more prevalent than deforestation from 1995 to 2017. Of consequence, the total area of forest in the Amazon that has been recently disturbed is greater than previously known. Overall, deforestation affected 327,900 km2 (±15,500) of previously undisturbed forest in the Amazon while degradation and natural disturbance affected 434,500 km2 (±22,100). Forest degradation and natural disturbance occur more frequently during drought years, which have increased in frequency and severity in recent years. Deforestation has largely decreased since 2004, while forest degradation and natural disturbance have remained consistent. Previously disturbed forests are lower in biomass than undisturbed forests, yet regeneration after disturbance gradually sequesters carbon. A carbon flux model shows that gross aboveground carbon loss from forest degradation and natural disturbance and deforestation from 1996 to 2017 in the Amazon were 2.2-2.8 Pg C and 3.3-4.3 Pg C, respectively. Since 2008, however, carbon loss from degradation and natural disturbance has been approximately the same as from deforestation. The methodologies developed in this dissertation are useful for monitoring deforestation and degradation throughout the world’s forest ecosystems. By leveraging dense data time series, statistical inference, and carbon modeling it is possible to quantify areas of deforestation and forest degradation in addition to the resulting carbon emissions. The results of this dissertation stress the importance of degradation and natural disturbance in the global carbon cycle and information valuable for climate science and conservation initiatives.
598

Application of Time Series Analysis in Video Background Subtraction

Cai, Yicheng January 2024 (has links)
This thesis aims to give statistical methods applicating to video background subtraction. In the thesis, I will give out the problem introduction and analyze the problem with different statistical methods including histogram statistics, and Gaussian Mixture models methods. To study further, I will give out the time series analysis to make a more significant way: To build up the time series analysis way of video background subtraction with the Kalman filter and give out the predictions and evaluations.
599

Overtourism in Dichotomies: Uncovering Dynamic and Non-Dynamic Costs and Benefits in Three Tourism Destinations

Baktash, Aarash 01 January 2023 (has links) (PDF)
The phenomenon of overtourism, characterized by its multifaceted impacts on destinations, has emerged as a major concern in the tourism industry. This dissertation aims to explore the dynamics of overtourism, emphasizing the dual impacts of main-source tourism markets on destinations in terms of their economic, social, and environmental consequences. Unlike existing literature, which focuses primarily on the negative aspects of overtourism, the present study illustrates the nuanced interaction between tourism markets by highlighting both their potential benefits and disadvantages. This study offers an in-depth analysis of cost and benefit factors based on a priori and a posteriori segmentation methodologies, combined with time-series analysis and limited information maximum likelihood (LIML) methods. Based on three case studies—Hong Kong, Malta, and Barbados—from 1980 to 2021, this study demonstrates the heterogeneous nature of the impacts across destinations and the complexities of market aggregation and interaction. The study identifies gaps in the conventional narrative of overtourism and introduces an interdisciplinary approach to the investigation. Based on the symbiotic framework, coupled with the Portfolio Theory, market aggregations and interactions can be classified into mutualism, commensalism, and parasitism. Additionally, the ‘limits of acceptable change' (LAC) and the ‘level of analysis problem' (LAP) frameworks have been utilized to further examine dominant and non-dominant markets' aggregation effects and interaction dynamics, resulting in a more comprehensive understanding of overtourism's complexity. Key findings suggest tailoring strategies to address overtourism, emphasizing the balance between minimizing costs and optimizing benefits. Based on the findings of this study, policymakers and stakeholders must develop strategies that respond to the challenges associated with overtourism by integrating empirical measures with theoretical frameworks.
600

Forecasting Highly-Aggregate Internet Time Series Using Wavelet Techniques

Edwards, Samuel Zachary 28 August 2006 (has links)
The U.S. Coast Guard maintains a network structure to connect its nation-wide assets. This paper analyzes and models four highly aggregate traces of the traffic to/from the Coast Guard Data Network ship-shore nodes, so that the models may be used to predict future system demand. These internet traces (polled at 5â 40â intervals) are shown to adhere to a Gaussian distribution upon detrending, which imposes limits to the exponential distribution of higher time-resolution traces. Wavelet estimation of the Hurst-parameter is shown to outperform estimation by another common method (Sample-Variances). The First Differences method of detrending proved problematic to this analysis and is shown to decorrelate AR(1) processes where 0.65< phi1 <1.35 and correlate AR(1) processes with phi1 <-0.25. The Hannan-Rissanen method for estimating (phi,theta) is employed to analyze this series and a one-step ahead forecast is generated. / Master of Science

Page generated in 0.0607 seconds