• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 173
  • 16
  • 13
  • 6
  • 5
  • 5
  • 5
  • 5
  • 4
  • 2
  • 1
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 237
  • 120
  • 89
  • 63
  • 56
  • 54
  • 35
  • 33
  • 32
  • 27
  • 26
  • 25
  • 23
  • 23
  • 23
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
151

Essays on Time Series Analysis : With Applications to Financial Econometrics

Preve, Daniel January 2008 (has links)
<p>This doctoral thesis is comprised of four papers that all relate to the subject of Time Series Analysis.</p><p>The first paper of the thesis considers point estimation in a nonnegative, hence non-Gaussian, AR(1) model. The parameter estimation is carried out using a type of extreme value estimators (EVEs). A novel estimation method based on the EVEs is presented. The theoretical analysis is complemented with Monte Carlo simulation results and the paper is concluded by an empirical example.</p><p>The second paper extends the model of the first paper of the thesis and considers semiparametric, robust point estimation in a nonlinear nonnegative autoregression. The nonnegative AR(1) model of the first paper is extended in three important ways: First, we allow the errors to be serially correlated. Second, we allow for heteroskedasticity of unknown form. Third, we allow for a multi-variable mapping of previous observations. Once more, the EVEs used for parameter estimation are shown to be strongly consistent under very general conditions. The theoretical analysis is complemented with extensive Monte Carlo simulation studies that illustrate the asymptotic theory and indicate reasonable small sample properties of the proposed estimators.</p><p>In the third paper we construct a simple nonnegative time series model for realized volatility, use the results of the second paper to estimate the proposed model on S&P 500 monthly realized volatilities, and then use the estimated model to make one-month-ahead forecasts. The out-of-sample performance of the proposed model is evaluated against a number of standard models. Various tests and accuracy measures are utilized to evaluate the forecast performances. It is found that forecasts from the nonnegative model perform exceptionally well under the mean absolute error and the mean absolute percentage error forecast accuracy measures.</p><p>In the fourth and last paper of the thesis we construct a multivariate extension of the popular Diebold-Mariano test. Under the null hypothesis of equal predictive accuracy of three or more forecasting models, the proposed test statistic has an asymptotic Chi-squared distribution. To explore whether the behavior of the test in moderate-sized samples can be improved, we also provide a finite-sample correction. A small-scale Monte Carlo study indicates that the proposed test has reasonable size properties in large samples and that it benefits noticeably from the finite-sample correction, even in quite large samples. The paper is concluded by an empirical example that illustrates the practical use of the two tests.</p>
152

Robust spatio-temporal latent variable models

Christmas, Jacqueline January 2011 (has links)
Principal Component Analysis (PCA) and Canonical Correlation Analysis (CCA) are widely-used mathematical models for decomposing multivariate data. They capture spatial relationships between variables, but ignore any temporal relationships that might exist between observations. Probabilistic PCA (PPCA) and Probabilistic CCA (ProbCCA) are versions of these two models that explain the statistical properties of the observed variables as linear mixtures of an alternative, hypothetical set of hidden, or latent, variables and explicitly model noise. Both the noise and the latent variables are assumed to be Gaussian distributed. This thesis introduces two new models, named PPCA-AR and ProbCCA-AR, that augment PPCA and ProbCCA respectively with autoregressive processes over the latent variables to additionally capture temporal relationships between the observations. To make PPCA-AR and ProbCCA-AR robust to outliers and able to model leptokurtic data, the Gaussian assumptions are replaced with infinite scale mixtures of Gaussians, using the Student-t distribution. Bayesian inference calculates posterior probability distributions for each of the parameter variables, from which we obtain a measure of confidence in the inference. It avoids the pitfalls associated with the maximum likelihood method: integrating over all possible values of the parameter variables guards against overfitting. For these new models the integrals required for exact Bayesian inference are intractable; instead a method of approximation, the variational Bayesian approach, is used. This enables the use of automatic relevance determination to estimate the model orders. PPCA-AR and ProbCCA-AR can be viewed as linear dynamical systems, so the forward-backward algorithm, also known as the Baum-Welch algorithm, is used as an efficient method for inferring the posterior distributions of the latent variables. The exact algorithm is tractable because Gaussian assumptions are made regarding the distribution of the latent variables. This thesis introduces a variational Bayesian forward-backward algorithm based on Student-t assumptions. The new models are demonstrated on synthetic datasets and on real remote sensing and EEG data.
153

Dopad nekonvenční měnové politiky ECB na střední a východní Evropu: Analýza panelovým VAR modelem / The Impact of Unconventional Monetary Policy of ECB to Central and Eastern European Countries: A Panel VAR Analysis

Hálová, Klára January 2015 (has links)
In this thesis we examine the macroeconomic interactions of unconventional monetary policy introduced by European Central bank during crisis by estimating a panel vector autoregression. We study impact of such policies using monthly data from 13 Central and Eastern European countries within seven-year period from 2008 to 2014. We find a positive reactions of output and prices to expansionary unconventional monetary policy shock. Our results provide evidence that decrease in shadow policy rate of ECB leads to rise in output as well as temporary rise in inflation, however, the effect on inflation is weaker and less persistent. We also find that unconventional monetary policy positively influences market uncertainty, but we do not find any significant effect on exchange rates. Individual country estimates suggest that the reaction of exchange rates to non-standard monetary policy shock significantly vary across countries.
154

Understanding transcriptional regulation through computational analysis of single-cell transcriptomics

Lim, Chee Yee January 2017 (has links)
Gene expression is tightly regulated by complex transcriptional regulatory mechanisms to achieve specific expression patterns, which are essential to facilitate important biological processes such as embryonic development. Dysregulation of gene expression can lead to diseases such as cancers. A better understanding of the transcriptional regulation will therefore not only advance the understanding of fundamental biological processes, but also provide mechanistic insights into diseases. The earlier versions of high-throughput expression profiling techniques were limited to measuring average gene expression across large pools of cells. In contrast, recent technological improvements have made it possible to perform expression profiling in single cells. Single-cell expression profiling is able to capture heterogeneity among single cells, which is not possible in conventional bulk expression profiling. In my PhD, I focus on developing new algorithms, as well as benchmarking and utilising existing algorithms to study the transcriptomes of various biological systems using single-cell expression data. I have developed two different single-cell specific network inference algorithms, BTR and SPVAR, which are based on two different formalisms, Boolean and autoregression frameworks respectively. BTR was shown to be useful for improving existing Boolean models with single-cell expression data, while SPVAR was shown to be a conservative predictor of gene interactions using pseudotime-ordered single-cell expression data. In addition, I have obtained novel biological insights by analysing single-cell RNAseq data from the epiblast stem cells reprogramming and the leukaemia systems. Three different driver genes, namely Esrrb, Klf2 and GY118F, were shown to drive reprogramming of epiblast stem cells via different reprogramming routes. As for the leukaemia system, FLT3-ITD and IDH1-R132H mutations were shown to interact with each other and potentially predispose some cells for developing acute myeloid leukaemia.
155

Draining the Pathogenic Reservoir of Guilt? : A study of the relationship between Guilt and Self-Compassion in Intensive Short-Term Dynamic Psychotherapy

Nygren, Tomas, Johansson, Claes January 2015 (has links)
Objective: One of the main theoretical proposals of Intensive Short-term Dynamic Psychotherapy (ISTDP; Davanloo, 1990) is that experiencing of previously unconscious guilt over aggressive impulses associated with attachment trauma leads to increase in self-compassion. The present study aimed to test this assumption. Method: Videotaped sessions from five therapies from a randomized controlled trial of 20-sessions of time-limited ISTDP for treatment-refractory depression were rated with the Achievement of Therapeutic Objectives Scale (ATOS; McCullough, Larsen, Schanche, Andrews&amp; Kuhn, 2003b). Degree of patient guilt arousal and self-compassion were rated on all available sessions. Data were analyzed using a replicated single-subject time-series approach. Results: Guilt arousal was not shown to positively predict self-compassion for any of the five patients. For one patient guilt arousal negatively predicted self-compassion two sessions ahead in time. Conclusion: The current study yields no support that the experience of guilt over aggressive feelings and impulses leads to increases in self-compassion. On the contrary, the finding that guilt negatively predicted self-compassion for one patient must be considered as an indication that this treatment process might negatively impact self-compassion for some patients in some contexts. However, there are several methodological limitations to the current study in the light of which the results should be regarded as tentative.
156

Modelling macroeconomic time series with smooth transition autoregressions

Skalin, Joakim January 1998 (has links)
Among the parametric nonlinear time series model families, the smooth transition regression (STR) model has recently received attention in the literature. The considerations in this dissertation focus on the univariate special case of this model, the smooth transition autoregression (STAR) model, although large parts of the discussion can be easily generalised to the more general STR case. Many nonlinear univariate time series models can be described as consisting of a number of regimes, each one corresponding to a linear autoregressive parametrisation, between which the process switches. In the STAR models, as opposed to certain other popular models involving multiple regimes, the transition between the extreme regimes is smooth and assumed to be characterised by a bounded continuous function of a transition variable. The transition variable, in turn, may be a lagged value of the variable in the model, or another stochastic or deterministic observable variable. A number of other commonly discussed nonlinear autoregressive models can be viewed as special or limiting cases of the STAR model. The applications presented in the first two chapters of this dissertation, Chapter I: Another look at Swedish Business Cycles, 1861-1988 Chapter II: Modelling asymmetries and moving equilibria in unemployment rates, make use of STAR models. In these two studies, STAR models are used to provide insight into dynamic properties of the time series which cannot be be properly characterised by linear time series models, and which thereby may be obscured by estimating only a linear model in cases where linearity would be rejected if tested. The applications being of interest in their own right, an important common objective of these two chapters is also to develop, suggest, and give examples of various methods that may be of use in discussing the dynamic properties of estimated STAR models in general.Chapter III, Testing linearity against smooth transition autoregression using a parametric bootstrap, reports the result of a small simulation study considering a new test of linearity against STAR based on bootstrap methodology. / <p>Diss. Stockholm : Handelshögskolan, 1999</p>
157

Essays on Time Series Analysis : With Applications to Financial Econometrics

Preve, Daniel January 2008 (has links)
This doctoral thesis is comprised of four papers that all relate to the subject of Time Series Analysis. The first paper of the thesis considers point estimation in a nonnegative, hence non-Gaussian, AR(1) model. The parameter estimation is carried out using a type of extreme value estimators (EVEs). A novel estimation method based on the EVEs is presented. The theoretical analysis is complemented with Monte Carlo simulation results and the paper is concluded by an empirical example. The second paper extends the model of the first paper of the thesis and considers semiparametric, robust point estimation in a nonlinear nonnegative autoregression. The nonnegative AR(1) model of the first paper is extended in three important ways: First, we allow the errors to be serially correlated. Second, we allow for heteroskedasticity of unknown form. Third, we allow for a multi-variable mapping of previous observations. Once more, the EVEs used for parameter estimation are shown to be strongly consistent under very general conditions. The theoretical analysis is complemented with extensive Monte Carlo simulation studies that illustrate the asymptotic theory and indicate reasonable small sample properties of the proposed estimators. In the third paper we construct a simple nonnegative time series model for realized volatility, use the results of the second paper to estimate the proposed model on S&amp;P 500 monthly realized volatilities, and then use the estimated model to make one-month-ahead forecasts. The out-of-sample performance of the proposed model is evaluated against a number of standard models. Various tests and accuracy measures are utilized to evaluate the forecast performances. It is found that forecasts from the nonnegative model perform exceptionally well under the mean absolute error and the mean absolute percentage error forecast accuracy measures. In the fourth and last paper of the thesis we construct a multivariate extension of the popular Diebold-Mariano test. Under the null hypothesis of equal predictive accuracy of three or more forecasting models, the proposed test statistic has an asymptotic Chi-squared distribution. To explore whether the behavior of the test in moderate-sized samples can be improved, we also provide a finite-sample correction. A small-scale Monte Carlo study indicates that the proposed test has reasonable size properties in large samples and that it benefits noticeably from the finite-sample correction, even in quite large samples. The paper is concluded by an empirical example that illustrates the practical use of the two tests.
158

Modeling financial volatility : A functional approach with applications to Swedish limit order book data

Elezovic, Suad January 2009 (has links)
<!-- /* Style Definitions */ p.MsoNormal, li.MsoNormal, div.MsoNormal {mso-style-parent:""; margin:0cm; margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:12.0pt; font-family:"Times New Roman"; mso-fareast-font-family:"Times New Roman"; mso-ansi-language:SV;} @page Section1 {size:612.0pt 792.0pt; margin:72.0pt 90.0pt 72.0pt 90.0pt; mso-header-margin:35.4pt; mso-footer-margin:35.4pt; mso-paper-source:0;} div.Section1 {page:Section1;} --> This thesis is designed to offer an approach to modeling volatility in the Swedish limit order market. Realized quadratic variation is used as an estimator of the integrated variance, which is a measure of the variability of a stochastic process in continuous time. Moreover, a functional time series model for the realized quadratic variation is introduced. A two-step estimation procedure for such a model is then proposed. Some properties of the proposed two-step estimator are discussed and illustrated through an application to high-frequency financial data and simulated experiments. In Paper I, the concept of realized quadratic variation, obtained from the bid and ask curves, is presented. In particular, an application to the Swedish limit order book data is performed using signature plots to determine an optimal sampling frequency for the computations. The paper is the first study that introduces realized quadratic variation in a functional context. Paper II introduces functional time series models and apply them to the modeling of volatility in the Swedish limit order book. More precisely, a functional approach to the estimation of volatility dynamics of the spreads (differences between the bid and ask prices) is presented through a case study. For that purpose, a two-step procedure for the estimation of functional linear models is adapted to the estimation of a functional dynamic time series model. Paper III studies a two-step estimation procedure for the functional models introduced in Paper II. For that purpose, data is simulated using the Heston stochastic volatility model, thereby obtaining time series of realized quadratic variations as functions of relative quantities of shares. In the first step, a dynamic time series model is fitted to each time series. This results in a set of inefficient raw estimates of the coefficient functions. In the second step, the raw estimates are smoothed. The second step improves on the first step since it yields both smooth and more efficient estimates. In this simulation, the smooth estimates are shown to perform better in terms of mean squared error. Paper IV introduces an alternative to the two-step estimation procedure mentioned above. This is achieved by taking into account the correlation structure of the error terms obtained in the first step. The proposed estimator is based on seemingly unrelated regression representation. Then, a multivariate generalized least squares estimator is used in a first step and its smooth version in a second step. Some of the asymptotic properties of the resulting two-step procedure are discussed. The new procedure is illustrated with functional high-frequency financial data.
159

Using Temporal Evidence and Fusion of Time-Frequency Features for Brain-Computer Interfacing

Dharwarkar, Gireesh January 2005 (has links)
Brain-computer interfacing (BCI) is a new method of human-machine interaction. It involves the extraction of information from the electroencephalogram (EEG) through signal processing and pattern recognition. The technology has far reaching implications for those with severe physical disabilities and has the potential to enhance machine interaction for the rest of the population. In this work we investigate time-frequency analysis in motor-imagery BCI. We consider two methods for signal analysis: adaptive autoregressive models (AAR) and wavelet transform (WAV). There are three major contributions of this research to single-trial analysis in motor-imagery BCI. First, we improve classification of AAR features over a conventional method by applying a temporal evidence accumulation (TEA) framework. Second, we compare the performance of AAR and WAV under the TEA framework for three subjects and find that WAV outperforms AAR for two subjects. The subject for whom AAR outperforms WAV has the lowest overall signal-to-noise ratio in their BCI output, an indication that the AAR model is more robust than WAV for noisier signals. Lastly, we find empirical evidence of complimentary information between AAR and WAV and propose a fusion scheme that increases the mutual information between the BCI output and classes.
160

Using Temporal Evidence and Fusion of Time-Frequency Features for Brain-Computer Interfacing

Dharwarkar, Gireesh January 2005 (has links)
Brain-computer interfacing (BCI) is a new method of human-machine interaction. It involves the extraction of information from the electroencephalogram (EEG) through signal processing and pattern recognition. The technology has far reaching implications for those with severe physical disabilities and has the potential to enhance machine interaction for the rest of the population. In this work we investigate time-frequency analysis in motor-imagery BCI. We consider two methods for signal analysis: adaptive autoregressive models (AAR) and wavelet transform (WAV). There are three major contributions of this research to single-trial analysis in motor-imagery BCI. First, we improve classification of AAR features over a conventional method by applying a temporal evidence accumulation (TEA) framework. Second, we compare the performance of AAR and WAV under the TEA framework for three subjects and find that WAV outperforms AAR for two subjects. The subject for whom AAR outperforms WAV has the lowest overall signal-to-noise ratio in their BCI output, an indication that the AAR model is more robust than WAV for noisier signals. Lastly, we find empirical evidence of complimentary information between AAR and WAV and propose a fusion scheme that increases the mutual information between the BCI output and classes.

Page generated in 0.0853 seconds