• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 48
  • 14
  • 2
  • 2
  • 2
  • 2
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 78
  • 78
  • 78
  • 27
  • 20
  • 17
  • 14
  • 13
  • 12
  • 11
  • 11
  • 11
  • 10
  • 10
  • 10
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
11

High-frequency sensing of Clear Creek water quality: mechanisms of dissolved oxygen and turbidity dynamics, and nutrient transport

Loperfido, John Vincent 01 July 2009 (has links)
The runoff of suspended solids and nutrients from land into the nation's lakes and rivers can have severe impacts on the health of these systems and their uses. High-frequency environmental data from sensors can provide insight into fundamental biogeochemical processes that dictate water quality and provide regulators with valuable knowledge on how to manage critical resources. The goal of this research was to utilize sensor technology, telemetry hardware, cyberinfrastructure, and water quality models to create a sensing system that will allow the investigation of the fate and transport of dissolved oxygen, suspended solids, nutrients, and other water quality parameters throughout a watershed dominated by agricultural activity. Deploying these sensors at multiple locations along the stream enabled the investigation of these processes from the fine scale to the larger watershed scale. Results from this research addressed both fundamental science and resource management issues regarding water quality. Using high-frequency data, a dramatic diel cycle in dissolved oxygen was observed with nonlinear dynamics which was successfully modeled mathematically, and excursions in water quality criteria were observed. In addition, a diel pattern in turbidity was discovered with higher levels at night likely caused by bioturbation (i.e. nocturnal activity of bottom feeding fishes) which resulted in higher suspended solids loadings during nighttime. Furthermore, the QUAL2K model was successfully calibrated for water quality using sensor measurements and grab samples from volunteer, IOWATER data. Nutrient loading rates (nitrate-N, orthophosphate, and total dissolved solids) were estimated along the entire creek and were similar to other Iowa streams. Volunteer environmental data were found to be helpful in model calibration for some parameters (e.g. TSS and nitrate). The construction and operation of a sensing system in Clear Creek contributed to water quality science and engineering. Findings from the configuration and field testing of sensing station components such as water quality sensors, power systems and communication hardware will aid the design of future sensing systems and environmental observatories. Integrating the methodology of this research with future observing systems will further our understanding of water quality processes and help maintain the health and value of our nation's water environment.
12

Shining light on the storm: Using high-frequency optical water quality sensors to characterize and interpret storm nutrient and carbon dynamics among contrasting land uses

Vaughan, Matthew CH 01 January 2019 (has links)
Elevated nutrient concentrations present significant challenges to surface water quality management globally, and dissolved organic matter mediates several key biogeochemical processes. Storm events often dominate riverine loads of nitrate, phosphorus, and dissolved organic matter, and are expected to increase in frequency and intensity in many regions due to climate change. The recent development of in situ optical sensors has revolutionized water quality monitoring and has highlighted the important role storms play in water quality. This dissertation focuses on improving the application of in situ optical water quality sensors and interpreting the high-frequency data they produce to better understand biogeochemical and watershed processes that are critical for resource management. We deployed in situ sensors to monitor water quality in three watersheds with contrasting land use / land cover, including agricultural, urban, and forested landscapes. The sensors measured absorbance of ultraviolet-visible light through the water column at 2.5 nanometer wavelength increments at 15-minute intervals for three years. These deployments provided a testbed to evaluate the sensors and improve models to predict concentrations of nitrate, three phosphorus fractions, and dissolved organic carbon using absorbance spectra and laboratory analyses through multivariate statistical techniques. In addition, an improved hysteresis calculation method was used to determine short-timescale storm dynamics for several parameters during 220 storm events. Goals of each dissertation chapter were to: (1) examine the influences of seasonality, storm size, and dominant land use / land cover on storm dissolved organic carbon and nitrate hysteresis and loads; (2) evaluate the utility of the sensors to determine total, dissolved, and soluble reactive phosphorus concentrations in streams draining different land use / land covers, and perform the first statistically robust validation technique applied to optical water quality sensor calibration models; and (3) analyze storm event dissolved organic matter quantity and character dynamics by calculating hysteresis indices for DOC concentration and spectral slope ratio, and develop a novel analytical framework that leverages these high frequency measurements to infer biogeochemical and watershed processes. Each chapter includes key lessons and future recommendations for using in situ optical sensors to monitor water quality.
13

Measuring mussel behavior and analyzing high frequency nitrate data to explore new phenomena in dynamic nutrient cycling

Bril, Jeremy 01 May 2010 (has links)
Labeled by the National Academy of Engineering (NAE) as one of fourteen Grand Challenges for Engineering, the management of the nitrogen cycle has become an increasingly difficult obstacle for sustainable development. In an effort to help overcome this challenge, the goal of our study is to expand on the limited scientific understanding of how the nitrogen cycle within aquatic environments may be affected by increasing human- and climate-induced changes. To this end, we are using freshwater mussels as a sentinel species to better understand the impacts of ecosystem perturbation on nitrogen processing in large river systems. This was completed by examining the physical, biological, and chemical characteristics of a mussel habitat in the Mississippi River, evaluating the impact of the 2008 floods on the habitat and the ecosystem's nutrient processing, establishing a well-equipped mussel laboratory habitat to investigate mussel behavioral responses, and analyzing highly time resolved data to examine the mussels' contribution to daily nitrate fluxes.
14

Large-Scale Portfolio Allocation Under Transaction Costs and Model Uncertainty

Hautsch, Nikolaus, Voigt, Stefan 09 1900 (has links) (PDF)
We theoretically and empirically study portfolio optimization under transaction costs and establish a link between turnover penalization and covariance shrinkage with the penalization governed by transaction costs. We show how the ex ante incorporation of transaction costs shifts optimal portfolios towards regularized versions of efficient allocations. The regulatory effect of transaction costs is studied in an econometric setting incorporating parameter uncertainty and optimally combining predictive distributions resulting from high-frequency and low-frequency data. In an extensive empirical study, we illustrate that turnover penalization is more effective than commonly employed shrinkage methods and is crucial in order to construct empirically well-performing portfolios.
15

Nonlinearities and regime shifts in financial time series

Åsbrink, Stefan E. January 1997 (has links)
This volume contains four essays on various topics in the field of financial econometrics. All four discuss the properties of high frequency financial data and its implications on the model choice when an estimate of the capital asset return volatility is in focus. The interest lies both in characterizing "stylized facts" in such series with time series models and in predicting volatility. The first essay, entitled A Survey of Recent Papers Considering the Standard &amp; Poor 500 Composite Stock Index, presents recent empirical findings and stylized facts in the financial market from 1987 to 1996 and gives a brief introduction to the research field of capital asset return volatitlity models and properties of high frequency financial data. As the title indicates, the survey is restricted to research on the well known Standard &amp; Poor 500 index. The second essay, with the title, Stylized Facts of Daily Return Series and the Hidden Markov Model, investigates the properties of the hidden Markov Model, HMM, and its capability of reproducing stylized facts of financial high frequency data. The third essay, Modelling the Conditional Mean and Conditional Variance: A combined Smooth Transition and Hidden Markov Approach with an Application to High Frequency Series, investigates the consequences of combining a nonlinear parameterized conditional mean with an HMM for the conditional variance when characterization of stylized facts is considered. Finally, the fourth essay entitled, Volatility Forecasting for Option Pricing on Exchange Rates and Stock Prices, investigates the volatility forecasting performance of some of the most frequently used capital asset return volatility models such as the GARCH with normal and t-distributed errors, the EGARCH and the HMM. The prediction error minimization approach is also investigated. Each essay is self-contained and could, in principle, be read in any order chosen by the reader. This, however, requires a working knowledge of the properties of the HMM. For readers less familiar with the research field the first essay may serve as an helpful introduction to the following three essays. / <p>Diss. Stockholm : Handelshögsk.</p>
16

Studies on the Estimation of Integrated Volatility for High Frequency Data

Lin, Liang-ching 26 July 2007 (has links)
Estimating the integrated volatility of high frequency realized prices is an important issue in microstructure literature. Bandi and Russell (2006) derived the optimal-sampling frequency, and Zhang et al. (2005) proposed a "two-scales estimator" to solve the problem. In this study, we propose a new estimator based on a signal to noise ratio statistic with convergence rate of Op (n^(−1/ 4) ). The method is applicable to both constant and stochastic volatility models and modi¡Âes the Op (n^(−1/ 6) ) convergence rate of Zhang et al. (2005). The proposed estimator is shown to be asymptotic e¡Ócient as the maximum likelihood estimate for the constant volatility case. Furthermore, unbiased estimators of the two elements, the variance of the microstructure noise and the fourth moment of the realized log returns, are also proposed to facilitate the estimation of integrated volatility. The asymptotic prop- erties and e&#x00AE;ectiveness of the proposed estimators are investigated both theoretically and via simulation study.
17

FORECASTING FOREIGN EXCHANGE VOLATILITY FOR VALUE AT RISK : CAN REALIZED VOLATILITY OUTPERFORM GARCH PREDICTIONS?

Fallman, David, Wirf, Jens January 2011 (has links)
In this paper we use model-free estimates of daily exchange rate volatilities employing high-frequency intraday data, known as Realized Volatility, which is then forecasted with ARMA-models and used to produce one-day-ahead Value-at-Risk predictions. The forecasting accuracy of the method is contrasted against the more widely used ARCH-models based on daily squared returns. Our results indicate that the ARCH-models tend to underestimate the Value-at-Risk in foreign exchange markets compared to models using Realized Volatility
18

A Study on The Random and Discrete Sampling Effect of Continuous-time Diffusion Model

Tsai, Yi-Po 04 August 2010 (has links)
High-frequency financial data are not only discretely sampled in time but the time separating successive observations is often random. We review the paper of Aït-Sahalia and Mykland (2003), that measure the effects of discreteness sampling and ignoring the randomness of the sampling for estimating the m.l.e of a continuous-time diffusion model. In that article, three different assumptions and restrict in one made on the sampling intervals, and the corresponding likelihood function, asymptotic normality, and covariance matrix are obtained. It is concluded that the effects due to discretely sampling are smaller than the effect of simply ignoring the sampling randomness. This study focuses on rechecking the results in the paper of A¡Lıt-Sahalia and Mykland (2003) including theory, simulation and application. We derive a different likelihood function expression from A¡Lıt-Sahalia and Mykland (2003)¡¦s result. However, the asymptotic covariance are consistent for both approaching in the O-U process. Furthermore, we conduct an empirical study on the high frequency transaction time data by using non-homogeneous Poisson Processes.
19

Essays in Financial Econometrics

De Lira Salvatierra, Irving January 2015 (has links)
<p>The main goal of this work is to explore the effects of time-varying extreme jump tail dependencies in asset markets. Consequently, a lot of attention has been devoted to understand the extremal tail dependencies between of assets. As pointed by Hansen (2013), the estimation of tail risks dependence is a challenging task and their implications in several sectors of the economy are of great importance. One of the principal challenges is to provide a measure systemic risks that is, in principle, statistically tractable and has an economic meaning. Therefore, there is a need of a standardize dependence measures or at least to provide a methodology that can capture the complexity behind global distress in the economy. These measures should be able to explain not only the dynamics of the most recent financial crisis but also the prior events of distress in the world economy, which is the motivation of this paper. In order to explore the tail dependencies I exploit the information embedded in option prices and intra-daily high frequency data. </p><p>The first chapter, a co-authored work with Andrew Patton, proposes a new class of dynamic copula models for daily asset returns that exploits information from high frequency (intra-daily) data. We augment the generalized autoregressive score (GAS) model of Creal, et al. (2013) with high frequency measures such as realized correlation to obtain a "GRAS" model. We find that the inclusion of realized measures significantly improves the in-sample fit of dynamic copula models across a range of U.S. equity returns. Moreover, we find that out-of-sample density forecasts from our GRAS models are superior to those from simpler models. Finally, we consider a simple portfolio choice problem to illustrate the economic gains from exploiting high frequency data for modeling dynamic dependence.</p><p>In the second chapter using information from option prices I construct two new measures of dependence between assets and industries, the Jump Tail Implied Correlation and the Tail Correlation Risk Premia. The main contribution in this chapter is the construction of a systemic risk factor from daily financial measures using a quantile-regression-based methodology. In this direction, I fill the existing gap between downturns in the financial sector and the real economy. I find that this new index performs well to forecast in-sample and out-of-sample quarterly macroeconomic shocks. In addition, I analyze whether the tail risk of the correlation may be priced. I find that for the S&P500 and its sectors there is an ex ante premium to hedge against systemic risks and changes in the aggregate market correlation. Moreover, I provide evidence that the tails of the implied correlation have remarkable predictive power for future stock market returns.</p> / Dissertation
20

Cointegration and exchange market efficiency. An analysis of high frequency data.

Trapletti, Adrian, Geyer, Alois, Leisch, Friedrich January 1999 (has links) (PDF)
A cointegration analysis on a triangle of high frequency exchange rates is presented. Market efficiency requires the triangle to be cointegrated and the cointegration term to be a martingale difference sequence. We find empirical evidence against market efficiency for very short time horizons: The cointegration term does not behave like a martingale difference sequence. In an out-of-sample forecasting study the cointegrated vector autoregressive (VAR) model is found to be superior to the naive martingale. Finally, a simple trading strategy shows that the VAR also has a significant forecast value in economic terms even after accounting for transaction costs. (author's abstract) / Series: Working Papers SFB "Adaptive Information Systems and Modelling in Economics and Management Science"

Page generated in 0.1315 seconds