• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 17
  • Tagged with
  • 244
  • 244
  • 42
  • 35
  • 32
  • 27
  • 26
  • 26
  • 26
  • 25
  • 24
  • 22
  • 22
  • 20
  • 19
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
131

Diagnostic checking and intra-daily effects in time series models

Koopman, Siem Jan January 1992 (has links)
A variety of topics on the statistical analysis of time series are addressed in this thesis. The main emphasis is on the state space methodology and, in particular, on structural time series (STS) models. There are now many applications of STS models in the literature and they have proved to be very successful. The keywords of this thesis vary from - Kalman filter, smoothing and diagnostic checking - to - time-varying cubic splines and intra-daily effects -. Five separate studies are carried out for this research project and they are reflected in the chapters 2 to 6. All studies concern time series models which are placed in the state space form (SSF) so that the Kalman filter (KF) can be applied for estimation. The SSF and the KF play a central role in time series analysis that can be compared with the important role of the regression model and the method of least squares estimation in econometrics. Chapter 2 gives an overview of the latest developments in the state space methodology including diffuse likelihood evaluation, stable calculations, etc. Smoothing algorithms evaluate the full sample estimates of unobserved components in time series models. New smoothing algorithms are developed for the state and the disturbance vector of the SSF which are computationally efficient and outperform existing methods. Chapter 3 discusses the existing and the new smoothing algorithms with an emphasis on theory, algorithms and practical implications. The new smoothing results pave the way to use auxiliary residuals, that is full sample estimates of the disturbances, for diagnostic checking of unobserved components time series models. Chapter 4 develops test statistics for auxiliary residuals and it presents applications showing how they can be used to detect and distinguish between outliers and structural change. A cubic spline is a polynomial function of order three which is regularly used for interpolation and curve-fitting. It has also been applied to piecewise regressions, density approximations, etc. Chapter 5 develops the cubic spline further by allowing it to vary over time and by introducing it into time series models. These timevarying cubic splines are an efficient way of handling slowly changing periodic movements in time series. This method for modelling a changing periodic pattern is applied in a structural time series model used to forecast hourly electricity load demand, with the periodic movements being intradaily or intra-weekly. The full model contains other components, including a temperature response which is also modelled using cubic splines. A statistical computer package (SHELF) is developed to produce, at any time, hourly load forecasts three days ahead.
132

Statistical analysis of freshwater parameters monitored at different temporal resolutions

Mohamad Hamzah, Firdaus January 2012 (has links)
Nowadays, it is of great importance in ecological and environmental studies to investigate some prominent features in environmental determinants using appropriate statistical approaches. The initial motivation of this work was provided by the enthusiasm of the limnologist, biologist and statistician, interested in exploring and investigating certain features of time series data at different temporal resolutions to environmental parameters in freshwater. This thesis introduces a variety of statistical techniques which are used to provide sufficient information on the features of interest in the environmental variables in freshwater. Chapter 1 gives the background of the work, explores the details of the locations of the case studies, presents several statistical and ecological issues and outlines the aims and objectives of the thesis. Chapter 2 provides a review of some commonly used statistical modelling approaches to model trend and seasonality. All the modelling approaches are then applied to low temporal resolution (monthly data) of temperature and chlorophyll measurements from 1987-2005 for the north and south basins of Loch Lomond, Scotland. An investigation into the influence of temperature and nutrients on the variability of log chlorophyll is also carried out. Chapter 3 extends the modelling for temperature in Chapter 2 with the use of a mixed-effects model with different error structures for temperature data at a moderate temporal resolution (1 and 3 hourly data) in the north, mid and south basins. Three approaches are proposed to estimate the positions of a sharp change in gradient of temperature (thermocline) in deeper basins, using the maximum relative rate of change, changepoint regression and derivatives of a smooth curve. Chapter 4 investigates several features in semi-continuous environmental variables (15 and 30 minutes data). The temporal pattern of temperature, pH, conductivity and barometric pressure, and the evidence of similarity of the signals of pH and conductivity is determined, using wavelets. The time taken for pH and conductivity to return to `baseline levels' (recovery period) following extreme discharge is determined for different thresholds of `extreme discharge' for the Rivers Charr and Drumtee Burn, Scotland and models for the recovery period are proposed and fitted. Model validation is carried out for the River Charr and the occurrence of clusters of extreme discharge for both rivers is investigated using the extremal index. Chapter 5 summarises the main findings within this thesis and several potential areas for future work are suggested.
133

Linking biodiversity with environmental drivers and pressures in Great Britain

Proctor, Iain January 2015 (has links)
This thesis describes the original and significant development of a hierarchical statistical framework in order to realign fine-scale spatial covariate data. An example of the utilisation of this framework is given within the context of biodiversity modelling. Biodiversity is of utmost importance to the correct functioning of ecosystems and the provision of services vital to humanity. Understanding of the impacts on biodiversity by environmental drivers and pressures can help appropriate responses to be taken, to mitigate, halt or reverse damage to habitats. Therefore, linking biodiversity measures with explanatory covariates in statistical models can help understand these relationships and the extent to which certain drivers and pressures are responsible for environmental change. When modelling biodiversity, the scale at which the variables are measured should be considered. Where data are measured at different scales, a situation of misalignment arises. Misaligned data may be subject to measurement error, which can influence the resultant model, if the data are not realigned. In order to realign covariate data, two transformation approaches can be implemented. The first method is to aggregate the response data to the level of the explanatory covariates. The second method is to downscale the covariate data to the response locations. This realignment process is more complex than aggregation of the response, since it requires the uncertainty estimation of the downscaled covariate predictions. The developed framework has possible further applications in fine-scale uncertainty estimation of model covariates, where the scale at which the covariates are given is coarser than that at which the response data are available. Chapter 1 provides an introduction to the main issues and challenges in the thesis: biodiversity, data measurement, modelling techniques, scale and data realignment. The three case studies used in the development of the hierarchical framework are also introduced. Data from Loch Leven on underwater plants are analysed in chapter 2. Carabid data from ten rural locations are considered in chapter 3. In the final case study in chapter 4, coverage abundance data from sites the Countryside Survey across Great Britain are modelled. In chapter 5 the data from chapter 4 are used as the impetus; a hierarchical framework for realigning covariate data is developed and a simulation is created in order to assess its performance relative to the non-realigned model. Chapter 6 provides a summary of the case studies as well as discussion of the main issues and proposals for additional development.
134

On the provision, reliability, and use of hurricane forecasts on various timescales

Jarman, Alexander S. January 2014 (has links)
Probabilistic forecasting plays a pivotal role both in the application and in the advancement of geophysical modelling. Operational techniques and modelling methodologies are examined critically in this thesis and suggestions for improvement are made; potential improvements are illustrated in low-dimensional chaotic systems of nonlinear equations. Atlantic basin hurricane forecasting and forecast evaluation methodologies on daily to multi-annual timescales provide the primary focus of application and real world illustration. Atlantic basin hurricanes have attracted much attention from the scientific and private sector communities as well as from the general public due to their potential for devastation to life and property, and speculation on increasing trends in hurricane activity. Current approaches to modelling, prediction and forecast evaluation employed in operational hurricane forecasting are critiqued, followed by recommendations for best-practice techniques. The applicability of these insights extends far beyond the forecasting of hurricanes. Hurricane data analysis and forecast output is based on small-number count data sourced from a small-sample historical archive; analysis benefits from specialised statistical methods which are adapted to this particular problem. The challenges and opportunities arising in hurricane statistical analysis and forecasting posed by small-number, small-sample, and, in particular, by serially dependent data are clarified. This will allow analysts and forecasters alike access to more appropriate statistical methodologies. Novel statistical forecasting techniques are introduced for seasonal hurricane prediction. In addition, a range of linear and non-linear techniques for analysis of hurricane count data are applied for the first time along with an innovative algorithmic approach for the statistical inference of regression model coefficients. A real-time outlook for the 2013 hurricane season is presented, along with a methodology to support a running (re)analysis for National Hurricane Center 48 hour forecasts in 2013; the focus here is on if, and if so how, to improve forecast effectiveness by “recalibrating” the raw forecasts in real time. In this case, it is revealed that recalibration does not improve forecast performance, and that, across years, it can be detrimental. In short, a new statistical framework is proposed for evaluating and interpreting forecast reliability, forecast skill, and forecast value to provide a sound basis for constructing and utilising operational event predictions. This novel framework is then illustrated in the specific context of hurricane prediction. Proposed methods of forecast recalibration in the context of both a low-dimensional dynamical system and operational hurricane forecasting are employed to illustrate methods for improving resource allocation distinguishing, for example, scenarios where forecast recalibration is effective from those where resources would be better dedicated towards improving forecast techniques. A novel approach to robust statistical identification of the weakest links in the complex chain leading to probabilistic prediction of nonlinear systems is presented, and its application demonstrated in both numerical studies and operational systems.
135

The investigation of alternative weighting approaches to adjust for non-response in longitudinal surveys

Sadig, Husam Eldin Sadig Ahmed January 2015 (has links)
To reduce bias in survey estimates, most longitudinal survey organisations, nowadays, prepare and include sets of weights in public use data files for use by analysts. Aside from correcting for non-coverage, the weights are usually designed to reflect the sample design as well as to correct for non-response error by combining design weights and non-response weight adjustments together. With regard to non-response weights, many longitudinal surveys implement similar strategies (referred to as the standard weighting approach in this thesis) to create them. This approach is based upon a weighting model where: response is defined as responding at all conducted waves; all sample members whose eligibility is unknown are assumed as eligible and the model is estimated by using generic weighting variables and all sample members for which data are available on the weighting variables. However, there are several issues in longitudinal surveys that raise concerns regarding using this approach of weighting. In particular, this thesis is concerned with three challenging issues: non-monotonic response pattern which results in a large number of combinations of waves at which sample members could respond, and hence weights that result from an approach such as the one in question, which defines response as responding at all the conducted waves may not be appropriate for the analysis of data from a wave-combination that does not include all waves; unknown eligibility over time leads to including a proportion of ineligible units in the weights' calculation (if they are assumed to be eligible as in the standard approach) which may result in biased estimates unless the actual ineligible units amongst units of unknown eligibility are excluded; and the choice of the best covariates for the weighting model which may differ considerably across different subgroups of respondents in the same sample. In the standard approach only generic weighting variables are used in the weighting model, as all sample members are used in the estimation. Meanwhile, some variables, which may not be significant in predicting response for the whole sample, could be important in predicting the response in some subgroups. In this thesis, I provide three alternative approaches (each deals with one of the raised issues) for non-response weighting. I investigate each of the proposed approaches by incorporating relevant weight adjustments, as well as weights from the standard weighting approach, in a longitudinal multivariate analysis. I test the impact of weights from each alternative approach on estimates by comparing the resultant estimates with estimates resulting from the standard approach. I use data from the British Household Panel Survey (BHPS) to carry out the investigation. The findings suggest that the standard and alternative approaches, all help similarly in reducing non-response error. However, the standard approach may fail in tackling the effect of non-response in some estimates, as it does not take into account the three raised issues in the weighting of longitudinal data. In contrast, since they deal with the three issues under investigation (separately), the alternative approaches seem to handle non-response even in estimates that are not affected by the standard weighting approach.
136

Evidence synthesis for prognosis and prediction : application, methodology and use of individual participant data

Ensor, Joie January 2017 (has links)
Prognosis research summarises, explains and predicts future outcomes in patients with a particular condition. This thesis investigates the application and development of evidence synthesis methods for prognosis research, with particular attention given to improving individualised predictions from prognostic models developed and/or validated using metaanalysis techniques. A review of existing prognostic models for recurrence of venous thromboembolism highlighted several methodological and reporting issues. This motivated the development of a new model to address previous shortcomings, in particular by explicitly modelling and reporting the baseline hazard to enable individualised risk predictions over time. The new model was developed using individual participant data from several studies, using a novel internal-external cross-validation approach. This highlighted the potential for between-study heterogeneity in model performance, and motivated the investigation of recalibration methods to substantially improve consistency in model performance across populations. Finally, a new multiple imputation method was developed to investigate the impact of missing threshold information in meta-analysis of prognostic test accuracy. Computer code was developed to implement the method, and applied examples indicated missing thresholds could have a potentially large impact on conclusions. A simulation study indicated that the new method generally improves on the current standard, in terms of bias, precision and coverage.
137

Capability as an outcome measure in randomised controlled trials

Keeley, Thomas James Hier January 2014 (has links)
‘The capability approach is a broad, normative framework for the evaluation of well-being’(p.94)[1], which has attracted growing interest in health and health economics research. A broader measure of well-being may more accurately capture the effects of some interventions, than traditional health-related quality of life measures. The ICECAP-A and ICECAP-O are two measures of a person’s well-being, with a theoretical grounding in the capability approach, designed for use in health and social care research. This thesis reports qualitative and quantitative investigations into the validity and responsiveness of the ICECAP measures. A methodological review of existing validation studies was completed. Seventeen semi-structured interviews with health research professionals were carried out and an iterative, constant comparative, thematic analysis was completed to assess the content validity of the ICECAP-A. The construct validity and responsiveness of the measures were assessed using two randomised controlled trials: the BEEP trial (ISRCTN 93634563) and the Past BP trial (ISRCTN 29062286). Qualitative and quantitative results provide positive indications of validity. The qualitative work showed that research professionals viewed the ICECAP-A as a relevant and feasible measure for use in health research. The quantitative results confirmed the majority of a priori hypotheses in the validity analyses, while longitudinal data provided evidence that the measures are responsive to self-reported changes in health status. In conclusion, this thesis reports the first assessment of validity in a randomised controlled trial setting and the first analysis of responsiveness. While further testing of the ICECAP measures is required, results indicate that the measures are appropriate for use in health research.
138

Analysing the effects of fiscal policy and assessing its sustainability

Jeong, Kwang Jo January 2014 (has links)
This thesis presents three empirical analyses of the macroeconomic effects and sustainability of fiscal policy. Three key issues are examined: the transmission mechanism for fiscal policy shocks in Korea, the sustainability of government debt in three selected countries (Korea, the UK, and the US), and the effects of fiscal consolidation on macroeconomic activity. The main findings are as follows. First, government spending has a positive effect on the economy. Capital spending is likely to boost the economy more effectively than current spending. Second, there is a cointegrating relationship between the variables in Korea and the US, but not in the UK. That means fiscal policy in Korea and the US is sustainable, while fiscal policy in the UK is not. Third, fiscal consolidation is not likely to be expansionary in terms of GDP growth. The results also show that fiscal consolidation in time of high debt-to-GDP ratios, the spending-base, or high sovereign risk has fewer negative effects on economic growth than fiscal consolidation in time of low debt-to-GDP ratios, the tax-base, or low sovereign risk. The economic growth rate, government spending-based fiscal consolidation, low long-term interest rates, and higher sovereign risk have significant effects on reducing debt-to-GDP ratio.
139

Cross-country heterogeneity and time variation in the Euro-area economies : investigation using VAR methods

Bagzibagli, Kemal January 2013 (has links)
This thesis investigates the monetary transmission mechanism in the Euro area, for countries taken individually and as an aggregate. The focus of the thesis is on the effects of monetary policy shocks on the area as a whole, across countries and over time during the period of single monetary policy by the Eurosystem. Using the mostrecent empirical techniques such as factor-augmented vector autoregression (VAR), Bayesian Gibbs sampling, rolling windows, data pre-screening and panel VAR, the thesis investigates a novel (large) data set for the economies of the Euro area. According to our empirical analyses utilising these techniques, the thesis reaches the following main conclusions: First, time variation in the impulse responses of area-wide consumer prices and monetary aggregates to monetary policy shocks is stronger than that of other key macroeconomic indicators. The contractionary impact of the monetary tightening on real activity is the strongest when it hits the economy during the global financial crisis period (Chapter 1). Second, although the effects of the policy shocks on national real activities and price levels are homogeneous across countries, the transmission mechanism displays important cross-country heterogeneity with the national monetary aggregates responding most heterogeneously to common monetary policy shocks (Chapter 2). Finally, despite the responses of the Eurosystem to the global financial crisis with unconventional monetary measures, country-specific factors such as defaults risks and bailouts played a significant role in disrupting the transmission of the policy actions to individual economic activities (Chapter 3).
140

Analysis of fluvial dissolved organic carbon using high resolution UV-visible spectroscopy and Raman spectroscopy

Coleman, Martin January 2017 (has links)
This dissertation focusses on some advancements in methodology for measuring and analysing dissolved organic carbon (DOC): analysing data from a high resolution sensor generating DOC concentrations, [DOC] and secondly the use of Raman spectroscopy to analyse the composition of DOC. Recent advances in sensor technology have enabled the collection of DOC data with greater frequency over extended time periods than was previously possible through manually collecting water samples. In this research a time series of 30 minute [DOC] data for 2.5 years from Drumtee water, a peaty catchment in Scotland, was generated and analysed using a Spectro::lyserTM from S::CanTM, with a customised algorithm for calculating [DOC]. The time series revealed details of events and strong seasonal variation in the [DOC], with a range of 8.0 mg/l to 55.7 mg/l. During the same time period measurements made using manual sampling of river water were very similar, ranging from 10.2 mg/l to 81.1 mg/l (with the second largest value at 64.1 mg/l). Similar DOC export budgets were calculated from Spectro::lyserTM measurements and from the laboratory-analysed samples for both the hydrological year 2012/13 (HY 2012/13) and hydrological year 2013/14 (HY 2013/14). For the HY 2012/13 year the DOC budgets using the field collected data and the laboratory collected data were 16.6 gCm2.yr-1 and 19.8 gCm2.yr-1 respectively. For the HY 2013/14 year the DOC budgets using the field collected data and the laboratory collected data were 18.1 gCm2.yr-1 and 19.5 gCm2.yr-1 respectively. The similarity between the budgets calculated using the high-resolution [DOC] sensor and the budget calculated using laboratory measured [DOC] samples indicated that seasonal variation had a greater influence on export budgets than short term events had. GAMs were used to model the high resolution [DOC] data, and the model generated an R2 value of 0.75 and a p-value of < 2.2 x 10-16. It was also identified statistically that there were regular [DOC] dilutions during events and that these dilutions tended to coincide with the time period when discharge was increasing most rapidly. To identify relationships and periodicities in the high resolution [DOC] time series that would otherwise be challenging to identify three forms of wavelet analysis were used. These were continuous wavelet transforms (CWTs), maximal overlap discrete wavelet transforms (MODWTs) and wavelet coherence transforms (WTCs). Using the WTCs, it was determined that there were short term correlations between the [DOC] and pH between 25 June 2013 and 17 July 2013, between [DOC] and SC during 7 August 2013 and 7 October 2013 and between [DOC] and water temperature during 19 June 2013 and 30 June 2013. Although the although the relationship between [DOC] and temperature is strong over a full year it was over these shorter time periods the weakest of the three relationships established. Identifying this coherence was not possible using bivariate analysis and the long periods of no coherence obscured these responses when analysing the data on scatter plots. Although wavelet analysis has been used in other applications this is one of the first instances in which this technique has been applied to [DOC] time series. Raman spectroscopy, conducted using a 785 nm laser, was explored as an analytical tool that could enable a better understanding of DOC composition, as an alternative to the use of fluorescence spectroscopy. Tests were conducted using both Stokes and anti-Stokes Raman spectroscopy measurements with the best results obtained using anti-Stokes Raman spectroscopy. Solid phase measurements were made of glucose, fructose, sucrose, glycine, tyrosine, tryptophan and phenylalanine, but only the glucose produced a measurable spectrum of these substances. Measurements (powders and solutions) were made of humic and fulvic acids and these produced spectra that were measurably different from the background signals. The limit of detection was measured to be approximately 500 mg/l for both the humic acid and fulvic acid. It was identified that comparing the sections of the measured spectra between wavenumbers -1100 cm-1 to -1400 cm-1 to -1800 cm-1 to -2000 cm-1 could be used to differentiate between humic and fulvic acids. In summary, this research has focussed on the use of use high resolution sensor technology to generate and then analyse a long time series in a fluvial system with a particularly high [DOC], and made advances in being able to model the [DOC] using a GAM model, despite the complex relationship measured between discharge and [DOC]. Additionally, wavelet analysis has been applied to a [DOC] data set to identify trends in the [DOC] time series that would otherwise be hard to identify. Wavelet analysis has been applied to other geophysical time series such as those found in atmospheric research, but this appears to be the first time it has been applied to [DOC]. Additionally, the use of the anti-Stokes region of the Raman spectra has allowed identification of humic and fulvic acids, and established a limit of detection. Furthermore, an absorbance ratio was identified that can be used to determine whether a solution of humic substances is dominated primarily by humic acid or fulvic acid. This research appears to be the first study to explore this.

Page generated in 0.0685 seconds