• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 2
  • 1
  • Tagged with
  • 24
  • 3
  • 3
  • 3
  • 2
  • 2
  • 2
  • 2
  • 2
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
11

Location techniques for pico- and femto-satellites, with applications for space weather monitoring

Griffiths, Ian Michael January 2017 (has links)
Space weather phenomena have a significant impact on satellite communications but are not well understood. In-situ measurements of the ionospheric environment would significantly improve the understanding of the origins and progressions of these phenomena. Whilst previous scientific satellites have measured the ionospheric plasma, they only provide a limited view due to their small number. It has previously been suggested that a swarm of femto-satellites (PCBsats) could be used to collect high quality temporal and spatial measurements, whilst being financially effective. To give the measurements any scientific value, the location and time of each measurement needs to be accurately recorded. The PCBsat prototype used a solution that, due to export requirements and fundamental limitations with the device, would not be capable of working in space. Several location and timing solutions have been investigated, with none matching the precision, accuracy, power consumption and physical size of a GNSS receiver (i.e. a receiver of GPS, GLONASS, Galileo etc. signals). To further reduce the power consumption, a novel distributed GNSS receiver has been designed and built, where the largest computational burden (calculating the receivers position) is offloaded to a relaying node. This use of distributed computing has been shown to reduce the power consumption of the receiver by between 5:6% and 13:3% - which is equivalent to between 2 and 5 times the power consumption of the PCBsat’s main processor. In addition to this, this novel approach has the additional benefit of being used in a hybrid scheme. Where information required to calculate a receiver’s position is stored so that it can be used with higher precision ephemerides that are publicly available but are delayed by up to three weeks. This has many applications as it can increase the utility of collected data, at a reduced cost. As the intended femto-satellite application relies on a link to relaying satellites, the dynamics, in particular the dispersion, of the intended constellation needs to be known. This has been modelled using a novel orbit simulator. The orbit simulator is the first of its kind to model multidimensional free molecular drag to simulate the effects of the low density atmosphere on a satellite. This allows the dispersion of a constellation of satellites to be investigated with maximum separations for the PCBsat being presented.
12

Quantifying uncertainty in radar rainfall estimates using an X-band dual polarisation weather radar

Dufton, David Richard Lloyd January 2016 (has links)
Weather radars have been used to quantitatively estimate precipitation since their development in the 1940s, yet these estimates are still prone to large uncertainties which dissuade the hydrological community in the UK from adopting these estimates as their primary rainfall data source. Recently dual polarisation radars have become more common, with the national networks in the USA, UK and across Europe being upgraded, and the benefits of dual polarisation radars are beginning to be realised for improving quantitative precipitation estimates (QPE). The National Centre for Atmospheric Science (NCAS) mobile Doppler X-band dual polarisation weather radar is the first radar of its kind in the UK, and since its acquisition in 2012 has been deployed on several field campaigns in both the UK and abroad. The first of these campaigns was the Convective Precipitation Experiment (COPE) where the radar was deployed in Cornwall (UK) through the summer of 2013. This thesis has used the data acquired during the COPE field campaign to develop a processing chain for the X-band radar which leverages its dual polarisation capabilities. The processing chain developed includes the removal of spurious echoes including second trip, ground clutter and insects through the use of dual polarisation texture fields, logical decision thresholds and fuzzy logic classification. The radar data is then corrected for the effects of attenuation and partial beam blockage (PBB) by using the differential phase shift to constrain the total path integrated attenuation and calibrate the radar azimuthally. A new smoothing technique has been developed to account for backscatter differential phase in the smoothing of differential phase shift which incorporates a long and a short averaging window in conjunction with weighting smoothing using the copolar correlation coefficient. During the correction process it is shown that the calculation of PBB is insensitive to the variation in the ratio between specific attenuation and specific differential phase shift provided a consistent value is used. It is also shown that the uncertainty in attenuation correction is lower when using a constrained correction such as the ZPHI approach rather than a direct linear correction using differential phase shift and is the preferred method of correction where possible. Finally the quality controlled, corrected radar moments are used to develop a rainfall estimation for the COPE field campaign. Results show that the quality control and correction process increases the agreement between radar rainfall estimates and rain gauges when using horizontal reflectivity from a regression correlation of -0.01 to 0.34, with a reduction in the mean absolute percentage difference (MAPD) from 86% to 31%. Using dual polarisation moments to directly estimate rainfall shows that rainfall estimates based on the theoretical conversion of specific attenuation to reflectivity produce the closest agreement to rain gauges for the field campaign with a MAPD of 24%. Finally it is demonstrated that merging multiple dual polarisation rainfall estimates together improves the performance of the rainfall estimates in high intensity rainfall events while maintaining the overall accuracy of the rainfall estimates when compared to rain gauges.
13

Application of chaos theory to short term prediction of high resolution rainfall

Murray, Lisa Jane January 2010 (has links)
No description available.
14

Methods of targeting observations for the improvement of weather forecast skill

Frame, Thomas H. A. January 2006 (has links)
This thesis is a contribution to the subjects of midlatitude atmospheric dynamics and targeting observations for the improvement of weather forecasts. For the first time the full spectrum of singular vectors of the Eady model are considered. The importance and implications of the unshielding and modal unmasking mechanisms to the computed singular vectors are discussed. The computed singular vectors are used to analyse the vertical structure of the singular vector targeting function commonly used in observation targeting, in a vertical cross-section. Through comparison of this vertical cross-section to the dynamics of singular vectors, inferences about the scale and qualitative behaviour of the perturbations to which particular regions are 'sensitive' are made.
15

Limitations to seasonal weather prediction and crop forecasting due to nonlinearity and model inadequacy

Higgins, Sarah January 2015 (has links)
This Thesis examines the main issues surrounding crop modelling by detailed studies of (i) multi-model ensemble forecasting using a simple dynamical system as a proxy for seasonal weather forecasting, (ii) probabilistic forecasts for crop models and (iii) an analysis of changes in US yield. The ability to forecast crop yield accurately on a seasonal time frame would be hugely beneficial to society in particular farmers, governments and the insurance industry. In addition, advance warning of severe weather patterns that could devastate large areas of crops would allow contingency plans to be put in place before the onset of a widespread famine, potentially averting a humanitarian disaster. There is little experience in the experimental design of ensembles for seasonal weather forecasting. Exploring the stability of the results varying, for example, the sample size aids understanding. For this a series of numerical experiments are conducted in an idealised world based around the Moran Ricker Map. The idealised world is designed to replicate the multi-model ensemble forecasting methods used in seasonal weather forecasting. Given the complexity of the physical weather systems experiments are instead conducted on the Moran Ricker Map [56,70]. Additionally, experiments examine whether including climatology as a separate model or blending with climatology can increase the skill. A method to create probabilistic forecasts from a crop model, the Crop Environment Resource Synthesis Maize model (CERES-Maize) [19, 37] is proposed. New empirical models are created using historical US maize yield. The skill from equally weighting the crop model with a simple empirical model is investigated. Background reviews of weather and yield data is presented in new ways for the largest maize growing state Iowa. A new method separating the impacts of favourable weather from technology increases in a crop yield time series is explored.
16

Applied probabilistic forecasting

Binter, Roman January 2012 (has links)
In any actual forecast, the future evolution of the system is uncertain and the forecasting model is mathematically imperfect. Both, ontic uncertainties in the future (due to true stochasticity) and epistemic uncertainty of the model (reflecting structural imperfections) complicate the construction and evaluation of probabilistic forecast. In almost all nonlinear forecast models, the evolution of uncertainty in time is not tractable analytically and Monte Carlo approaches (”ensemble forecasting”) are widely used. This thesis advances our understanding of the construction of forecast densities from ensembles, the evolution of the resulting probability forecasts and methods of establishing skill (benchmarks). A novel method of partially correcting the model error is introduced and shown to outperform a competitive approach. The properties of Kernel dressing, a method of transforming ensembles into probability density functions, are investigated and the convergence of the approach is illustrated. A connection between forecasting and Information theory is examined by demonstrating that Kernel dressing via minimization of Ignorance implicitly leads to minimization of Kulback-Leibler divergence. The Ignorance score is critically examined in the context of other Information theory measures. The method of Dynamic Climatology is introduced as a new approach to establishing skill (benchmarking). Dynamic Climatology is a new, relatively simple, nearest neighbor based model shown to be of value in benchmarking of global circulation models of the ENSEMBLES project. ENSEMBLES is a project funded by the European Union bringing together all major European weather forecasting institutions in order to develop and test state-of-the-art seasonal weather forecasting models. Via benchmarking the seasonal forecasts of the ENSEMBLES models we demonstrate that Dynamic Climatology can help us better understand the value and forecasting performance of large scale circulation models. Lastly, a new approach to correcting (improving) imperfect model is presented, an idea inspired by [63]. The main idea is based on a two-stage procedure where a second stage ‘corrective’ model iteratively corrects systematic parts of forecasting errors produced by a first stage ‘core’ model. The corrector is of an iterative nature so that at a given time t the core model forecast is corrected and then used as an input into the next iteration of the core model to generate a time t + 1 forecast. Using two nonlinear systems we demonstrate that the iterative corrector is superior to alternative approaches based on direct (non-iterative) forecasts. While the choice of the corrector model class is flexible, we use radial basis functions. Radial basis functions are frequently used in statistical learning and/or surface approximations and involve a number of computational aspects which we discuss in some detail.
17

On the provision, reliability, and use of hurricane forecasts on various timescales

Jarman, Alexander S. January 2014 (has links)
Probabilistic forecasting plays a pivotal role both in the application and in the advancement of geophysical modelling. Operational techniques and modelling methodologies are examined critically in this thesis and suggestions for improvement are made; potential improvements are illustrated in low-dimensional chaotic systems of nonlinear equations. Atlantic basin hurricane forecasting and forecast evaluation methodologies on daily to multi-annual timescales provide the primary focus of application and real world illustration. Atlantic basin hurricanes have attracted much attention from the scientific and private sector communities as well as from the general public due to their potential for devastation to life and property, and speculation on increasing trends in hurricane activity. Current approaches to modelling, prediction and forecast evaluation employed in operational hurricane forecasting are critiqued, followed by recommendations for best-practice techniques. The applicability of these insights extends far beyond the forecasting of hurricanes. Hurricane data analysis and forecast output is based on small-number count data sourced from a small-sample historical archive; analysis benefits from specialised statistical methods which are adapted to this particular problem. The challenges and opportunities arising in hurricane statistical analysis and forecasting posed by small-number, small-sample, and, in particular, by serially dependent data are clarified. This will allow analysts and forecasters alike access to more appropriate statistical methodologies. Novel statistical forecasting techniques are introduced for seasonal hurricane prediction. In addition, a range of linear and non-linear techniques for analysis of hurricane count data are applied for the first time along with an innovative algorithmic approach for the statistical inference of regression model coefficients. A real-time outlook for the 2013 hurricane season is presented, along with a methodology to support a running (re)analysis for National Hurricane Center 48 hour forecasts in 2013; the focus here is on if, and if so how, to improve forecast effectiveness by “recalibrating” the raw forecasts in real time. In this case, it is revealed that recalibration does not improve forecast performance, and that, across years, it can be detrimental. In short, a new statistical framework is proposed for evaluating and interpreting forecast reliability, forecast skill, and forecast value to provide a sound basis for constructing and utilising operational event predictions. This novel framework is then illustrated in the specific context of hurricane prediction. Proposed methods of forecast recalibration in the context of both a low-dimensional dynamical system and operational hurricane forecasting are employed to illustrate methods for improving resource allocation distinguishing, for example, scenarios where forecast recalibration is effective from those where resources would be better dedicated towards improving forecast techniques. A novel approach to robust statistical identification of the weakest links in the complex chain leading to probabilistic prediction of nonlinear systems is presented, and its application demonstrated in both numerical studies and operational systems.
18

Evaluated developments in the WRF-Chem Model : comparison with observations and evaluation of impacts

Archer-Nicholls, Scott January 2014 (has links)
The Weather Research and Forecasting with Chemistry (WRF-Chem) Model is an “online” regional scale prediction system designed to simulate many detailed meteorological, gas-phase chemical and aerosol processes, with full coupling between the different components. The impacts of aerosol particles are complex and spatially heterogeneous, their impacts varying greatly at the regional scale. Modelling the properties and impacts in a systematic manner requires the coupling between different chemical phases, meteorological and physical parameterisations a model such as WRF-Chem offers. This manuscript documents several developments, and their evaluation, that have been made to the WRF-chem model to improve its representation of detailed gas-phase chemical and aerosol processes. The first study gives an overview of developments made for modeling the North-West European region, including the addition of a new semi-explicit chemical mechanism, N2O5 heterogeneous chemistry and modifications to the sea-spray emissions routine to include fine-mode organic material. The broad impacts of these developments were assessed in the study, while a follow up paper (included in supplementary material) investigated more deeply the impacts of N2O5 heterogeneous chemistry. The second study discusses modifications to WRF-Chem and emission products to improve modelled representation of biomass burning aerosol particles over Brazil. Model results were compared with aircraft measurements and found to represent aerosol particle size distributions and cloud condensation nuclei concentrations reasonably well, but too much biomass burning aerosol were transported up to high altitudes (4-8 km) by the model. In the third study, nested simulations (at higher resolutions than those used in the second study) over Brazil were used to evaluate the impact of aerosol particles on the local radiative balance, by comparing model results from simulations with and with- out aerosol-radiative feedbacks. The instantaneous clear sky aerosol-radiation forcings were found to have a net cooling of -5.0 W m−2 at the top of the atmosphere. Issues with resolving aerosol–cloud interactions, because of the convective parameterisation and differences in model setup across scales, made evaluating semi- and indirect effects impossible.
19

Optimising ground penetrating radar (GPR) to assess pavements

Evans, Robert D. January 2010 (has links)
Ground penetrating radar (GPR) technology has existed for many decades, but it has only been in the last 20 to 30 years that it has undergone great development for use in near surface ground investigations. The early 1980's saw the first major developments in the application of GPR for pavements (i.e. engineered structures designed to carry traffic loads), and it is now an established investigation technique, with generic information included in several national standard guidance documents. Analysis of GPR data can provide information on layer depths, material condition, moisture, voiding, reinforcement and location of other features. Assessing the condition of pavements, in order to plan subsequent maintenance, is essential to allow the efficient long-term functioning of the structure and GPR has enhanced and improved the range and certainty of information that can be obtained from pavement investigations. Despite the recent establishment of the technique in pavement investigation, the current situation is one in which GPR is used routinely for pavement projects in only a minority of countries, and the specialist nature of the technique and the sometimes variable results that are obtained can mean that there is both a lack of appreciation and a lack of awareness of the potential information that GPR can provide. The fact that GPR is still a developing technique, and that many aspects of its use are specialised in their nature, means that there are also several technical aspects of GPR pavement investigations which have not been fully researched, and knowledge of the response of GPR to some material conditions has not been fully established. The overall aim of this EngD research project was to provide improved pavement investigation capabilities by enhancing the methodologies and procedures used to obtain information from GPR. Several discrete research topics were addressed through various research methods including a literature review, fieldwork investigations, experimental laboratory investigations and a review of previously collected data. The findings of the research allowed conclusions and recommendations to be made regarding improved fieldwork methodologies, enhancing information and determining material condition from previously collected GPR data, assessing the effect of pavement temperature and moisture condition on GPR data and also on managing errors and uncertainty in GPR data. During the EngD project, a number of documents and presentations have been made to publicise the findings both within the EngD sponsoring company (Jacobs) and externally, and an in-house GPR capability has been established within Jacobs as a direct result of the EngD project.
20

Exploiting weather forecast data for cloud detection

Mackie, Shona January 2009 (has links)
Accurate, fast detection of clouds in satellite imagery has many applications, for example Numerical Weather Prediction (NWP) and climate studies of both the atmosphere and of the Earth’s surface temperature. Most operational techniques for cloud detection rely on the differences between observations of cloud and of clear-sky being more or less constant in space and in time. In reality, this is not the case - different clouds have different spectral properties, and different cloud types are more or less likely in different places and at different times, depending on atmospheric conditions and on the Earth’s surface properties. Observations of clear sky also vary in space and time, depending on atmospheric and surface conditions, and on the presence or absence of aerosol particles. The Bayesian approach adopted in this project allows pixel-specific physical information (for example from NWP) to be used to predict pixel-specific observations of clear sky. A physically-based, spatially- and temporally-specific probability that each pixel contains a cloud observation is then calculated. An advantage of this approach is that identification of ambiguously classed pixels from a probabilistic result is straightforward, in contrast to the binary result generally produced by operational techniques. This project has developed and validated the Bayesian approach to cloud detection, and has extended the range of applications for which it is suitable, achieving skills scores that match or exceed those achieved by operational methods in every case. High temperature gradients can make observations of clear sky around ocean fronts, particularly at thermal wavelengths, appear similar to cloud observations. To address this potential source of ambiguous cloud detection results, a region of imagery acquired by the AATSR sensor which was noted to contain some ocean fronts, was selected. Pixels in the region were clustered according to their spectral properties with the aim of separating pixels that correspond to different thermal regimes of the ocean. The mean spectral properties of pixels in each cluster were then processed using the Bayesian cloud detection technique and the resulting posterior probability of clear then assigned to individual pixels. Several clustering methods were investigated, and the most appropriate, which allowed pixels to be associated with multiple clusters, with a normalized vector of ‘membership strengths’, was used to conduct a case study. The distribution of final calculated probabilities of clear became markedly more bimodal when clustering was included, indicating fewer ambiguous classifications, but at the cost of some single pixel clouds being missed. While further investigations could provide a solution to this, the computational expense of the clustering method made this impractical to include in the work of this project. This new Bayesian approach to cloud detection has been successfully developed by this project to a point where it has been released under public license. Initially designed as a tool to aid retrieval of sea surface temperature from night-time imagery, this project has extended the Bayesian technique to be suitable for imagery acquired over land as well as sea, and for day-time as well as for night-time imagery. This was achieved using the land surface emissivity and surface reflectance parameter products available from the MODIS sensor. This project added a visible Radiative Transfer Model (RTM), developed at University of Edinburgh, and a kernel-based surface reflectance model, adapted here from that used by the MODIS sensor, to the cloud detection algorithm. In addition, the cloud detection algorithm was adapted to be more flexible, making its implementation for data from the SEVIRI sensor straightforward. A database of ‘difficult’ cloud and clear targets, in which a wide range of both spatial and temporal locations was represented, was provided by M´et´eo-France and used in this work to validate the extensions made to the cloud detection scheme and to compare the skill of the Bayesian approach with that of operational approaches. For night land and sea imagery, the Bayesian technique, with the improvements and extensions developed by this project, achieved skills scores 10% and 13% higher than M´et´eo-France respectively. For daytime sea imagery, the skills scores were within 1% of each other for both approaches, while for land imagery the Bayesian method achieved a 2% higher skills score. The main strength of the Bayesian technique is the physical basis of the differentiation between clear and cloud observations. Using NWP information to predict pixel-specific observations for clear-sky is relatively straightforward, but making such predictions for cloud observations is more complicated. The technique therefore relies on an empirical distribution rather than a pixel-specific prediction for cloud observations. To try and address this, this project developed a means of predicting cloudy observations through the fast forward-modelling of pixel-specific NWP information. All cloud fields in the pixel-specific NWP data were set to 0, and clouds were added to the profile at discrete intervals through the atmosphere, with cloud water- and ice- path (cwp, cip) also set to values spaced exponentially at discrete intervals up to saturation, and with cloud pixel fraction set to 25%, 50%, 75% and 100%. Only single-level, single-phase clouds were modelled, with the justification that the resulting distribution of predicted observations, once smoothed through considerations of uncertainties, is likely to include observations that would correspond to multi-phase and multi-level clouds. A fast RTM was run on the profile information for each of these individual clouds and cloud altitude-, cloud pixel fraction- and channel-specific relationships between cwp (and similarly cip) and predicted observations were calculated from the results of the RTM. These relationships were used to infer predicted observations for clouds with cwp/cip values other than those explicitly forward modelled. The parameters used to define the relationships were interpolated to define relationships for predicted observations of cloud at 10m vertical intervals through the atmosphere, with pixel coverage ranging from 25% to 100% in increments of 1%. A distribution of predicted cloud observations is then achieved without explicit forward-modelling of an impractical number of atmospheric states. Weights are applied to the representation of individual clouds within the final Probability Density Function (PDF) in order to make the distribution of predicted observations realistic, according to the pixel-specific NWP data, and to distributions seen in a global reference dataset of NWP profiles from the European Centre for Medium Range Weather Forecasting (ECMWF). The distribution is then convolved with uncertainties in forward-modelling, in the NWP data, and with sensor noise to create the final PDF in observation space, from which the conditional probability that the pixel observation corresponds to a cloud observation can be read. Although the relatively fast computational implementation of the technique was achieved, the results are disappointingly poor for the SEVIRI-acquired dataset, provided by M´et´eo-France, against which validation was carried out. This is thought to be explained by both the uncertainties in the NWP data, and the forward-modelling dependence on those uncertainties, being poorly understood, and treated too optimistically in the algorithm. Including more errors in the convolution introduces the problem of quantifying those errors (a non-trivial task), and would increase the processing time, making implementation impractical. In addition, if the uncertianties considered are too high then a PDF flatter than the empirical distribution currently used would be produced, making the technique less useful.

Page generated in 0.048 seconds