• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 4
  • 1
  • Tagged with
  • 5
  • 5
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • 2
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Valid estimation and prediction inference in analysis of a computer model

Nagy, Béla 11 1900 (has links)
Computer models or simulators are becoming increasingly common in many fields in science and engineering, powered by the phenomenal growth in computer hardware over the past decades. Many of these simulators implement a particular mathematical model as a deterministic computer code, meaning that running the simulator again with the same input gives the same output. Often running the code involves some computationally expensive tasks, such as solving complex systems of partial differential equations numerically. When simulator runs become too long, it may limit their usefulness. In order to overcome time or budget constraints by making the most out of limited computational resources, a statistical methodology has been proposed, known as the "Design and Analysis of Computer Experiments". The main idea is to run the expensive simulator only at a relatively few, carefully chosen design points in the input space, and based on the outputs construct an emulator (statistical model) that can emulate (predict) the output at new, untried locations at a fraction of the cost. This approach is useful provided that we can measure how much the predictions of the cheap emulator deviate from the real response surface of the original computer model. One way to quantify emulator error is to construct pointwise prediction bands designed to envelope the response surface and make assertions that the true response (simulator output) is enclosed by these envelopes with a certain probability. Of course, to be able to make such probabilistic statements, one needs to introduce some kind of randomness. A common strategy that we use here is to model the computer code as a random function, also known as a Gaussian stochastic process. We concern ourselves with smooth response surfaces and use the Gaussian covariance function that is ideal in cases when the response function is infinitely differentiable. In this thesis, we propose Fast Bayesian Inference (FBI) that is both computationally efficient and can be implemented as a black box. Simulation results show that it can achieve remarkably accurate prediction uncertainty assessments in terms of matching coverage probabilities of the prediction bands and the associated reparameterizations can also help parameter uncertainty assessments.
2

Valid estimation and prediction inference in analysis of a computer model

Nagy, Béla 11 1900 (has links)
Computer models or simulators are becoming increasingly common in many fields in science and engineering, powered by the phenomenal growth in computer hardware over the past decades. Many of these simulators implement a particular mathematical model as a deterministic computer code, meaning that running the simulator again with the same input gives the same output. Often running the code involves some computationally expensive tasks, such as solving complex systems of partial differential equations numerically. When simulator runs become too long, it may limit their usefulness. In order to overcome time or budget constraints by making the most out of limited computational resources, a statistical methodology has been proposed, known as the "Design and Analysis of Computer Experiments". The main idea is to run the expensive simulator only at a relatively few, carefully chosen design points in the input space, and based on the outputs construct an emulator (statistical model) that can emulate (predict) the output at new, untried locations at a fraction of the cost. This approach is useful provided that we can measure how much the predictions of the cheap emulator deviate from the real response surface of the original computer model. One way to quantify emulator error is to construct pointwise prediction bands designed to envelope the response surface and make assertions that the true response (simulator output) is enclosed by these envelopes with a certain probability. Of course, to be able to make such probabilistic statements, one needs to introduce some kind of randomness. A common strategy that we use here is to model the computer code as a random function, also known as a Gaussian stochastic process. We concern ourselves with smooth response surfaces and use the Gaussian covariance function that is ideal in cases when the response function is infinitely differentiable. In this thesis, we propose Fast Bayesian Inference (FBI) that is both computationally efficient and can be implemented as a black box. Simulation results show that it can achieve remarkably accurate prediction uncertainty assessments in terms of matching coverage probabilities of the prediction bands and the associated reparameterizations can also help parameter uncertainty assessments.
3

Valid estimation and prediction inference in analysis of a computer model

Nagy, Béla 11 1900 (has links)
Computer models or simulators are becoming increasingly common in many fields in science and engineering, powered by the phenomenal growth in computer hardware over the past decades. Many of these simulators implement a particular mathematical model as a deterministic computer code, meaning that running the simulator again with the same input gives the same output. Often running the code involves some computationally expensive tasks, such as solving complex systems of partial differential equations numerically. When simulator runs become too long, it may limit their usefulness. In order to overcome time or budget constraints by making the most out of limited computational resources, a statistical methodology has been proposed, known as the "Design and Analysis of Computer Experiments". The main idea is to run the expensive simulator only at a relatively few, carefully chosen design points in the input space, and based on the outputs construct an emulator (statistical model) that can emulate (predict) the output at new, untried locations at a fraction of the cost. This approach is useful provided that we can measure how much the predictions of the cheap emulator deviate from the real response surface of the original computer model. One way to quantify emulator error is to construct pointwise prediction bands designed to envelope the response surface and make assertions that the true response (simulator output) is enclosed by these envelopes with a certain probability. Of course, to be able to make such probabilistic statements, one needs to introduce some kind of randomness. A common strategy that we use here is to model the computer code as a random function, also known as a Gaussian stochastic process. We concern ourselves with smooth response surfaces and use the Gaussian covariance function that is ideal in cases when the response function is infinitely differentiable. In this thesis, we propose Fast Bayesian Inference (FBI) that is both computationally efficient and can be implemented as a black box. Simulation results show that it can achieve remarkably accurate prediction uncertainty assessments in terms of matching coverage probabilities of the prediction bands and the associated reparameterizations can also help parameter uncertainty assessments. / Science, Faculty of / Statistics, Department of / Graduate
4

Improving Runoff Estimation at Ungauged Catchments

Zelelew, Mulugeta January 2012 (has links)
Water infrastructures have been implemented to support the vital activities of human society. The infrastructure developments at the same time have interrupted the natural catchment response characteristics, challenging society to implement effective water resources planning and management strategies. The Telemark area in southern Norway has seen a large number of water infrastructure developments, particularly hydropower, over more than a century. Recent developments in decision support tools for flood control and reservoir operation has raised the need to compute inflows from local catchments, most of which are regulated or have no observed data. This has contributed for the motivation of this PhD thesis work, with an aim of improving runoff estimation at ungauged catchments, and the research results are presented in four manuscript scientific papers.  The inverse distance weighting, inverse distance squared weighting, ordinary kriging, universal kriging and kriging with external drift were applied to analyse precipitation variability and estimate daily precipitation in the study area. The geostatistical based univariate and multivariate map-correlation concepts were applied to analyse and physically understand regional hydrological response patterns. The Sobol variance based sensitivity analysis (VBSA) method was used to investigate the HBV hydrological model parameterization significances on the model response variations and evaluate the model’s reliability as a prediction tool. The HBV hydrological model space transferability into ungauged catchments was also studied.  The analyses results showed that the inverse distance weighting variants are the preferred spatial data interpolation methods in areas where relatively dense precipitation station network can be found.  In mountainous areas and in areas where the precipitation station network is relatively sparse, the kriging variants are the preferred methods. The regional hydrological response correlation analyses suggested that geographic proximity alone cannot explain the entire hydrological response correlations in the study area. Besides, when the multivariate map-correlation analysis was applied, two distinct regional hydrological response patterns - the radial and elliptical-types were identified. The presence of these hydrological response patterns influenced the location of the best-correlated reference streamgauges to the ungauged catchments. As a result, the nearest streamgauge was found the best-correlated in areas where the radial-type hydrological response pattern is the dominant. In area where the elliptical-type hydrological response pattern is the dominant, the nearest reference streamgauge was not necessarily the best-correlated. The VBSA verified that varying up to a minimum of four to six influential HBV model parameters can sufficiently simulate the catchments' responses characteristics when emphasis is given to fit the high flows. Varying up to a minimum of six influential model parameters is necessary to sufficiently simulate the catchments’ responses and maintain the model performance when emphasis is given to fit the low flows. However, varying more than nine out of the fifteen HBV model parameters will not make any significant change on the model performance.  The hydrological model space transfer study indicated that estimation of representative runoff at ungauged catchments cannot be guaranteed by transferring model parameter sets from a single donor catchment. On the other hand, applying the ensemble based model space transferring approach and utilizing model parameter sets from multiple donor catchments improved the model performance at the ungauged catchments. The result also suggested that high model performance can be achieved by integrating model parameter sets from two to six donor catchments. Objectively minimizing the HBV model parametric dimensionality and only sampling the sensitive model parameters, maintained the model performance and limited the model prediction uncertainty.
5

Evaluation de la Performance des Réglages de Fréquence des Eoliennes à l’Echelle du Système Electrique : Application à un Cas Insulaire / Performance Evaluation of Frequency Response from Wind Turbines on a System-Wide Scale : Application onto an Isolated Power System Case

Wang, Ye 20 November 2012 (has links)
L’intégration croissante de la production éolienne ne participant pas au réglage de fréquence induit de nouvelles difficultés de gestion des systèmes électriques. Ces problèmes sont d’autant plus significatifs que le réseau est faible. La présente thèse vise à évaluer la performance et la fiabilité du réglage de fréquence des éoliennes à l’échelle du système électrique. Les études sont appliquées sur un réseau insulaire.D’abord, l’impact d’un fort taux de pénétration de la production éolienne sur l’allocation de la réserve primaire et sur le comportement dynamique du réseau est caractérisé. Il est montré que la participation des éoliennes au réglage de fréquence est techniquement indispensable pour le maintien de la sûreté du système électrique à partir d’un certain taux de pénétration. Deux solutions permettant aux éoliennes de contribuer au réglage de fréquence sont ensuite étudiées par simulations dynamiques. La performance d’une inertie émulée est caractérisée en considérant l’impact du point de fonctionnement initial des éoliennes et des paramètres du contrôleur. La contribution de la réserve éolienne à l’amélioration de la performance dynamique du système est également identifiée.Afin d’évaluer le potentiel et la fiabilité de la réserve éolienne, la dernière partie de ce travail est consacrée aux études statistiques prenant en compte la variabilité et l’incertitude de la prévision de la production. Deux stratégies du placement de réserve sont proposées et comparées. L’impact des erreurs de prévision sur le potentiel de réserve éolienne est également mis en évidence. Enfin l’énergie réglante d’une ferme et la plage de réglage du statisme éolien sont caractérisées / The increasing development of wind power that does not participate in frequency control leads to new challenges in the management of electrical power systems. The problems are more significant in weak power grids. The present thesis aims to evaluate the performance and the reliability of frequency response from wind turbines on a system-wide scale. Studies are applied onto an isolated power grid.First of all, the impact of high levels of wind penetration on primary reserve allocation and on grid dynamic behaviour is characterized. It is shown that the participation of wind turbines in frequency regulation is technically required for maintaining power system security from a certain wind penetration rate.Two solutions allowing wind turbines to contribute to frequency control are then studied through dynamic simulations. The performance of emulated inertia is characterized by taking into account the impact of initial wind operating point and controller parameters. The contribution of wind power reserve to system dynamic performance improvement is also identified.In order to assess the potential and the reliability of wind primary reserve, the last part of this research work is devoted to statistical analyses considering the variability and the prediction uncertainty of wind generation. Two strategies for reserve allocation are proposed and compared. The impact of forecast errors on the potential of wind power reserve is also highlighted. Finally the power frequency characteristic of a wind farm as well as the droop adjustment range is characterized

Page generated in 0.1166 seconds