• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 92
  • 25
  • 20
  • 12
  • 4
  • 3
  • 3
  • 2
  • 2
  • 1
  • 1
  • Tagged with
  • 221
  • 221
  • 40
  • 35
  • 32
  • 30
  • 30
  • 24
  • 24
  • 24
  • 22
  • 20
  • 20
  • 20
  • 19
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
71

Underwater acoustic localization and tracking of Pacific walruses in the northeastern Chukchi Sea

Rideout, Brendan Pearce 10 January 2012 (has links)
This thesis develops and demonstrates an approach for estimating the three-dimensional (3D) location of a vocalizing underwater marine mammal using acoustic arrival time measurements at three spatially separated receivers while providing rigorous location uncertainties. To properly account for uncertainty in the measurements of receiver parameters (e.g., 3D receiver locations and synchronization times) and environmental parameters (water depth and sound speed correction), these quantities are treated as unknowns constrained with prior estimates and prior uncertainties. While previous localization algorithms have solved for an unknown scaling factor on the prior uncertainties as part of the inversion, in this work unknown scaling factors on both the prior and arrival time uncertainties are estimated. Maximum a posteriori estimates for sound source locations and times, receiver parameters, and environmental parameters are calculated simultaneously. Posterior uncertainties for all unknowns are calculated and incorporate both arrival time and prior uncertainties. Simulation results demonstrated that, for the case considered here, linearization errors are generally small and that the lack of an accurate sound speed profile does not necessarily cause large uncertainties or biases in the estimated positions. The primary motivation for this work was to develop an algorithm for locating underwater Pacific walruses in the coastal waters around Alaska. In 2009, an array of approximately 40 underwater acoustic receivers was deployed in the northeastern Chukchi Sea (northwest of Alaska) from August to October to record the vocalizations of marine mammals including Pacific walruses and bowhead whales. Three of these receivers were placed in a triangular arrangement approximately 400 m apart near the Hanna Shoal (northwest of Wainwright, Alaska). A sequence of walrus knock vocalizations from this data set was processed using the localization algorithm developed in this thesis, yielding a track whose estimated swim speed is consistent with current knowledge of normal walrus swim speed. An examination of absolute and relative walrus location uncertainties demonstrated the usefulness of considering relative uncertainties for applications where the precise location of the mammal is not important (e.g., estimating swim speed). / Graduate
72

Direct sensitivity techniques in regional air quality models: development and application

Zhang, Wenxian 12 January 2015 (has links)
Sensitivity analysis based on a chemical transport model (CTM) serves as an important approach towards better understanding the relationship between trace contaminant levels in the atmosphere and emissions, chemical and physical processes. Previous studies on ozone control identified the high-order Decoupled Direct Method (HDDM) as an efficient tool to conduct sensitivity analysis. Given the growing recognition of the adverse health effects of fine particulate matter (i.e., particles with an aerodynamic diameter less than 2.5 micrometers (PM2.5)), this dissertation presents the development of a HDDM sensitivity technique for particulate matter and its implementation it in a widely used CTM, CMAQ. Compared to previous studies, two new features of the implementation are 1) including sensitivities of aerosol water content and activity coefficients, and 2) tracking the chemical regimes of the embedded thermodynamic model. The new features provide more accurate sensitivities especially for nitrate and ammonium. Results compare well with brute force sensitivities and are shown to be more stable and computationally efficient. Next, this dissertation explores the applications of HDDM. Source apportionment analysis for the Houston region in September 2006 indicates that nonlinear responses accounted for 3.5% to 33.7% of daily average PM2.5, and that PM2.5 formed rapidly during night especially in the presence of abundant ozone and under stagnant conditions. Uncertainty analysis based on the HDDM found that on average, uncertainties in the emissions rates led to 36% uncertainty in simulated daily average PM2.5 and could explain much, but not all, of the difference between simulated and observed PM2.5 concentrations at two observations sites. HDDM is then applied to assess the impact of flare VOC emissions with temporally variable combustion efficiency. Detailed study of flare emissions using the 2006 Texas special inventory indicates that daily maximum 8-hour ozone at a monitoring site can increase by 2.9 ppb when combustion is significantly decreased. The last application in this dissertation integrates the reduced form model into an electricity generation planning model, and enables representation of geospatial dependence of air quality-related health costs in the optimization process to seek the least cost planning for power generation. The integrated model can provide useful advice on selecting fuel types and locations for power plants.
73

Efficient Methods for Predicting Soil Hydraulic Properties

Minasny, Budiman January 2000 (has links)
Both empirical and process-simulation models are useful for evaluating the effects of management practices on environmental quality and crop yield. The use of these models is limited, however, because they need many soil property values as input. The first step towards modelling is the collection of input data. Soil properties can be highly variable spatially and temporally, and measuring them is time-consuming and expensive. Efficient methods, which consider the uncertainty and cost of measurements, for estimating soil hydraulic properties form the main thrust of this study. Hydraulic properties are affected by other soil physical, and chemical properties, therefore it is possible to develop empirical relations to predict them. This idea quantified is called a pedotransfer function. Such functions may be global or restricted to a country or region. The different classification of particle-size fractions used in Australia compared with other countries presents a problem for the immediate adoption of exotic pedotransfer functions. A database of Australian soil hydraulic properties has been compiled. Pedotransfer functions for estimating water-retention and saturated hydraulic conductivity from particle size and bulk density for Australian soil are presented. Different approaches for deriving hydraulic transfer functions have been presented and compared. Published pedotransfer functions were also evaluated, generally they provide a satisfactory estimation of water retention and saturated hydraulic conductivity depending on the spatial scale and accuracy of prediction. Several pedotransfer functions were developed in this study to predict water retention and hydraulic conductivity. The pedotransfer functions developed here may predict adequately in large areas but for site-specific applications local calibration is needed. There is much uncertainty in the input data, and consequently the transfer functions can produce varied outputs. Uncertainty analysis is therefore needed. A general approach to quantifying uncertainty is to use Monte Carlo methods. By sampling repeatedly from the assumed probability distributions of the input variables and evaluating the response of the model the statistical distribution of the outputs can be estimated. A modified Latin hypercube method is presented for sampling joint multivariate probability distributions. This method is applied to quantify the uncertainties in pedotransfer functions of soil hydraulic properties. Hydraulic properties predicted using pedotransfer functions developed in this study are also used in a field soil-water model to analyze the uncertainties in the prediction of dynamic soil-water regimes. The use of the disc permeameter in the field conventionally requires the placement of a layer of sand in order to provide good contact between the soil surface and disc supply membrane. The effect of sand on water infiltration into the soil and on the estimate of sorptivity was investigated. A numerical study and a field experiment on heavy clay were conducted. Placement of sand significantly increased the cumulative infiltration but showed small differences in the infiltration rate. Estimation of sorptivity based on the Philip's two term algebraic model using different methods was also examined. The field experiment revealed that the error in infiltration measurement was proportional to the cumulative infiltration curve. Infiltration without placement of sand was considerably smaller because of the poor contact between the disc and soil surface. An inverse method for predicting soil hydraulic parameters from disc permeameter data has been developed. A numerical study showed that the inverse method is quite robust in identifying the hydraulic parameters. However application to field data showed that the estimated water retention curve is generally smaller than the one obtained in laboratory measurements. Nevertheless the estimated near-saturated hydraulic conductivity matched the analytical solution quite well. Th author believes that the inverse method can give a reasonable estimate of soil hydraulic parameters. Some experimental and theoretical problems were identified and discussed. A formal analysis was carried out to evaluate the efficiency of the different methods in predicting water retention and hydraulic conductivity. The analysis identified the contribution of individual source of measurement errors to the overall uncertainty. For single measurements, the inverse disc-permeameter analysis is economically more efficient than using pedotransfer functions or measuring hydraulic properties in the laboratory. However, given the large amount of spatial variation of soil hydraulic properties it is perhaps not surprising that lots of cheap and imprecise measurements, e.g. by hand texturing, are more efficient than a few expensive precise ones.
74

Osäkerhet vid översvämningskartering av vattendrag : En kunskapsöversikt och tillämpning på MIKE 11 / Uncertainty in flood inundation modeling of watercourses : A research overview and application to MIKE 11

Björkman, Elin January 2014 (has links)
På grund av osäkerheter i indata, parametrar och modellstruktur kan det finnas stora osäkerheter i översvämningskarteringar. Trots detta sker oftast ingen osäkerhetsanalys vid översvämningskarteringar i praktiken vilket gör att beslutsfattare och andra användare kan uppfatta resultaten som mer korrekta än vad de egentligen är. En orsak till att osäkerhetsanalys ännu inte blivit en vedertagen del i översvämningskarteringar kan vara att modellerare på konsultbyråer och myndigheter inte har tillräcklig kunskap om ämnet. Att tillgången på data kan vara begränsad underlättar inte heller vid osäkerhetsanalyser. Dessutom saknas exempel på hur osäkerheter kan analyseras i MIKE 11, vilket är en av de vanligaste modellerna som används vid översvämningskarteringar på konsultbyråer. Syftet med examensarbetet var tvåfaldigt. Det första var att ge en generell kunskapsöverblick över aktuell forskning om osäkerheter och osäkerhetsanalys vid översvämningskarteringar för att öka kunskapen hos konsulter och beslutsfattare. Det andra syftet var att med ett exempel visa hur osäkerheter kan uppskattas i en översvämningskartering skapad i MIKE 11 då det finns begränsad tillgång på data. En litteraturstudie visade att det ofta finns stora osäkerheter i flödesberäkningar och den geometriska beskrivningen och att det finns väldigt många sätt att analysera dessa på. Några av metoderna som används är Monte Carlo simuleringar, Oskarpa mängder, Scenarioanalys, Bayesiansk kalibrering och Generalized Likelihood Uncertainty Estimation, GLUE. En fallstudie gjordes där en hydraulisk modell av Kungsbackaån skapades med MIKE 11. Den metod som var praktiskt genomförbar att använda för att uppskatta osäkerheterna i detta arbete var scenarioanalys. Totalt utfördes 36 olika modellsimuleringar där kalibreringsflöde, Mannings tal och scenarioflöde varierades. Scenarioanalys ger inte någon exakt beräkning av osäkerheterna utan endast en subjektiv uppskattning. Resultatet av scenarioanalysen visade att då havsnivån i Kungsbackafjorden var 0,92 m skiljde de simulerada vattennivåerna som mest med 1,3 m för 100-årsflödet och med 0,41 m för beräknat högsta flöde, BHF. Även osäkerheterna i utbredningen för de två flödena undersöktes och visade sig vara som störst i flacka områden trots att osäkerheten i vattennivåerna var mindre där. / Due to uncertainty in data, parameters and model structure, there may be large uncertainties in flood inundation models. Despite of this, uncertainty analysis is still rarely used by practitioners when creating flood maps. A reason why uncertainty analysis has not yet become customary in flood inundation modeling may be due to a lack of knowledge. Low availability of data can sometimes also make it more difficult to do an uncertainty analysis. Moreover, no examples exist of how uncertainties can be analyzed in MIKE 11, which is one of the most common models used in flood mapping at consultant agencies. The aim of this study was twofold. Firstly, to provide a general overview of current research on uncertainty and uncertainty analysis for flood inundation modeling. This in order to increase knowledge among consultants and decision makers. Secondly, to give an example of how uncertainties can be estimated in a flood inundation model created in MIKE 11 when there is limited access to data. The research overview showed that there is often considerable uncertainty in the discharge calculations and geometrical description in hydraulic models, and that there are many different ways to analyze the uncertainties. Some methods that are often used are Monte Carlo simulations, fuzzy sets, scenario analysis, Bayesian calibration and Generalized Likelihood Uncertainty Estimation, GLUE. A case study was performed in which a hydraulic model was built for the River Kungsbackaån in MIKE 11. A scenario analysis was carried out to show the uncertainties in the hydraulic model. Overall, 36 different model runs were made in which the calibration discharge, Manning's number and design flow were varied. Scenario analysis cannot provide a precise estimate of the uncertainty, it can only give a subjective estimate. The results of the scenario analysis showed that when the sea level in Kungsbackafjorden was 0,92 m the simulated water levels differed at most by 1,3 m for the 100-year discharge and by 0,41 m for the calculated maximum flow. Also, the flood extent of the two discharges were investigated. The greatest uncertainty in the extent was found in the flat areas even though the uncertainty in water levels was smaller there.
75

Exploração de espaços de parâmetros de modelos biológicos sob diferentes paradigmas estatísticos / Parameter space exploration of biological models under different statistical paradigms

Andre Chalom Machado de Oliveira 02 September 2015 (has links)
A formulação e o uso de modelos matemáticos complexos têm recebido grande atenção no estudo da ecologia nos últimos anos. Questões relacionadas à exploração de espaços de parâmetros destes modelos - executada de forma eficiente, sistemática e à prova de erros - são de grande importância para melhor compreender, avaliar a confiabilidade e interpretar o resultado destes modelos. Neste trabalho, apresentamos uma investigação de métodos existentes para responder as questões relevantes da área, com ênfase na técnica conhecida como Hipercubo Latino e com foco na análise quantitativa dos resultados, e realizamos a comparação entre resultados analíticos de incerteza e sensibilidade e resultados obtidos do Hipercubo. Ainda, examinamos a proposta de uma metodologia paralela baseada no paradigma estatístico da verossimilhança. O capítulo 1 introduz uma investigação a respeito dos conceitos históricos sobre a natureza da probabilidade, situando o conceito da verossimilhança como componente central da inferência estatística. O capítulo 2 (em inglês) traz uma revisão bibliográfica sobre o estado da arte em análises de incerteza e sensibilidade, apresentando dois exemplos de aplicação das técnicas descritas a problemas de crescimento populacional estruturado. O capítulo 3 examina a proposta de uma metodologia baseada na verossimilhança dos dados como uma abordagem integrativa entre a estimação de parâmetros e a análise de incerteza, apresentando resultados preliminares. Durante o progresso do presente trabalho, um pacote de funções na linguagem R foi desenvolvido para facilitar o emprego na prática das ferramentas teóricas expostas acima. Os apêndices deste texto trazem um tutorial e exemplos de uso deste pacote, pensado para ser ao mesmo tempo conveniente e de fácil extensão, e disponível livremente na internet, no endereço http://cran.r-project.org/web/packages/pse. / There is a growing trend in the use of mathematical modeling tools in the study of many areas of the biological sciences. The use of computer models in science is increasing, specially in fields where laboratory experiments are too complex or too costly, like ecology. Questions of efficient, systematic and error-proof exploration of parameter spaces are are of great importance to better understand, estimate confidences and make use of the output from these models. We present a survey of the proposed methods to answer these questions, with emphasis on the Latin Hypercube Sampling and focusing on quantitative analysis of the results. We also compare analytical results for sensitivity and uncertainty, where relevant, to LHS results. Finally, we examine the proposal of a methodology based on the likelihood statistical paradigm. Chapter 1 introduces a brief investigation about the historical views about the nature of probability, in order to situate the concept of likelihood as a central component in statistical inference. Chapter 2 (in English) shows a revision about the state-of-art uncertainty and sensitivity analyses, with a practical example of applying the described techniques to two models of structured population growth. Chapter 3 examines the proposal of a likelihood based approach as an integrative procedure between parameter value estimation and uncertainty analyses, with preliminary results. During the progress of this work, a package of R functions was developed to facilitate the real world use of the above theoretical tools. The appendices of this text bring a tutorial and examples of using this package, freely available on the Internet at http://cran.r-project.org/web/packages/pse.
76

Integração de analise de incertezas e ajuste de historico de produçaõ / Integration of uncertainty analysis and history matching process

Moura Filho, Marcos Antonio Bezerra de 12 August 2018 (has links)
Orientadores: Denis Jose Schiozer, Celio Maschio / Dissertação (mestrado) - Universidade Estadual de Campinas, Faculdade de Engenharia Mecanica e Instituto de Geociencias / Made available in DSpace on 2018-08-12T23:06:56Z (GMT). No. of bitstreams: 1 MouraFilho_MarcosAntonioBezerrade_M.pdf: 4188788 bytes, checksum: 07988564a5783bc054c31f18ca0a2752 (MD5) Previous issue date: 2006 / Resumo:O processo de ajuste de histórico tradicional normalmente resulta em um único modelo determinístico que é utilizado para representar o reservatório, o que pode não ser suficiente para garantir previsões de produção confiáveis, principalmente para campos em início de produção. Este trabalho apresenta uma análise quantitativa das incertezas dos atributos de reservatório integrada com o processo de ajuste de histórico. Ao invés de ser utilizada uma abordagem determinística, aborda-se uma análise probabilística dos modelos de reservatório resultando em faixas de incerteza de previsão de produção e possibilitando uma melhor visão do comportamento futuro de reservatórios. Na metodologia utilizada neste trabalho, dados de simulação são comparados com dados de produção observados e, de acordo com os afastamentos em relação ao histórico de produção, há uma mudança das probabilidades de ocorrência dos cenários. Em alguns procedimentos propostos, há alterações também nos valores dos atributos incertos, diminuindo sua faixa de incerteza. O maior desafio deste trabalho consiste em determinar uma maneira consistente e confiável para promover a integração da análise de incertezas e ajuste de histórico, aumentando a confiabilidade na previsão de comportamento de reservatórios de petróleo e que seja possível de ser automatizada, facilitando o trabalho e acelerando o processo. Foram testados vários critérios até se alcançar a validação da metodologia proposta. Após a análise dos resultados obtidos, sugere-se uma seqüência de aplicação dos métodos de redução de incerteza propostos na metodologia. A principal contribuição desta metodologia é aumentar a confiabilidade na previsão de comportamento de reservatórios através de simulação numérica e mostrar a necessidade de incorporar incertezas ao processo de ajuste de histórico de produção. Uma outra contribuição deste trabalho é iniciar essa linha de pesquisa propondo e validando alguns métodos para integrar os processos de ajuste e análise de incertezas / Abstract: History matching process usually results in a unique deterministic model that is used torepresent the reservoir, but it may not be enough to guarantee reliable production forecasts, mainly for fields in early production stages. This work presents a quantitative uncertainty analysis of reservoir attributes integrated to the history matching process. Instead of using a deterministic approach, it is used a probabilistic analysis of the reservoir models, resulting in uncertainty ranges for the production forecast and allowing a better prediction of reservoir performance. In the methodology used in this work, simulation data are compared to observed production data and, according to the difference between those data, the probabilities of the scenarios are changed. In some procedures, the probability distribution of the reservoir attributes also change, diminishing their uncertainty range. The main challenges of this work are: (1) the determination of a consistent and reliable procedure to provide the integration of the uncertainty analysis and the history matching process, increasing the reliability in the reservoir performance forecast; and (2) to develop an automatic procedure, making the work easier and speeding up the process. The main contribution of this work is to increase the reliability of production predictions through reservoir simulation models and to show the necessity of incorporating uncertainties in the history matching. Other contribution of this work is start up a research line, proposing and validating some methods to integrate the history matching process and the uncertainty analysis / Mestrado / Ciencias e Engenharia do Petroleo / Mestre em Ciências e Engenharia de Petróleo
77

Improving Seasonal Rainfall and Streamflow Forecasting in the Sahel Region via Better Predictor Selection, Uncertainty Quantification and Forecast Economic Value Assessment

Sittichok, Ketvara January 2016 (has links)
The Sahel region located in Western Africa is well known for its high rainfall variability. Severe and recurring droughts have plagued the region during the last three decades of the 20th century, while heavy precipitation events (with return periods of up to 1,200 years) were reported between 2007 and 2014. Vulnerability to extreme events is partly due to the fact that people are not prepared to cope with them. It would be of great benefit to farmers if information about the magnitudes of precipitation and streamflow in the upcoming rainy season were available a few months before; they could then switch to more adapted crops and farm management systems if required. Such information would also be useful for other sectors of the economy, such as hydropower production, domestic/industrial water consumption, fishing and navigation. A logical solution to the above problem would be seasonal rainfall and streamflow forecasting, which would allow to generate knowledge about the upcoming rainy season based on information available before it's beginning. The research in this thesis sought to improve seasonal rainfall and streamflow forecasting in the Sahel by developing statistical rainfall and streamflow seasonal forecasting models. Sea surface temperature (SST) were used as pools of predictor. The developed method allowed for a systematic search of the best period to calculate the predictor before it was used to predict average rainfall or streamflow over the upcoming rainy season. Eight statistical models consisted of various statistical methods including linear and polynomial regressions were developed in this study. Two main approaches for seasonal streamflow forecasting were developed here: 1) A two steps streamflow forecasting approach (called the indirect method) which first linked the average SST over a period prior to the date of forecast to average rainfall amount in the upcoming rainy season using the eight statistical models, then linked the rainfall amount to streamflow using a rainfall-runoff model (Soil and Water Assessment Tool (SWAT)). In this approach, the forecasted rainfall was disaggregated to daily time step using a simple approach (the fragment method) before being fed into SWAT. 2) A one step streamflow forecasting approach (called as the direct method) which linked the average SST over a period prior to the date of forecast to the average streamflow in the upcoming rainy season using the eight statistical models. To decrease the uncertainty due to model selection, Bayesian Model Averaging (BMA) was also applied. This method is able to explore the possibility of combining all available potential predictors (instead of selecting one based on an arbitrary criterion). The BMA is also capability to produce the probability density of the forecast which allows end-users to visualize the density of expected value and assess the level of uncertainty of the generated forecast. Finally, the economic value of forecast system was estimated using a simple economic approach (the cost/loss ratio method). Each developed method was evaluated using three well known model efficiency criteria: the Nash-Sutcliffe coefficient (Ef), the coefficient of determination (R2) and the Hit score (H). The proposed models showed equivalent or better rainfall forecasting skills than most research conducted in the Sahel region. The linear model driven by the Pacific SST produced the best rainfall forecasts (Ef = 0.82, R2 = 0.83, and H = 82%) at a lead time of up to 12 months. The rainfall forecasting model based on polynomial regression and forced by the Atlantic ocean SST can be used using a lead time of up to 5 months and had a slightly lower performance (Ef = 0.80, R2 = 0.81, and H = 82%). Despite the fact that the natural relationship between rainfall and SST is nonlinear, this study found that good results can be achieved using linear models. For streamflow forecasting, the direct method using polynomial regression performed slightly better than the indirect method (Ef = 0.74, R2 = 0.76, and H = 84% for the direct method; Ef = 0.70, R2 = 0.69, and H = 77% for the indirect method). The direct method was driven by the Pacific SST and had five months lead time. The indirect method was driven by the Atlantic SST and had six months lead time. No significant difference was found in terms of performance between BMA and the linear regression models based on a single predictor for streamflow forecasting. However, BMA was able to provide a probabilistic forecast that accounts for model selection uncertainty, while the linear regression model had a longer lead time. The economic value of forecasts developed using the direct and indirect methods were estimated using the cost/loss ratio method. It was found that the direct method had a better value than the indirect method. The value of the forecast declined with higher return periods for all methods. Results also showed that for the particular watershed under investigation, the direct method provided a better information for flood protection. This research has demonstrated the possibility of decent seasonal streamflow forecasting in the Sirba watershed, using the tropical Pacific and Atlantic SSTs as predictors.The findings of this study can be used to improve the performance of seasonal streamflow forecasting in the Sahel. A package implementing the statistical models developed in this study was developed so that end users can apply them for seasonal rainfall or streamflow forecasting in any region they are interested in, and using any predictor they may want to try.
78

An efficient analysis of pareto optimal solutions in multidisciplinary design

Erfani, Tohid January 2011 (has links)
Optimisation is one of the most important and challenging part of any engineering design. In real world design problems one faces multiobjective optimisation under constraints. The optimal solution in these cases is not unique because the objectives can contradict each other. In such cases, a set of optimal solutions which forms a Pareto frontier in the objective space is considered. There are many algorithms to generate the Pareto frontier. However, only a few of them are potentially capable of providing an evenly distributed set of the solutions. Such a property is especially important in real-life design because a decision maker is usually able to analyse only a very limited quantity of solutions. This thesis consists of two main parts. At first, it develops and gives the detailed description of two different algorithms that are able to generate an evenly distributed Pareto set in a general formulation. One is a classical approach and called Directed Search Domain (DSD) and the other, the cylindrical constraint evolutionary algorithm (CCEA), is a hybrid population based method. The efficiency of the algorithms are demonstrated by a number of challenging test cases and the comparisons with the results of the other existing methods. It is shown that the proposed methods are successful in generating the Pareto solutions even when some existing methods fail. In real world design problems, deterministic approaches cannot provide a reliable solution as in the event of uncertainty, deterministic optimal solution would be infeasible in many instances. Therefore a solution less sensitive to problem perturbation is desirable. This leads to the robust solution which is the focus of the second part of the thesis. In the literature, there are some techniques tailored for robust optimisation. However, most of them are either computationally expensive or do not systematically articulate the designer preferences into a robust solution. In this thesis, by introducing a measure for robustness in multiobjective context, a tunable robust function (TRF) is presented. Including the TRF in the problem formulation, it is demonstrated that the desirable robust solution based on designer preferences can be obtained. This not only provides the robust solution but also gives a control over the robustness level. The method is efficient as it only increases the dimension of the problem by one irrespective of the dimension of the original problem.
79

Parameterschätzung und Modellevaluation für komplexe Systeme

Schumann-Bischoff, Jan 06 April 2016 (has links)
No description available.
80

Water and Carbon Balance Modeling: Methods of Uncertainty Analysis

Juston, John January 2010 (has links)
How do additional data of the same and/or different type contribute to reducing model parameter and predictive uncertainties? This was the question addressed with two models – the HBV hydrological water balance model and the ICBM soil carbon balance model – that were used to investigate the usefulness of the Generalized Likelihood Uncertainty Estimation (GLUE) method for calibrations and uncertainty analyses.  The GLUE method is based on threshold screening of Monte Carlo simulations using so-called informal likelihood measures and subjective acceptance criterion. This method is highly appropriate for model calibrations when errors are dominated by epistemic rather than stochastic uncertainties.  The informative value of data for model calibrations was investigated with numerous calibrations aimed at conditioning posterior parameter distributions and boundaries on model predictions.  The key results demonstrated examples of: 1) redundant information in daily time series of hydrological data; 2) diminishing returns in the value of continued time series data collections of the same type; 3) the potential value of additional data of a different type; 4) a means to effectively incorporate fuzzy information in model calibrations; and 5) the robustness of estimated parameter uncertainty for portability of a soil carbon model between and tropical climate zones.  The key to obtaining these insights lied in the methods of uncertainty analysis used to produce them.  A paradigm for selecting between formal and informal likelihood measures in uncertainty analysis is presented and discussed for future use within a context of climate related environmental modeling. / QC 20110414

Page generated in 0.4648 seconds