• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 1700
  • 419
  • 238
  • 214
  • 136
  • 93
  • 31
  • 26
  • 25
  • 21
  • 20
  • 15
  • 9
  • 8
  • 7
  • Tagged with
  • 3615
  • 598
  • 433
  • 364
  • 360
  • 359
  • 347
  • 328
  • 326
  • 296
  • 282
  • 259
  • 214
  • 214
  • 210
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
391

[en] THE EFFECTS OF UNCERTAINTY ON ACTIVITY AND MONETARY POLICY IN BRAZIL / [pt] OS EFEITOS DA INCERTEZA SOBRE ATIVIDADE E POLÍTICA MONETÁRIA NO BRASIL

RICARDO DE MENEZES BARBOZA 02 March 2018 (has links)
[pt] Este trabalho tem um duplo objetivo. Em primeiro lugar, investiga qual o efeito da incerteza sobre a atividade econômica no Brasil. Para isso, são construídas diversas proxies que buscam captar o nível de incerteza vigente no Brasil (incerteza doméstica) e em vários de seus principais parceiros comerciais (incerteza externa). Em seguida, são estimados modelos de vetores autorregressivos (SVAR), em linha com Baker, Bloom e Davis (2016). Os resultados obtidos sugerem que a incerteza tem efeitos contracionistas relevantes sobre a economia brasileira. Em segundo lugar, estuda qual o efeito da incerteza sobre o poder da política monetária no Brasil. Para tanto, são construídos diversos modelos de vetores autorregressivos interativos (IVAR), tal como proposto por Aastveit, Natvik e Sola (2013), porém estimados por LASSO Adaptativo. As estimativas obtidas não corroboram a hipótese de que sob alta incerteza os efeitos da política monetária sobre a atividade são menores do que sob baixa incerteza. Este resultado, no entanto, não é robusto. / [en] This work has a dual purpose. First of all, we investigate the effect of uncertainty on economic activity in Brazil. In order to do that, we construct several proxies which seek to capture the uncertainty level prevailing in Brazil (domestic uncertainty) and in several of our major trading partners (external uncertainty). Next, we estimate vector autoregressive (SVAR) models, in line with Baker, Bloom and Davis (2016). The results suggest that uncertainty has, in fact, contractionary effects on the activity in Brazil. Second, we study the effect of uncertainty on effectiveness of monetary policy in Brazil. Thus, we make use of interacted vector autoregressive (IVAR) models, as proposed by Aastveit, Natvik and Sola (2013), estimated, however, by Adaptive LASSO. Our estimates do not corroborate the hypothesis that under high uncertainty the effects of monetary policy on the activity are lower than under low uncertainty.
392

Développement d'une méthodologie de Quantification d'Incertitudes pour une analyse Mutli-Physique Best Estimate et application sur un Accident d’Éjection de Grappe dans un Réacteur à Eau Pressurisée / Development of an Uncertainty Quantification methodology for Multi-Physics Best Estimate analysis and application to the Rod Ejection Accident in a Pressurized Water Reactor

Delipei, Gregory 04 October 2019 (has links)
Durant les dernières décennies, l’évolution de la puissance de calcul a conduit au développement de codes de simulation en physique des réacteurs de plus en plus prédictifs pour la modélisation du comportement d’un réacteur nucléaire en situation de fonctionnement normal et accidentel. Un cadre d’analyse d’incertitudes cohérent avec l’utilisation de modélisations Best Estimate (BE) a été développé. On parle d’approche Best Estimate Plus Uncertain-ties (BEPU) et cette approche donne lieu `a de nombreux travaux de R&D à l’international en simulation numérique. Dans cette thèse, on étudie la quantification d’incertitudes multi-physiques dans le cas d’un transitoire d’ éjection de Grappe de contrôle (REA- Rod Ejection Accident) dans un Réacteur à Eau Pressurisée (REP). La modélisation BE actuellement disponible au CEA est réalisée en couplant les codes APOLLO3 R (netronique) et FLICA4 (thermohydraulique-thermique du combustible) dans l’environnement SALOME/CORPUS. Dans la première partie de la thèse, on examine différents outils statistiques disponibles dans la littérature scientifique dont la réduction de dimension, l’analyse de sensibilité globale, des modèles de substitution et la construction de plans d’expérience. On utilise ces outils pour développer une méthodologie de quantification d’incertitudes. Dans la deuxième partie de la thèse, on améliore la modélisation du comportement du combustible. Un couplage Best Effort pour la simulation d’un transitoire REA est disponible au CEA. Il comprend le code ALCYONE V1.4 qui permet une modélisation fine du comportement thermomécanique du combustible. Cependant, l’utilisation d’une telle modélisation conduit à une augmentation significative du temps de calcul ce qui rend actuellement difficile la réalisation d’une analyse d’incertitudes. Pour cela, une méthodologie de calibrage d’un modèle analytique simplifié pour le transfert de chaleur pastille-gaine basé sur des calculs ALCYONE V1.4 découplés a été développée. Le modèle calibré est finalement intégré dans la modélisation BE pour améliorer sa prédictivité. Ces deux méthodologies sont maquettées initialement sur un cœur de petite échelle représentatif d’un REP puis appliquées sur un cœur REP à l’échelle 1 dans le cadre d’une analyse multi-physique d’un transitoire REA. / The computational advancements of the last decades lead to the development of numerical codes for simulating the reactor physics with increa-sing predictivity allowing the modeling of the beha-vior of a nuclear reactor under both normal and acci-dental conditions. An uncertainty analysis framework consistent with Best Estimate (BE) codes was develo-ped in order to take into account the different sources of uncertainties. This framework is called Best Esti-mate Plus Uncertainties (BEPU) and is currently a field of increasing research internationally. In this the-sis we study the multi-physics uncertainty quantifi-cation for Rod Ejection Accident (REA) in Pressuri-zed Water Reactors (PWR). The BE modeling avai-lable in CEA is used with a coupling of APOLLO3 (neutronics) and FLICA4 (thermal-hydraulics and fuel-thermal) in the framework of SALOME/CORPUS tool. In the first part of the thesis, we explore different statistical tools available in the scientific literature including: dimension reduction, global sensitivity analy-sis, surrogate modeling and design of experiments. We then use them in order to develop an uncer-tainty quantification methodology. In the second part of the thesis, we improve the BE modeling in terms of its uncertainty representation. A Best Effort coupling scheme for REA analysis is available at CEA. This in-cludes ALCYONE V1.4 code for a detailed modeling of fuel-thermomechanics behavior. However, the use of such modeling increases significantly the compu-tational cost for a REA transient rendering the uncer-tainty analysis prohibited. To this purpose, we deve-lop a methodology for calibrating a simplified analytic gap heat transfer model using decoupled ALCYONE V1.4 REA calculations. The calibrated model is finally used to improve the previous BE modeling. Both de-veloped methodologies are tested initially on a small scale core representative of a PWR and then applied on a large scale PWR core.
393

Vyjadřování nejistoty výsledku zkoušky / Expressing uncertainty of efficiency value

Kňava, Miroslav January 2010 (has links)
The aim of this work is to analyze the methodology for executing examination heat transfer tubes of steam generator by method of eddy currents on EDU, to create the model for the estimate of standard combined uncertainty of efficiency value, quantify the individual components of standard uncertainties, analyse the influence of different components on the expressing of uncertainty, to make the final estimate of standard combined and enlarged uncertainty of efficiency value and as of minor objective to evaluate the suitability of the use of this method .
394

The differentiation between variability uncertainty and knowledge uncertainty in life cycle assessment: A product carbon footprint of bath powder “Blaue Traube”

Budzinski, Maik January 2012 (has links)
The following thesis deals with methods to increase the reliability of the results in life cycle assessment. The paper is divided into two parts. The first part points out the typologies and sources of uncertainty in LCA and summarises the existing methods dealing with it. The methods are critically discussed and pros and cons are contrasted. Within the second part a case study is carried out. This study calculates the carbon footprint of a cosmetic product of Li-iL GmbH. Thereby the whole life cycle of the powder bath Blaue Traube is analysed. To increase the reliability of the result a procedure, derived from the first part, is applied. Recommendations to enhance the product´s sustainability are then given to the decision-makers of the company. Finally the applied procedure for dealing with uncertainty in LCAs is evaluated. The aims of the thesis are to make a contribution to the understanding of uncertainty in life cycle assessment and to deal with it in a more consistent manner. As well, the carbon footprint of the powder bath shall be based on appropriate assumptions and shall consider occurring uncertainties. Basing on discussed problems, a method is introduced to avoid the problematic merging of variability uncertainty and data uncertainty to generate probability distributions. The introduced uncertainty importance analysis allows a consistent differentiation of these types of uncertainty. Furthermore an assessment of the used data of LCA studies is possible. The method is applied at a PCF study of the bath powder Blaue Traube of Li-iL GmbH. Thereby the analysis is carried out over the whole life cycle (cradle-to-grave) as well as cradle-to-gate. The study gives a practical example to the company determining the carbon footprint of products. In addition, it meets the requirements of ISO guidelines of publishing the study and comparing it with other products. Within the PCF study the introduced method allows a differentiation of variability uncertainty and knowledge uncertainty. The included uncertainty importance analysis supports the assessment of each aggregated unit process within the analysed product system. Finally this analysis can provide a basis to collect additional, more reliable or uncertain data for critical processes.
395

Určení nejistot při stanovení průtoků ve vodních tocích pomocí měření hydrometrickou vrtulí. / Evaluation Of Uncertainties Of River Flow Measurements With Propeller Current-Meter

Niemiec, Łukasz January 2013 (has links)
The development of hydrology and the need of getting exact data increase the demands on hydro metering. Measurement of spot velocities and deriving the flow of water are frequently performed by the workers of Český hydrometeorologický ústav (Czech Hydrometeorological Institute) and other companies. The limited precision of the measuring machines, the imperfection of methods and the influence of human senses cause that we are not able to get the accurate value of quantity. We are just close to the right values. The term uncertainty of measuring determines the interval which is assigned to each measurement and contains the real value of measured quantity. Twenty imminently repeated measurement were done in twelve measuring profiles of the Dyje River and in one profile of the Morava River using the current meters OTT (OTT Hydromet, 2013) type C2 and type C31. We investigated the dependence of uncertainties type A and B in different profile parameter. The measurement was done with the suspension bar with 2 to 4 propeller current-meters. The hydrometric car on the bridge was used in deeper waters. The results were analysed from the point of view of the uncertainties and generalized. Next point of the research was to find out how the frequency of the current meter depends on temperature of water. For this purpose, the specific canal was made in the Faculty of Civil Engineering in Brno (Ústav vodního hospodářství krajiny) and the measuring was done in temperature interval from 1 °C to 24 °C. Repeated measuring was statistically evaluated from the uncertainties point of view. In the thesis, the proposal of elaboration of uncertainties determined by both types of measurement into current methods of determination of uncertainties of the derived flows using the measurement of spot velocity by the propeller current-meter in measuring profiles of the rivers is introduced.
396

Uncertainty visualization of ensemble simulations

Sanyal, Jibonananda 09 December 2011 (has links)
Ensemble simulation is a commonly used technique in operational forecasting of weather and floods. Multi-member ensemble output is usually large, multivariate, and challenging to interpret interactively. Forecast meteorologists and hydrologists are interested in understanding the uncertainties associated with the simulation; specifically variability between the ensemble members. The visualization of ensemble members is currently accomplished through spaghetti plots or hydrographs. To improve visualization techniques and tools for forecasters, we conducted a userstudy to evaluate the effectiveness of existing uncertainty visualization techniques on 1D and 2D synthetic datasets. We designed an uncertainty evaluation framework to enable easier design of such studies for scientific visualization. The techniques evaluated are errorbars, scaled size of glyphs, color-mapping on glyphs, and color-mapping of uncertainty on the data surface. Although we did not find a consistent order among the four techniques for all tasks, we found that the efficiency of techniques used highly depended on the tasks being performed. Errorbars consistently underperformed throughout the experiment. Scaling the size of glyphs and color-mapping of the surface performed reasonably well. With results from the user-study, we iteratively developed a tool named ‘Noodles’ to interactively explore the ensemble uncertainty in weather simulations. Uncertainty was quantified using standard deviation, inter-quartile range, width of the 95% confidence interval, and by bootstrapping the data. A coordinated view of ribbon and glyph-based uncertainty visualization, spaghetti plots, and data transect plots was provided to two meteorologists for expert evaluation. They found it useful in assessing uncertainty in the data, especially in finding outliers and avoiding the parametrizations leading to these outliers. Additionally, they could identify spatial regions with high uncertainty thereby determining poorly simulated storm environments and deriving physical interpretation of these model issues. We also describe uncertainty visualization capabilities developed for a tool named ‘FloodViz’ for visualization and analysis of flood simulation ensembles. Simple member and trend plots and composited inundation maps with uncertainty are described along with different types of glyph based uncertainty representations. We also provide feedback from a hydrologist using various features of the tool from an operational perspective.
397

Quantification of the Effects of Soil Uncertainties on Nonlinear Site Response Analysis: Brute Force Monte Carlo Approach

Eshun, Kow Okyere 28 May 2013 (has links)
No description available.
398

A Statistical Framework for Distinguishing Between Aleatory and Epistemic Uncertainties in the Best- Estimate Plus Uncertainty (BEPU) Nuclear Safety Analyses

Pun-Quach, Dan 11 1900 (has links)
In 1988, the US Nuclear Regulatory Commission approved an amendment that allowed the use of best-estimate methods. This led to an increased development, and application of Best Estimate Plus Uncertainty (BEPU) safety analyses. However, a greater burden was placed on the licensee to justify all uncertainty estimates. A review of the current state of the BEPU methods indicate that there exists a number of significant criticisms, which limits the BEPU methods from reaching its full potential as a comprehensive licensing basis. The most significant criticism relates to the lack of a formal framework for distinguishing between aleatory and epistemic uncertainties. This has led to a prevalent belief that such separation of uncertainties is for convenience, rather than one out of necessity. In this thesis, we address the above concerns by developing a statistically rigorous framework to characterize the different uncertainty types. This framework is grounded on the philosophical concepts of knowledge. Considering the Plato problem, we explore the use of probability as a means to gain knowledge, which allows us to relate the inherent distinctness in knowledge with the different uncertaintytypesforanycomplexphysicalsystem. Thisframeworkis demonstrated using nuclear analysis problems, and we show through the use of structural models that the separation of these uncertainties leads to more accurate tolerance limits relative to existing BEPU methods. In existing BEPU methods, where such a distinction is not applied, the total uncertainty is essentially treated as the aleatory uncertainty. Thus, the resulting estimated percentile is much larger than the actual (true) percentile of the system's response. Our results support the premise that the separation of these two distinct uncertainty types is necessary and leads to more accurate estimates of the reactor safety margins. / Thesis / Doctor of Philosophy (PhD)
399

Application of Modular Uncertainty Techniques to Engineering Systems

Long, William C 04 May 2018 (has links)
Uncertainty analysis is crucial to any thorough analysis of an engineering system. Traditional uncertainty analysis can be a tedious task involving numerous steps that can be error prone if conducted by hand. If conducted with the aid of a computer, these tasks can be computationally expensive. In either case, the process is quite rigid. If a parameter of the system is modified or the system configuration is changed, the entire uncertainty analysis process must be conducted again giving more opportunities for calculation errors or computation time. Modular uncertainty analysis provides a method to overcome all these obstacles of traditional uncertainty analysis. The modular technique is well suited for computation by a computer which makes the process somewhat automatic after the initial setup and computation errors are reduced. The modular technique implements matrix operations to conduct the analysis. This in turns makes the process more efficient than traditional methods because computers are well suited for matrix operations. Since the modular technique implements matrix operations, the method is adaptable to system parameter or configuration modifications. The modular technique also lends itself to quickly calculating other uncertainty analysis parameters such as the uncertainty magnification factor, and the uncertainty percent contribution. This dissertation will focuson the modular technique, the extension of the technique in the form the uncertainty magnification factor and uncertainty percent contribution, and the application of the modular technique to different type of energy systems. The modular technique is applied to an internal combustion engine with a bottoming organic Rankine cycle system, a combined heat and power system, and a heating, ventilation, and air conditioning system. The results show that the modular technique is well suited to evaluate complex engineering systems. The modular technique is also shown to perform well when system parameters or configurations are modified.
400

Probabilistic and Statistical Learning Models for Error Modeling and Uncertainty Quantification

Zavar Moosavi, Azam Sadat 13 March 2018 (has links)
Simulations and modeling of large-scale systems are vital to understanding real world phenomena. However, even advanced numerical models can only approximate the true physics. The discrepancy between model results and nature can be attributed to different sources of uncertainty including the parameters of the model, input data, or some missing physics that is not included in the model due to a lack of knowledge or high computational costs. Uncertainty reduction approaches seek to improve the model accuracy by decreasing the overall uncertainties in models. Aiming to contribute to this area, this study explores uncertainty quantification and reduction approaches for complex physical problems. This study proposes several novel probabilistic and statistical approaches for identifying the sources of uncertainty, modeling the errors, and reducing uncertainty to improve the model predictions for large-scale simulations. We explore different computational models. The first class of models studied herein are inherently stochastic, and numerical approximations suffer from stability and accuracy issues. The second class of models are partial differential equations, which capture the laws of mathematical physics; however, they only approximate a more complex reality, and have uncertainties due to missing dynamics which is not captured by the models. The third class are low-fidelity models, which are fast approximations of very expensive high-fidelity models. The reduced-order models have uncertainty due to loss of information in the dimension reduction process. We also consider uncertainty analysis in the data assimilation framework, specifically for ensemble based methods where the effect of sampling errors is alleviated by localization. Finally, we study the uncertainty in numerical weather prediction models coming from approximate descriptions of physical processes. / Ph. D. / Computational models are used to understand the behavior of the natural phenomenon. Models are used to approximate the evolution of the true phenomenon or reality in time. We obtain more accurate forecast for the future by combining the model approximation together with the observation from reality. Weather forecast models, oceanography, geoscience, etc. are some examples of the forecasting models. However, models can only approximate the true reality to some extent and model approximation of reality is not perfect due to several sources of error or uncertainty. The noise in measurements or in observations from nature, the uncertainty in some model components, some missing components in models, the interaction between different components of the model, all cause model forecast to be different from reality. The aim of this study is to explore the techniques and approaches of modeling the error and uncertainty of computational models, provide solution and remedies to reduce the error of model forecast and ultimately improve the model forecast. Taking the discrepancy or error between model forecast and reality in time and mining that error provide valuable information about the origin of uncertainty in models as well as the hidden dynamics that is not considered in the model. Statistical and machine learning based solutions are proposed in this study to identify the source of uncertainty, capturing the uncertainty and using that information to reduce the error and enhancing the model forecast. We studied the error modeling, error or uncertainty quantification and reduction techniques in several frameworks from chemical models to weather forecast models. In each of the models, we tried to provide proper solution to detect the origin of uncertainty, model the error and reduce the uncertainty to improve the model forecast.

Page generated in 0.0359 seconds