Spelling suggestions: "subject:"incertainty."" "subject:"ncertainty.""
531 |
The invasion of tomorrow : A theoretical study of the development of the risk conceptOyarzun Norborg, Marcus January 2011 (has links)
The use of the term risk has increased during the latest decades, not least in the medias, and so has the interest for risk as a topic of research. It seems as if the nature and social perception of risk has gone through important changes during the last centuries. The purpose of this study has been to investigate how and why the risk concept has changed over time. Starting from the influential writings of Ulrich Beck and Anthony Giddens, the ambition has been make a clear and critical survey of the subject. To Beck and Giddens risk is a modern concept. While in traditional societies threats were experienced as uncontrollable divine strokes of fate, as dangers, in modernity many threats begin to be conceived as controllable, as risks. This is a development closely linked to the expansion of insurance systems (i.e. the welfare-state), which through predictions based on probability calculations and the promise of retrospective monetary compensation for damage make the future seem controllable. During this first stage of risk the "perfect control society" seems attainable. But as rational knowledge continues to expand and humans intrude further into their environment, we realize that many risks believed to originate in nature or in circumstances fixed by tradition are the result of human decisions. Moreover, the inherently changeable character of (expert) knowledge is revealed. In a world where individuals are forced to take responsibility for their decisions, risk moves into their everyday lives. The social origin and sometimes catastrophic potential and indefinable character of these so called "manufactured risks", and an experienced increased openness and complexity of the world, create problems for the individual as well as for the insurance system. In this so called "risk society", our sense of control staggers. Beck's and Giddens' accounts of this development are very similar. Yet Beck is somewhat more pessimistic about our present state of control. This is also a main criticism towards his risk society thesis. Moreover, Beck's lack of consistency and clarity and his alleged over-generalizing and insufficient empirical backing are discussed. Further research of a more empirical kind is suggested in order to test the merits of Beck's and Giddens' theories.
|
532 |
Algorithmic Framework for Improving Heuristics in Stochastic, Stage-Wise Optimization ProblemsChoi, Jaein 24 November 2004 (has links)
Algorithmic Framework for Improving Heuristics in
Stochastic, Stage-Wise Optimization Problems
Jaein Choi
172 Pages
Directed by Dr. Jay H. Lee and Dr. Matthew J. Realff
The goal of this thesis is the development of a computationally tractable solution method for stochastic, stage-wise optimization problems. In order to achieve the goal, we have developed a novel algorithmic framework based on Dynamic Programming (DP) for improving heuristics. The propose method represents a systematic way to take a family of solutions and patch them together as an improved solution. However, patching is accomplished in state space, rather than in solution space. Since the proposed approach utilizes simulation with heuristics to circumvent the curse of dimensionality of the DP, it is named as Dynamic Programming in Heuristically Restricted State Space. The proposed algorithmic framework is applied to stochastic Resource Constrained Project Scheduling problems, a real-world optimization problem with a high dimensional state space and significant uncertainty equivalent to billions of scenarios. The real-time decision making policy obtained by the proposed approach outperforms the best heuristic applied in simulation stage to form the policy. The proposed approach is extended with the idea of Q-Learning technique, which enables us to build empirical state transition rules through simulation, for stochastic optimization problems with complicated state transition rules. Furthermore, the proposed framework is applied to a stochastic supply chain management problem, which has high dimensional action space as well as high dimensional state space, with a novel concept of implicit sub-action space that efficiently restricts action space for each state in the restricted state space. The resulting real-time policy responds to the time varying demand for products by stitching together decisions made by the heuristics and improves overall performance of the supply chain. The proposed approach can be applied to any problem formulated as a stochastic DP, provided that there are reasonable heuristics available for simulation.
|
533 |
Spatial variability in soils: stiffness and strengthKim, Hyunki 19 July 2005 (has links)
Geotechnical properties vary in space. Statistical parameters such as mean, deviation, and correlation length are characteristics for each sediment and formation history. The effects of spatial variability on the macro-scale mechanical properties of soils are investigated using Monte Carlo non-linear finite element simulations. Boundary conditions include 1) isotropic loading, 2) zero-lateral strain loading, 3) drained and undrained deviatoric loading, and 4) small-strain wave propagation. Emphasis is placed on identifying the effects of spatial variability on the stiffness and strength of soils, recognizing emergent phenomena, and creating the background for new geotechnical design methods that take into consideration spatial variability.
The arithmetic mean of soil properties cannot be used to estimate the stiffness or strength of heterogeneous soils. Greater deviation and longer relative correlation length in the spatial distribution of soil properties yield a softer and weaker mechanical response. Load transfer concentrates along stiffer zones, leading to stress-focusing and lower K0 values. Drained loading promotes internal homogenization. Undrained deviatoric loading can cause percolation of internal weakness and shear strain localization. Spatial heterogeneity adds complexity to elastic wave propagation. Heterogeneous soil mixtures can be engineered to attain unique macroscale behavior
|
534 |
Spatially Adaptive Augmented RealityCoelho, Enylton Machado 28 November 2005 (has links)
One of the most important problems in real-time, mobile augmented reality is *registration error* -- the misalignment between the computer generated graphics and the physical world the application is trying to augment. Such misalignment may either cause the information presented by the application to be misleading to the user or make the augmentation meaningless.
In this work, we question the implied assumption that registration error must be eliminated for AR to be useful. Instead, we take the position that registration error will never be eliminated and that application developers can build useful AR applications if they have an estimate of registration error. We present a novel approach to AR application design: *Spatially Adaptive Augmented Reality* (i.e., applications that change their displays based on the quality of the alignment between the physical and virtual world). The computations used to change the display are based on real-time estimates of the registration error. The application developer uses these estimates to build applications that function under a variety of conditions independent of specific tracking technologies.
In addition to introducing Spatially Adaptive AR, this research establishes a theoretical model for AR. These theoretical contributions are manifested in a toolkit that supports the design of Spatially Adaptive AR applications: OSGAR.
This work describes OSGAR in detail and presents examples that demonstrate how to use this novel approach to create adaptable augmentations as well as how to support user interaction in the presence of uncertainty.
|
535 |
Information Uncertainty and Momentum StrategyYen, Jiun-huey 18 July 2010 (has links)
The profitability and the sources of the returns on momentum strategy have always been a popular subject of study in the financial field. Nevertheless, there exist significant discrepancies between the conclusions of the papers due to the difference
in time period and the emphasis on average results. Thus, the purpose of this paper is to investigate momentum strategy with information uncertainty in Taiwan stock market during the period 1990-2009 given the basis on the research method of
Jegardeesh and Titman(1993).
The result shows that momentum strategy cannot averagely obtain significantly positive returns in the long run in Taiwan stock market and moreover it presents an enormous short-term reversal. Besides, in terms of business cycle, there is still no significant return on momentum strategy either; there will be significantly negative return when implementing momentum strategy in the recession of business cycle.
On the other hand, from the view of investor psychological biases, it should be seen greater psychological biases owing to greater information uncertainty. As a result, a stronger stock price continuation may be observed under high information
uncertainty stocks. However, the effect of information uncertainty is only pronounced among loser portfolios.
To summarize, compared with the profitability and the stability of contrarian strategy, the findings support that there is no significant momentum phenomena in
Taiwan stock market at all.
|
536 |
Analyzing risk and uncertainty for improving water distribution system security from malevolent water supply contamination eventsTorres, Jacob Manuel 15 May 2009 (has links)
Previous efforts to apply risk analysis for water distribution systems (WDS) have
not typically included explicit hydraulic simulations in their methodologies. A risk
classification scheme is here employed for identifying vulnerable WDS components
subject to an intentional water contamination event. A Monte Carlo simulation is
conducted including uncertain stochastic diurnal demand patterns, seasonal demand,
initial storage tank levels, time of day of contamination initiation, duration of
contamination event, and contaminant quantity.
An investigation is conducted on exposure sensitivities to the stochastic inputs
and on mitigation measures for contaminant exposure reduction. Mitigation measures
include topological modifications to the existing pipe network, valve installation, and an
emergency purging system. Findings show that reasonable uncertainties in model inputs
produce high variability in exposure levels. It is also shown that exposure level
distributions experience noticeable sensitivities to population clusters within the
contaminant spread area. The significant uncertainty in exposure patterns leads to
greater resources needed for more effective mitigation.
|
537 |
Evaluating and developing parameter optimization and uncertainty analysis methods for a computationally intensive distributed hydrological modelZhang, Xuesong 15 May 2009 (has links)
This study focuses on developing and evaluating efficient and effective parameter
calibration and uncertainty methods for hydrologic modeling. Five single objective
optimization algorithms and six multi-objective optimization algorithms were tested for
automatic parameter calibration of the SWAT model. A new multi-objective
optimization method (Multi-objective Particle Swarm and Optimization & Genetic
Algorithms) that combines the strengths of different optimization algorithms was
proposed. Based on the evaluation of the performances of different algorithms on three
test cases, the new method consistently performed better than or close to the other
algorithms.
In order to save efforts of running the computationally intensive SWAT model,
support vector machine (SVM) was used as a surrogate to approximate the behavior of
SWAT. It was illustrated that combining SVM with Particle Swarm and Optimization
can save efforts for parameter calibration of SWAT. Further, SVM was used as a
surrogate to implement parameter uncertainty analysis fo SWAT. The results show that
SVM helped save more than 50% of runs of the computationally intensive SWAT model
The effect of model structure on the uncertainty estimation of streamflow simulation
was examined through applying SWAT and Neural Network models. The 95%
uncertainty intervals estimated by SWAT only include 20% of the observed data, while Neural Networks include more than 70%. This indicates the model structure is an
important source of uncertainty of hydrologic modeling and needs to be evaluated
carefully. Further exploitation of the effect of different treatments of the uncertainties of
model structures on hydrologic modeling was conducted through applying four types of
Bayesian Neural Networks. By considering uncertainty associated with model structure,
the Bayesian Neural Networks can provide more reasonable quantification of the
uncertainty of streamflow simulation. This study stresses the need for improving
understanding and quantifying methods of different uncertainty sources for effective
estimation of uncertainty of hydrologic simulation.
|
538 |
Uncertainty evaluation of delayed neutron decay parametersWang, Jinkai 15 May 2009 (has links)
In a nuclear reactor, delayed neutrons play a critical role in sustaining a controllable chain reaction. Delayed neutron’s relative yields and decay constants are very important for modeling reactivity control and have been studied for decades. Researchers have tried different experimental and numerical methods to assess these delayed neutron parameters. The reported parameter values vary widely, much more than the small statistical errors reported with these parameters. Interestingly, the reported parameters fit their individual measurement data well in spite of these differences.
This dissertation focuses on evaluation of the errors and methods of delayed neutron relative yields and decay constants for thermal fission of U-235. Various numerical methods used to extract the delayed neutron parameter from the measured data, including Matrix Inverse, Levenberg-Marquardt, and Quasi-Newton methods, were studied extensively using simulated delayed neutron data. This simulated data was Poisson distributed around Keepin’s theoretical data. The extraction methods produced totally different results for the same data set, and some of the above numerical methods could not even find solutions for some data sets. Further investigation found that ill-conditioned matrices in the objective function were the reason for the inconsistent results. To find a reasonable solution with small variation, a regularization parameter was introduced using a numerical method called Ridge Regression. The results from the Ridge Regression method, in terms of goodness of fit to the data, were good and often better than the other methods. Due to the introduction of a regularization number in the algorithm, the fitted result contains a small additional bias, but this method can guarantee convergence no matter how large the coefficient matrix condition number. Both saturation and pulse modes were simulated to focus on different groups. Some of the factors that affect the solution stability were investigated including initial count rate, sample flight time, initial guess values.
Finally, because comparing reported delayed neutron parameters among different experiments is useless to determine if their data actually differs, methods are proposed that can be used to compare the delayed neutron data sets.
|
539 |
Uncertainty of microwave radiative transfer computations in rainHong, Sung Wook 02 June 2009 (has links)
Currently, the effect of the vertical resolution on the brightness temperature (BT)
has not been examined in depth. The uncertainty of the freezing level (FL) retrieved
using two different satellites' data is large. Various radiative transfer (RT) codes yield
different BTs in strong scattering conditions.
The purposes of this research were: 1) to understand the uncertainty of the BT
contributed by the vertical resolution numerically and analytically; 2) to reduce the
uncertainty of the FL retrieval using new thermodynamic observations; and 3) to
investigate the characteristics of four different RT codes.
Firstly, a plane-parallel RT Model (RTM) of n layers in light rainfall was used for
the analytical and computational derivation of the vertical resolution effect on the BT.
Secondly, a new temperature profile based on observations was absorbed in the Texas
A&M University (TAMU) algorithm. The Precipitation Radar (PR) and Tropical
Rainfall Measuring Mission (TRMM) Microwave Imager (TMI) data were utilized for
the improved FL retrieval. Thirdly, the TAMU, Eddington approximation (EDD), Discrete Ordinate, and backward Monte Carlo codes were compared under various view
angles, rain rates, FLs, frequencies, and surface properties. The uncertainty of the BT
decreased as the number of layers increased. The uncertainty was due to the optical
thickness rather than due to relative humidity, pressure distribution, water vapor, and
temperature profile. The mean TMI FL showed a good agreement with mean bright band
height. A new temperature profile reduced the uncertainty of the TMI FL by about 10%.
The differences of the BTs among the four different RT codes were within 1 K at the
current sensor view angle over the entire dynamic rain rate range of 10-37 GHz. The
differences between the TAMU and EDD solutions were less than 0.5 K for the specular
surface.
In conclusion, this research suggested the vertical resolution should be considered
as a parameter in the forward model. A new temperature profile improved the TMI FL in
the tropics, but the uncertainty still exists with low FL. Generally, the four RT codes
agreed with each other, except at nadir, near limb or in heavy rainfall. The TAMU and
the EDD codes had better agreement than other RT codes.
|
540 |
A Preliminary Study to Assess Model Uncertainties in Fluid FlowsDelchini, Marc Olivier 2010 May 1900 (has links)
In this study, the impact of various flow models is assessed under free and
forced convection: compressible versus incompressible models for a Pressurized Water
Reactor, and Darcy's law vs full momentum equation for High Temperature Gas
Reactor. Euler equations with friction forces and a momentum and energy source/sink
are used. The geometric model consists of a one-dimensional rectangular loop system.
The fluid is heated up and cooled down along the vertical legs. A pressurizer and a
pump are included along the horizontal legs. The compressible model is assumed to
be the most accurate model in this study.
Simulations show that under forced convection compressible and incompressible
models yield the same transient and steady-state. As free convection is studied,
compressible and incompressible models have different transient but the same final
steady-state. As Darcy's law is used, pressure and velocity steady-state profiles yield
some differences compared to the compressible model both under free and forced
convections. It is also noted some differences in the transient.
|
Page generated in 0.0388 seconds