• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 1690
  • 419
  • 238
  • 214
  • 133
  • 93
  • 31
  • 26
  • 25
  • 21
  • 20
  • 15
  • 8
  • 7
  • 7
  • Tagged with
  • 3596
  • 597
  • 429
  • 363
  • 358
  • 357
  • 345
  • 326
  • 325
  • 293
  • 282
  • 255
  • 214
  • 213
  • 210
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
151

Uncertainty Analysis in Modelling for CANDU and Pressurized Water Reactors

Tucker, Michael January 2023 (has links)
This thesis documents significant contributions to the quantification of input and modelling uncertainties in the simulation of nuclear power plants. This work is intended to support the simulations that are performed to demonstrate the safety of nuclear power plants in general, and in CANDU reactors specifically. The work presented in this thesis extends the methodologies for uncertainty propagation established internationally to CANDU plants and pioneers the integration of these tools with important plant features in CANDUs, such as online fueling. This thesis documents a series of simulation studies performed to quantify the impact of uncertainties (primarily nuclear data uncertainties), on simulations of CANDU stations and light water reactors (LWRs). The novel part of this work includes quantifying the role of operational feedbacks such as online refuelling and reactor control systems, and important modelling uncertainties, on CANDU simulations. To achieve this objective, this thesis examines 4 important areas as documented in journal papers. To demonstrate understanding of the tools developed for the UAM-LWR benchmark and to support the ongoing international effort, select studies from the UAM-LWR benchmark study exercises were performed and published in the first journal paper. Time-dependent PWR neutronics exercises, considering both nuclear data and manufacturing uncertainties, were completed. This work found that the relative importance of nuclear data uncertainties and manufacturing uncertainties depended on whether the parameter of interest was “local”, such as pin power factors, or “global”, such as homogenized assembly properties. The second publication in this thesis documents the adaption of the tools from the first paper to consider CANDU specific features, such as spatial control systems and online refuelling. This paper demonstrated the significant effect that consistent feedback from fuelling operations has on reducing the total uncertainty in core level simulations of CANDU plants. The tools developed for this work were used to support downstream studies by generating extensive sets of realistic initial conditions for many different possible nuclear datasets. The next publications utilized the tools developed above and then extends the methods to include operational aspects of CANDUs in the assessments for the first time. In the third paper these methods were then used to demonstrate the tools’ capabilities to simulate an operational transient (a power maneuver from 100% full power to 59% full power) in a CANDU station and compared the resultant prediction and uncertainties to measure plant responses. A further study, on the role of nuclear data and initial burnup distribution uncertainty on a CANDU plant’s response to perturbations to liquid zone controller levels, was also performed to examine the effect of the commonly used “superposition principle” utilized in industry to make safety analysis of CANDU’s various fueling states more tractable. In both cases the role of nuclear data uncertainties was generally found to be similar in magnitude to the role of uncertainty in the core initial conditions. The results of this work support the continued safe operation of CANDU nuclear generating stations in Canada by quantifying the role of select uncertainties on safety simulation outputs, informing future BEPU analysis for CANDU plants and demonstrating the exceptional flexibility of the CANDU reactor design. This is reflected in one of the major conclusions of these works, which demonstrates that the natural feedbacks in CANDU operation help to minimize the effect of uncertainties in the outcome of many safety analysis. / Thesis / Candidate in Philosophy
152

Coherent states with minimum Gini uncertainty for finite quantum systems

Lei, Ci, Vourdas, Apostolos 28 November 2022 (has links)
Yes
153

Ambiguity in Public Organizations - Is it always Negative and Difficult to Manage or does Theory Assume Too Much?: A Case Study Review of Customs and Border Protection's Container Security Initiative

Mullins, Dustin Ashley 26 September 2011 (has links)
This research study provides a conceptual framework to understand how public managers strategically engage ambiguity and translate the complexity associated with ambiguity to manageable objectives to control complex work within federal programs. A central assumption for this study is that ambiguity is an organizational reality due to the social nature of administrative systems, influencing how managers approach and understand problem sets. This research study demonstrates the impact of management strategies in combating organizational ambiguity, at a strategic level, as well as mitigating and reducing uncertainty at more tactical levels of an organization. Theoretically, this study engages the current divide between organizational theory and public management scholarship by providing an empirical perspective on the management and execution of a key national security program. Through examination of the Container Security Initiative (CSI) program within the Department of Homeland Security (DHS) Customs and Border Protection (CBP), this research explores how ambiguity and uncertainty, within bureaucratic settings, is managed on an ongoing basis in the pursuit of defined goals and objectives. Examination of how public managers strategically engage ambiguity and implementation pressures, which manifest as a result of systemic external and internal pressures, to translate complexity associated with the ambiguity into manageable program objectives, provides valuable insight into the impact of managerial processes within public organizations. Through this managerial process and by setting priorities and objectives, public managers decompose and translate ambiguity and complexity in order to more actively and effectively utilize strategies and resources in support of those defined objectives. Central to the translation process is managing the interface between the strategic and tactical dimensions of programs, through goal setting and priority definition, enabling the execution of key program activities and operations. This study's findings build upon existing research that examines the role of management within public organizations, as well as challenges several assumptions within the extant literature regarding the influence and consequences of ambiguity within public organizations. Addressing the need to empirically demonstrate how management matters, this research emphasizes the role public managers play in actively engaging and managing organizational and program complexity in order to accomplish the objectives of public bureaucracies. / Ph. D.
154

Assessment of Model Validation, Calibration, and Prediction Approaches in the Presence of Uncertainty

Whiting, Nolan Wagner 19 July 2019 (has links)
Model validation is the process of determining the degree to which a model is an accurate representation of the true value in the real world. The results of a model validation study can be used to either quantify the model form uncertainty or to improve/calibrate the model. However, the model validation process can become complicated if there is uncertainty in the simulation and/or experimental outcomes. These uncertainties can be in the form of aleatory uncertainties due to randomness or epistemic uncertainties due to lack of knowledge. Four different approaches are used for addressing model validation and calibration: 1) the area validation metric (AVM), 2) a modified area validation metric (MAVM) with confidence intervals, 3) the standard validation uncertainty from ASME VandV 20, and 4) Bayesian updating of a model discrepancy term. Details are given for the application of the MAVM for accounting for small experimental sample sizes. To provide an unambiguous assessment of these different approaches, synthetic experimental values were generated from computational fluid dynamics simulations of a multi-element airfoil. A simplified model was then developed using thin airfoil theory. This simplified model was then assessed using the synthetic experimental data. The quantities examined include the two dimensional lift and moment coefficients for the airfoil with varying angles of attack and flap deflection angles. Each of these validation/calibration approaches will be assessed for their ability to tightly encapsulate the true value in nature at locations both where experimental results are provided and prediction locations where no experimental data are available. Generally it was seen that the MAVM performed the best in cases where there is a sparse amount of data and/or large extrapolations and Bayesian calibration outperformed the others where there is an extensive amount of experimental data that covers the application domain. / Master of Science / Uncertainties often exists when conducting physical experiments, and whether this uncertainty exists due to input uncertainty, uncertainty in the environmental conditions in which the experiment takes place, or numerical uncertainty in the model, it can be difficult to validate and compare the results of a model with those of an experiment. Model validation is the process of determining the degree to which a model is an accurate representation of the true value in the real world. The results of a model validation study can be used to either quantify the uncertainty that exists within the model or to improve/calibrate the model. However, the model validation process can become complicated if there is uncertainty in the simulation (model) and/or experimental outcomes. These uncertainties can be in the form of aleatory (uncertainties which a probability distribution can be applied for likelihood of drawing values) or epistemic uncertainties (no knowledge, inputs drawn within an interval). Four different approaches are used for addressing model validation and calibration: 1) the area validation metric (AVM), 2) a modified area validation metric (MAVM) with confidence intervals, 3) the standard validation uncertainty from ASME V&V 20, and 4) Bayesian updating of a model discrepancy term. Details are given for the application of the MAVM for accounting for small experimental sample sizes. To provide an unambiguous assessment of these different approaches, synthetic experimental values were generated from computational fluid dynamics(CFD) simulations of a multi-element airfoil. A simplified model was then developed using thin airfoil theory. This simplified model was then assessed using the synthetic experimental data. The quantities examined include the two dimensional lift and moment coefficients for the airfoil with varying angles of attack and flap deflection angles. Each of these validation/calibration approaches will be assessed for their ability to tightly encapsulate the true value in nature at locations both where experimental results are provided and prediction locations where no experimental data are available. Also of interest was to assess how well each method could predict the uncertainties about the simulation outside of the region in which experimental observations were made, and model form uncertainties could be observed.
155

Managing commitments and flexibility by real options

Roemer, Ellen, Rese, M. January 2004 (has links)
No / In the past, transaction cost economics (TCE) literature has largely stressed the benefits of contractual commitments in relationships. TCE traditionally recommends safeguarding specific assets against holdup, thus reducing behavioral uncertainty. In contrast, the reverse side of the coin has been disregarded for a long time. Firms may lose some of their flexibility and thus the opportunity to benefit from future emerging business activities due to prior contractual commitments. Flexibility becomes especially important in dynamic environments where there is rapid technological change. To manage the emergent trade-off between contractual commitments and flexibility in dynamic markets, the authors propose a real options approach. The value of a firm's flexibility to switch to a new trading partner is represented by a real switching option. It can be shown how contractual commitments directly affect the value of the switching option. To clarify, the authors numerically analyze the respective trade-off and derive implications for an optimal choice of contractual commitments.
156

Development and optimisation of fast energy yield calculations (FEnYCs) of photovoltaic modules

Roy, Jyotirmoy January 2014 (has links)
Development and optimisation of a robust energy yield prediction methodology is the ultimate aim of this research. Outdoor performance of the PV module is determined by the influences of a variety of interlinked factors related to the environment and device technologies. There are two basic measurement data sets required for any energy yield prediction model. Firstly, characterisation of specific PV module technology under different operating conditions and secondly site specific meteorological data. Based on these two datasets a calculation procedure is required in any specific location energy yield estimation. This research established a matrix based multi-dimensional measurement set points for module characterisation which is independent of PV technologies. This novel approach has been established by demonstrating an extended correlation of different environmental factors (irradiance, temperature and spectral irradiance) and their influences on the commercial PV device technologies. Utilisation of the site specific meteorological data is the common approach applied in this yield prediction method. A series of modelling approach, including a tri-linear interpolation method is then applied for energy yield calculation. A novel Monte Carlo simulation is demonstrated for uncertainty analysis of irradiance (pyranometer CM 11) & temperature (PT 1000) measurements and ultimately the yield prediction of c-Si and CIGS modules. The degree of uncertainties of irradiance is varies from ??2% to ??6.2% depending on the level of monthly irradiation. The temperature measurement uncertainty is calculated in the range of ??0.18??C to ??0.46%??C in different months of the year. The calculated uncertainty of the energy yield prediction of c-Si and CIGS module are ??2.78% and ??15.45%. This research validated different irradiance translation models to identify the best matched model for UK climate for horizontal to in-plane irradiance. Ultimately, the validation results of the proposed Fast Energy Yield Calculation (FEnYCs), shows a good agreement against measured values i.e. 5.48%, 6.97% and 3.1% for c-Si, a-Si and CIGS module respectively.
157

Analysis of Transient Overpower Scenarios in Sodium Fast Reactors

Grabaskas, David 20 August 2010 (has links)
No description available.
158

Quantifying Coordinate Uncertainty Fields in Coupled Spatial Measurement systems

Calkins, Joseph Matthew 06 August 2002 (has links)
Spatial coordinate measurement systems play an important role in manufacturing and certification processes. There are many types of coordinate measurement systems including electronic theodolite networks, total station systems, video photogrammetry systems, laser tracking systems, laser scanning systems, and coordinate measuring machines. Each of these systems produces coordinate measurements containing some degree of uncertainty. Often, the results from several different types of measurement systems must be combined in order to provide useful measurement results. When these measurements are combined, the resulting coordinate data set contains uncertainties that are a function of the base data sets and complex interactions between the measurement sets. ISO standards, ANSI standards, and others, require that estimates of uncertainty accompany all measurement data. This research presents methods for quantifying the uncertainty fields associated with coupled spatial measurement systems. The significant new developments and refinements presented in this dissertation are summarized as follows: 1) A geometrical representation of coordinate uncertainty fields. 2) An experimental method for characterizing instrument component uncertainty. 3) Coordinate uncertainty field computation for individual measurements systems. 4) Measurement system combination methods based on the relative uncertainty of each measurement's individual components. 5) Combined uncertainty field computation resulting from to the interdependence of the measurements for coupled measurement systems. 6) Uncertainty statements for measurement analyses such as best-fit geometrical shapes and hidden-point measurement. 7) The implementation of these methods into commercial measurement software. 8) Case studies demonstrating the practical applications of this research. The specific focus of this research is portable measurement systems. It is with these systems that uncertainty field combination issues are most prevalent. The results of this research are, however, general and therefore applicable to any instrument capable of measuring spatial coordinates. / Ph. D.
159

Optimization Under Uncertainty and Total Predictive Uncertainty for a Tractor-Trailer Base-Drag Reduction Device

Freeman, Jacob Andrew 07 September 2012 (has links)
One key outcome of this research is the design for a 3-D tractor-trailer base-drag reduction device that predicts a 41% reduction in wind-averaged drag coefficient at 57 mph (92 km/h) and that is relatively insensitive to uncertain wind speed and direction and uncertain deflection angles due to mounting accuracy and static aeroelastic loading; the best commercial device of non-optimized design achieves a 12% reduction at 65 mph. Another important outcome is the process by which the optimized design is obtained. That process includes verification and validation of the flow solver, a less complex but much broader 2-D pathfinder study, and the culminating 3-D aerodynamic shape optimization under uncertainty (OUU) study. To gain confidence in the accuracy and precision of a computational fluid dynamics (CFD) flow solver and its Reynolds-averaged Navier-Stokes (RANS) turbulence models, it is necessary to conduct code verification, solution verification, and model validation. These activities are accomplished using two commercial CFD solvers, Cobalt and RavenCFD, with four turbulence models: Spalart-Allmaras (S-A), S-A with rotation and curvature, Menter shear-stress transport (SST), and Wilcox 1998 k-ω. Model performance is evaluated for three low subsonic 2-D applications: turbulent flat plate, planar jet, and NACA 0012 airfoil at α = 0°. The S-A turbulence model is selected for the 2-D OUU study. In the 2-D study, a tractor-trailer base flap model is developed that includes six design variables with generous constraints; 400 design candidates are evaluated. The design optimization loop includes the effect of uncertain wind speed and direction, and post processing addresses several other uncertain effects on drag prediction. The study compares the efficiency and accuracy of two optimization algorithms, evolutionary algorithm (EA) and dividing rectangles (DIRECT), twelve surrogate models, six sampling methods, and surrogate-based global optimization (SBGO) methods. The DAKOTA optimization and uncertainty quantification framework is used to interface the RANS flow solver, grid generator, and optimization algorithm. The EA is determined to be more efficient in obtaining a design with significantly reduced drag (as opposed to more efficient in finding the true drag minimum), and total predictive uncertainty is estimated as ±11%. While the SBGO methods are more efficient than a traditional optimization algorithm, they are computationally inefficient due to their serial nature, as implemented in DAKOTA. Because the S-A model does well in 2-D but not in 3-D under these conditions, the SST turbulence model is selected for the 3-D OUU study that includes five design variables and evaluates a total of 130 design candidates. Again using the EA, the study propagates aleatory (wind speed and direction) and epistemic (perturbations in flap deflection angle) uncertainty within the optimization loop and post processes several other uncertain effects. For the best 3-D design, total predictive uncertainty is +15/-42%, due largely to using a relatively coarse (six million cell) grid. That is, the best design drag coefficient estimate is within 15 and 42% of the true value; however, its improvement relative to the no-flaps baseline is accurate within 3-9% uncertainty. / Ph. D.
160

Encapsulation and abstraction for modeling and visualizing information uncertainty

Streit, Alexander January 2008 (has links)
Information uncertainty is inherent in many real-world problems and adds a layer of complexity to modeling and visualization tasks. This often causes users to ignore uncertainty, especially when it comes to visualization, thereby discarding valuable knowledge. A coherent framework for the modeling and visualization of information uncertainty is needed to address this issue In this work, we have identified four major barriers to the uptake of uncertainty modeling and visualization. Firstly, there are numerous uncertainty modeling tech- niques and users are required to anticipate their uncertainty needs before building their data model. Secondly, parameters of uncertainty tend to be treated at the same level as variables making it easy to introduce avoidable errors. This causes the uncertainty technique to dictate the structure of the data model. Thirdly, propagation of uncertainty information must be manually managed. This requires user expertise, is error prone, and can be tedious. Finally, uncertainty visualization techniques tend to be developed for particular uncertainty types, making them largely incompatible with other forms of uncertainty information. This narrows the choice of visualization techniques and results in a tendency for ad hoc uncertainty visualization. The aim of this thesis is to present an integrated information uncertainty modeling and visualization environment that has the following main features: information and its uncertainty are encapsulated into atomic variables, the propagation of uncertainty is automated, and visual mappings are abstracted from the uncertainty information data type. Spreadsheets have previously been shown to be well suited as an approach to visu- alization. In this thesis, we devise a new paradigm extending the traditional spreadsheet to intrinsically support information uncertainty.Our approach is to design a framework that integrates uncertainty modeling tech- niques into a hierarchical order based on levels of detail. The uncertainty information is encapsulated and treated as a unit allowing users to think of their data model in terms of the variables instead of the uncertainty details. The system is intrinsically aware of the encapsulated uncertainty and is therefore able to automatically select appropriate uncertainty propagation methods. A user-objectives based approach to uncertainty visualization is developed to guide the visual mapping of abstracted uncertainty information. Two main abstractions of uncertainty information are explored for the purpose of visual mapping: the Unified Uncertainty Model and the Dual Uncertainty Model. The Unified Uncertainty Model provides a single view of uncertainty for visual mapping, whereas the Dual Uncertainty Model distinguishes between possibilistic and probabilistic views. Such abstractions provide a buffer between the visual mappings and the uncertainty type of the underly- ing data, enabling the user to change the uncertainty detail without causing the visual- ization to fail. Two main case studies are presented. The first case study covers exploratory and forecasting tasks in a business planning context. The second case study inves- tigates sensitivity analysis for financial decision support. Two minor case studies are also included: one to investigate the relevancy visualization objective applied to busi- ness process specifications, and the second to explore the extensibility of the system through General Purpose Graphics Processor Unit (GPGPU) use. A quantitative anal- ysis compares our approach to traditional analytical and numerical spreadsheet-based approaches. Two surveys were conducted to gain feedback on the from potential users. The significance of this work is that we reduce barriers to uncertainty modeling and visualization in three ways. Users do not need a mathematical understanding of the uncertainty modeling technique to use it; uncertainty information is easily added, changed, or removed at any stage of the process; and uncertainty visualizations can be built independently of the uncertainty modeling technique.

Page generated in 0.0308 seconds