• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 1700
  • 419
  • 238
  • 214
  • 136
  • 93
  • 31
  • 26
  • 25
  • 21
  • 20
  • 15
  • 9
  • 8
  • 7
  • Tagged with
  • 3615
  • 598
  • 433
  • 364
  • 360
  • 359
  • 347
  • 328
  • 326
  • 296
  • 282
  • 259
  • 214
  • 214
  • 210
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
561

Predicting the migration of CO₂ plume in saline aquifers using probabilistic history matching approaches

Bhowmik, Sayantan 20 August 2012 (has links)
During the operation of a geological carbon storage project, verifying that the CO₂ plume remains within the permitted zone is of particular interest both to regulators and to operators. However, the cost of many monitoring technologies, such as time-lapse seismic, limits their application. For adequate predictions of plume migration, proper representation of heterogeneous permeability fields is imperative. Previous work has shown that injection data (pressures, rates) from wells might provide a means of characterizing complex permeability fields in saline aquifers. Thus, given that injection data are readily available and inexpensive, they might provide an inexpensive alternative for monitoring; combined with a flow model like the one developed in this work, these data could even be used for predicting plume migration. These predictions of plume migration pathways can then be compared to field observations like time-lapse seismic or satellite measurements of surface-deformation, to ensure the containment of the injected CO₂ within the storage area. In this work, two novel methods for creating heterogeneous permeability fields constrained by injection data are demonstrated. The first method is an implementation of a probabilistic history matching algorithm to create models of the aquifer for predicting the movement of the CO₂ plume. The geologic property of interest, for example hydraulic conductivity, is updated conditioned to geological information and injection pressures. The resultant aquifer model which is geologically consistent can be used to reliably predict the movement of the CO₂ plume in the subsurface. The second method is a model selection algorithm that refines an initial suite of subsurface models representing the prior uncertainty to create a posterior set of subsurface models that reflect injection performance consistent with that observed. Such posterior models can be used to represent uncertainty in the future migration of the CO₂ plume. The applicability of both methods is demonstrated using a field data set from central Algeria. / text
562

Reliability methods in dynamic system analysis

Munoz, Brad Ernest 26 April 2013 (has links)
Standard techniques used to analyze a system's response with uncertain system parameters or inputs, are generally Importance sampling methods. Sampling methods require a large number of simulation runs before the system output statistics can be analyzed. As model fidelity increases, sampling techniques become computationally infeasible, and Reliability methods have gained popularity as an analysis method that requires significantly fewer simulation runs. Reliability analysis is an analytic technique which finds a particular point in the design space that can accurately be related to the probability of system failure. However, application to dynamic systems have remained limited. In the following thesis a First Order Reliability Method (FORM) is used to determine the failure probability of a dynamic system due to system/input uncertainties. A pendulum cart system is used as a case study to demonstrate the FORM on a dynamic system. Three failure modes are discussed which correspond to the maximum pendulum angle, the maximum system velocity, and a combined requirement that neither the maximum pendulum angle or system velocity are exceeded. An explicit formulation is generated from the implicit formulation using a Response Surface Methodology, and the FORM is performed using the explicit estimate. Although the analysis converges with minimal simulation computations, attempts to verify FORM results illuminate current limitations of the methodology. The results of this initial study conclude that, currently, sampling techniques are necessary to verify the FORM results, which restricts the potential applications of the FORM methodology. Suggested future work focuses on result verification without the use of Importance sampling which would allow Reliability methods to have widespread applicability. / text
563

Generation of high fidelity covariance data sets for the natural molybdenum isotopes including a series of molybdenum sensitive critical experiment designs

Van der Hoeven, Christopher Ainslie 15 October 2013 (has links)
Quantification of uncertainty in computational models of nuclear systems is required for assessing margins of safety for both design and operation of those systems. The largest source of uncertainty in computational models of nuclear systems derives from the nuclear cross section data used for modeling. There are two parts to cross section uncertainty data: the relative uncertainty in the cross section at a particular energy, and how that uncertainty is correlated with the uncertainty at all other energies. This cross section uncertainty and uncertainty correlation is compiled as covariance data. High fidelity covariance data exists for a few key isotopes, however the covariance data available for many structural materials is considered low fidelity, and is derived primarily from integral measurements with little meaningful correlation between energy regions. Low fidelity covariance data is acceptable for materials to which the operating characteristics of the modeled nuclear system are insensitive. However, in some cases, nuclear systems can be sensitive to isotopes with only low fidelity covariance data. Such is the case for the new U(19.5%)-10Moly foil fuel form to be produced at the Y-12 National Security Complex for use in research and test reactors. This fuel is ten weight percent molybdenum, the isotopes of which have only low fidelity covariance data. Improvements to the molybdenum isotope covariance data would benefit the modeling of systems using the new fuel form. This dissertation provides a framework for deriving high fidelity molybdenum isotope covariance data from a set of elemental molybdenum experimental cross section results. Additionally, a series of critical experiments featuring the new Y-12 fuel form was designed to address deficiencies in the critical experiment library with respect to molybdenum isotopes. Along with existing molybdenum sensitive critical experiments, these proposed experiments were used as a basis to compare the performance of the new high fidelity molybdenum covariance data set with the existing low fidelity covariance data set using the nuclear modeling code SCALE. The use of the high fidelity covariance data was found to result in reduced overall bias, reduced bias due to the molybdenum isotopes, and improved goodness-of-fit of computational results to experimental results. / text
564

How do disclosures of tax uncertainty to tax authorities affect reporting decisions? : evidence from Schedule UTP

Towery, Erin Marie 30 October 2013 (has links)
This study exploits the recently-issued Uncertain Tax Position Statement (Schedule UTP) to examine the effect of mandatory disclosures of tax uncertainty to tax authorities on firms' reporting decisions. Schedule UTP requires firms to disclose federal income tax positions to the Internal Revenue Service that have been classified as 'uncertain' for financial reporting purposes. In showing how Schedule UTP disclosure requirements affect private and public reporting decisions, I provide insights into the usefulness of these disclosures. Using confidential tax return data and public financial statement data, I find that after imposition of Schedule UTP reporting requirements, firms report lower financial reporting reserves for uncertain income tax positions, but do not claim fewer income tax benefits on their federal tax returns. These findings suggest some firms changed their financial reporting for uncertain tax positions to avoid Schedule UTP reporting requirements without changing the underlying positions. The effect is concentrated among firms with greater business complexity, whose business operations facilitate tax planning strategies that are more difficult for the IRS to identify. More broadly, my results imply private disclosures of tax uncertainty can affect the informativeness of public disclosures of tax uncertainty. / text
565

“In this day and age, you just don’t know” : an examination of how people in romantic relationships use communication to manage financial uncertainty

Romo, Lynsey Kluever 11 November 2013 (has links)
While finances are known to be a source of uncertainty for couples (Knobloch, 2008), the specific sources of financial ambiguity and the ways in which they are appraised and negotiated have not been explored. Framed by the lens of Uncertainty Management Theory (UMT; Brashers, 2001), the current study used face-to-face, semi-structured interviews of 40 diverse participants in married or cohabiting relationships to provide new insight into uncertainty management. The investigation uncovered the types of uncertainty experienced by participants (economic, personal, family, communication, and chronic), the ways in which people managed uncertainty (reducing, maintaining, and adapting to it through a variety of practical strategies), and barriers to uncertainty management (information, time management, sociocultural, and communication obstacles), shedding light on why people are (not) successful in managing their finances. Consistent with the tenets of UMT (Brashers, 2001), communication (or lack thereof) was critical to the process of uncertainty management, particularly with respect to reducing and maintaining uncertainty. However, this study uniquely found that collective negotiation of financial uncertainty was particularly salient. In many ways, financial uncertainty management can be conceptualized as a joint enterprise. Just as individuals negotiate uncertainty by seeking information through computer-mediated communication (e.g., the internet), mass media (e.g., magazines), and external interpersonal sources (e.g., financial advisors), this investigation found that people frequently negotiated their uncertainty with their romantic partner through communal coping. This study provides important insight into the ways in which financial uncertainty can influence people's communication, behavior, and relationships and proposes extending the theory to take into account the role that dyads, culture, and individual factors can play in shaping uncertainty management. / text
566

Evaluating hydrodynamic uncertainty in oil spill modeling

Hou, Xianlong 02 December 2013 (has links)
A new method is presented to provide automatic sequencing of multiple hydrodynamic models and automated analysis of model forecast uncertainty. A Hydrodynamic and oil spill model Python (HyosPy) wrapper was developed to run the hydrodynamic model, link with the oil spill, and visualize results. The HyosPy wrapper completes the following steps automatically: (1) downloads wind and tide data (nowcast, forecast and historical); (2) converts data to hydrodynamic model input; (3) initializes a sequence of hydrodynamic models starting at pre-defined intervals on a multi-processor workstation. Each model starts from the latest observed data, so that the multiple models provide a range of forecast hydrodynamics with different initial and boundary conditions reflecting different forecast horizons. As a simple testbed for integration strategies and visualization on Google Earth, a Runge-Kutta 4th order (RK4) particle transport tracer routine is developed for oil spill transport. The model forecast uncertainty is estimated by the difference between forecasts in the sequenced model runs and quantified by using statistics measurements. The HyosPy integrated system with wind and tide force is demonstrated by introducing an imaginary oil spill in Corpus Christi Bay. The results show that challenges in operational oil spill modeling can be met by leveraging existing models and web-visualization methods to provide tools for emergency managers. / text
567

Parametric uncertainty and sensitivity methods for reacting flows

Braman, Kalen Elvin 09 July 2014 (has links)
A Bayesian framework for quantification of uncertainties has been used to quantify the uncertainty introduced by chemistry models. This framework adopts a probabilistic view to describe the state of knowledge of the chemistry model parameters and simulation results. Given experimental data, this method updates the model parameters' values and uncertainties and propagates that parametric uncertainty into simulations. This study focuses on syngas, a combination in various ratios of H2 and CO, which is the product of coal gasification. Coal gasification promises to reduce emissions by replacing the burning of coal with the less polluting burning of syngas. Despite the simplicity of syngas chemistry models, they nonetheless fail to accurately predict burning rates at high pressure. Three syngas models have been calibrated using laminar flame speed measurements. After calibration the resulting uncertainty in the parameters is propagated forward into the simulation of laminar flame speeds. The model evidence is then used to compare candidate models. Sensitivity studies, in addition to Bayesian methods, can be used to assess chemistry models. Sensitivity studies provide a measure of how responsive target quantities of interest (QoIs) are to changes in the parameters. The adjoint equations have been derived for laminar, incompressible, variable density reacting flow and applied to hydrogen flame simulations. From the adjoint solution, the sensitivity of the QoI to the chemistry model parameters has been calculated. The results indicate the most sensitive parameters for flame tip temperature and NOx emission. Such information can be used in the development of new experiments by pointing out which are the critical chemistry model parameters. Finally, a broader goal for chemistry model development is set through the adjoint methodology. A new quantity, termed field sensitivity, is introduced to guide chemistry model development. Field sensitivity describes how information of perturbations in flowfields propagates to specified QoIs. The field sensitivity, mathematically shown as equivalent to finding the adjoint of the primal governing equations, is obtained for laminar hydrogen flame simulations using three different chemistry models. Results show that even when the primal solution is sufficiently close for the three mechanisms, the field sensitivity can vary. / text
568

Mining uncertain data with probabilistic guarantees

Sun, Liwen, 孙理文 January 2010 (has links)
published_or_final_version / Computer Science / Master / Master of Philosophy
569

Voronoi-based nearest neighbor search for multi-dimensional uncertain databases

Zhang, Peiwu., 张培武. January 2012 (has links)
In Voronoi-based nearest neighbor search, the Voronoi cell of every point p in a database can be used to check whether p is the closest to some query point q. We extend the notion of Voronoi cells to support uncertain objects, whose attribute values are inexact. Particularly, we propose the Possible Voronoi cell (or PV-cell). A PV-cell of a multi-dimensional uncertain object o is a region R, such that for any point p ∈ R, o may be the nearest neighbor of p. If the PV-cells of all objects in a database S are known, they can be used to identify objects that have a chance to be the nearest neighbor of q. However, there is no efficient algorithm for computing an exact PV-cell. We hence study how to derive an axis-parallel hyper-rectangle (called the Uncertain Bounding Rectangle, or UBR) that tightly contains a PV-cell. We further develop the PV-index, a structure that stores UBRs, to evaluate probabilistic nearest neighbor queries over uncertain data. An advantage of the PV-index is that upon updates on S, it can be incrementally updated. Extensive experiments on both synthetic and real datasets are carried out to validate the performance of the PV-index. / published_or_final_version / Computer Science / Master / Master of Philosophy
570

Issues related to site property variability and shear strength in site response analysis

Griffiths, Shawn Curtis 18 September 2015 (has links)
Nonlinear site response analyses are generally preferred over equivalent linear analyses for soft soil sites subjected to high-intensity input ground motions. However, both nonlinear and equivalent linear analyses often result in large induced shear strains (3-10%) at soft sites, and these large strains may generate unusual characteristics in the predicted surface ground motions. One source of the overestimated shear strains may be attributed to unrealistically low shear strengths implied by commonly used modulus reduction curves. Therefore, modulus reduction and damping curves can be modified at shear strains greater than 0.1% to provide a more realistic soil model for site response. However, even after these modifications, nonlinear and equivalent linear site response analyses still may generate unusual surface acceleration time histories and Fourier amplitude spectra at soft soil sites when subjected to high-intensity input ground motions. As part of this work, equivalent linear and nonlinear 1D site response analyses for the well-known Treasure Island site demonstrate the challenges associated with accurately modeling large shear strains, and subsequent surface response, at soft soil sites. Accounting for the uncertainties associated with the shear wave velocity profile is an important part of a properly executed site response analyses. Surface wave data from Grenoble, France and Mirandola, Italy have been used to determine shear wave velocity (Vs) profiles from inversion of surface wave data. Furthermore, Vs profiles from inversion have been used to determine boundary, median and statistically-based randomly generated profiles. The theoretical dispersion curves from the inversion analyses as well as the boundary, median and randomly generated Vs profiles are compared with experimentally measured surface wave data. It is found that the median theoretical dispersion curve provides a satisfactory fit to the experimental data, but the boundary type theoretical dispersion curves do not. Randomly generated profiles result in some theoretical dispersion curves that fit the experimental data, and many that do not. Site response analyses revealed that the greater variability in the response spectra and amplification factors were determined from the randomly generated Vs profiles than the inversion or boundary Vs profiles.

Page generated in 0.0609 seconds