• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 1697
  • 419
  • 238
  • 214
  • 136
  • 93
  • 31
  • 26
  • 25
  • 21
  • 20
  • 15
  • 9
  • 8
  • 7
  • Tagged with
  • 3612
  • 598
  • 433
  • 364
  • 360
  • 359
  • 347
  • 327
  • 326
  • 295
  • 282
  • 257
  • 214
  • 214
  • 210
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
361

Positional Uncertainty Analysis Using Data Uncertainy Engine A Case Study On Agricultural Land Parcels

Urganci, Ilksen 01 December 2009 (has links) (PDF)
Most of spatial data extraction and updating procedures require digitization of geographical entities from satellite imagery. During digitization, errors are introduced by factors like instrument deficiencies or user errors. In this study positional uncertainty of geographical objects, digitized from high resolution Quickbird satellite imagery, is assessed using Data Uncertainty Engine (DUE). It is a software tool for assessing uncertainties in environmental data / and generating realisations of uncertain data for use in uncertainty propagation analyses. A case study area in Kocaeli, Turkey that mostly includes agricultural land parcels is selected in order to evaluate positional uncertainty and obtain uncertainty boundaries for manually digitized fields. Geostatistical evaluation of discrepancy between reference data and digitized polygons are undertaken to analyse auto and cross correlation structures of errors. This process is utilized in order to estimate error model parameters which are employed in defining an uncertainty model within DUE. Error model parameters obtained from training data, are used to generate simulations for test data. Realisations of data derived via Monte Carlo Simulation using DUE, are evaluated to generate uncertainty boundaries for each object guiding user for further analyses with pre-defined information related to the accuracy of spatial entities. It is also aimed to assess area uncertainties affected by the position of spatial entities. For all different correlation structures and object models, weighted average positional error for this study is between 2.66 to 2.91 meters. At the end of uncertainty analysis, deformable object model produced the smallest uncertainty bandwidth by modelling cross correlation.
362

Operational, supply-side uncertainty in transportation networks: causes, effects, and mitigation strategies

Boyles, Stephen David 15 October 2009 (has links)
This dissertation is concerned with travel time uncertainty in transportation networks due to ephemeral phenomena such as incidents or poor weather. Such events play a major role in nonrecurring congestion, which is estimated to comprise between one-third and one-half of all delay on freeways. Although past research has considered many individual aspects of this problem, this dissertation is unique in bringing a comprehensive approach, beginning with study of its causes, moving to discussion of its effects on traveler behavior, and then demonstrating how these models can be applied to mitigate the effects of this uncertainty. In particular, two distinctive effects of uncertainty are incorporated into all aspects of these models: nonlinear traveler behavior, encompassing risk aversion, schedule delay, on-time arrival, and other user objectives that explicitly recognize travel time uncertainty; and information and adaptive routing, where travelers can adjust their routes through the network as they acquire information on its condition. In order to accurately represent uncertain events in a mathematical model, some quantitative description of these events and their impacts must be available. On freeways, a large amount of travel data is collected through intelligent transportation systems (ITS), although coverage is far from universal, and very little data is collected on arterial streets. This dissertation develops a statistical procedure for estimating probability distributions on speed, capacity, and other operational metrics by applying regression to locations where such data is available. On arterials, queueing theory is used to develop novel expressions for expected delay conditional on the signal indication. The effects of this uncertainty are considered next, both at the individual (route choice) and collective (equilibrium) levels. For individuals, the optimal strategy is no longer a path, but an adaptive policy which allows for flexible re-routing as information is acquired. Dynamic programming provides an efficient solution to this problem. Issues related to cycling in optimal policies are examined in some depth. While primarily a technical concern, the presence of cycling can be discomforting and needs to be addressed. When considering collective behavior, the simultaneous choices of many self-optimizing users (who need not share the same behavioral objective) can be expressed as the solution to a variational inequality problem, leading to existence and uniqueness results under certain regularity conditions. An improved policy loading algorithm is also provided for the case of linear traveler behavior. Finally, three network improvement strategies are considered: locating information-providing devices; adaptive congestion pricing; and network design. Each of these demonstrates how the routing and equilibrium models can be applied, using small networks as testbed locations. In particular, the information provision and adaptive congestion pricing strategies are extremely difficult to represent without an adaptive equilibrium model such as the one provided in this dissertation. / text
363

Reservoir system management under uncertainty

Kistenmacher, Martin 13 May 2012 (has links)
Reservoir systems are subject to several uncertainties that are the result of imperfect knowledge about system behavior and inputs. A major source of uncertainty arises from the inability to predict future inflows. Fortunately, it is often possible to generate probabilistic forecasts of inflow volumes in the form of probability density functions or ensembles. These inflow forecasts can be coupled with stochastic management models to determine reservoir release policies and provide stakeholders with meaningful information of upcoming system responses such as reservoir levels, releases, flood damage risks, hydropower production, water supply withdrawals, water quality conditions, navigation opportunities, and environmental flows, among others. This information on anticipated system responses is also expressed in the form of forecasts that must reliably represent the actual system behavior when it eventually occurs. The first part of this study presents an assessment methodology that can be used to determine the consistency of ensemble forecasts through the use of relative frequency histograms and minimum spanning trees (MST). This methodology is then used to assess a management model's ability to produce reliable ensemble forecasts. It was found that neglecting to account for hydrologic state variables and improperly modeling the finite management horizon decrease ensemble consistency. Several extensions to the existing management model are also developed and evaluated. The second portion of this study involves the management of the uncertainties in reservoir systems. Traditional management models only find management policies that optimize the expected values of system benefits or costs, thereby not allowing operators and stakeholders to explicitly explore issues related to uncertainty and risk management. A technique that can be used to derive management policies that produce desired probabilistic distributions of reservoir system outputs reflecting stakeholder preferences is developed. This technique can be embedded in a user-interactive framework that can be employed to evaluate the trade-offs and build consensus in multi-objective and multi-stakeholder systems. The methods developed in this dissertation are illustrated in case studies of real reservoir systems, including a seven-reservoir, multi-objective system in California's Central Valley.
364

Fearing the Uncertain: A Causal Exploration of Self-Esteem, Self-Uncertainty, and Mortality Salience

Hohman, Zachary P. 01 January 2012 (has links)
Social identity theory (Tajfel & Turner, 1979) is one of the most influential social psychological theories of group behavior and intergroup relations. Early social identity research focused on many different group processes; however, the motivation behind group identification was not fully explored. Researchers have proposed a variety of accounts for why people join and identify with groups. This dissertation unravels the relationship between, on the one hand, mortality salience, self-related uncertainty and self-esteem, and on the other group identification and ingroup defense. The general hypothesis derived from uncertainty-identity theory (Hogg, 2010) is that uncertainty and not fear of death or pursuit of self-esteem motivate people to identify with and defend their groups, and that identification mediates the relationship between uncertainty and defense of the group. Experiment 1 (N = 112) tested the relationship between uncertainty and self-esteem on defense of the ingroup, with the additional test of the mediating effects of identification with the group between uncertainty and ingroup defense. Results showed that uncertainty and not self-esteem motivate people to identify with a group, to defend their group, and that group defense is mediated by identification. Experiment 2 (N = 112) provided a replication of the typical TMT study, which suggests that self-esteem will buffer the effects of mortality salience on ingroup defense, with the additional test of the mediating effects of identification between mortality salience and defense of one's group. As predicted, mortality salience only increased identification and defense of the group when self-esteem was not enhanced, as well, the interactive effects of mortality salience and self-esteem on defense was mediated by identification. Experiment 3 (N = 294) was a combination of both Experiments 1 and 2 and tested the hypothesis that uncertainty would moderate the relationship between self-esteem and mortality salience on group identification and ingroup defense. Exactly as predicted, only under high uncertainty the typical TMT results are demonstrated. Results across these three experiments demonstrate that self-uncertainty plays a significant role in reactions to mortality salience, and support uncertainty-identity theory's analysis of the role of self-uncertainty in ideological conviction and group behavior.
365

Nuclear data uncertainty quantification and data assimilation for a lead-cooled fast reactor : Using integral experiments for improved accuracy

Alhassan, Erwin January 2015 (has links)
For the successful deployment of advanced nuclear systems and optimization of current reactor designs, high quality nuclear data are required. Before nuclear data can be used in applications they must first be evaluated, tested and validated against a set of integral experiments, and then converted into formats usable for applications. The evaluation process in the past was usually done by using differential experimental data which was then complemented with nuclear model calculations. This trend is fast changing due to the increase in computational power and tremendous improvements in nuclear reaction models over the last decade. Since these models have uncertain inputs, they are normally calibrated using experimental data. However, these experiments are themselves not exact. Therefore, the calculated quantities of model codes such as cross sections and angular distributions contain uncertainties. Since nuclear data are used in reactor transport codes as input for simulations, the output of transport codes contain uncertainties due to these data as well. Quantifying these uncertainties is important for setting safety margins; for providing confidence in the interpretation of results; and for deciding where additional efforts are needed to reduce these uncertainties. Also, regulatory bodies are now moving away from conservative evaluations to best estimate calculations that are accompanied by uncertainty evaluations. In this work, the Total Monte Carlo (TMC) method was applied to study the impact of nuclear data uncertainties from basic physics to macroscopic reactor parameters for the European Lead Cooled Training Reactor (ELECTRA). As part of the work, nuclear data uncertainties of actinides in the fuel, lead isotopes within the coolant, and some structural materials have been investigated. In the case of the lead coolant it was observed that the uncertainty in the keff and the coolant void worth (except in the case of 204Pb), were large, with the most significant contribution coming from 208Pb. New 208Pb and 206Pb random nuclear data libraries with realistic central values have been produced as part of this work. Also, a correlation based sensitivity method was used in this work, to determine parameter - cross section correlations for different isotopes and energy groups. Furthermore, an accept/reject method and a method of assigning file weights based on the likelihood function are proposed for uncertainty reduction using criticality benchmark experiments within the TMC method. It was observed from the study that a significant reduction in nuclear data uncertainty was obtained for some isotopes for ELECTRA after incorporating integral benchmark information. As a further objective of this thesis, a method for selecting benchmark for code validation for specific reactor applications was developed and applied to the ELECTRA reactor. Finally, a method for combining differential experiments and integral benchmark data for nuclear data adjustments is proposed and applied for the adjustment of neutron induced 208Pb nuclear data in the fast energy region.
366

Visualisation interactive de grands volumes de données incertaines : pour une approche perceptive / Interactive scientific visualization of high-volume, uncertain data : for a perceptual approach

Coninx, Alexandre 22 May 2012 (has links)
Les études scientifiques et d'ingénierie actuelles font de plus en plus souvent appel à des techniques de simulation numérique pour étudier des phénomènes physiques complexes. La visualisation du résultat de ces simulations sur leur support spatial, souvent nécessaire à leur bonne compréhension, demande la mise en place d'outils adaptés, permettant une restitution fidèle et complète de l'information présente dans un jeu de données. Une telle visualisation doit donc prendre en compte les informations disponibles sur la qualité du jeu de données et l'incertitude présente. Cette thèse a pour but d'améliorer les méthodes de visualisation des champs de données scalaires de façon à intégrer une telle information d'incertitude. Les travaux présentés adoptent une approche perceptive, et utilisent les méthodes expérimentales et les connaissances préalables obtenues par la recherche sur la perception visuelle pour proposer, étudier et finalement mettre en oeuvre des nouvelles techniques de visualisation. Une revue de l'état de l'art sur la visualisation de données incertaines nous fait envisager l'utilisation d'un bruit procédural animé comme primitive pour la représentation de l'incertitude. Une expérience de psychophysique nous permet d'évaluer des seuils de sensibilité au contraste pour des stimuli de luminance générés par l'algorithme de bruit de Perlin, et de déterminer ainsi dans quelles conditions ces stimuli seront perçus. Ces résultats sont validés et étendus par l'utilisation d'un modèle computationnel de sensibilité au contraste, que nous avons réimplémenté et exécuté sur nos stimuli. Les informations obtenues nous permettent de proposer une technique de visualisation des données scalaires incertaines utilisant un bruit procédural animé et des échelles de couleur, intuitive et efficace même sur des géométries tridimensionnelles complexes. Cette technique est appliquée à deux jeux de données industriels, et présentée à des utilisateurs experts. Les commentaires de ces utilisateurs confirment l'efficacité et l'intérêt de notre technique et nous permettent de lui apporter quelques améliorations, ainsi que d'envisager des axes de recherche pour des travaux futurs. / Current scientific and engineering works make an increasingly frequent use of numerical simulation techniques to study complex physical phenomenons. Visualizing these simulations' results on their geometric structure is often necessary in order to understand and analyze the simulated system. Such a visualization requires specific software tools in order to achieve a comprehensive and accurate depiction of the information present in the dataset. This includes taking into account the available information about dataset quality and data uncertainty. The goal of this thesis is to improve the visualization techniques for scalar data fields in order to integrate uncertainty information to the result. Our work follows a perceptual approach, using knowledge and experimental methods from visual perception research to put forward, study and implement new visualization techniques. A review of the state of the art on uncertainty visualization make us suggest to use an animated procedural noise as a visual primitive to show uncertainty. We set up a psychophysics experiment to evaluate contrast sensitivity thresholds for luminance stimuli generated using Perlin's noise algorithm, and therefore understand under which conditions such noise patterns can be perceived. These results are validated and extended by using a computational model of contrast sensitiviy, which we reimplemented and ran on our stimuli. The resulting information allow us to put forward a new technique for visualizing uncertain scalar data using an animated procedural noise and color maps. The resulting visualization is intuitive and efficient even for datasets with a complex tridimensional geometry. We apply this new technique to two industrial datasets, and demonstrate it to expert users. Their feedback uphold the usabiliy and efficiency of our technique, and allows us to add a few more improvements and to orient our future work.
367

Models and Computational Strategies for Multistage Stochastic Programming under Endogenous and Exogenous Uncertainties

Apap, Robert M. 01 July 2017 (has links)
This dissertation addresses the modeling and solution of mixed-integer linear multistage stochastic programming problems involving both endogenous and exogenous uncertain parameters. We propose a composite scenario tree that captures both types of uncertainty, and we exploit its unique structure to derive new theoretical properties that can drastically reduce the number of non-anticipativity constraints (NACs). Since the reduced model is often still intractable, we discuss two special solution approaches. The first is a sequential scenario decomposition heuristic in which we sequentially solve endogenous MILP subproblems to determine the binary investment decisions, fix these decisions to satisfy the first-period and exogenous NACs, and then solve the resulting model to obtain a feasible solution. The second approach is Lagrangean decomposition. We present numerical results for a process network planning problem and an oilfield development planning problem. The results clearly demonstrate the efficiency of the special solution methods over solving the reduced model directly. To further generalize this work, we also propose a graph-theory algorithm for non-anticipativity constraint reduction in problems with arbitrary scenario sets. Finally, in a break from the rest of the thesis, we present the basics of stochastic programming for non-expert users.
368

The Role of Constitutive Model in Traumatic Brain Injury Prediction

Kacker, Shubhra 28 October 2019 (has links)
No description available.
369

Ökologische Bewertung von zentralen und dezentralen Abwasserentsorgungssystemen

Schubert, Rebecca January 2011 (has links)
Die zentralen Abwasserentsorgungssysteme sind aufgrund der langen Nutzungsdauern und der Leistungsgebundenheit unflexibel gegenüber sich ändernden Rahmenbedingungen, wie dem demografischen Wandel, dem Klimawandel und einem sinkenden Trinkwasserverbrauch. Kleinkläranlagen werden daher häufig als eine Alternative diskutiert, vor allem wenn es um die Erschließung von neuen dezentral gelegenen Grundstücken geht, da eventuell höhere Kosten für das neue Kanalnetz zur zentralen Kläranlage anfallen. Um sich für eine der beiden Möglichkeiten zu entscheiden, werden bisher vorrangig ökonomische Instrumente eingesetzt. Ökologische Bewertungsmethoden finden bislang als Entscheidungsinstrument noch keine Anwendung. In dieser Arbeit wird exemplarisch eine Ökobilanzierung durchgeführt. Sie kann jedoch nur dann zur Entscheidungsfindung genutzt werden, wenn die Ergebnisse auch robust und zuverlässig sind. Deshalb ist es notwendig mit möglichen Unsicherheiten in der Ökobilanzierung offensiv umzugehen. Aus diesen Problematiken ergibt sich das zentrale Forschungsinteresse der ökologischen Bewertung von dezentralen Abwasserentsorgungssystemen im Vergleich zu zentralen Systemen. Dabei fließen die Abhängigkeit der Länge des Kanalnetzes und die Anzahl der angeschlossenen Einwohner, unter besonderer Berücksichtigung von Datenunsicherheiten bei der Ökobilanzierung, ein. Es wird die Methode der Ökobilanzierung angewendet, welche sich nach DIN EN ISO 14040 (2009) und DIN EN ISO 14044 (2006) richtet. Die Literaturrecherche ergibt, dass wenig Literatur über Kleinkläranlagen zur Erstellung einer Ökobilanz zur Verfügung steht. Deshalb wird ein Fragebogen an Hersteller von Kleinkläranlagen versendet. Auf Grundlage der erhobenen Daten wird eine Ökobilanz für eine SBR-Anlage und eine Rotationstauchkörperanlage erstellt. Um die Breite der Daten zu erfassen, wird die Szenarioanalyse verwendet. Dagegen stammen die Daten für die zentrale Anlage aus der EcoIvent-Datenbank für die Abwasserentsorgung (DOKA, G. (2007)), wofür weitere Szenarien erstellt werden. Die funktionelle Einheit sind vier Einwohnerwerte. Nach Aufstellung der Sachbilanz, wird die Methode „CML 2 baseline 2000“ zur Wirkungsabschätzung in SimaPro 7.1 eingesetzt. Mit einer Break-Even-Analyse werden die Abhängigkeiten von der Länge des Kanalnetzes und der angeschlossenen Einwohner für die Abwasserentsorgungssysteme untersucht. Die Integration der Datenunsicherheit erfolgt mittels einer Sensitivitäts- und einer Monte Carlo-Analyse. Die Analysen ergeben, dass sich die Installation einer Kleinkläranlage eher lohnt, wenn die kommunale Kläranlage weiter vom Haushalt entfernt und die Anzahl der angeschlossenen Einwohner gering ist. Eine SBR-Anlage ist einer Rotationstauchkörperanlage vorzuziehen. Die verwendeten Methoden zum Umgang mit Unsicherheiten zeigen, dass sich Ökobilanzen als Entscheidungsinstrument eignen. Die Anwendung von Methoden zur Vermeidung bzw. Eingliederung von Unsicherheiten muss noch viel stärker in die Ökobilanzen von Abwasserentsorgungssystemen einfließen. / Next to municipal sewage treatment plants for sewage treatment, decentralized small wastewater treatment plants can be applied. Given a long working life and a dependence on user numbers, central sewage systems are inflexible in respect to overall conditions such as demography, climate or declining water consumption. Small wastewater treatment plants in rural areas can be an alternative, especially in the case of new plots for buildings because of new expensive sewer grids. Economic methodologies help to choose one of these facilities whereas ecological assessments are neglected. This thesis focuses a life cycle assessment (LCA), but this can only been used if the results are robust and reliable. Hence it is necessary to analyze uncertainty factors of life cycle assessments. The following main objective arises: The ecological assessment of small wastewater treatment plants in comparison to a municipal sewage treatment plant as a function of the length of sewer grid and the number of connected inhabitants to the central sewer system in consideration of dealing with data uncertainty of LCA. A simplified LCA study, which is based on the DIN EN ISO 14040 (2006) and the DIN EN ISO 14044 (2006), is applied to capture this objective, which means in detail the analysis of environmental effects during the production, operation, and disposal phase. As the literature review has shown, available data on small wastewater treatment plants is nearly non-existent. Hence, conducting a survey among the producers of small wastewater treatment plants completes the less available literature. Based on the questionnaire-data, a LCA for of sequencing batch reactor (SBR) plant and rotating biological contactor plant is applied. The scenario analysis captures the extensive amount of the available data adequately. In contrast, the data for the central sewage treatment plant comes from the EcoInvent-Database for Wastewater Treatment (DOKA, G. (2007)) and is further used for other scenarios. The functional unit consists of four population equivalents. After collecting all data for the Life Cycle Inventory Analysis, the estimation of effectiveness is done by applying the “CML 2 baseline 2000” method implemented in SimaPro 7. By means of Break-Even-Analysis the dependence on the length of sewer grid and the number of inhabitants connected to the sewer grid is analyzed. To explicitly incorporate the uncertainty in LCA the Sensitivity analysis and the Monte Carlo-Analysis is used. The most important result is that the higher the distance between the household and the central sewage treatment plant and the lower the number of connected inhabitance, the more preferable is the small wastewater treatment plant. Furthermore, SBRs are generally preferred in comparison to rotating biological contactor plants. Concerning the treatment of uncertainty factors, the analysis of different methods confirms the LCA as the most accurate variant. Further requirements of research are in respect to prevent uncertainty factors or integrate them in LCAs of sewerage systems.
370

Efficient Uncertainty quantification with high dimensionality

Jianhua Yin (12456819) 25 April 2022 (has links)
<p>Uncertainty exists everywhere in scientific and engineering applications. To avoid potential risk, it is critical to understand the impact of uncertainty on a system by performing uncertainty quantification (UQ) and reliability analysis (RA). However, the computational cost may be unaffordable using current UQ methods with high-dimensional input. Moreover, current UQ methods are not applicable when numerical data and image data coexist. </p> <p>To decrease the computational cost to an affordable level and enable UQ with special high dimensional data (e.g. image), this dissertation develops three UQ methodologies with high dimensionality of input space. The first two methods focus on high-dimensional numerical input. The core strategy of Methodology 1 is fixing the unimportant variables at their first step most probable point (MPP) so that the dimensionality is reduced. An accurate RA method is used in the reduced space. The final reliability is obtained by accounting for the contributions of important and unimportant variables. Methodology 2 addresses the issue that the dimensionality cannot be reduced when most of the variables are important or when variables equally contribute to the system. Methodology 2 develops an efficient surrogate modeling method for high dimensional UQ using Generalized Sliced Inverse Regression (GSIR), Gaussian Process (GP)-based active learning, and importance sampling. A cost-efficient GP model is built in the latent space after dimension reduction by GSIR. And the failure boundary is identified through active learning that adds optimal training points iteratively. In Methodology 3, a Convolutional Neural Networks (CNN) based surrogate model (CNN-GP) is constructed for dealing with mixed numerical and image data. The numerical data are first converted into images and the converted images are then merged with existing image data. The merged images are fed to CNN for training. Then, we use the latent variables of the CNN model to integrate CNN with GP to quantify the model error using epistemic uncertainty. Both epistemic uncertainty and aleatory uncertainty are considered in uncertainty propagation. </p> <p>The simulation results indicate that the first two methodologies can not only improve the efficiency but also maintain adequate accuracy for the problems with high-dimensional numerical input. GSIR with active learning can handle the situations that the dimensionality cannot be reduced when most of the variables are important or the importance of variables are close. The two methodologies can be combined as a two-stage dimension reduction for high-dimensional numerical input. The third method, CNN-GP, is capable of dealing with special high-dimensional input, mixed numerical and image data, with the satisfying regression accuracy and providing an estimate of the model error. Uncertainty propagation considering both epistemic uncertainty and aleatory uncertainty provides better accuracy. The proposed methods could be potentially applied to engineering design and decision making. </p>

Page generated in 0.0513 seconds