• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 2
  • 1
  • Tagged with
  • 3
  • 3
  • 2
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Nuclear data uncertainty quantification and data assimilation for a lead-cooled fast reactor : Using integral experiments for improved accuracy

Alhassan, Erwin January 2015 (has links)
For the successful deployment of advanced nuclear systems and optimization of current reactor designs, high quality nuclear data are required. Before nuclear data can be used in applications they must first be evaluated, tested and validated against a set of integral experiments, and then converted into formats usable for applications. The evaluation process in the past was usually done by using differential experimental data which was then complemented with nuclear model calculations. This trend is fast changing due to the increase in computational power and tremendous improvements in nuclear reaction models over the last decade. Since these models have uncertain inputs, they are normally calibrated using experimental data. However, these experiments are themselves not exact. Therefore, the calculated quantities of model codes such as cross sections and angular distributions contain uncertainties. Since nuclear data are used in reactor transport codes as input for simulations, the output of transport codes contain uncertainties due to these data as well. Quantifying these uncertainties is important for setting safety margins; for providing confidence in the interpretation of results; and for deciding where additional efforts are needed to reduce these uncertainties. Also, regulatory bodies are now moving away from conservative evaluations to best estimate calculations that are accompanied by uncertainty evaluations. In this work, the Total Monte Carlo (TMC) method was applied to study the impact of nuclear data uncertainties from basic physics to macroscopic reactor parameters for the European Lead Cooled Training Reactor (ELECTRA). As part of the work, nuclear data uncertainties of actinides in the fuel, lead isotopes within the coolant, and some structural materials have been investigated. In the case of the lead coolant it was observed that the uncertainty in the keff and the coolant void worth (except in the case of 204Pb), were large, with the most significant contribution coming from 208Pb. New 208Pb and 206Pb random nuclear data libraries with realistic central values have been produced as part of this work. Also, a correlation based sensitivity method was used in this work, to determine parameter - cross section correlations for different isotopes and energy groups. Furthermore, an accept/reject method and a method of assigning file weights based on the likelihood function are proposed for uncertainty reduction using criticality benchmark experiments within the TMC method. It was observed from the study that a significant reduction in nuclear data uncertainty was obtained for some isotopes for ELECTRA after incorporating integral benchmark information. As a further objective of this thesis, a method for selecting benchmark for code validation for specific reactor applications was developed and applied to the ELECTRA reactor. Finally, a method for combining differential experiments and integral benchmark data for nuclear data adjustments is proposed and applied for the adjustment of neutron induced 208Pb nuclear data in the fast energy region.
2

A Statistical Framework for Distinguishing Between Aleatory and Epistemic Uncertainties in the Best- Estimate Plus Uncertainty (BEPU) Nuclear Safety Analyses

Pun-Quach, Dan 11 1900 (has links)
In 1988, the US Nuclear Regulatory Commission approved an amendment that allowed the use of best-estimate methods. This led to an increased development, and application of Best Estimate Plus Uncertainty (BEPU) safety analyses. However, a greater burden was placed on the licensee to justify all uncertainty estimates. A review of the current state of the BEPU methods indicate that there exists a number of significant criticisms, which limits the BEPU methods from reaching its full potential as a comprehensive licensing basis. The most significant criticism relates to the lack of a formal framework for distinguishing between aleatory and epistemic uncertainties. This has led to a prevalent belief that such separation of uncertainties is for convenience, rather than one out of necessity. In this thesis, we address the above concerns by developing a statistically rigorous framework to characterize the different uncertainty types. This framework is grounded on the philosophical concepts of knowledge. Considering the Plato problem, we explore the use of probability as a means to gain knowledge, which allows us to relate the inherent distinctness in knowledge with the different uncertaintytypesforanycomplexphysicalsystem. Thisframeworkis demonstrated using nuclear analysis problems, and we show through the use of structural models that the separation of these uncertainties leads to more accurate tolerance limits relative to existing BEPU methods. In existing BEPU methods, where such a distinction is not applied, the total uncertainty is essentially treated as the aleatory uncertainty. Thus, the resulting estimated percentile is much larger than the actual (true) percentile of the system's response. Our results support the premise that the separation of these two distinct uncertainty types is necessary and leads to more accurate estimates of the reactor safety margins. / Thesis / Doctor of Philosophy (PhD)
3

Affinement de relevés laser mobiles issus de LIDARs multi-couches / Refinement of mobile lasers scans coming from multi-beam Lidars

Nouira, Houssem 20 April 2017 (has links)
Les Systèmes Mobiles de Cartographie basés LIDAR permettent d’obtenir des cartes 3D de l’environnement, qui sont géo-référencées grâce à d’autres capteurs embarqués sur le véhicule : GPS, centrale inertielle, ou encore odomètre sont de tels capteurs qui permettent de localiser le véhicule mobile pendant la campagne d’acquisition. Toutefois, ces cartes manquent de précisions et un affinage des cartes est essentiel dans de nombreux cas d’applications où une précision fine est requise sur les cartes 3D, comme pour des applications de classifications par exemple.Lors de la création de cartes 3D géoréférencées,les données sont tout d’abord acquises par le capteur LIDAR et référencées dans le repère cartésien du laser à l’aide d’un calibrage intrinsèque du capteur d’acquisition. Ensuite, un calibrage extrinsèque du capteur permet de caractériser la transformation entre le capteur et le véhicule, et permet de référencer les données dans le repère « body», lié au véhicule d’acquisition. Enfin, avec la trajectoire du véhicule obtenue en fusionnant les données issues des GPS, centrale inertielle et odomètre, il est possible de géoréférencer les données lasers.Nous proposons dans cette thèse d’affiner les relevés laser issus d’acquisitions effectuées à l’aide d’un véhicule mobile de cartographie, en optimisant plusieurs paramètres différents qui entrent en compte dans le géoréférencement des données. Nous nous sommes intéressés à l’affinement des nuages de points par optimisation des paramètres decalibrage extrinsèque dans un premier temps, puis par optimisation des paramètres de calibrage intrinsèque, et enfin par optimisation des paramètres de translations liés à la trajectoire du véhicule mobile. / LIDAR based Mobile Mapping Systems allowto get 3D maps of the environment, which are globally referenced with the help of others sensors embedded on the vehicle: GPS,Inertial Measurement Unit, or odometer are such sensors which allow localizing the vehicle during the acquisition process. However, these maps lack of precision, and are finement of the maps is necessary for manywork where a good precision is needed on the 3D maps, like classification applications for example.When creating the globally referenced 3D maps, the data are firstly acquired by the LIDAR sensor and referenced in the Cartesian reference frame of the sensor withan intrinsic calibration of the sensor. Then, anextrinsic calibration gives the transformation between the sensor and the vehicle, and gives data referenced in the « body »reference frame, linked to the vehicle. Finally, with the fusion of the data coming from the GPS, the Inertial Measurement Unit and theodometer, the laser data can be globally referenced.In this thesis, we propose to refine the point clouds coming from acquisitions done with a mobile mapping system, by optimizing some parameters which are used in the georeferencing process of the data. Firstly, we were interested in the refinement of point clouds by optimizing the extrinsic calibration parameters, and then we were interested in the refinement of point clouds by optimizing the intrinsic calibration parameters; finally by optimizing the translation parameters of the mobile vehicle trajectory.

Page generated in 0.0574 seconds