Spelling suggestions: "subject:"[een] QUANTIFICATION"" "subject:"[enn] QUANTIFICATION""
471 |
Efficient Uncertainty Characterization Framework in Neutronics Core Simulation with Application to Thermal-Spectrum Reactor SystemsDongli Huang (7473860) 16 April 2020 (has links)
<div>This dissertation is devoted to developing a first-of-a-kind uncertainty characterization framework (UCF) providing comprehensive, efficient and scientifically defendable methodologies for uncertainty characterization (UC) in best-estimate (BE) reactor physics simulations. The UCF is designed with primary application to CANDU neutronics calculations, but could also be applied to other thermal-spectrum reactor systems. The overarching goal of the UCF is to propagate and prioritize all sources of uncertainties, including those originating from nuclear data uncertainties, modeling assumptions, and other approximations, in order to reliably use the results of BE simulations in the various aspects of reactor design, operation, and safety. The scope of this UCF is to propagate nuclear data uncertainties from the multi-group format, representing the input to lattice physics calculations, to the few-group format, representing the input to nodal diffusion-based core simulators and quantify the uncertainties in reactor core attributes.</div><div>The main contribution of this dissertation addresses two major challenges in current uncertainty analysis approaches. The first is the feasibility of the UCF due to the complex nature of nuclear reactor simulation and computational burden of conventional uncertainty quantification (UQ) methods. The second goal is to assess the impact of other sources of uncertainties that are typically ignored in the course of propagating nuclear data uncertainties, such as various modeling assumptions and approximations.</div>To deal with the first challenge, this thesis work proposes an integrated UC process employing a number of approaches and algorithms, including the physics-guided coverage mapping (PCM) method in support of model validation, and the reduced order modeling (ROM) techniques as well as the sensitivity analysis (SA) on uncertainty sources, to reduce the dimensionality of uncertainty space at each interface of neutronics calculations. In addition to the efficient techniques to reduce the computational cost, the UCF aims to accomplish four primary functions in uncertainty analysis of neutronics simulations. The first function is to identify all sources of uncertainties, including nuclear data uncertainties, modeling assumptions, numerical approximations and technological parameter uncertainties. Second, the proposed UC process will be able to propagate the identified uncertainties to the responses of interest in core simulation and provide uncertainty quantifications (UQ) analysis for these core attributes. Third, the propagated uncertainties will be mapped to a wide range of reactor core operation conditions. Finally, the fourth function is to prioritize the identified uncertainty sources, i.e., to generate a priority identification and ranking table (PIRT) which sorts the major sources of uncertainties according to the impact on the core attributes’ uncertainties. In the proposed implementation, the nuclear data uncertainties are first propagated from multi-group level through lattice physics calculation to generate few-group parameters uncertainties, described using a vector of mean values and a covariance matrix. Employing an ROM-based compression of the covariance matrix, the few-group uncertainties are then propagated through downstream core simulation in a computationally efficient manner.<div>To explore on the impact of uncertainty sources except for nuclear data uncertainties on the UC process, a number of approximations and assumptions are investigated in this thesis, e.g., modeling assumptions such as resonance treatment, energy group structure, etc., and assumptions associated with the uncertainty analysis itself, e.g., linearity assumption, level of ROM reduction and associated number of degrees of freedom employed. These approximations and assumptions have been employed in the literature of neutronic uncertainty analysis yet without formal verifications. The major argument here is that these assumptions may introduce another source of uncertainty whose magnitude needs to be quantified in tandem with nuclear data uncertainties. In order to assess whether modeling uncertainties have an impact on parameter uncertainties, this dissertation proposes a process to evaluate the influence of various modeling assumptions and approximations and to investigate the interactions between the two major uncertainty sources. To explore this endeavor, the impact of a number of modeling assumptions on core attributes uncertainties is quantified.</div><div>The proposed UC process has first applied to a BWR application, in order to test the uncertainty propagation and prioritization process with the ROM implementation in a wide range of core conditions. Finally, a comprehensive uncertainty library for CANDU uncertainty analysis with NESTLE-C as core simulator is generated compressed uncertainty sources from the proposed UCF. The modeling uncertainties as well as their impact on the parameter uncertainty propagation process are investigated on the CANDU application with the uncertainty library.</div>
|
472 |
POLYNOMIAL CHAOS EXPANSION IN BIO- AND STRUCTURAL MECHANICS / MISE EN OEUVRE DU CHAOS POLYNOMIAL EN BIOMECANIQUE ET EN MECANIQUE DES STRUCTURESSzepietowska, Katarzyna 12 October 2018 (has links)
Cette thèse présente une approche probabiliste de la modélisation de la mécanique des matériaux et des structures. Le dimensionnement est influencé par l'incertitude des paramètres d'entrée. Le travail est interdisciplinaire et les méthodes décrites sont appliquées à des exemples de biomécanique et de génie civil. La motivation de ce travail était le besoin d'approches basées sur la mécanique dans la modélisation et la simulation des implants utilisés dans la réparation des hernies ventrales. De nombreuses incertitudes apparaissent dans la modélisation du système implant-paroi abdominale. L'approche probabiliste proposée dans cette thèse permet de propager ces incertitudes et d’étudier leurs influences respectives. La méthode du chaos polynomial basée sur la régression est utilisée dans ce travail. L'exactitude de ce type de méthodes non intrusives dépend du nombre et de l'emplacement des points de calcul choisis. Trouver une méthode universelle pour atteindre un bon équilibre entre l'exactitude et le coût de calcul est encore une question ouverte. Différentes approches sont étudiées dans cette thèse afin de choisir une méthode efficace et adaptée au cas d’étude. L'analyse de sensibilité globale est utilisée pour étudier les influences des incertitudes d'entrée sur les variations des sorties de différents modèles. Les incertitudes sont propagées aux modèles implant-paroi abdominale. Elle permet de tirer des conclusions importantes pour les pratiques chirurgicales. À l'aide de l'expertise acquise à partir de ces modèles biomécaniques, la méthodologie développée est utilisée pour la modélisation de joints de bois historiques et la simulation de leur comportement mécanique. Ce type d’étude facilite en effet la planification efficace des réparations et de la rénovation des bâtiments ayant une valeur historique. / This thesis presents a probabilistic approach to modelling the mechanics of materials and structures where the modelled performance is influenced by uncertainty in the input parameters. The work is interdisciplinary and the methods described are applied to medical and civil engineering problems. The motivation for this work was the necessity of mechanics-based approaches in the modelling and simulation of implants used in the repair of ventral hernias. Many uncertainties appear in the modelling of the implant-abdominal wall system. The probabilistic approach proposed in this thesis enables these uncertainties to be propagated to the output of the model and the investigation of their respective influences. The regression-based polynomial chaos expansion method is used here. However, the accuracy of such non-intrusive methods depends on the number and location of sampling points. Finding a universal method to achieve a good balance between accuracy and computational cost is still an open question so different approaches are investigated in this thesis in order to choose an efficient method. Global sensitivity analysis is used to investigate the respective influences of input uncertainties on the variation of the outputs of different models. The uncertainties are propagated to the implant-abdominal wall models in order to draw some conclusions important for further research. Using the expertise acquired from biomechanical models, modelling of historic timber joints and simulations of their mechanical behaviour is undertaken. Such an investigation is important owing to the need for efficient planning of repairs and renovation of buildings of historical value.
|
473 |
Formes modulaires et courbes modulaires : quelques contributions à leur rôle en physique mathématique / Modular forms and modular curves : some contributions to their role in mathematical physicsDostert, Mike 13 November 2009 (has links)
Le but de cette thèse est d'analyser et de développer les objets mathématiques apparus dans l'article "New classical limits of quantum theories" de S. G. Rajeev, notamment les (loc.sit) "limites néoclassiques" dans le contexte de la théorie des formes modulaires. Afin de voir au mieux quels sont les objets en jeu dans l'étude de Rajeev, on a dans une première étape construit certains modèles-jouets afin de mener dedans des calculs similaires que ceux exposés dans l'article en question tout en essayant d'étudier le rapprochement avec des objets et théories mathématiques rigoureuses, notamment la quantification kählerienne, la géométrie algébrique arithmétique et la formule pour la trace des opérateurs de Hecke. Dans une deuxième étape, on a développé un cadre mathématique rigoureux où vivent naturellement les objets de l'étude de Rajeev. Ce cadre devrait servir dans la suite afin de faire de manière rigoureuse les calculs de "limite néoclassique" dans ce contexte ci. Ainsi les objets développés devraient servir aux mathématiciens de mieux comprendre les idées des physiciens et aux physiciens de pouvoir pousser plus loin les calculs de perturbations / The goal of this thesis is to analyze and to develop the mathematical objects that appeared in "New classical limits of quantum theories" of S. G. Rajeev, especially the (loc. sit) "neoclassical limits" in the context of the contexte of the theory of modular forms. To see what are the objects involved in the study of Rajeev, we constructed certain toy models where could develop similar calculations as those done in the article mentioned above. This was done by trying to compare these toy models with rigorous mathematical theories, for example Kähler quantization, algebric geometry and the trace formula for Hecke operators. After that we developed a rigorous mathematical frame where the objects introduced by Rajeev naturally live. This frame should be used in the futur to do the "neoclassical limit" calculations in this context. So the objects developed could be used by the mathematicians to understand the physical ideas and by the physicists to push further the calculations of perturbation
|
474 |
Investigating Brain Tissue Microstructure using Quantitative Magnetic Resonance ImagingMetere, Riccardo 14 May 2018 (has links)
In recent years there has been a considerable research effort in improving the specificity of magnetic resonance imaging (MRI) techniques by employing quantitative methods.
These methods offer greater reproducibility over traditional acquisitions, and hold the potential for obtaining improved information at the microstructural level.
However, they typically require a longer duration for the experiments as the quantitative information is often obtained from multiple acquisitions.
Here, a multi-echo extension of the MP2RAGE pulse sequence for the simultaneous mapping of T1, T2* (and magnetic susceptibility) is introduced, optimized and validated.
This acquisition technique can be faster than the separate acquisition of T1 and T2*, and has the advantage of producing intrinsically co-localized maps.
This is helpful in reducing the preprocessing complexity, but most importantly it removes the need for image alignment (registration) which is shown to introduce significant bias in quantitative MRI maps.
One of the reasons why the knowledge of T1 and T2* is of relevance in neuroscience is because their reciprocal, R1 and R2*, have been shown to predict quantitatively myelin and iron content in ex vivo experiments using a linear model of relaxation.
However, the post-mortem results cannot be applied directly to the in vivo case.
Therefore, an adaptation of the linear relaxation model to the in vivo case is proposed.
This is capable of predicting (with some limitations) the myelin and iron contents of the brain under in vivo conditions, by using prior knowledge from the literature to calibrate the linear coefficients.
The dependence of the relaxation parameters from the biochemical composition in brain tissues is further explored with ex vivo experiments.
In particular, the hyaluronan component of the extracellular matrix is investigated.
The contribution to T1 and T2* is measured with a sophisticated experiments that allow for a greater control over experimental conditions compared to a typical MRI experiment.
The result indicate a small but appreciable contribution of hyaluronan to the relaxation parameters.
In conclusion, this work develops a method for measuring T1 and T2* maps simultaneously.
These are then used to quantify myelin and iron under in vivo conditions using a linear model of relaxation.
In parallel, the hyaluronan-based extracellular matrix was shown to be a marginal but measurable component in T1 and T2* relaxation maps in ex vivo experiments.
|
475 |
Immunoassays or LC-MS/MS? : A Comparison Revealing the Properties of Modern Methods for Insulin, Pro-insulin, C-peptide and Glucagon QuantificationUpite, Ruta, Wärmegård, Susanna, Tiger, Casper, Ivert Nordén, Anna, Martinez, Temis, Umenius, Viktor January 2019 (has links)
The purpose of this report is to compare seven different methods for biomarker detection and quantification based on previously published papers. The methods investigated are ELISA, LC-MS/MS, UPLC-MS/MS, LC-IM/MS, IA-LC-MS/MS, MSIA-HR/AM, HTRF and AlphaLISA ® . The focus lies on biomarkers relevant for diabetes, obesity and cardiovascular diseases.Namely insulin, proinsulin, glucagon and C-peptide. Particular significance is assigned to the comparison of the currently widest used method, ELISA, with various types of LC-MS/MS. The report concludes ELISA being superior to LC-MS/MS methods in terms of recovery and precision, while LC-MS/MS is superior in accuracy, multiplexing, specificity, throughput and sample cost. This suggests that different types of LC-MS/MS has the potential to gain momentum in the field of biomarker quantification if they become more available.
|
476 |
Essais sur l'adoption des technologies de quantification de soi : une approche critique / Essays on the adoption of self-quantification technologies : a critical approachDe Moya, Jean-François 19 March 2019 (has links)
Cette thèse sur travaux explore l’adoption des technologies et des pratiques de quantification de soi avec un positionnement critique. Il s’agit de mieux comprendre l’expérience des utilisateurs avec les technologies de quantification de soi, telles que les bracelets connectés, et de questionner la contribution réelle de ces technologies au bien-être et à la santé des individus. Le premier essai présente une revue de la littérature systématique sur la quantification de soi et un agenda de recherche pour les chercheurs en management. Le deuxième essai est une étude qualitative qui révèle les relations de pouvoir qu’entretiennent les utilisateurs avec la technologie. Le troisième essai s’intéresse aux mécanismes sous-jacents qui guident la décision de l’utilisateur afin d’identifier les facteurs d’adoption d’une technologie de quantification de soi. / This thesis by articles explores the adoption of technologies and practices of quantified-self with a critical stance. The aim is to have a better understanding of users' experiences with self-quantization technologies, such as connected bracelets, and to question the real contribution of these technologies to the well-being and health of individuals. The first essay presents a systematic literature review on quantified-self and a research agenda for business researchers. The second essay is a qualitative study that reveals the power relationships that users have with technology. The third essay focuses on the underlying mechanisms that guide the user's decision in order to identify the factors that lead to the adoption of a quantified-self technology.
|
477 |
Quantification of User Privacy LossPinnaka, Chaitanya January 2012 (has links)
With the advent of communication age newer, faster and arguably better ways of moving information are at our disposal. People felt the need to stay connected which led to the evolution of smart gadgets like cell phones, tablets and laptops. The next generations of automobiles are keen in extending this connectivity to the vehicle user by arming themselves with radio interfaces. This move will enable the formation of vehicular networks where each car (mobile node) will be an instance of mobile ad hoc networks, popularly referred as Vehicular AdHoc Networks (VANETS). These networks will provide further necessary infrastructure for applications that can help improving safety and efficiency of road traffic as well as provide useful services for the mobile nodes (cars). The specific nature of VANETS brings up the need to address necessary security and privacy issues to be integrated into the social world. Thus, the open field of secure inter-vehicular communication promises an interesting research area. This thesis aims to quantify how much of a user trajectory can an adversary identify while monitoring non-safety applications in VANETS. Different types of adversaries, their attacks and possible non-safety applications are also discussed.
|
478 |
Prototyp för identifiering av teknisk skuld inom Product Lifecycle ManagementGauffin, Christian, Jonsson, Marcus January 2016 (has links)
Technical debt is a well known term within software development, but has not yet been implemented outside of software development. Because of this, there is no knowledge whether that is possible or not. This thesis investigates how technical debt can be extended to and be identified within a software system for handling Product Lifecycle Management. The purpose of the thesis is to present a prototype, called the ITS-prototype, which shows that it is possible to identify technical debt within Product Lifecycle Management. The thesis has qualitative characteristics and were conducted as a case study. In order to verify that the implementation is correct, two evaluation criterias were established. The first criteria, measuring the degree of coverage, saying that the ITS-prototype should be able to identify 100% of the technical debt defined by each rule. The second criteria consists of an interview with a technical expert on Product Lifecycle Management where the prototype's underlying method is evaluated. The ITS-prototype together with the results of the evaluation shows that technical debt is possible to be implemented and identified in a software system for handling Product Lifecycle Management. The rule-driven implementation that is used, has shown effective and the authors suggests that the development of the ITS-prototype continues in order to better use conveniences that exist within a Product Lifecycle Management-system. / Teknisk skuld är ett vedertaget begrepp inom mjukvaruutveckling men har ännu inte implementerats utanför mjukvara. Således finns det ingen kunskap om huruvida det är praktiskt möjligt att göra detta eller inte. I detta arbete undersöktes om konceptet teknisk skuld kan implementeras i ett mjukvarusystem för hantering av Product Lifecycle Management. Syftet med arbetet är att visa att teknisk skuld kan implementeras inom Product Lifecycle Management genom att presentera en prototyp för identifiering av teknisk skuld inom Product Lifecycle Management, kallad ITS-prototypen. Arbetet är av kvalitativ karaktär och genomfördes som en fallstudie. För att verifiera att implementationen är korrekt upprättades två utvärderingskriterier. Det första mäter prototypens täckningsgrad och säger att ITS-prototypen ska kunna identifiera 100% teknisk skuld definierad av varje regel. Det andra kriteriet består av en utvärderingsintervju med en teknisk expert på Product Lifecycle Management, där prototypens underliggande metod utvärderas. ITS-prototypen tillsammans med resultaten av utvärderingen visar att teknisk skuld är möjlig att implementera i ett mjukvarusystem för hantering av Product Lifecycle Management. Den regeldrivna implementation som använts i ITS-prototypen är effektiv och författarna föreslår att utvecklandet av prototypen fortsätter för att bättre kunna nyttja fördelar i ett PLM-system.
|
479 |
Cannabidiol in Gummies: Determination of Effective Solvent-Based Extraction MethodsClark, Abigail R. 18 May 2022 (has links)
No description available.
|
480 |
Produktion and analysis of polyhydroxybutyrate in Halomonas boliviensis / Produktion och analys av polyhydroxybutyrat i Halomonas boliviensisGnanasekhar, Joshua Dhivyan January 2012 (has links)
No description available.
|
Page generated in 0.0508 seconds