Spelling suggestions: "subject:"[een] QUANTIFICATION"" "subject:"[enn] QUANTIFICATION""
501 |
[en] CARING ABOUT WOMEN AND THE ROLE OF QUANTIFICATION IN FAMILY PLANNING: A CRITICAL ANALYSIS OF THE FP2020 PROGRAMME / [pt] CUIDADO COM AS MULHERES E O PAPEL DA QUANTIFICAÇÃO NO PLANEJAMENTO FAMILIAR: UMA ANÁLISE CRÍTICA DO PROGRAMA FP2020BEATRIZ NAZARETH DE SOUZA TEIXEIRA 27 April 2023 (has links)
[pt] A interseção entre população, conflito, mudanças climáticas e migração é o ponto central
dos debates contemporâneos sobre controle populacional e reacende iniciativas cujo alvo
são os corpos de mulheres e meninas, majoritariamente não- brancas e do Sul Global. À
medida que uma população crescente passa a ser enquadrada como um problema global,
a fertilidade dessas mulheres se torna uma ameaça e os seus corpos um espaço para
intervenções de organizações internacionais, agências de desenvolvimento e governos. O
propósito do trabalho é investigar uma dessas intervenções, a maior em termos de
recursos financeiros e de metas estabelecidas – Family Planning 2020. Mais
especificamente, será focado no papel desempenhado por dados quantitativos para
fundamentar intervenções na saúde reprodutiva de milhões de mulheres. A hipótese da
dissertação é que esses dados desempenham um papel fundamental no posicionamento
de políticas de planejamento familiar como objetivas e baseada em evidências, e não
como fundadas em fatores raciais. Ademais, que a adesão a uma ética do cuidado pode
orientar tais políticas de maneira que se evite instrumentalizar a fertilidade e a saúde de
mulheres para enfrentar a questões atribuídos à superpopulação. / [en] The intersection between population, conflict, climate change and migration
is the cornerstone of contemporary population control debates and rekindles
initiatives that target the bodies of women and girls, largely of colour and from the
Global South. As a growing population becomes framed as a global problem, the
fertility of those women turns into a threat and their bodies into a space for
interventions by international organizations, development agencies, and
governments. The purpose of this work is to investigate one of said interventions,
the largest one when it comes to targets and financial investment - the Family
Planning 2020.More specifically, the project will look to FP 2020 to analyse the
role played by quantified data in substantiating interventions in the reproductive
health of millions of women. The working hypothesis of this dissertation is that data
is a crucial element to the rebranding of family planning policies as not racially
oriented, but rather as objective and evidence-based and that, by adhering to an
ethics of care, one might avoid the pitfalls of instrumentalizing women s fertility
and overall health in order to address other pressing issues.
|
502 |
Simulation and Calibration of Uncertain Space Fractional Diffusion EquationsAlzahrani, Hasnaa H. 10 January 2023 (has links)
Fractional diffusion equations have played an increasingly important role in ex- plaining long-range interactions, nonlocal dynamics and anomalous diffusion, pro- viding effective means of describing the memory and hereditary properties of such processes. This dissertation explores the uncertainty propagation in space fractional diffusion equations in one and multiple dimensions with variable diffusivity and order parameters. This is achieved by:(i) deploying accurate numerical schemes of the forward problem, and (ii) employing uncertainty quantifications tools that accelerate the inverse problem. We begin by focusing on parameter calibration of a variable- diffusivity fractional diffusion model. A random, spatially-varying diffusivity field is considered together with an uncertain but spatially homogeneous fractional operator order. Polynomial chaos (PC) techniques are used to express the dependence of the stochastic solution on these random variables. A non-intrusive methodology is used, and a deterministic finite-difference solver of the fractional diffusion model is utilized for this purpose. The surrogates are first used to assess the sensitivity of quantities of interest (QoIs) to uncertain inputs and to examine their statistics. In particular, the analysis indicates that the fractional order has a dominant effect on the variance of the QoIs considered. The PC surrogates are further exploited to calibrate the uncertain parameters using a Bayesian methodology. In the broad range of parameters addressed, the analysis shows that the uncertain parameters having a significant impact on the variance of the solution can be reliably inferred, even from limited observations.
Next, we address the numerical challenges when multidimensional space-fractional
diffusion equations have spatially varying diffusivity and fractional order. Significant computational challenges arise due to the kernel singularity in the fractional integral operator as well as the resulting dense discretized operators. Hence, we present a singularity-aware discretization scheme that regularizes the singular integrals through a singularity subtraction technique adapted to the spatial variability of diffusivity and fractional order. This regularization strategy is conveniently formulated as a sparse matrix correction that is added to the dense operator, and is applicable to different formulations of fractional diffusion equations. Numerical results show that the singularity treatment is robust, substantially reduces discretization errors, and attains the first-order convergence rate allowed by the regularity of the solutions.
In the last part, we explore the application of a Bayesian formalism to detect an anomaly in a fractional medium. Specifically, a computational method is presented for inferring the location and properties of an inclusion inside a two-dimensional domain. The anomaly is assumed to have known shape, but unknown diffusivity and fractional order parameters, and is assumed to be embedded in a fractional medium of known fractional properties. To detect the presence of the anomaly, the medium is forced using a collection of localized sources, and its response is measured at the source locations. To this end, the singularity-aware finite-difference scheme is applied. A non-intrusive regression approach is used to explore the dependence of the computed signals on the properties of the anomaly, and the resulting surrogates are first exploited to characterize the variability of the response, and then used to accelerate the Bayesian inference of the anomaly. In the regime of parameters considered, the computational results indicate that robust estimates of the location and fractional properties of the anomaly can be obtained, and that these estimates become sharper when high contrast ratios prevail between the anomaly and the surrounding matrix.
|
503 |
Robust Design Framework for Automating Multi-component DNA Origami Structures with Experimental and MD coarse-grained Model ValidationHuang, Chao-Min January 2020 (has links)
No description available.
|
504 |
Prioritising Safe Trips Over Safe Policy Processes - A Pilot Study on the Appropriateness of the Participatory Value Evaluation Method for Encompassing (Feelings of) Safety (and Security) in Public Transport / Att prioritera trygga resor över trygga beslutsprocesser - En pilotstudie om Participatory Value Evaluation-metodens förmåga att behandla trygghetFarshchi, Shahin, Rizk, Roy January 2020 (has links)
It is becoming more important that transport agendas acknowledge complex social sustainability matters like feelings of safety and security. For developing knowledge in this context today, participatory descriptive assessments hold a strong position. However, quantitative methods providing popular transport appraisals, have a clearer impact focus, but struggle with encompassing social sustainability matters. Filling the methodological gap that exists between the participatory descriptive tradition and the quantitative impact tradition, can be vital in moving methods closer to public and policy demands and norms. This work uses independent interdisciplinary collaboration and assesses the appropriateness of the Participatory Value Evaluation (PVE) method for encompassing feelings of safety and security in public transport in the context of Stockholm, Sweden. The PVE method evaluates projects through participation and quantifies results without relying monetary valuation, while allowing for norms to be detected. By using in depth descriptive information as PVE input, this study aims to provide a methodological contribution by analysing the PVE method in a new complex setting with modifications made. Descriptive results from the PVE method is regarded to still advance knowledge on feelings of safety and security, while improving the impact focus of appraisals by evaluating projects. The method can benefit from a focus on transparency, attractive participation and quality in results and the amendment here called ‘re-categorisation’ was found necessary for PVE appraisals. Incentives to keep flawed processes can be found in political and policy realms and with no actor controlling the intersectoral (and interdisciplinary) issue of feelings of safety and security, traditions can have a strong impact. However, this study shows feasibility in improving appraisals given the contemporary public and policy standards.
|
505 |
Evaluating the impact of the structure for common mesoporous aminosilica materials on the catalytic activity for the aldol reaction and condensationBrizes, Michael Christopher 06 September 2022 (has links)
No description available.
|
506 |
Development and Use of a Spatially Accurate Polynomial Chaos Method for Aerospace ApplicationsSchaefer, John Anthony 24 January 2023 (has links)
Uncertainty is prevalent throughout the design, analysis, and optimization of aerospace products. When scientific computing is used to support these tasks, sources of uncertainty may include the freestream flight conditions of a vehicle, physical modeling parameters, geometric fidelity, numerical error, and model-form uncertainty, among others. Moreover, while some uncertainties may be treated as probabilistic, aleatory sources, other uncertainties are non-probabilistic and epistemic due to a lack of knowledge, and cannot be rigorously treated using classical statistics or Bayesian approaches. An additional complication for propagating uncertainty is that many aerospace scientific computing tools may be computationally expensive; for example, a single high-fidelity computational fluid dynamics solution may require several days or even weeks to complete. It is therefore necessary to employ uncertainty propagation strategies that require as few solutions as possible.
The Non-Intrusive Polynomial Chaos (NIPC) method has grown in popularity in recent decades due to its ability to propagate both aleatory and epistemic parametric sources of uncertainty in a computationally efficient manner. While traditional Monte Carlo methods might require thousands to millions of function evaluations to achieve statistical convergence, NIPC typically requires tens to hundreds for problems with similar numbers of uncertain dimensions. Despite this efficiency, NIPC is limited in one important aspect: it can only propagate uncertainty at a particular point in a design space or flight envelope. For optimization or aerodynamic database problems that require uncertainty estimates at many more than one point, the use of NIPC quickly becomes computationally intractable.
This dissertation introduces a new method entitled Spatially Accurate Polynomial Chaos (SAPC) that extends the original NIPC approach for the spatial regression of aleatory and epistemic parametric sources of uncertainty. Throughout the dissertation, the SAPC method is applied to various aerospace problems of interest. These include the regression of aerodynamic force and moment uncertainties throughout the flight envelope of a commercial aircraft, the design under uncertainty of a two-stream propulsive mixer device, and the robust design of a low-boom supersonic demonstrator aircraft. Collectively the results suggest that SAPC may be useful for a large variety of engineering applications. / Doctor of Philosophy / Uncertainty is prevalent throughout the design, analysis, and optimization of aerospace products. When scientific computer simulations are used to support these tasks, sources of uncertainty may include the speed of an aerospace vehicle, the direction of the wind, physical modeling constants or assumptions, and the vehicle shape, among others. As a result of these sources uncertainty, assessments of vehicle performance are also uncertain. For example, if the speed of a vehicle is not known precisely, then computer simulations will predict a lift force which is also imprecisely known.
A challenge when assessing the uncertainty in aerospace vehicle performance is that the computer simulations which predict performance may take a long time to run, even on state-of-the-art super computers. Traditional statistical methods may require thousands or millions of simulations for the prediction of uncertainty, which does not fit within the computational budget of most aerospace analyses. A newer method called Non-Intrusive Polynomial Chaos (NIPC) is more efficient, typically requiring only tens to hundreds of simulations; however, NIPC only provides uncertainty estimates at a single point in an aircraft flight envelope or design condition.
In this dissertation, a new method called Spatially Accurate Polynomial Chaos (SAPC) is developed. The SAPC method combines desirable features of NIPC with regression methods for an efficient estimation of uncertainty throughout a vehicle flight envelope or design space. Throughout the dissertation, the SAPC method is applied to various aerospace problems of interest. These include the regression of aerodynamic force and moment uncertainties throughout the flight envelope of a commercial aircraft, the design under uncertainty of a two-stream propulsive mixer device, and the robust design of a low-boom supersonic demonstrator aircraft. Collectively the results suggest that SAPC may be useful for a large variety of engineering applications.
|
507 |
UNCERTAINTY QUANTIFICATION OF LASER POWDER BED FUSION COMPUTATIONAL MODELSScott M Wells (14228129) 09 December 2022 (has links)
<p> </p>
<p>Laser powder bed fusion (L-PBF) is a relatively young metallurgical processing method which has many advantages over traditional casting and wrought based methods. Alloy systems suitable for this additive manufacturing (AM) process include Ti-6Al-4V, 316 stainless steel, Inconel 718 and 625 making it attractive for automotive, aerospace, and biomedical applications. Despite the potential, L-PBF is plagued by defects and inconsistent build qualities which make certification of critical components onerous. Additionally, experimental studies are difficult due to the cost of laser systems and feedstock material. Many researchers have turned to computational modeling as this allows for rigorous examination and isolation of the underlying physics to better understand where problems may arise, and where improvements can be made. However, models often fail to consider the role of systematic and statistical uncertainty while also relying heavily on assumptions and simplifications for computational efficiency. As such, there is no quantifiable metric for how reliable these models are. This work applies an uncertainty quantification (UQ) framework to computational models for L-PBF to understand the role of uncertainty and assumptions on model reliability as this provides insight into their limitations and potential areas of improvement.</p>
<p>First, the UQ framework is applied to a finite volume melt pool transport model to evaluate the role of uncertainty and model assumptions on melt pool shapes and solidification dynamics. This includes the role of simulating the powder bed thermophysical properties, surface tension driven Marangoni convection, and the thermodynamic relation dictating latent heat release. The transport model is then weakly coupled to a cellular automata (CA) grain evolution model to propagate and quantify the uncertainty in the as-built microstructure including crystallographic texture formation. Further propagation of melt pool and microstructure uncertainty to the resulting mechanical properties to close the process-microstructure-property relations are discussed. Lastly, recommendations for future model development and research are presented. </p>
|
508 |
FATTY ACID PROFILES OF NUTRTION SOURCES AND PLASMA IN PRETERM INFANTSChoi, A-Rum January 2016 (has links)
Preterm infants are not able to tolerate full enteral feeding of breast milk due to gut immaturity. To fill this nutritional gap, parenteral nutrition is introduced while enteral feeding is gradually increased. Of parenteral nutrition, lipid emulsions supply energy, essential and polyunsaturated fatty acids, which significantly affect short- and long-term health outcomes of growth, visual-and neuro-development for preterm infants. However, elevated plasma triglyceride (TG) levels in preterm infants receiving lipid emulsions have been observed despite less lipid intake compared to breast milk fed infants. We hypothesized that unbalanced fatty acid profiles in lipid emulsion was one factor to cause high plasma TGs for preterm infants. In the multi-center, observational, prospective study, the following samples were analyzed using GC-MS: (1) lipid emulsions (n=5) and breast milk (n=112), (2) plasma (n=294) including normal TG (n=116) and high TG (n=88). Lipoproteins in normal TG (n=18) and high TG (n=24) plasma were measured using gel electrophoresis. Fatty acid profiles in lipid emulsions differed from ones in breast milk. Plasma fatty acid profiles were related to dietary fatty acid intake. Accumulation of all fatty acids except C20:5n3 and relatively high (LDL+VLDL) levels (p<0.001) resulted in high TG plasma compared to normal TG plasma. Overall, this study supports the hypothesis that nutrition of lipid emulsions was associated high TG with differences in fatty acid uptake but it is still unclear if the imbalance of fatty acids directly causes high TG. More research is necessary to investigate other factors such as enzyme activity, lipid clearance rate, or different rate of fatty acid metabolism. / Thesis / Master of Science (MSc)
|
509 |
Robustness of the Hedgehog morphogen gradient towards variations of tissue morphology in DrosophilaPierini, Giulia 16 November 2023 (has links)
Gradients of morphogens, secreted signaling molecules, are crucial for providing cells with positional information during animal development. While the processes of forma- tion and interpretation of these gradients have been extensively studied, the impact of morphogenetic events on patterning through morphogen gradients remains largely unex- plored. This thesis aims to understand the interplay and feedback mechanisms between tissue shape and morphogen gradients formation. To address this, we developed an analysis pipeline using MATLAB to accurately measure morphogen gradients in curved epithelia. By computationally deforming confocal images of curved tissues, we quantified the levels of a protein of interest at a specified distance from a reference point along the apico-basal axis. Applying our pipeline to the Hedgehog morphogen gradient in the Drosophila eye and wing imaginal discs, which serve as model systems for folded and flat epithelial tissues, respectively, we made an intriguing discovery. Despite the distinct morphologies of these tissues, the decay rate of the Hedgehog gradient remained com- parable. This led us to investigate the robustness of Hedgehog gradient formation by manipulating the morphology of the wing and eye discs. We induced ectopic fold forma- tion at the boundary between the source and receiver tissue of Hedgehog in the wing disc. We found that the decay rate of the Hedgehog gradient remained unchanged even in the wing disc with perturbed morphology, supporting the notion that the Hedgehog gradient is robust towards variability in tissue shapes. Additionally, we locally flattened the eye disc by introducing a mutation that inhibited depolymerization of F-actin. This resulted in the inability of cells to form the morphogenetic furrow and in an expansion of the Hedgehog range compared to the wild-type. However, according to our quantifica- tion, the expansion in the Hedgehog range is to be attributed to a shift in its source rather than a change in decay rate of the gradient. Overall, by developing quantitative methods to analyze the distribution of signaling proteins in curved tissues, we contribute to the understanding of the interplay between tissue morphology and pattern formation through morphogen gradients. Our findings highlight the robustness of the Hedgehog gradient formation towards diverse tissue morphologies. This observation leads us to hypothesize that this property of robustness could extend to other morphogens that employ transport mechanisms similar to Hedgehog.:Contents
Summary . . . . . . . . . . i
1 Introduction . . . . . . . . . . 1
1.1 Basic principles of animal development: an intricated story . . . . . . . . . . 1
1.2 Epithelial folds: a fundamental building block for morphogenesis . . . . . . . . . . 3
1.3 Patterning via morphogen gradients . . . . . . . . . . 4
1.4 Hedgehog gradient in Drosophila imaginal discs as a model system. . . . . . . . . . 13
2 Aims of the Thesis . . . . . . . . . . 21
2.1 Developing an analysis pipeline to quantify morphogen gradients in curved epithelia. . . . . . . . . . 21
2.2 Assessing the robustness of the Hedgehog morphogen gradient in naturally folded and flat tissues: the eye and wing imaginal discs . . . . . . . . . . 22
2.3 Testing the robustness of the Hedgehog gradient by perturbing the morphology of the wing and eye discs . . . . . . . . . . 22
3 Materials and methods . . . . . . . . . . 25
3.1 Fly stocks. . . . . . . . . . 25
3.2 Immunohistochemistry. . . . . . . . . . 28
3.3 Imaging . . . . . . . . . . 30
3.4 Data analysis. . . . . . . . . . 30
4 Results . . . . . . . . . . 47
4.1 Analysis pipeline to computationally flatten curved epithelial tissues: limitations in applicability and comparison to other methodologies. . . . . . . . . . 47
4.2 The Hedgehog gradient is comparable between wing and eye disc in Drosophila . . . . . . . . . . 54
4.3 The extracellular basal gradient of Hedgehog has a decay rate comparable to the one of the internalized morphogen . . . . . . . . . . 62
4.4 Folds in the wing do not affect the Hedgehog gradient. . . . . . . . . . 66
4.5 Downregulation of ci leads to lower levels of the Hedgehog receptors Ptc, which in turn results in a longer Hedgehog gradient . . . . . . . . . . 71
4.6 Local flattening of the morphogenetic furrow expands the source of Hedge-
hog but does not affect the decay rate of the gradient . . . . . . . . . . 74
5 Discussion . . . . . . . . . . 83
5.1 Developing quantitative methods to analyze morphogen gradients in curved epithelia opens new possibilities to study the interplay between morphogens gradients and morphogenesis . . . . . . . . . . 83
5.2 A methodological consideration: the decay rate as a relevant parameter for assessing the robustness of the Hedgehog morphogen gradient . . . . . . . . . . 85
5.3 The decay rate of the Hedgehog gradient is comparable between the wing and the eye disc . . . . . . . . . . 90
5.4 The transport mechanism underlying the formation of the Hedgehog gra-
dient in the wing disc is robust towards deformations of the apical side of the tissue . . . . . . . . . . 91
5.5 The capt mutation in the eye disc affects the signaling for differentiation without affecting the decay rate of the Hedgehog gradient . . . . . . . . . . 94
5.6 Active transport and binding to heparan sulfate proteoglypicans allow the Hedgehog morphogen gradient formation to be robust towards variation in tissuemorphology . . . . . . . . . . 98
5.7 Tissue morphology: obstacle or aid to patterning via morphogens . . . . . . . . . . 99
6 Conclusion. . . . . . . . . . 103
7 Acknowledgments . . . . . . . . . . 105
8 References . . . . . . . . . . 107
9 Declaration according to §5.5 of the doctorate regulations . . . . . . . . . . 117
|
510 |
Bayesian Uncertainty Quantification while Leveraging Multiple Computer Model RunsWalsh, Stephen A. 22 June 2023 (has links)
In the face of spatially correlated data, Gaussian process regression is a very common modeling approach. Given observational data, kriging equations will provide the best linear unbiased predictor for the mean at unobserved locations. However, when a computer model provides a complete grid of forecasted values, kriging will not apply. To develop an approach to quantify uncertainty of computer model output in this setting, we leverage information from a collection of computer model runs (e.g., historical forecast and observation pairs for tropical cyclone precipitation totals) through a Bayesian hierarchical framework. This framework allows us to combine information and account for the spatial correlation within and across computer model output. Using maximum likelihood estimates and the corresponding Hessian matrices for Gaussian process parameters, these are input to a Gibbs sampler which provides posterior distributions for parameters of interest. These samples are used to generate predictions which provide uncertainty quantification for a given computer model run (e.g., tropical cyclone precipitation forecast). We then extend this framework using deep Gaussian processes to allow for nonstationary covariance structure, applied to multiple computer model runs from a cosmology application. We also perform sensitivity analyses to understand which parameter inputs most greatly impact cosmological computer model output. / Doctor of Philosophy / A crucial theme when analyzing spatial data is that locations that are closer together are more likely to have similar output values (for example, daily precipitation totals). For a particular event, common modeling approach of spatial data is to observe data at numerous locations, and make predictions for locations that were unobserved. In this work, we extend this within-event modeling approach by additionally learning about the uncertainty across different events. Through this extension, we are able to quantify uncertainty for a particular computer model (which may be modeling tropical cyclone precipitation, for example) that does not provide any uncertainty on its own. This framework can be utilized to quantify uncertainty across a vast array of computer model outputs where more than one event or model run has been obtained. We also study how inputting different values into a computer model can influence the values it produces.
|
Page generated in 0.0575 seconds