• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 8
  • 1
  • Tagged with
  • 15
  • 15
  • 15
  • 5
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • 2
  • 2
  • 2
  • 2
  • 2
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
11

Galaxies as Clocks and the Universal Expansion / Galaxer som klockor och universums expansion

Ahlström Kjerrgren, Anders January 2021 (has links)
The Hubble parameter H(z) is a measure of the expansion rate of the universe at redshift z. One method to determine it relies on inferring the slope of the redshift with respect to cosmic time, where galaxy ages can be used as a proxy for the latter. This method is used by Simon et al. in [1], where they present 8 determinations of the Hubble parameter. The results are surprisingly precise given the precision of their data set. Therefore, we reanalyze their data using three methods: chi-square minimization, Monte Carlo sampling, and Gaussian processes. The first two methods show that obtaining 8 independent values of the Hubble parameter yields significantly larger uncertainties than those presented by Simon et al. The last method yields a continuous inference of H(z) with lower uncertainties. However, this is obtained at the cost of having strong correlations, meaning that inferences at a wide range of redshifts provide essentially the same information. Furthermore, we demonstrate that obtaining 8 independent values for the Hubble parameter with the same precision as in [1] requires either significantly increasing the size of the data set, or significantly decreasing the uncertainty in the data. We conclude that their resulting Hubble parameter values can not be derived from the employed data. [1] J. Simon, L. Verde and R. Jimenez, Constraints on the redshift dependence of the dark energy potential, Physical Review D 71, 123001 (2005). / Hubbleparametern H(z) är ett mått på universums expansionshastighet vid rödskift z. En metod som bestämmer parametern bygger på att hitta lutningen av sambandet mellan rödskift och kosmisk tid, där det sistnämnda går att ersätta med galaxåldrar. Denna metod används av Simon et al. i [1], där de presenterar 8 värden av Hubbleparametern. Resultaten är förvånansvärt precisa, med tanke på precisionen i deras data. Vi omanalyserar därför deras data med tre metoder: chi-2-miniminering, Monte Carlo-sampling och Gaussiska processer. De två första metoderna visar att när 8 oberoende värden av Hubbleparametern bestäms fås mycket större osäkerheter än de som presenteras av Simon et al. Den sistnämnda metoden ger en kontinuerlig funktion H(z) med lägre osäkerheter. Priset för detta är dock starka korrelationer, det vill säga att resultat vid många olika rödskift innehåller i princip samma information. Utöver detta visar vi att det krävs antingen en mycket större mängd data eller mycket mindre osäkerheter i datan för att kunna bestämma 8 oberoende värden av Hubbleparametern med samma precision som i [1]. Vi drar slutsatsen att deras värden av Hubbleparametern inte kan fås med den data som använts. [1] J. Simon, L. Verde and R. Jimenez, Constraints on the redshift dependence of the dark energy potential, Physical Review D 71, 123001 (2005).
12

Dynamique d'un réseau métabolique avec un modèle à base de contraintes : approche par échantillonnage des trajectoires solutions / Dynamic of metabolic network with constraint-based model : an approach by sampling of solution trajectories

Duigou, Thomas 13 May 2015 (has links)
À l’issue de ce travail de thèse, je propose une approche basée sur le formalisme des modèles à base de contraintes, pour étudier la dynamique d’un système métabolique. En associant l’échantillonnage de l’espace des solutions avec l’utilisation d’une contrainte de « faisabilité » entre les périodes de temps considérées, cette approche permet de modéliser la dynamique d’un système métabolique en prenant en compte la variabilité des mesures expérimentales. La contrainte de faisabilité entre les périodes permet de garantir que chaque « trajectoire solution » correspond à une succession de cartes de flux qui conduit à des cinétiques de concentrations cohérentes avec les mesures expérimentales. Les populations de trajectoires solutions générées autorisent différents types d’analyses. D’une part, les répartitions de flux prédites peuvent être utilisées afin d’estimer les répartitions de flux les plus plausibles au sein du réseau étudié. D’autre part, la distribution des concentrations prédites permet d’évaluer le modèle utilisé pour étudier le réseau métabolique. Le fait que cette approche soit basée sur le formalisme de la modélisation à base de contraintes permet, moyennant l’utilisation de l’hypothèse d’état stationnaire du système, d’étudier des réseaux métaboliques de taille relativement grande, et d’utiliser des données expérimentales qui sont aisément mesurables, par exemple les concentrations en biomasse et en métabolites extracellulaires. Cette approche par « trajectoires solutions » a été utilisée afin d’étudier la dynamique du métabolisme de Corynebacterium glutamicum, lorsqu’elle est cultivée en condition de limitation en biotine. Les résultats obtenus ont permis d’une part d’attester du fonctionnement de la méthode, et d’autre part de proposer plusieurs hypothèses quant aux phénomènes biologiques qui ont lieu pendant cette condition particulière de croissance. / In this thesis, I propose an approach based on the formalism of constraint-based models to study the dynamics of a metabolic system. By combining the sampling of the solutions space and the use of a "feasibility" constraint between the considered time periods, this approach allows to model the dynamic of a metabolic system taking into account the variability of experimental measurements. The feasibility constraint between time periods ensures that each "solution trajectory" corresponds to a succession of flux maps which leads to some kinetics of concentrations that are consistent with the experimental measurements. The generation of a population of solution trajectories allows several analyses. On the one hand, the predicted flux maps can be used to estimate the most plausible flux within the network studied. On the other hand, the distribution of predicted concentrations enables to assess the model used for studying the metabolic network. The fact that this approach is based on the formalism of constraint-based modeling allows, using the steady-state assumption of the system, to study metabolic networks of relatively large size, and to use experimental data that are easily measurable, such as biomass concentration and extracellular metabolites concentration. This approach by "solution trajectories" has been used to study the dynamics of the metabolism of Corynebacterium glutamicum, when grown under biotin-limited condition. The results allowed, first, to attest the functioning of the method, and second, to propose several hypotheses about biological phenomena that take place during this particular growth condition.
13

Improving the Depiction of Uncertainty in Simulation Models by Exploiting the Potential of Gaussian Quadratures

Stepanyan, Davit 12 March 2021 (has links)
Simulationsmodelle sind ein etabliertes Instrument zur Analyse von Auswirkungen exogener Schocks in komplexen Systemen. Die in jüngster Zeit gestiegene verfügbare Rechenleistung und -geschwindigkeit hat die Entwicklung detaillierterer und komplexerer Simulationsmodelle befördert. Dieser Trend hat jedoch Bedenken hinsichtlich der Unsicherheit solcher Modellergebnisse aufgeworfen und daher viele Nutzer von Simulationsmodellen dazu motiviert, Unsicherheiten in ihren Simulationen zu integrieren. Eine Möglichkeit dies systematisch zu tun besteht darin, stochastische Elemente in die Modellgleichungen zu integrieren, wodurch das jeweilige Modell zu einem Problem (mehrfacher) numerischer Integrationen wird. Da es für solche Probleme meist keine analytischen Lösungen gibt, werden numerische Approximationsmethoden genutzt. Die derzeit zur Quantifizierung von Unsicherheiten in Simulationsmodellen genutzt en Techniken, sind entweder rechenaufwändig (Monte Carlo [MC] -basierte Methoden) oder liefern Ergebnisse von heterogener Qualität (Gauß-Quadraturen [GQs]). In Anbetracht der Bedeutung von effizienten Methoden zur Quantifizierung von Unsicherheit im Zeitalter von „big data“ ist es das Ziel dieser Doktorthesis, Methoden zu entwickeln, die die Näherungsfehler von GQs verringern und diese Methoden einer breiteren Forschungsgemeinschaft zugänglich machen. Zu diesem Zweck werden zwei neuartige Methoden zur Quantifizierung von Unsicherheiten entwickelt und in vier verschiedene, große partielle und allgemeine Gleichgewichtsmodelle integriert, die sich mit Agrarumweltfragen befassen. Diese Arbeit liefert methodische Entwicklungen und ist von hoher Relevanz für angewandte Simulationsmodellierer. Obwohl die Methoden in großen Simulationsmodellen für Agrarumweltfragen entwickelt und getestet werden, sind sie nicht durch Modelltyp oder Anwendungsgebiet beschränkt, sondern können ebenso in anderen Zusammenhängen angewandt werden. / Simulation models are an established tool for assessing the impacts of exogenous shocks in complex systems. Recent increases in available computational power and speed have led to simulation models with increased levels of detail and complexity. However, this trend has raised concerns regarding the uncertainty of such model results and therefore motivated many users of simulation models to consider uncertainty in their simulations. One way is to integrate stochastic elements into the model equations, thus turning the model into a problem of (multiple) numerical integration. As, in most cases, such problems do not have analytical solutions, numerical approximation methods are applied. The uncertainty quantification techniques currently used in simulation models are either computational expensive (Monte Carlo [MC]-based methods) or produce results of varying quality (Gaussian quadratures [GQs]). Considering the importance of efficient uncertainty quantification methods in the era of big data, this thesis aims to develop methods that decrease the approximation errors of GQs and make these methods accessible to the wider research community. For this purpose, two novel uncertainty quantification methods are developed and integrated into four different large-scale partial and general equilibrium models addressing agro-environmental issues. This thesis provides method developments and is of high relevance for applied simulation modelers who struggle to apply computationally burdensome stochastic modeling methods. Although the methods are developed and tested in large-scale simulation models addressing agricultural issues, they are not restricted to a model type or field of application.
14

Efficient Approaches to the Treatment of Uncertainty in Satisfying Regulatory Limits

Grabaskas, David 30 August 2012 (has links)
No description available.
15

Newsvendor Models With Monte Carlo Sampling

Ekwegh, Ijeoma W 01 August 2016 (has links)
Newsvendor Models with Monte Carlo Sampling by Ijeoma Winifred Ekwegh The newsvendor model is used in solving inventory problems in which demand is random. In this thesis, we will focus on a method of using Monte Carlo sampling to estimate the order quantity that will either maximizes revenue or minimizes cost given that demand is uncertain. Given data, the Monte Carlo approach will be used in sampling data over scenarios and also estimating the probability density function. A bootstrapping process yields an empirical distribution for the order quantity that will maximize the expected profit. Finally, this method will be used on a newsvendor example to show that it works in maximizing profit.

Page generated in 0.0898 seconds