251 |
Empirical Modeling of Planetary Boundary Layer Dynamics Under Multiple Precipitation Scenarios Using a Two-Layer Soil Moisture Approach: An Example From a Semiarid ShrublandSanchez-Mejia, Zulia Mayari, Papuga, Shirley A. 11 1900 (has links)
In semiarid regions, where water resources are limited and precipitation dynamics are changing, understanding land surface-atmosphere interactions that regulate the coupled soil moisture-precipitation system is key for resource management and planning. We present a modeling approach to study soil moisture and albedo controls on planetary boundary layer height (PBLh). We used Santa Rita Creosote Ameriflux and Tucson Airport atmospheric sounding data to generate empirical relationships between soil moisture, albedo, and PBLh. Empirical relationships showed that similar to 50% of the variation in PBLh can be explained by soil moisture and albedo with additional knowledge gained by dividing the soil profile into two layers. Therefore, we coupled these empirical relationships with soil moisture estimated using a two-layer bucket approach to model PBLh under six precipitation scenarios. Overall we observed that decreases in precipitation tend to limit the recovery of the PBL at the end of the wet season. However, increases in winter precipitation despite decreases in summer precipitation may provide opportunities for positive feedbacks that may further generate more winter precipitation. Our results highlight that the response of soil moisture, albedo, and the PBLh will depend not only on changes in annual precipitation, but also on the frequency and intensity of this change. We argue that because albedo and soil moisture data are readily available at multiple temporal and spatial scales, developing empirical relationships that can be used in land surface-atmosphere applications have great potential for exploring the consequences of climate change.
|
252 |
Nuclear data uncertainty propagation and uncertainty quantification in nuclear codesFiorito, Luca 03 October 2016 (has links)
Uncertainties in nuclear model responses must be quantified to define safety limits, minimize costs and define operational conditions in design. Response uncertainties can also be used to provide a feedback on the quality and reliability of parameter evaluations, such as nuclear data. The uncertainties of the predictive model responses sprout from several sources, e.g. nuclear data, model approximations, numerical solvers, influence of random variables. It was proved that the largest quantifiable sources of uncertainty in nuclear models, such as neutronics and burnup calculations, are the nuclear data, which are provided as evaluated best estimates and uncertainties/covariances in data libraries. Nuclear data uncertainties and/or covariances must be propagated to the model responses with dedicated uncertainty propagation tools. However, most of the nuclear codes for neutronics and burnup models do not have these capabilities and produce best-estimate results without uncertainties. In this work, the nuclear data uncertainty propagation was concentrated on the SCK•CEN code burnup ALEPH-2 and the Monte Carlo N-Particle code MCNP.Two sensitivity analysis procedures, i.e. FSAP and ASAP, based on linear perturbation theory were implemented in ALEPH-2. These routines can propagate nuclear data uncertainties in pure decay models. ASAP and ALEPH-2 were tested and validated against the decay heat and uncertainty quantification for several fission pulses and for the MYRRHA subcritical system. The decay uncertainty is necessary to define the reliability of the decay heat removal systems and prevent overheating and mechanical failure of the reactor components. It was proved that the propagation of independent fission yield and decay data uncertainties can be carried out with ASAP also in neutron irradiation models. Because of the ASAP limitations, the Monte Carlo sampling solver NUDUNA was used to propagate cross section covariances. The applicability constraints of ASAP drove our studies towards the development of a tool that could propagate the uncertainty of any nuclear datum. In addition, the uncertainty propagation tool was supposed to operate with multiple nuclear codes and systems, including non-linear models. The Monte Carlo sampling code SANDY was developed. SANDY is independent of the predictive model, as it only interacts with the nuclear data in input. Nuclear data are sampled from multivariate probability density functions and propagated through the model according to the Monte Carlo sampling theory. Not only can SANDY propagate nuclear data uncertainties and covariances to the model responses, but it is also able to identify the impact of each uncertainty contributor by decomposing the response variance. SANDY was extensively tested against integral parameters and was used to quantify the neutron multiplication factor uncertainty of the VENUS-F reactor.Further uncertainty propagation studies were carried out for the burnup models of light water reactor benchmarks. Our studies identified fission yields as the largest source of uncertainty for the nuclide density evolution curves of several fission products. However, the current data libraries provide evaluated fission yields and uncertainties devoid of covariance matrices. The lack of fission yield covariance information does not comply with the conservation equations that apply to a fission model, and generates inconsistency in the nuclear data. In this work, we generated fission yield covariance matrices using a generalised least-square method and a set of physical constraints. The fission yield covariance matrices solve the inconsistency in the nuclear data libraries and reduce the role of the fission yields in the uncertainty quantification of burnup models responses. / Doctorat en Sciences de l'ingénieur et technologie / info:eu-repo/semantics/nonPublished
|
253 |
Metody výpočtu VaR pro tržní a kreditní rizika / Methods of the calculation of Value at Risk for the market and credit risksŠtolc, Zdeněk January 2008 (has links)
This thesis is focused on a theoretical explication of the basic methods of the calculation Value at Risk for the market and credit risk. For the market risk there is in detail developed the variance -- covariance method, historical simulation and Monte Carlo simulation, above all for the nonlinear portfolio. For all methods the assumptions of their applications are highlighted and the comparation of these methods is made too. For the credit risk there is made a theoretical description of CreditMetrics, CreditRisk+ and KMV models. Analytical part is concerned in the quantification of Value at Risk on two portfolios, namely nonlinear currency portfolio, which particular assumptions of the variance -- covariance method a Monte Carlo simulation are tested on. Then by these methods the calculation of Value at Risk is realized. The calculation of Credit Value at Risk is made on the portfolio of the US corporate bonds by the help of CreditMetrics model.
|
254 |
Value at Risk: Historická simulace, variančně kovarianční metoda a Monte Carlo simulace / Value at Risk: Historical simulation, variance covariance method and Monte CarloFelcman, Adam January 2012 (has links)
The diploma thesis "Value at Risk: Historical simulation, variance covariance method and Monte Carlo" aims to value the risk which real bond portfolio bears. The thesis is decomposed into two major chapters: Theoretical and Practical chapters. The first one speaks about VaR and conditional VaR theory including their advantages and disadvantages. Moreover, there are described three basic methods to calculate VaR and CVaR with adjustments to each method in order to increase the reliability of results. The last chapter brings results of VaR and CVaR computation. Many graphs, tables and images are added to the result section in order to make the outputs more visible and well-arranged.
|
255 |
Inferences about Parameters of Trivariate Normal Distribution with Missing DataWang, Xing 05 July 2013 (has links)
Multivariate normal distribution is commonly encountered in any field, a frequent issue is the missing values in practice. The purpose of this research was to estimate the parameters in three-dimensional covariance permutation-symmetric normal distribution with complete data and all possible patterns of incomplete data. In this study, MLE with missing data were derived, and the properties of the MLE as well as the sampling distributions were obtained. A Monte Carlo simulation study was used to evaluate the performance of the considered estimators for both cases when ρ was known and unknown. All results indicated that, compared to estimators in the case of omitting observations with missing data, the estimators derived in this article led to better performance. Furthermore, when ρ was unknown, using the estimate of ρ would lead to the same conclusion.
|
256 |
Geodesics of ruled surfacesRamirez, Steven John 01 January 2001 (has links)
The focus of this thesis is on the investigation of the geodesics of ruled surfaces.
|
257 |
Quantização da partícula não relativística em espaços curvos como superfícies do Rn / Quantization of the non-relativistic particle in curved spaces as surfaces of RnResende, Maria Fernanda Araujo de 16 November 2011 (has links)
Neste trabalho estudamos o problema relacionado à construção de uma teoria quântica para uma partícula, se movendo não relativisticamente num espaço curvo, tratado como uma subvariedade de outro Euclideano, talvez dando maior ênfase ao aspecto geométrico envolvido nesta abordagem, uma vez que os demais trabalhos relacionados ao mesmo tema não o fazem. Além de mostrarmos que o consequente uso de uma teoria de sistemas vinculados não contribui para remover as ambiguidades da formulação quântica, relacionados diretamente ao ordenamento de operadores, também apresentamos, através de uma quantização específica feita sob a prescrição de Dirac, elementos que permitem não apenas construir um formalismo quântico covariante, mas também liberto de qualquer correção quântica. Em adição, fazemos alguns comentários gerais no que se refere às outras abordagens clássicas possíveis para o mesmo problema, intentando construir teorias quânticas associadas ao sistema sob consideração. / In this work we study the problem related to the construction of a quantum theory for a particle, moving non-relativistically in a curved space, treated as submanifold of the other Euclidean, maybe putting more emphasis on the geometric aspect envolved in this approach, since the rest of the works related to the subject do not. Besides showing that the consequent use of a theory of constrained systems not contributes for remove the ambiguities in the quantum formulation, related directly to the ordering of operators, we also showing, through a specific quantization made in the prescription of Dirac, elements that offers resources not only to construct a covariant quantum formalism, but also free from any quantum correction. In addition, we make some general comments in relation to other classical approaches possible for the same problem, attempting to build quantum theories associated with the system under consideration.
|
258 |
Estimation of Bivariate Spatial DataOnnen, Nathaniel J. 01 October 2021 (has links)
No description available.
|
259 |
An investigation into the methodologies of value-at -risk and a simulation process of a portfolio of financial instruments.Ballam, Gamal Abdel Hussein January 2004 (has links)
>Magister Scientiae - MSc / Financial companies such as investment and commercial banks as well as insurance companies, mutual and pension funds hold assets in the form of financial instruments in portfolios. Nowadays, financial instruments have proliferated so much that there are so many forms of them namely: derivatives, common stock, corporate and government bonds, foreign exchange and contracts. With so many financial instruments, companies can have very large and diversified portfolios for which they must quantify the risk. With high profile calamities that have rocked the financial world lately, the need for better risk management has never been so in demand as before. Value-at-Risk (VaR) is the latest addition in the investor's toolkit as far as measurements of risk is concerned. This new measure of risk complements well the existing risk measures that exist.Unfortunately, VaR is not unanimous and it has attracted a lot of critics over the years. This research thesis is threefold: to introduce the reader to the VaR concept; to discuss the different methods that exist to calculate VaR; and, finally, to simulate the VaR of a portfolio of government bonds. The first part of this research is to introduce the reader to the general idea of risk forms and its management, the role that the existing risk measures have played so far and the coming up of the new technique, which is VaR. The pros and cons that accompany a new technique are discussed as well as the history of VaR. The second part is about the different methods that exist to compute the VaR of a portfolio. Usually, VaR methodologies fall into three categories namely: Parametric; Historical; and Monte Carlo. In this research, the advantages and disadvantages of these three methods are discussed together with a step-wise method on how to proceed to calculate the VaR of a portfolio using any of the three methods. The practical side of this thesis deals about the VaR simulation of a portfolio of financial instruments. The chosen financial instruments are four South African government bonds with different characteristics. VaR for this particular portfolio will then be simulated by the three main methods. Eleven different simulations are run and they are compared against a Control Simulation (Benchmark Portfolio) to see how factors influencing VaR measure cope under different conditions. The main idea here was to check how VaR measures can change under different portfolio characteristics and to interpret these changes. Moreover, the VaR estimates under the three different methods will be compared
|
260 |
CO2-emissions from rivers and streams : Seasonal variation of pCO2-levels and CO2-fluxesVandeburie, Emile January 2020 (has links)
Since the industrial evolution, the CO2-levels have been increasing in a way that’s never seen in the history of the earth. To mitigate and adapt to the happening climate change it is really important to understand the global carbon cycle and each component that plays a role in it. Some studies suggest that there has been an underestimation on the influence from inland waters in the total carbon budget.To address this issue, there has been Eddy Covariance measurements going on the boreal Indalsälven river in front of the Kattstrupeforsen water dam. In this study continuous data has been collected which includes air-river CO2-flux, pCO2-values in the air and the water and some more meteorological parameters such as the wind speed, relative humidity and the air and water temperature. The aim of this study is to look into the seasonal variation in pCO2-levels and the CO2-fluxes on the Indalsälven river.The data indicates that the CO2-fluxes are mainly positive from January till July (average flux = 0.2 μmol m-2s-1) and mainly negative from September till November (average flux = -0.59 μmol m-2s-1) with an average flux of 0.212 μmol m-2s-1 during 2019. The main range of CO2-fluxes per month lies between -2 and 2 μmol m-2s-1, with the exception of March and December where there is a bigger range of fluxes. The pCO2-levels in the water mainly range between 400 and 1000 ppm. With the exception of 2 periods, one in the end of May where there is a peak to 4000ppm and more which can be explained by the spring flood and in the end of July and beginning of August where there is a peak to 3000 ppm. / <p>2020-06-17</p>
|
Page generated in 0.0697 seconds