• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 134
  • 44
  • 24
  • 14
  • 6
  • 3
  • 2
  • 1
  • 1
  • 1
  • Tagged with
  • 267
  • 267
  • 267
  • 45
  • 28
  • 26
  • 24
  • 23
  • 21
  • 20
  • 20
  • 18
  • 18
  • 18
  • 18
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
31

Monte Carlo simulations of D-mesons with extended targets in the PANDA detector

Gustafsson, Mattias January 2016 (has links)
Within the PANDA experiment, proton anti-proton collisions will be studied in order to gain knowledge about the strong interaction. One interesting aspect is the production and decay of charmed hadrons. The charm quark is three orders of magnitude heavier than the light up- and down-quarks which constitue the matter we consist of. The detection of charmed particles is a challenge since they are rare and often hidden in a large background. There will be three different targets used at the experiment; the cluster-jet, the untracked pellet and the tracked pellet. All three targets meet the experimental requirements of high luminosity. However they have different properties, concerning the effect on beam quality and the determination of the interaction point. In this thesis, simulations and reconstruction of the charmed D-mesons using the three different targets have been made. The data quality, such as momentum resolution and vertex resolution has been studied, as well as how the different targets effect the reconstruction efficiency of D-meson have been analysed. The results are consistent with the results from a similar study in 2006, but provide additional information since it takes the detector response into account. Furthermore, a new target distribution have been implemented in the software package. This is the first results obtained from a cylindrical target distribution. The results are very important for PANDA since they show the limitations of different target types regarding the possibilities to reduce background. Simulations with the new target distribution have been made for all three targets and the results are presented.
32

Monte Carlo simulation and experimental studies of the production of neutron-rich medical isotopes using a particle accelerator.

Rosencranz, Daniela Necsoiu 05 1900 (has links)
The developments of nuclear medicine lead to an increasing demand for the production of radioisotopes with suitable nuclear and chemical properties. Furthermore, from the literature it is evident that the production of radioisotopes using charged-particle accelerators instead of nuclear reactors is gaining increasing popularity. The main advantages of producing medical isotopes with accelerators are carrier free radionuclides of short lived isotopes, improved handling, reduction of the radioactive waste, and lower cost of isotope fabrication. Proton-rich isotopes are the result of nuclear interactions between enriched stable isotopes and energetic protons. An interesting observation is that during the production of proton-rich isotopes, fast and intermediately fast neutrons from nuclear reactions such as (p,xn) are also produced as a by-product in the nuclear reactions. This observation suggests that it is perhaps possible to use these neutrons to activate secondary targets for the production of neutron-rich isotopes. The study of secondary radioisotope production with fast neutrons from (p,xn) reactions using a particle accelerator is the main goal of the research in this thesis.
33

Analysis Of Sequential Barycenter Random Probability Measures via Discrete Constructions

Valdes, LeRoy I. 12 1900 (has links)
Hill and Monticino (1998) introduced a constructive method for generating random probability measures with a prescribed mean or distribution on the mean. The method involves sequentially generating an array of barycenters that uniquely defines a probability measure. This work analyzes statistical properties of the measures generated by sequential barycenter array constructions. Specifically, this work addresses how changing the base measures of the construction affects the statististics of measures generated by the SBA construction. A relationship between statistics associated with a finite level version of the SBA construction and the full construction is developed. Monte Carlo statistical experiments are used to simulate the effect changing base measures has on the statistics associated with the finite level construction.
34

Numerical solutions to a class of stochastic partial differential equations arising in finance

Bujok, Karolina Edyta January 2013 (has links)
We propose two alternative approaches to evaluate numerically credit basket derivatives in a N-name structural model where the number of entities, N, is large, and where the names are independent and identically distributed random variables conditional on common random factors. In the first framework, we treat a N-name model as a set of N Bernoulli random variables indicating a default or a survival. We show that certain expected functionals of the proportion L<sub>N</sub> of variables in a given state converge at rate 1/N as N [right arrow - infinity]. Based on these results, we propose a multi-level simulation algorithm using a family of sequences with increasing length, to obtain estimators for these expected functionals with a mean-square error of epsilon <sup>2</sup> and computational complexity of order epsilon<sup>−2</sup>, independent of N. In particular, this optimal complexity order also holds for the infinite-dimensional limit. Numerical examples are presented for tranche spreads of basket credit derivatives. In the second framework, we extend the approximation of Bush et al. [13] to a structural jump-diffusion model with discretely monitored defaults. Under this approach, a N-name model is represented as a system of particles with an absorbing boundary that is active in a discrete time set, and the loss of a portfolio is given as the function of empirical measure of the system. We show that, for the infinite system, the empirical measure has a density with respect to the Lebesgue measure that satisfies a stochastic partial differential equation. Then, we develop an algorithm to efficiently estimate CDO index and tranche spreads consistent with underlying credit default swaps, using a finite difference simulation for the resulting SPDE. We verify the validity of this approximation numerically by comparison with results obtained by direct Monte Carlo simulation of the basket constituents. A calibration exercise assesses the flexibility of the model and its extensions to match CDO spreads from precrisis and crisis periods.
35

Comparing Welch's ANOVA, a Kruskal-Wallis test and traditional ANOVA in case of Heterogeneity of Variance

Liu, Hangcheng 01 January 2015 (has links)
Analysis of variance (ANOVA) is a robust test against the normality assumption, but it may be inappropriate when the assumption of homogeneity of variance has been violated. Welch ANOVA and the Kruskal-Wallis test (a non-parametric method) can be applicable for this case. In this study we compare the three methods in empirical type I error rate and power, when heterogeneity of variance occurs and find out which method is the most suitable with which cases including balanced/unbalanced, small/large sample size, and/or with normal/non-normal distributions.
36

Studies of Light Hyperon Decay Parameters

Heikkilä, Annele January 2019 (has links)
A basic assumption in fundamental physics is that equal amounts of matter and antimatter were created after the Big Bang. When particles and antiparticles collide, they annihilate, i.e. disappear and produce photons. Nevertheless, the universe consists mainly of matter today. To explain why all matter did not disappear, violation of CP symmetry beyond the Standard Model is required. CP symmetry means that the laws of physics are the same if particles are interchanged with antiparticles and spatial coordinates of all particles are mirrored. CP symmetry is relatively poorly tested in baryon decays. A new method to study CP symmetry in hyperon-antihyperon pairs has been developed at Uppsala University. Hyperons are baryons with one or more strange quarks. The method allows determining the decay asymmetry parameters of the hyperon and antihyperon separately if the hyperon-antihyperon pair is polarized. If any significant difference between the magnitudes of these parameters is found, the process is CP violating. The particle physics experiment BESIII in China is a suitable experiment to conduct this kind of measurements because it is a high precision experiment and has collected large data samples of hyperon-antihyperon pairs. The goal of this project was to study statistical precisions of the physics parameters that can be obtained with the new method in cases of J/ψ meson decaying into ΛΛ, Σ+Σ− and Σ0Σ0. High statistical precision is required to detect CP violation, because CP violating processes are, if they exist, expected to be rare. The main focus was to study the process e+e− → J/ψ → Σ0Σ0 → ΛγΛγ → pπ−γpπ+γ. In this process, CP symmetry can be tested in two decay processes: electromagnetic decay Σ0 → Λγ and weak decay Λ → pπ−. Only the asymmetry parameter of Λ → pπ− was studied. The study served as a validity check of the new method and ongoing analyses at BESIII. The statistical precision was studied by simulations: Monte Carlo data samples were created and then a maximum-log-likelihood fit was applied to the samples. An important component when determining the asymmetry parameters turned out to be the relative phase ∆φJ/ψ. The relative phase is one of the parameters used for determining the relation between the electric and magnetic form factors. ∆φJ/ψ is also related to the polarization of the hyperon-antihyperon pair. The study showed that the value of ∆φJ/ψ has a large impact on the uncertainties of the hyperon and antihyperon asymmetry parameters. A low value of ∆φJ/ψ resulted in high uncertainties and strong correlations between the asymmetry parameters. The formalism is different for different processes, which affects the uncertainties as well. The formalism used for the Σ0Σ0 process gives poorer parameter precision of the asymmetry parameter related to the Λ → pπ− decay than the formalism used for the ΛΛ process. Therefore, the ΛΛ process is a much more suitable process for CP studies of the Λ → pπ− decay. / Ett grundantagande inom den fundamentala fysiken är att lika stora mängder av materia och antimateria skapades efter Big Bang. När partiklar och antipartiklar kolliderar, annihilerar de, dvs försvinner och producerar fotoner. Trots detta består dagens universum huvudsakligen av materia. För att förklara varför all materia inte försvann krävs ett brott mot CP-symmetrin bakom standardmodellen. CP-symmetrin innebär att fysikens lagar är desamma om man byter partiklar mot antipartiklar och speglar partikelns rumsliga koordinater. CP-symmetri i baryonsönderfall är relativt dåligt testad. En ny metod för att studera CP-symmetrin har utvecklats vid Uppsala universitet för hyperon-antihyperon par. Hyperoner är baryoner med en eller fler särkvarkar. Metoden gör det möjligt att bestämma asymmetriparametrar hos hyperon- och antihyperonsönderfall separat om hyperonantihyperonparet är polariserat. Om en signifikant skillnad mellan värden av dessa parametrar upptäcks, är processen CP-brytande. Partikelfysikexperimentet BESIII i Kina är ett lämpligt experiment för sådana här mätningar eftersom det är ett högpresicionsexperiment och har dessutom samlat in stora mängder data av hyperon-antihyperonpar. Målet för detta projekt har varit att studera de statistiska precisioner av fysikparametrar som kan nås när man använder den nya metoden i de fall där J/ψ mesonen sönderfaller till ΛΛ, Σ+Σ− och Σ0Σ0. Hög statistisk precision behövs för att upptäcka CP-brott, eftersom CP-brytande processer, om de existerar, är relativt sällsynta. Huvudfokuset var att studera processen e+e− → J/ψ → Σ0Σ0 → ΛγΛγ → pπ−γpπ+γ. I denna process kan CP-symmetri testas för två sönderfallsprocesser: det elektromagnetiska sönderfallet Σ0 → Λγ och det svaga sönderfallet Λ → pπ−. I denna rapport studerades bara asymmetriparametrarna av Λ → pπ−. Detta arbete har fungerat som validitetskontroll av den nya metoden och pågående analyser på BESIII. Den statistiska precisionen undersöktes med simuleringar: Monte Carlo datamängder skapades och sedan en maximum-log-likelihood-anpassning av datan genomfördes. En viktig komponent i bestämningen av asymmetriparametrarna visade sig vara den relativa fasen, ∆φJ/ψ. Den relativa fasen är en av de parametrar som används för att bestämma relationen mellan de elektriska och magnetiska formfaktorer. ∆φJ/ψ är också relaterad till hyperonens hyperon-antihyperonparets polarisation. I forskningsprojektet visades att ∆φJ/ψ har en stor inverkan på osäkerheterna av hyperon- och antihyperonasymmetriparametrarna. Ett lågt värde av ∆φJ/ψ resulterade i stora osäkerheter och starka korrelationer mellan asymmetriparametrarna. Formalismen är annorlunda för olika processer, vilket också påverkar osäkerheterna. Formalismen som används för Σ0Σ0-processen ger sämre parameterprecision av asymmetriparametern kopplad till sönderfallet Λ → pπ− än formalismen som används för ΛΛ-processen. Därför är ΛΛ-processen en mycket lämpligare process för att testa CP-symmetrin i Λ → pπ− sönderfallet.
37

The MaRiQ model: A quantitative approach to risk management

Carlsson, Elin, Mattsson, Moa January 2019 (has links)
In recent years, cyber attacks and data fraud have become major issues to companies, businesses and nation states alike. The need for more accurate and reliable risk management models is therefore substantial. Today, cybersecurity risk management is often carried out on a qualitative basis, where risks are evaluated to a predefined set of categories such as low, medium or high. This thesis aims to challenge that practice, by presenting a model that quantitatively assesses risks - therefore named MaRiQ (Manage Risks Quantitatively). MaRiQ was developed based on collected requirements and contemporary literature on quantitative risk management. The model consists of a clearly defined flowchart and a supporting tool created in Excel. To generate scientifically validated results, MaRiQ makes use of a number of statistical techniques and mathematical functions, such as Monte Carlo simulations and probability distributions. To evaluate whether our developed model really was an improvement compared to current qualitative processes, we conducted a workshop at the end of the project. The organization that tested MaRiQexperienced the model to be useful and that it fulfilled most of their needs. Our results indicate that risk management within cybersecurity can and should be performed using more quantitative approaches than what is praxis today. Even though there are several potential developments to be made, MaRiQ demonstrates the possible advantages of transitioning from qualitative to quantitative risk management processes.
38

Historical risk assessment of a balanced portfolio using Value-at-Risk

Malfas, Gregory P. 30 April 2004 (has links)
Calculation of the Value at Risk (VaR) measure, of a portfolio, can be done using Monte Carlo simulations of that portfolio's potential losses over a specified period of time. Regulators, such as the US Securities and Exchange Commission, and Exchanges, such as the New York Stock Exchange, establish regulatory capital requirements for firms. These regulations set the amount of capital that firms are required to have on hand to safeguard against market loses that can occur. VaR gives us this specific monetary value set by Regulators and Exchanges. The specific amount of capital on hand must satisfy that, for a given confidence level, a portfolio's loses over a certain period of time, will likely be no greater than the capital required a firm must have on hand. The scenario used will be one of a Risk Manager position in which this manager inherited a portfolio that was set up for a client beginning in April 1992. The portfolio will have to meet certain parameters. The initial portfolio is worth $61,543,328.00. The risk manager will be responsible for the calculation of the Value at Risk measure, at five percent, with a confidence level of 95% and 20 days out from each of the 24 business quarters, over a six year period, starting in 1992 and ending in 1996.
39

Desenvolvimento de novas aproximações para simulações ab initio / Development of new approximations for ab initio simulations

Pedroza, Luana Sucupira 14 December 2010 (has links)
Simulações computacionais são ferramentas essenciais para a compreensão num nível microscópico de diversos fenômenos que ocorrem na natureza. Em particular, simulações ab initio, isto é, de primeiros princípios, podem predizer novas propriedades e auxiliar na interpretação de resultados experimentais, sem a necessidade de potenciais empíricos os quais são ajustados para uma específica configuração do sistema. No entanto, mesmo para simulações ab initio são necessárias aproximações tanto para o cálculo de estrutura eletrônica quanto para a descrição dos movimentos nucleares. Nesta tese, novas aproximações para o funcional de energia de troca e correlação da Teoria do Funcional da Densidade (DFT) são propostas e testadas no cálculo de estrutura eletrônica de átomos, moléculas e sólidos. Para a descrição dos movimentos nucleares a técnica de Monte Carlo é utilizada, porém as energias totais são obtidas com a DFT. Propomos também uma nova metodologia que permite descrever movimentos intramoleculares de sistemas cujas frequências de vibração não podem ser tratadas classicamente. Como aplicações estudamos aglomerados de água e água líquida, mostrando a relevância dessa nova metodologia na descrição de propriedades estruturais, vibracionais e de momento de dipolo desses sistemas. / Computer simulations are essential tools for a microscopic understanding of many processes that occur in nature. In particular, ab initio simulations, i.e., first principles simulations, can predict new properties and support experimental results, without the need to use empirical potentials which are fitted for a specific configuration of the system. However, even in ab initio simulations it is necessary to do approximations for the electronic structure calculations and for the description of the nuclear movements. In this thesis, new approximations for the exchange-correlation energy functional of Density Functional Theory (DFT) are proposed and tested for atoms, molecules and solids. The description of the nuclear movements was done using the Monte Carlo technique, however the total energy calculations were obtained by DFT. We also propose a new methodology which allows the description of intramolecular movements whose vibrational frequencies can not be treated classically. As an illustration, we have studied water clusters and liquid water, showing the relevance of this new methodology on the description of structural, vibrational and of the dipole moment of these systems.
40

Pricing Caps in the Heath, Jarrow and Morton Framework Using Monte Carlo Simulations in a Java Applet

Kalavrezos, Michail January 2007 (has links)
<p>In this paper the Heath, Jarrow and Morton (HJM) framework is applied in the programming language Java for the estimation of the future spot rate. The subcase of an exponential model for the diffusion coefficient (volatility) is used for the pricing of interest rate derivatives (caps).</p>

Page generated in 0.104 seconds