191 |
Risk Analysis of Tilapia Recirculating Aquaculture Systems: A Monte Carlo Simulation ApproachKodra, Bledar 12 June 2007 (has links)
The purpose of this study is to modify an existing static analytical model developed for a Re-circulating Aquaculture Systems through incorporation of risk considerations to evaluate the economic viability of the system. In addition the objective of this analysis is to provide a well documented risk based analytical system so that individuals (investors/lenders) can use it to tailor the analysis to their own investment decisions—that is to collect the input data, run the model, and interpret the results. The Aquaculture Economic Cost Model (AECM) was developed by Dr. Charles Coale, Jr. and others from the department of Agricultural and Applied Economics at Virginia Tech. The AECM is a spreadsheet model that was developed to help re-circulating aquaculture producers make strategic business decisions. The model can be used by potential producers interested in investing in re-circulating aquaculture through development of a financial analysis that in turn will help them obtain funding for the enterprise. The model is also useful for current producers who want to isolate inefficient aspects of their operation. AECM model consists of three major sections which include the Data Entry, Calculations and Analysis. The first section requires that the producer conducts background research about their operation to ensure accurate calculation and analysis. The calculation section provides a great deal of information about the operation's finances, while the analysis section provides information about the operation's financial stability. While the AECM is a powerful model, it is based on single, usually mean, values for prices, costs, and input and output quantities. However, market, financial and production uncertainties result in fluctuating prices, costs and yields. An individual who is making management decisions for a re-circulating aquaculture system will be faced with some or all of these uncertainties. By adding simulation to the AECM model to account for these uncertainties individuals will be able to make better management decisions. Information of the varying likelihoods or probabilities of achieving profits will be of crucial interest to individuals who plan on entering into or modifying an existing aquaculture system. Risks associated with six variables were examined in this paper: feed cost, feed conversion, mortality rate, capital interest rate, final weight, and output price. Data for the Interest Rate and output price were obtained from the Federal Reserve System and NMFS website respectively. Expert opinion was the source of data for the other variables. After probability distributions were applied to the random variables to account for the uncertainty the model was simulated for ten thousand iterations to obtain expected returns for three years in advance that the model calculates an income statement. In addition to that, sensitivity analyses were carried out in order to inform the producer which factors are contributing the most to the profitability of the operation. In this way the producer will have a better idea as to which aspects of the operation to monitor closely and consider modifying. The analysis shows that the mean income for the three years will be negative and thus the business would be losing money. The simulated mean net incomes were: -$216,905, -$53,689, -$53,111 for year1 through year3 respectively. Sensitivity analysis confirmed that output price is by far the most significant input that makes the overall bottom line to fluctuate most. Output price was on top of the list for all the three years analyzed in this study. Feed cost and Feed conversion were the next most significant inputs. The other inputs were also significant in explaining the fluctuation of the bottom line; however both their regression and correlation coefficients were small. / Master of Science
|
192 |
Monte Carlo analysis of methods for extracting risk-neutral densities with affine jump diffusionsLu, Shan 31 July 2019 (has links)
Yes / This paper compares several widely-used and recently-developed methods to extract
risk-neutral densities (RND) from option prices in terms of estimation accuracy. It
shows that positive convolution approximation method consistently yields the most
accurate RND estimates, and is insensitive to the discreteness of option prices. RND
methods are less likely to produce accurate RND estimates when the underlying process
incorporates jumps and when estimations are performed on sparse data, especially for
short time-to-maturities, though sensitivity to the discreteness of the data differs across
different methods.
|
193 |
Digitale Positronen-Annihilation-Lebensdauer-Spektroskopie: Geant4-Monte-Carlo-Simulation und Puls-Generator-Kopplung für einen realitätsgetreuen digitalen Zwilling / Digital Positron-Annihilation-Lifetime-Spectroscopy: Geant4- Monte-Carlo-Simulation and Pulse-Generator-Coupling for a Realistic Digital TwinBoras, Dominik January 2024 (has links) (PDF)
Diese Dissertation führt einen umfassend modular aufgebauten digitalen Zwilling der Positronen-Annihilation-Lebensdauerspektroskopie (PALS) ein, der das Ziel verfolgt, ein tiefergehendes Verständnis der Messmethode zu ermöglichen und aufzuzeigen, wie die Konfiguration des experimentellen Setups sowie die Eigenschaften der verwendeten radioaktiven Positronenquelle zu Verzerrungen im Lebensdauerspektrum führen können. Aufgrund der mathematischen Komplexität realer Lebensdauerspektren, die eine Zerlegung in ihre Komponenten erschwert, und der Herausforderung, die Instrumentauflösungsfunktion / engl.: Instrument Response Function (IRF) genau zu bestimmen, bietet der digitale Zwilling eine innovative Lösung, um jegliche Fragen zur Hardware der PALS mittels einer realitätsgetreuen digitalen Nachbildung des gewünschten Setups zu beantworten.
Der entwickelte digitale Zwilling setzt sich aus drei Kernmodulen zusammen, die schrittweise tiefergehende Einblicke in das simulierte Setup und schließlich in das resultierende Lebensdauerspektrum bieten. Im ersten Modul wird die Open-Source-Softwareplattform Geant4 genutzt, um die physische Welt in eine digitale Umgebung zu überführen. Hierbei steht die Modellierung von Teilcheninteraktionen mit Materie im Fokus, die durch die flexible Architektur von Geant4 ermöglicht wird. In diesem Kontext wurde eine spezifische Simulation implementiert, die eine differenzierte Betrachtung der Gamma-Quanten-Energien erlaubt und somit ein präziseres Verständnis der PALS-Methodik ermöglicht.
Im zweiten Modul erfolgt die Kopplung der aus Geant4 gewonnenen Daten-Streams mit dem DLTPulseGenerator über eine speziell entwickelte Schnittstelle. Diese Schnittstelle bietet neben der sequenziellen Verarbeitung der Daten-Streams die Einbindung physikalischer Prozesse wie der Positronen-Lebensdauer, sowie die Quellstärke des als Positronenstrahler verwendeten radioaktiven Isotops und der zeitlichen Unschärfe des Photo-Multipliers / engl.: Photo-Multiplier-Tube (PMT), wodurch eine umfassende Untersuchung verschiedener Effekte mit einem einzigen Datensatz möglich wird. Dies führt zu einer hohen Vergleichbarkeit der Ergebnisse. Darüber hinaus ermöglicht die Schnittstelle eine detaillierte Klassifizierung möglicher Ereignisse innerhalb der PALS-Methode, was es ermöglicht, quantitative Effekte spezifischer Ereignisse zu untersuchen. Dabei gibt die Schnittstelle sowohl wichtige Informationen aus der Geant4-Simulation als auch Informationen aus ihren eigenen Funktionen an den DLTPulseGenerator weiter, welcher digitalisierte PMT Output-Pulse je nach gewählter Konfiguration des gewählten Digitizers erzeugt.
Das dritte Modul nutzt die DDRS4PALS Software zur Analyse des aus dem zweiten Modul stammenden Daten-Streams, um Informationen über das Lebensdauerspektrum zu extrahieren. Hierbei wird erstmals die Validierung physikalischer Filter am Gesamtspektrum und seiner Anteile gewährt, da durch die Klassifizierung im zweiten Modul des digitalen Zwillings ermöglicht wird, lediglich die unerwünschten Anteile im Lebensdauerspektrum zu betrachten und somit die tatsächliche Wirkung der physikalischen Filter auf diese zu untersuchen.
Der modular und flexibel gestaltete digitale Zwilling der PALS ermöglicht eine einfache Anpassung an veränderte experimentelle Setups und kann auch für ähnliche Messmethoden (z.B. Fluoreszenz-Lebensdauer-Spektroskopie) adaptiert werden. Dadurch markiert der digitale Zwilling einen signifikanten Fortschritt in Richtung einer digitalen Ära der Forschung und trägt zu einem verbesserten Verständnis der Messmethode sowie zu effizienteren und kostengünstigeren Optimierungsprozessen bei. / This dissertation introduces a comprehensively modular digital twin of Positron Annihilation Lifetime Spectroscopy (PALS), aimed at enabling a deeper understanding of the measurement method and demonstrating how the configuration of the experimental setup and the properties of the used radioactive positron source can lead to distortions in the lifetime spectrum. Due to the mathematical complexity of real lifetime spectra, which complicates their decomposition into components, and the challenge of accurately determining the Instrument Response Function (IRF), the digital twin offers an innovative solution to address any hardware-related questions of PALS by using a realistic digital replication of the desired setup.
The developed digital twin comprises three core modules that provide progressively deeper insights into the simulated setup and ultimately the resulting lifetime spectrum. The first module utilizes the open-source software platform Geant4 to transform the physical world into a digital environment. Here, the focus is on modelling particle interactions with matter across a broad energy spectrum, including electromagnetic processes, facilitated by Geant4's flexible architecture. Within this context, a specific simulation was implemented, allowing a detailed examination of gamma quantum energies, thus enabling a more precise understanding of the PALS methodology.
The second module involves coupling the data streams generated by the Geant4-Simulation with the DLTPulseGenerator via a specially developed interface. This interface, in addition to sequential processing of data streams, incorporates physical processes such as the positron lifetime, the source strength of the radioactive isotope used as a positron emitter, and the temporal uncertainty of Photo-Multiplier-Tubes (PMTs), thereby enabling a comprehensive examination of various effects with a single dataset. This leads to high comparability of results. Moreover, the interface facilitates a detailed classification of potential events within the PALS method, allowing for quantitative investigations of specific event effects. The interface conveys both critical information from the Geant4-Simulation and its functions to the DLTPulseGenerator, which generates digitized PMT output pulses based on the chosen configuration of the digitizer.
The third module utilizes the DDRS4PALS software to analyse the data stream generated by the second module, extracting information about the lifetime spectrum. This approach allows for the first-time validation of physical filters on the overall spectrum and its components. By classifying data in the second module of the digital twin, it becomes possible to focus solely on the undesired components within the lifetime spectrum. This enables a precise investigation of the actual impact of physical filters on these components.
The modular and flexible design of the digital twin allows for easy adaptation to altered experimental setups and can also be adjusted for similar measurement methods (e.g. Fluorescence Lifetime Spectroscopy). Thus, the digital twin signifies a significant step towards a digital era of research, contributing to an enhanced understanding of the measurement method and more efficient and cost-effective optimization processes.
|
194 |
Investigation of Structural Behaviors of Methyl Methacrylate Oligomers within Confinement Space by Coarse-grained Configurational-bias Monte Carlo SimulationChang, Chun-Yi 16 August 2010 (has links)
The coarse-grained configurational-bias Monte Carlo (CG-CBMC) simulation was employed to study the structural behaviors of methyl methacrylate (MMA) oligomers adsorbed on grooved substrate due to molecular dynamics (MD) simulation is probably trapped at some local energy minima and difficult to carry out over a long enough time to allow relaxation of chain motion for an enormous polymeric system. Therefore, the CG-CBMC simulation was adopted in the present study. In this study, three types of chains are classified according to their positions relative to the groove. Type 1, Type 2, and Type 3 represent the entire MMA-oligomer within the groove, the MMA-oligomer partially within the groove, and the oligomer outside the groove, respectively. The orientational order parameters of Type 1 and Type 2 oligomers decrease with the increase of groove width, but the orientational order parameter of Type 3 oligomers is approximately equal to 0.1. In addition, observation of the orientational order parameters of Type 1 oligomers interacting with the grooved substrate at different interaction strengths decrease with increasing the groove width. Furthermore, the orientational order parameters of Type 1 oligomers within the narrowest (20 Å) and the widest (35 Å) groove with different depths were determined. For the narrowest groove, the arrangement of Type 1 oligomers will be influenced by the groove width. However, in the case of the widest groove, the orientational order parameter of Type 1 oligomers is approximately equal to 0.2. This study can help engineers clarify the characteristics and phenomena of physical adsorption of the molecules, as well as contributing to the application of recent technology.
|
195 |
Monte-Carlo simulation with FLUKA for liquid and solid targetsInfantino, A., Oehlke, E., Trinczek, M., Mostacci, D., Schaffer, P., Hoehr, C. January 2015 (has links)
Introduction
Monte-Carlo simulations can be used to assess isotope production on small medical cyclotrons. These simulations calculate the particle interactions with electric and magnetic fields, as well as the nuclear reactions. The results can be used to predict both yields and isotopic contaminations and can aid in the optimum design of target material and target geometry [1,2]. FLUKA is a general-purpose tool widely used in many applications from accelerator shielding to target design, calorimetry, activation, dosimetry, detector design, neutrino physics, or radiotherapy [3,4]. In this work, we applied the Monte-Carlo code FLUKA to determine the accuracy of predicting yields of various isotopes as compared to experimental yields.
Material and Methods
The proton beam collimation system, as well as the liquid and solid target of the TR13 cyclotron at TRIUMF, has been modeled in FLUKA. The proton beam parameters were initially taken from the cyclotron design specifications and were optimized against experimental measurements from the TR13. Data from irradiations of different targets and with different beam currents were collected in order to account for average behavior, see FIG. 1. Yields for a pencil proton beam as well as a beam spread out in direction and energy have been calculated and have been compared to experimental results obtained with the TR13.
Results and Conclusion
The reactions listed in TABLE 1 were assessed. For most reactions a good agreement was found in the comparison between experimental and simulated saturation yield. TABLE 1 only shows the yields simulated with a proton beam with a spread in both direction and energy. In most cases, the simulated yield is slightly larger or comparable. Only the calculated yield for 55Co was significantly lower by a factor of 4.2. This is still a good agreement considering that FLUKA was originally a high-energy physics code. It may indicate that the FLUKA internal cross-section calculation for this isotope production needs some optimization. In summary, we conclude that FLUKA can be used as a tool for the prediction of isotope production as well as for target design.
|
196 |
Simulation Based Methods for Credit Risk Management in Payment Service Provider Portfolios / Simuleringsbaserade metoder för kreditriskhantering i betaltjänstleverantörsportföljerDahlström, Knut, Forssbeck, Carl January 2023 (has links)
Payment service providers have unique credit portfolios with different characteristics than many other credit providers. It is therefore important to study if common credit risk estimation methods are applicable to their setting. By comparing simulation based methods for credit risk estimation it was found that combining Monte Carlo simulation with importance sampling and the asymptotic single risk factor model is the most suitable model amongst those analyzed. It allows for a combination of variance reduction, scenario analysis and correlation checks, which all are important for estimating credit risk in a payment service provider portfolio. / Betaltjänstleverantörer har unika kreditportföljer med andra egenskaper än många andra kreditgivare. Det är därför viktigt att undersöka om vanliga metoder för uppskattning av kreditrisk går att tillämpa på betaltjänstleverantörer. Genom att jämföra olika simulationsbaserade metoder för uppskattning av kreditrisk fann man att att kombinationen av Monte Carlo-simulering med Importance Sampling och en ASRF-modell är den mest lämpliga bland de analyserade metoderna. Det möjliggör en kombination av variansminskning, scenarioanalys och korrelationskontroller som alla är viktiga för att uppskatta kreditrisk i en betaltjänstleverantörsportfölj.
|
197 |
Low Energy Ion Beam Synthesis of Si Nanocrystals for Nonvolatile Memories - Modeling and Process Simulations / Niederenergie-Ionenstrahlsynthese von Si Nanokristallen für nichtflüchtige Speicher - Modellierung und ProzesssimulationenMüller, Torsten 16 November 2005 (has links) (PDF)
Metal-Oxide-Silicon Field-Effect-Transistors with a layer of electrically isolated Si nanocrystals (NCs) embedded in the gate oxide are known to improve conventional floating gate flash memories. Data retention, program and erase speeds as well as the memory operation voltages can be substantially improved due to the discrete charge storage in the isolated Si NCs. Using ion beam synthesis, Si NCs can be fabricated along with standard CMOS processing. The optimization of the location and size of ion beam synthesized Si NCs requires a deeper understanding of the mechanisms involved, which determine (i) the built-up of Si supersaturation by high-fluence ion implantation and (ii) NC formation by phase separation. For that aim, process simulations have been conducted that address both aspects on a fundamental level and, on the other hand, are able to avoid tedious experiments. The built-up of a Si supersaturation by high-fluence ion implantation were studied using dynamic binary collision calculations with TRIDYN and have lead to a prediction of Si excess depth profiles in thin gate oxides of a remarkable quality. These simulations include in a natural manner high fluence implantation effects as target erosion by sputtering, target swelling and ion beam mixing. The second stage of ion beam synthesis is modeled with the help of a tailored kinetic Monte Carlo code that combines a detailed kinetic description of phase separation on atomic level with the required degree of abstraction that is necessary to span the timescales involved. Large ensembles of Si NCs were simulated reaching the late stages of NC formation and dissolution at simulation sizes that allowed a direct comparison with experimental studies, e.g. with electron energy loss resolved TEM investigations. These comparisons reveal a nice degree of agreement, e.g. in terms of predicted and observed precipitate morphologies for different ion fluences. However, they also point clearly onto impact of additional external influences as, e.g., the oxidation of implanted Si by absorbed humidity, which was identified with the help of these process simulations. Moreover, these simulations are utilized as a general tool to identify optimum processing regimes for a tailored Si NC formation for NC memories. It is shown that key properties for NC memories as the tunneling distance from the transistor channel to the Si NCs, the NC morphology, size and density can be adjusted accurately despite of the involved degree of self-organization. Furthermore, possible lateral electron tunneling between neighboring Si NCs is evaluated on the basis of the performed kinetic Monte Carlo simulations.
|
198 |
Low Energy Ion Beam Synthesis of Si Nanocrystals for Nonvolatile Memories - Modeling and Process SimulationsMüller, Torsten 19 October 2005 (has links)
Metal-Oxide-Silicon Field-Effect-Transistors with a layer of electrically isolated Si nanocrystals (NCs) embedded in the gate oxide are known to improve conventional floating gate flash memories. Data retention, program and erase speeds as well as the memory operation voltages can be substantially improved due to the discrete charge storage in the isolated Si NCs. Using ion beam synthesis, Si NCs can be fabricated along with standard CMOS processing. The optimization of the location and size of ion beam synthesized Si NCs requires a deeper understanding of the mechanisms involved, which determine (i) the built-up of Si supersaturation by high-fluence ion implantation and (ii) NC formation by phase separation. For that aim, process simulations have been conducted that address both aspects on a fundamental level and, on the other hand, are able to avoid tedious experiments. The built-up of a Si supersaturation by high-fluence ion implantation were studied using dynamic binary collision calculations with TRIDYN and have lead to a prediction of Si excess depth profiles in thin gate oxides of a remarkable quality. These simulations include in a natural manner high fluence implantation effects as target erosion by sputtering, target swelling and ion beam mixing. The second stage of ion beam synthesis is modeled with the help of a tailored kinetic Monte Carlo code that combines a detailed kinetic description of phase separation on atomic level with the required degree of abstraction that is necessary to span the timescales involved. Large ensembles of Si NCs were simulated reaching the late stages of NC formation and dissolution at simulation sizes that allowed a direct comparison with experimental studies, e.g. with electron energy loss resolved TEM investigations. These comparisons reveal a nice degree of agreement, e.g. in terms of predicted and observed precipitate morphologies for different ion fluences. However, they also point clearly onto impact of additional external influences as, e.g., the oxidation of implanted Si by absorbed humidity, which was identified with the help of these process simulations. Moreover, these simulations are utilized as a general tool to identify optimum processing regimes for a tailored Si NC formation for NC memories. It is shown that key properties for NC memories as the tunneling distance from the transistor channel to the Si NCs, the NC morphology, size and density can be adjusted accurately despite of the involved degree of self-organization. Furthermore, possible lateral electron tunneling between neighboring Si NCs is evaluated on the basis of the performed kinetic Monte Carlo simulations.
|
199 |
The differentiation between variability uncertainty and knowledge uncertainty in life cycle assessmentBudzinski, Maik 08 May 2014 (has links) (PDF)
The following thesis deals with methods to increase the reliability of the results in life cycle assessment. The paper is divided into two parts. The first part points out the typologies and sources of uncertainty in LCA and summarises the existing methods dealing with it. The methods are critically discussed and pros and cons are contrasted. Within the second part a case study is carried out. This study calculates the carbon footprint of a cosmetic product of Li-iL GmbH. Thereby the whole life cycle of the powder bath Blaue Traube is analysed. To increase the reliability of the result a procedure, derived from the first part, is applied. Recommendations to enhance the product´s sustainability are then given to the decision-makers of the company. Finally the applied procedure for dealing with uncertainty in LCAs is evaluated.
The aims of the thesis are to make a contribution to the understanding of uncertainty in life cycle assessment and to deal with it in a more consistent manner. As well, the carbon footprint of the powder bath shall be based on appropriate assumptions and shall consider occurring uncertainties.
Basing on discussed problems, a method is introduced to avoid the problematic merging of variability uncertainty and data uncertainty to generate probability distributions. The introduced uncertainty importance analysis allows a consistent differentiation of these types of uncertainty. Furthermore an assessment of the used data of LCA studies is possible.
The method is applied at a PCF study of the bath powder Blaue Traube of Li-iL GmbH. Thereby the analysis is carried out over the whole life cycle (cradle-to-grave) as well as cradle-to-gate. The study gives a practical example to the company determining the carbon footprint of products. In addition, it meets the requirements of ISO guidelines of publishing the study and comparing it with other products.
Within the PCF study the introduced method allows a differentiation of variability uncertainty and knowledge uncertainty. The included uncertainty importance analysis supports the assessment of each aggregated unit process within the analysed product system. Finally this analysis can provide a basis to collect additional, more reliable or uncertain data for critical processes.
|
200 |
Binary Mixtures and Fluids in the presence of Quenched Disorder / Binäre Mischungen und Fluide in inhomogenen MedienFischer, Timo Daniel 18 January 2012 (has links)
No description available.
|
Page generated in 0.028 seconds