• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 2276
  • 1055
  • 674
  • 184
  • 120
  • 103
  • 68
  • 55
  • 53
  • 53
  • 35
  • 32
  • 30
  • 24
  • 23
  • Tagged with
  • 5600
  • 5600
  • 1660
  • 1366
  • 571
  • 532
  • 532
  • 525
  • 423
  • 414
  • 395
  • 379
  • 329
  • 326
  • 308
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
281

Development and applications of a computer code for Monte Carlo simulation of electronphoton showers

Sempau Roma, Josep 29 February 1996 (has links)
Se presenta el trabajo realizado sobre el paquete de Subrrutinas Penélope. Este código permite la simulación MC del transporte de fotones y electrones en la materia con geometrías complejas. Los aspectos considerados son:A) mejora del algoritmo de SCATTERING de la radiación primaria y de los algoritmos que dan cuenta de las secundarias.B) simplificación del algoritmo de SCATTERING mixto par electrones empleado anteriormente. C) incorporación de secciones eficaces diferenciales. D) un paquete de subrutinas geométricas, pengeom, ha sido desarrollado. Permite geometría combinatoria con superficies cuadricas.e) presentación de un marco teórico para aplicar técnicas de reducción de varianza.F) comparación con resultados experimentales y presentación de 4 aplicaciones reales que emplean pengeom y reducción de varianza. En su estado actual Penélope permite que usuarios externos no especializados puedan abordar problemas en el campo de la ingeniería de radiaciones, de la física médica, etc.
282

Efficient Modelling and Performance Analysis of Wideband Communication Receivers

Eriksson, Andreas January 2011 (has links)
This thesis deals with Symbol Error Rate (SER)-simulation of wireless communications and its application into throughput analysis of UltraWideband (UWB) systems. The SERs will be simulated in C++ using the Monte Carlo method and when some are calculated, the rest will be estimated using a novel extrapolation method. These SER values will be very accurate and in this thesis go as low as 1.0e-14. Reaching that low values would otherwise be impossible using the traditional Monte Carlo method, because of very large computation time. However, the novel extrapolation method, can simulate a SER-curve in less than 30 seconds. It is assumed that the noise belongs to the Generalized Gaussian distribution family and among them noise from the Normal distribution (Gaussian noise) gives the best result. It is to be noted that Gaussian noise is the most commonly used in digital communication simulations. Although the program is used for throughput analysis of UWB, the program could easily be adapted to various signals. In this thesis, throughput analysis means a plot with symbol rate vs distance. From any given symbols, the user can, with a desired minimum SER, generate an extrapolated SER-curve and see what symbol rate can be achieved by the system, while obeying power constraints of signals imposed by international laws. The developed program is, by comparing with published theoretical results, tested for QAM and PSK cases, but can easily be extended to UWB systems.
283

Pricing American Options using Simulation

Larsson, Karl January 2007 (has links)
American options are financial contracts that allow exercise at any time until ex- piration. While the pricing of standard American option contracts has been well researched, with a few exceptions no analytical solutions exist. Valuation of more in- volved American option contracts, which include multiple underlying assets or path- dependent payoff, is still to a high degree an uncharted area. Most numerical methods work badly for such options as their time complexity scales exponentially with the number of dimensions. In this Master’s thesis we study valuation methods based on Monte Carlo sim- ulations. Monte Carlo methods don’t suffer from exponential time complexity, but have been known to be difficult to use for American option pricing due to the forward nature of simulations and the backward nature of American option valuation. The studied methods are: Parametrization of exercise rule, Random Tree, Stochastic Mesh and Regression based method with a dual approach. These methods are evaluated and compared for the standard American put option and for the American maximum call option. Where applicable the values are compared with those from deterministic reference methods. The strengths and weaknesses of each method is discussed. The Regression based method essentially reduces the problem to one of selecting suitable basis functions. This choice is empirically evaluated for the following Amer- ican option contracts; standard put, maximum call, basket call, Asian call and Asian call on a basket. The set of basis functions considered include polynomials in the underlying assets, the payoff, the price of the corresponding European contract as well as certain analytic approximation of the latter. Results from the empirical studies show that the regression based method is the best choice when pricing exotic American options. Furthermore, using available analytical approximations for the corresponding European option values as a basis function seems to improve the performance of the method in most cases.
284

Monte Carlo modeling of the sensitivity of x-ray photoconductors

Yunus, Mohammad 13 May 2005 (has links)
The sensitivity reduction or ghosting mechanism of x-ray photoconductor is studied based on Monte Carlo simulation techniques. We have calculated the sensitivity reduction for different detector operating conditions (applied electric field, x-ray spectrum and photoconductor thickness) and for different levels of carrier trapping. We have analyzed the effect of photoconductor biasing (positive or negative) on ghosting. The following effects are taken into account in modeling the ghosting phenomena: (i) recombination between trapped and oppositely charged drifting carriers, (ii) trap filling, (iii) nonuniform electric field, (iv) detrapping of trapped holes, and (v) x-ray induced trap generation. Our calculation shows that not only the recombination between trapped and oppositely charged drifting carriers but the x-ray induced trap generation is also responsible for ghosting in photoconductor based x-ray image detectors. Moreover not all the trapped carriers take part in recombination; rather only a fraction of the trapped carriers are involved in recombination. Electric field also plays an important role in ghosting calculations via the electron hole pair generation mechanism. Trap filling has also non trivial effects on ghosting. The simulation results show that the amount of ghosting strongly depends on the applied electric field. Ghosting increases with decreasing applied electric field and vice versa. It is observed that ghosting is higher at high carrier trapping level than at low trapping level. Again ghosting is more pronounced in chest radiographic detector than mammographic detector. In chest radiographic detector, carrier trapping is high due to greater thickness hence recombination and electric field effects are prominent in chest radiographic detector. Biasing dependent ghosting depends on the carrier mobility lifetime product. For positively biased detectors, ghosting is less if the mobility lifetime product of hole is higher than that of electron and vice versa for negatively biased detectors. It also appears that the use of only recombination to calculate ghosting, as believed the primary source of ghosting in some literatures, will lead to significant error in the calculation of ghosting.
285

Uncertainty Analysis of the NONROAD Emissions Model for the State of Georgia

Chi, Tien-Ru Rosa 23 August 2004 (has links)
Understanding uncertainty in emissions inventories is critical for evaluating both air quality modeling results as well as impacts of emissions reduction strategies. This study focused on quantification of uncertainty due to non-road emissions specifically for the state of Georgia using the EPA NONROAD emissions model. Nonroad engines contribute significantly to anthropogenic emissions inventories, with national estimates for various criteria pollutants ranging from 14% to 22%. The NONROAD model is designed to estimate emissions for any area in the United States based on population, activity, and emissions data. Information used in the model comes from a variety of sources collected over many years. A sensitivity analysis of the model determined the input variables that have significant effects on emissions. Results showed that model estimated emissions are significantly sensitive to increases in equipment population, activity, load factor, and emission factor. Increases in ambient temperature, fuel RVP, fuel sulfur (except on SO2), and average useful life have smaller effects. Emissions and activity data used in the NONROAD model were analyzed using statistical techniques to quantify uncertainty in the input parameters. Expert elicitation was also used to estimate uncertainties in emission factors, equipment population, activity, load factors, and geographic allocations of the emissions to the county level. A Monte Carlo approach using the derived parameter uncertainties and different input probability distributions was used to estimate the overall uncertainty of emissions from the NONROAD model for the state of Georgia. The uncertainties resulting from this analysis were significant, with 95% confidence intervals about the mean ranging from ?? to +61% for THC, -46 to +68% for NOx, -43% to 75% for CO, and ?? to +75% for PM. The sensitivity of ozone and CO for different regions in Georgia to NONROAD emissions in Georgia was also estimated. The analysis suggests that uncertainties in ozone and CO simulations due to NONROAD emissions uncertainties, averaged over the regions of interest, are not large, with resulting maximum coefficients of variation of 1% and 10% respectively.
286

Pricing Path-Dependent Derivative Securities Using Monte Carlo Simulation and Intra-Market Statistical Trading Model

Lee, Sungjoo 09 December 2004 (has links)
This thesis is composed of two parts. The first parts deals with a technique for pricing American-style contingent options. The second part details a statistical arbitrage model using statistical process control approaches. We propose a novel simulation approach for pricing American-style contingent claims. We develop an adaptive policy search algorithm for obtaining the optimal policy in exercising an American-style option. The option price is first obtained by estimating the optimal option exercising policy and then evaluating the option with the estimated policy through simulation. Both high-biased and low-biased estimators of the option price are obtained. We show that the proposed algorithm leads to convergence to the true optimal policy with probability one. This policy search algorithm requires little knowledge about the structure of the optimal policy and can be naturally implemented using parallel computing methods. As illustrative examples, computational results on pricing regular American options and American-Asian options are reported and they indicate that our algorithm is faster than certain alternative American option pricing algorithms reported in the literature. Secondly, we investigate arbitrage opportunities arising from continuous monitoring of the price difference of highly correlated assets. By differentiating between two assets, we can separate common macroeconomic factors that influence the asset price movements from an idiosyncratic condition that can be monitored very closely by itself. Since price movements are in line with macroeconomic conditions such as interest rates and economic cycles, we can easily see out of the normal behaviors on the price changes. We apply a statistical process control approach for monitoring time series with the serially correlated data. We use various variance estimators to set up and establish trading strategy thresholds.
287

Applying Value-at-Risk to Financial Risk Evaluation in BOT Projects

Sung, Chao-Hsien 28 May 2010 (has links)
There is a growing trend to use public-private partnerships (PPP) for infrastructure project delivery due to lack of budget and inefficiency of public sector. The most popular PPP option is a concession-based type such as build-operate-transfer (BOT). However, construction delay, costs overrun, and disastrous financial performance in the early operation phase are not rarely seen in large BOT projects. The case of Taiwan High Speed Rail (THSR) is the evidence. The problem lies in over-optimism in financial feasibility analysis and under-estimation in risk exposure evaluation. Based on information of Case Project - Kaohsiung Intercontinental Terminal (KIT), which started its Phase One Plan in 2007 at a cost of about NT$42.89 billions in land procurement, peripheral public infrastructure and construction and facilities of the terminal, I will apply traditional capital investment methodology to evaluate its financial feasibility. This is done by calculating key financial indexes from Total Investment Point of View and Equity Point of View and determine whether this project is acceptable or not by conventional criteria from three main participants¡¦ position, including government agency, financial institutions, and private investors. However, we can not realize risk exposure of Case Project from traditional methods. Therefore, ideas of Value-at-Risk (VAR) that commonly used in evaluating risk exposure of financial assets are brought in. The VAR concepts are used in four financial indexes, including self-liquidation ratio (SLR), net present value (NPV), debt coverage ratio (DCR) and times interest earned (TIE), which are regarded as critical in decision by government agency, private investors, and financial institutions. This is done by applying Monte Carlo Simulation, which involves 1,000 iterations of sampling based on parameter settings of risk factors and consideration of correlations in risk factors. Volatility of key risk factor is analyzed to further realize comprehensively risk exposure in terms of VARs of financial indexes. Evidences show that, while parameter settings of risk factors are critical to simulations results, consideration of correlations of risk factors will also diverge results from that of ignoring. In addition, sensitivity analysis in terms of volatility in key risk factors presents full-scale financial risk exposure, which is helpful in reaching final decision. Of all three participants in Case Project, while private investors face greatest risk exposure due to high financial leverage employed, financial institutions confront relatively low risk in terms of loan repayment. From government agency¡¦s view, probability of fully self-liquidated with no subsidy in Case Project is 90%.
288

A Theoretical Approach for the Determination and Mechanistic Interpretation of Radiation D10-value

Ekpanyaskun, Nont 2009 May 1900 (has links)
In the design of the food irradiation process, the knowledge of the radiation resistance of the target organism in a specific food commodity is required. The D10-value, the radiation dose needed to inactivate 90% of the microbial load in the food medium, is used to relate the amount of absorbed energy to the surviving bacterial population. Numerous experimental studies have been performed to determine the D10 values of several food-borne microorganisms irradiated under various conditions. Nevertheless, accurate predictions of D10 values of the pathogens in food products that have not been empirically examined cannot be made due to insufficient understanding of the biological response to radiation exposure. A theoretical model for the derivation of the D10-value has been proposed in this study to mechanistically assess the production of radiation-induced DNA damage by energetic electrons. The step-by-step Monte-Carlo simulation technique, which employs the detailed histories of the ionizing particles and the radiolytic species, was utilized. The effects of selected parameters including the genomic sequence, the type of DNA double strand break, the DNA damaging agents, the radical scavengers, the degree of dispersion of DNA molecules, and the number of genome equivalents were hypothetically investigated. The developed computational methodology as well as the results presented can be used as an analytical tool to evaluate the impact of ionizing radiation on cell survival.
289

A Study on the Impacts of Grid Connection Wind Power Generations

Kuo, Zhi-Yuan 01 July 2004 (has links)
Wind power generations have increased impacts on the electric utility power systems. When the wind power is placed into service in an electric system, it becomes a functioning part of the system, which may require other design changes to the system and special practices to integrate it to the system. The presence of the wind power generation units will directly affect voltage profiles along a feeder by changing the direction and magnitude of active/reactive power flows. A number of coordination issues including safety issue, protection, voltages and frequency control presently require study in order to understand technical limits to the penetration of wind power or distributed generation on a given system. The aim of this thesis is to investigate the impacts of wind generators connected to a distribution system. To take load uncertainty and wind power generation uncertainty due to wind speed variation in the analysis, Monte Carlo simulation technique is used. A number of cases are tested to assess the impacts of wind power generations in various scenarios for the studied network. Test results have shown that the when wind power generators are connected to distribution network, it would not only reduce the probability of occurrence of undervoltage but also decrease the feeder losses. The analytical models proposed in this thesis can provide the utility useful information in placing the wind power generators.
290

An Analysis on the Blade Design Parameters of Turbo Molecular Pumps

Tsai, Hong-Zhi 27 July 2000 (has links)
Turbo Molecular pumps, abbreviated as TMP, can create a high vacuum environment for some special industries, especially the semiconductor and IC industries. The turbo blade design is one of the main technologies that affect the performance of a TMP. The object of this study is to investigate what kind of blade design parameters, e.g. blade angle, blade spacing, blade chord, blade velocity, etc., will affect the performance of TMP. It is hope that an analysis methodology of these parameters can be setup in the viewpoint of pumping rate curve. The results of this study will be useful for the design of TMP.

Page generated in 0.0642 seconds