461 |
A Three-dimensional Direct Simulation Monte Carlo Methodology on Unstructured Delaunay Grids with Applications to Micro and NanoflowsChamberlin, Ryan Earl 29 March 2007 (has links)
The focus of this work is to present in detail the implementation of a three dimensional direct simulation Monte Carlo methodology on unstructured Delaunay meshes (U-DSMC). The validation and verification of the implementation are shown using a series of fundamental flow cases. The numerical error associated with the implementation is also studied using a fundamental flow configuration. Gas expansion from microtubes is studied using the U-DSMC code for tube diameters ranging from 100Æ’ÃÂ�m down to 100nm. Simulations are carried out for a range of inlet Knudsen numbers and the effect of aspect ratio and inlet Reynolds number on the plume structure is investigated. The effect of scaling the geometry is also examined. Gas expansion from a conical nozzle is studied using the U-DSMC code for throat diameters ranging from 250 Æ’ÃÂ�m down to 250 nm. Simulations are carried out for a range of inlet Knudsen numbers and the effect of inlet speed ratio and inlet Reynolds number on the plume structure is investigated. The effect of scaling the geometry is examined. Results of a numerical study using the U-DSMC code are employed to guide the design of a micropitot probe intended for use in analyzing rarefied gaseous microjet flow. The flow conditions considered correspond to anticipated experimental test cases for a probe that is currently under development. The expansion of nitrogen from an orifice with a diameter of 100Æ’ÃÂ�m is modeled using U-DSMC. From these results, local ¡¥free stream¡¦ conditions are obtained for use in U-DSMC simulations of the flow in the vicinity of the micropitot probe. Predictions of the pressure within the probe are made for a number of locations in the orifice plume. The predictions from the U-DSMC simulations are used for evaluating the geometrical design of the probe as well as aiding in pressure sensor selection. The effect of scale on the statistical fluctuation of the U-DSMC data is studied using Poiseuille flow. The error in the predicted velocity profile is calculated with respect to both first and second-order slip formulations. Simulations are carried out for a range of channel heights and the error between the U-DSMC predictions and theory are calculated for each case. From this error, a functional dependence is shown between the scale-induced statistical fluctuations and the decreasing channel height.
|
462 |
Development of a Monte Carlo Simulation Method for use in investigating CT (Computed Tomography) MammographyThacker, Samta C 30 April 2004 (has links)
The development of new digital mammography techniques such as dual-energy imaging, tomosynthesis and CT mammography will require investigation of optimal camera design parameters and optimal imaging acquisition parameters. One tool that is useful for this purpose is Monte Carlo simulation. This study presents a methodology for generating simulated images from a CsI-based, flat-panel imager model and for estimating the normalized glandular dose to the uncompressed breast in CT mammography. The simulation uses the GEANT 3 Monte Carlo code to model x-ray transport and absorption within the CsI scintillator, and the DETECT-II code to track optical photon spread within a columnar model of the CsI scintillator. The Monte Carlo modeling of x-ray transport and absorption within the CsI was validated by comparing to previously published values for the probability of a K-shell interaction, the fluorescent yield, the probability of a K-fluorescent emission, and the escape fraction describing the probability of a K x-ray escaping the scintillator. To validate the combined (GEANT 3 coupled with DETECT-II) Monte Carlo approach to form simulated images, comparison of modulation transfer functions (MTFs) and system sensitivity (electrons/mR/pixel) obtained from simulations were compared to empirical measurements obtained with different x-ray spectra and imagers with varying CsI thicknesses. By varying the absorption and reflective properties of the columnar CsI used in the DETECT-II code, good agreement between simulated MTFs and system sensitivity and empirically measured values were observed. The Monte Carlo software was also validated for dosimetry by comparing results of the linear attenuation coefficient values and the normalized glandular dose (DgN) values of the compressed breast, to those reposted in the literature. The normalized glandular dose was then estimated for three different sizes of the uncompressed breast with a homogeneous composition of adipose and glandular tissue. Further, fit equations of the normalized glandular dose curves were also generated using MATLAB. These equations can be used to replicate the dose for the three sizes of the breast and three compositions of the adipose and glandular tissue. In addition, images displaying energy deposition maps are presented to better understand the spatial distribution of dose in CT mammography.
|
463 |
Historical risk assessment of a balanced portfolio using Value-at-RiskMalfas, Gregory P. 30 April 2004 (has links)
Calculation of the Value at Risk (VaR) measure, of a portfolio, can be done using Monte Carlo simulations of that portfolio's potential losses over a specified period of time. Regulators, such as the US Securities and Exchange Commission, and Exchanges, such as the New York Stock Exchange, establish regulatory capital requirements for firms. These regulations set the amount of capital that firms are required to have on hand to safeguard against market loses that can occur. VaR gives us this specific monetary value set by Regulators and Exchanges. The specific amount of capital on hand must satisfy that, for a given confidence level, a portfolio's loses over a certain period of time, will likely be no greater than the capital required a firm must have on hand. The scenario used will be one of a Risk Manager position in which this manager inherited a portfolio that was set up for a client beginning in April 1992. The portfolio will have to meet certain parameters. The initial portfolio is worth $61,543,328.00. The risk manager will be responsible for the calculation of the Value at Risk measure, at five percent, with a confidence level of 95% and 20 days out from each of the 24 business quarters, over a six year period, starting in 1992 and ending in 1996.
|
464 |
Computational and instrumental developments in quantitative Auger electron analysisJackson, Andrew Robert January 1999 (has links)
No description available.
|
465 |
Aquametria por micro-ondasSevero, Sergio Luiz Schubert January 2016 (has links)
A medida do teor de umidade de materiais resulta em ganhos econômicos e ambientais ao possibilitar o controle preciso da secagem de produtos agrícolas, de processos de cura e armazenagem, e reduzir perdas de material e consumo de energia. Tal medida pode ser feita através da interação de ondas eletromagnéticas nas frequências de micro-ondas. Apresenta-se o desenvolvimento histórico e o estado da arte dessa área da metrologia que é conhecida por aquametria por micro-ondas. Detalha-se o comportamento das ondas eletromagnéticas em dielétricos, o modelo de espalhamento nas interfaces das amostras, a relação entre a permissividade complexa e a umidade, métodos para a determinação da permissividade, instrumentos, sistemas e aspectos metrológicos da cadeia de medição para espectroscopia dielétrica. Dois novos métodos, com equações explícitas originais para a determinação da permissividade são obtidos. As novas equações resultam em incerteza menor que os métodos clássicos. Uma nova metodologia, através do método de Monte Carlo, é empregada para a avaliação das mesmas. Conclui-se que o desenvolvimento da aquametria passa, necessariamente, pelo aperfeiçoamento dos instrumentos e métodos para determinação da permissividade dos materiais. / The measurement of moisture content of materials results in economic and environmental gains. Monitoring drying and curing of materials prevents loss of products and waste of energy. This can be done through the interaction of electromagnetic waves at microwave frequencies. This dissertation presents the historical development and the state-of-the-art in aquametry, the behavior of electromagnetic waves in dielectrics, sample interface scattering model, the relationship between the complex permittivity and moisture, methods for determining permittivity, instruments and metrological aspects of the measurement chain, and some instruments for dielectric spectroscopy. Two new methods, capable of providing explicit equation for the permittivity, are obtained and the Monte Carlo method is applied to determine their uncertainty. The new equations have lower uncertainty than the classic methods. The general conclusion is that the development of instrumentation and techniques in aquametry must be based in improvements in permittivity measurement.
|
466 |
The estimation of natural frequencies and damping ratios of offshore structuresCampbell, Robert Bradlee January 1980 (has links)
Thesis (Ph.D.)--Massachusetts Institute of Technology, Dept. of Ocean Engineering, 1980. / MICROFICHE COPY AVAILABLE IN ARCHIVES AND ENGINEERING. / Includes bibliographical references. / by Robert Bradlee Campbell. / Ph.D.
|
467 |
Probilistic evaluation of tankship damage in collision eventsCrake, Kurtis Wayne January 1998 (has links)
Thesis (Nav.E.)--Massachusetts Institute of Technology, Dept. of Ocean Engineering; and, (M.Eng.)--Joint Program in Marine Environmental Systems, Massachusetts Institute of Technology and Woods Hole Oceanographic Institution, 1998. / Includes bibliographical references (p. 85-86). / by Kurtis Wayne Crake. / Nav.E.
|
468 |
Monte Carlo calculation of rarefied hypersonic gas flow past a circular discKuwano, Yoshiaki January 1981 (has links)
Thesis (M.S.)--Massachusetts Institute of Technology, Dept. of Aeronautics and Astronautics, 1981. / MICROFICHE COPY AVAILABLE IN ARCHIVES AND AERO. / Includes bibliographical references. / by Yoshiaki Kuwano. / M.S.
|
469 |
Asymptotic Theory and Applications of Random FunctionsLi, Xiaoou January 2016 (has links)
Random functions is the central component in many statistical and probabilistic problems. This dissertation presents theoretical analysis and computation for random functions and its applications in statistics. This dissertation consists of two parts. The first part is on the topic of classic continuous random fields. We present asymptotic analysis and computation for three non-linear functionals of random fields. In Chapter 1, we propose an efficient Monte Carlo algorithm for computing P{sup_T f(t)>b} when b is large, and f is a Gaussian random field living on a compact subset T. For each pre-specified relative error ɛ, the proposed algorithm runs in a constant time for an arbitrarily large $b$ and computes the probability with the relative error ɛ. In Chapter 2, we present the asymptotic analysis for the tail probability of ∫_T e^{σf(t)+μ(t)}dt under the asymptotic regime that σ tends to zero. In Chapter 3, we consider partial differential equations (PDE) with random coefficients, and we develop an unbiased Monte Carlo estimator with finite variance for computing expectations of the solution to random PDEs. Moreover, the expected computational cost of generating one such estimator is finite. In this analysis, we employ a quadratic approximation to solve random PDEs and perform precise error analysis of this numerical solver. The second part of this dissertation focuses on topics in statistics. The random functions of interest are likelihood functions, whose maximum plays a key role in statistical inference. We present asymptotic analysis for likelihood based hypothesis tests and sequential analysis. In Chapter 4, we derive an analytical form for the exponential decay rate of error probabilities of the generalized likelihood ratio test for testing two general families of hypotheses. In Chapter 5, we study asymptotic properties of the generalized sequential probability ratio test, the stopping rule of which is the first boundary crossing time of the generalized likelihood ratio statistic. We show that this sequential test is asymptotically optimal in the sense that it achieves asymptotically the shortest expected sample size as the maximal type I and type II error probabilities tend to zero. These results have important theoretical implications in hypothesis testing, model selection, and other areas where maximum likelihood is employed.
|
470 |
A Systematic Framework to Optimize and Control Monoclonal Antibody Manufacturing ProcessLi, Ying Fei January 2018 (has links)
Since the approval of the first therapeutic monoclonal antibody in 1986, monoclonal antibody has become an important class of drugs within the biopharmaceutical industry, with indications and superior efficacy across multiple therapeutic areas, such as oncology and immunology. Although there has been great advance in this field, there are still challenges that hinder or delay the development and approval of new antibodies.
For example, we have seen issues in manufacturing, such as quality, process inconsistency and large manufacturing cost, which can be attributed to production failure, delay in approval and drug shortage. Recently, the development of new technologies, such as Process Analytical Tools (PCT), and the use of statistical tools, such as quality by design (QbD), Design of Experiment (DoE) and Statistical Process Control (SPC), has enabled us to identify critical process parameters and attributes, and monitor manufacturing performance.
However, these methods might not be reliable or comprehensive enough to accurately describe the relationship between critical process parameters and attributes, or still lack the ability to forecast manufacturing performance. In this work, by utilizing multiple modeling approaches, we have developed a systematic framework to optimize and control monoclonal antibody manufacturing process.
In our first study, we leverage DoE-PCA approach to unambiguously identify critical process parameters to improve process yield and cost of goods, followed by the use of Monte Carlo simulation to validate the impact of parameters on these attributes. In our second study, we use a Bayesian approach to predict product quality for future manufacturing batches, and hence mitigation strategies can be put in place if the data suggest a potential deviation. Finally, we use neural network model to accurately characterize the impurity reduction of each purification step, and ultimately use this model to develop acceptance criteria for the feed based on the predetermined process specifications. Overall, the work in this thesis demonstrates that the framework is powerful and more reliable for process optimization, monitoring and control.
|
Page generated in 0.0352 seconds