Spelling suggestions: "subject:"fonte carlo simulation"" "subject:"fonte sarlo simulation""
451 |
Simulation of X-ray imaging systems for luggage inspectionXie, Wei 23 December 2009 (has links)
This thesis describes XL, an x-ray imaging simulator for luggage inspection. This software system runs on a workstation and models x-ray sources, x-ray detectors and objects between them. A simple graphical interface permits the user to specify simulation parameters and inputs. XL then uses Monte Carlo methods to simulate x-ray interaction with matter, including the photoelectric effect, coherent scattering, and incoherent scattering. Finally, XL can produce x-ray images which agree closely with experimental data obtained from a commercial luggage scanner. The simulator will be a valuable tool in the development of future x-ray scanners, particularly those designed to detect explosives in luggage. / Master of Science
|
452 |
Maintenance Data Augmentation, using Markov Chain Monte Carlo Simulation : (Hamiltonian MCMC using NUTS)Roohani, Muhammad Ammar January 2024 (has links)
Reliable and efficient utilization and operation of any engineering asset require carefully designed maintenance planning and maintenance related data in the form of failure times, repair times, Mean Time between Failure (MTBF) and conditioning data etc. play a pivotal role in maintenance decision support. With the advancement in data analytics sciences and industrial artificial intelligence, maintenance related data is being used for maintenance prognostics modeling to predict future maintenance requirements that form the basis of maintenance design and planning in any maintenance-conscious industry like railways. The lack of such available data creates a no. of different types of problems in data driven prognostics modelling. There have been a few methods, the researchers have employed to counter the problems due to lack of available data. The proposed methodology involves data augmentation technique using Markov Chain Monte Carlo (MCMC) Simulation to enhance maintenance data to be used in maintenance prognostics modeling that can serve as basis for better maintenance decision support and planning.
|
453 |
Prepayment Modeling in Mortgage Backed Securities : Independent and Strategic Approaches to Prepayment TimingAndersson, Johanna January 2024 (has links)
Mortgage Backed Securities (MBS) are a type of security backed by mortgages as the underlying asset. This is achieved through a process called securitization, where specific mortgages are grouped together and separated from the bank’s other assets, and then sold to investors. One of the risks for investors in MBS is mortgage prepayments made by the borrowers of the underlying mortgages. This risk arises due to the uncertainty of the expected cash flows to be distributed among the investors. There is a correlation between falling market interest rates and an increase in prepayments. When market interest rates fall, borrowers have an incentive to refinance their mortgages at lower interest rates, leading to higher prepayment rates. The Public Securities Association (PSA) model is recognized as a standard benchmark for estimating prepayment rates in MBS. In this paper, we have introduced models to generate time points for prepayments and compare how well these models match with the PSA model. Some of these models determine the timing of each prepayment event using an exponentially distributed Poisson process, while one model employs the Gamma distribution. Additionally, we introduce a strategy where prepayment is strategically triggered by whether the market rate falls below the contract rate. In that strategy, we investigate when it is most beneficial to make a prepayment. The results show that among the models employing random generation of prepayment events, the Gamma distribution best aligns with the PSA rule. Regarding the strategic prepayment strategy, our findings suggest that it is most advantageous to make prepayments early in the mortgage term, aligning with the most rational behavior as well.
|
454 |
Generalized Principal Component AnalysisSolat, Karo 05 June 2018 (has links)
The primary objective of this dissertation is to extend the classical Principal Components Analysis (PCA), aiming to reduce the dimensionality of a large number of Normal interrelated variables, in two directions. The first is to go beyond the static (contemporaneous or synchronous) covariance matrix among these interrelated variables to include certain forms of temporal (over time) dependence. The second direction takes the form of extending the PCA model beyond the Normal multivariate distribution to the Elliptically Symmetric family of distributions, which includes the Normal, the Student's t, the Laplace and the Pearson type II distributions as special cases. The result of these extensions is called the Generalized principal component analysis (GPCA).
The GPCA is illustrated using both Monte Carlo simulations as well as an empirical study, in an attempt to demonstrate the enhanced reliability of these more general factor models in the context of out-of-sample forecasting. The empirical study examines the predictive capacity of the GPCA method in the context of Exchange Rate Forecasting, showing how the GPCA method dominates forecasts based on existing standard methods, including the random walk models, with or without including macroeconomic fundamentals. / Ph. D. / Factor models are employed to capture the hidden factors behind the movement among a set of variables. It uses the variation and co-variation between these variables to construct a fewer latent variables that can explain the variation in the data in hand. The principal component analysis (PCA) is the most popular among these factor models.
I have developed new Factor models that are employed to reduce the dimensionality of a large set of data by extracting a small number of independent/latent factors which represent a large proportion of the variability in the particular data set. These factor models, called the generalized principal component analysis (GPCA), are extensions of the classical principal component analysis (PCA), which can account for both contemporaneous and temporal dependence based on non-Gaussian multivariate distributions.
Using Monte Carlo simulations along with an empirical study, I demonstrate the enhanced reliability of my methodology in the context of out-of-sample forecasting. In the empirical study, I examine the predictability power of the GPCA method in the context of “Exchange Rate Forecasting”. I find that the GPCA method dominates forecasts based on existing standard methods as well as random walk models, with or without including macroeconomic fundamentals.
|
455 |
Measurement Invariance and Sensitivity of Delta Fit Indexes in Non-Normal Data: A Monte Carlo Simulation StudyYu, Meixi 01 January 2024 (has links) (PDF)
The concept of measurement invariance is essential in ensuring psychological and educational tests are interpreted consistently across diverse groups. This dissertation investigated the practical challenges associated with measurement invariance, specifically on how measurement invariance delta fit indexes are affected by non-normal data. Non-normal data distributions are common in real-world scenarios, yet many statistical methods and measurement invariance delta fit indexes are based on the assumption of normally distributed data. This raises concerns about the accuracy and reliability of conclusions drawn from such analyses. The primary objective of this research is to examine how commonly used delta fit indexes of measurement invariance respond under conditions of non-normality. The present research was built upon Cao and Liang (2022a)’s study to test the sensitivities of a series of delta fit indexes, and further scrutinizes the role of non-normal data distributions. A series of simulation studies was conducted, where data sets with varying degrees of skewness and kurtosis were generated. These data sets were then examined by multi-group confirmatory factor analysis (MGCFA) using the Satorra-Bentler scaled chi-square difference test, a method specifically designed to adjust for non-normality. The performance of delta fit indexes such as the Delta Comparative Fit Index (∆CFI), Delta Standardized Root Mean Square residual (∆SRMR) and Delta Root Mean Square Error of Approximation (∆RMSEA) were assessed. These findings have significant implications for professionals and scholars in psychology and education. They provide constructive information related to key aspects of research and practice in these fields related to measurement, contributing to the broader discussion on measurement invariance by highlighting challenges and offering solutions for assessing model fit in non-normal data scenarios.
|
456 |
Probabilistic Post-Liquefaction Residual Shear Strength Analyses of Cohesionless Soil Deposits: Application to the Kocaeli (1999) and Duzce (1999) EarthquakesLumbantoruan, Partahi Mamora Halomoan 31 October 2005 (has links)
Liquefaction of granular soil deposits can have extremely detrimental effects on the stability of embankment dams, natural soil slopes, and mine tailings. The residual or liquefied shear strength of the liquefiable soils is a very important parameter when evaluating stability and deformation of level and sloping ground. Current procedures for estimating the liquefied shear strength are based on extensive laboratory testing programs or from the back-analysis of failures where liquefaction was involved and in-situ testing data was available. All available procedures utilize deterministic methods for estimation and selection of the liquefied shear strength. Over the past decade, there has been an increasing trend towards analyzing geotechnical problems using probability and reliability. This study presents procedures for assessing the liquefied shear strength of cohesionless soil deposits within a risk-based framework. Probabilistic slope stability procedures using reliability methods and Monte Carlo Simulations are developed to incorporate uncertainties associated with geometrical and material parameters. The probabilistic methods are applied to flow liquefaction case histories from the 1999 Kocaeli/Duzce, Turkey Earthquake, where extensive liquefaction was observed. The methods presented in this paper should aid in making better decisions about the design and rehabilitation of structures constructed of or atop liquefiable soil deposits. / Master of Science
|
457 |
Scaling of Steady States in a Simple Driven Three-State Lattice GasThies, Michael 15 September 1998 (has links)
Phase segregated states in a simple three-state stochastic lattice gas are investigated. A two dimensional finite lattice with periodic boundary conditions is filled with one hole and two oppositely "charged" species of particles, subject to an excluded volume constraint. Starting from a completely disordered initial configuration, a sufficiently large external "electric" field <I>E</I> induces the phase segregation, by separating the charges into two strips and "trapping" the hole at an interface between them. Focusing on the steady state, the scaling properties of an appropriate order parameter, depending on drive and system size, are investigated by mean-field theory and Monte Carlo methods. Density profiles of the two interfaces in the ordered system are studied with the help of Monte Carlo simulations and are found to scale in the field-dependent variable, Ε = 2 tanh <I>E</I> /2), for <I>E</I> ≲ 0.8. For larger values of <I>E</I>, independent approximations of the interfacial profiles, obtained within the framework of mean-field theory, exhibit significant deviations from the Monte Carlo data. Interestingly, the deviations can be reduced significantly by a slight modification of the mean-field theory. / Master of Science
|
458 |
A probabilistic method for the operation of three-phase unbalanced active distribution networksMokryani, Geev, Majumdar, A., Pal, B.C. 25 January 2016 (has links)
Yes / This paper proposes a probabilistic multi-objective optimization method for the operation of three-phase distribution networks incorporating active network management (ANM) schemes including coordinated voltage control and adaptive power factor control. The proposed probabilistic method incorporates detailed modelling of three-phase distribution network components and considers different operational objectives. The method simultaneously minimizes the total energy losses of the lines from the point of view of distribution network operators (DNOs) and maximizes the energy generated by photovoltaic (PV) cells considering ANM schemes and network constraints. Uncertainties related to intermittent generation of PVs and load demands are modelled by probability density functions (PDFs). Monte Carlo simulation method is employed to use the generated PDFs. The problem is solved using ɛ-constraint approach and fuzzy satisfying method is used to select the best solution from the Pareto optimal set. The effectiveness of the proposed probabilistic method is demonstrated with IEEE 13- and 34- bus test feeders.
|
459 |
[en] PROBABILISTIC LOAD FLOW VIA MONTE CARLO SIMULATION AND CROSS-ENTROPY METHOD / [pt] FLUXO DE POTÊNCIA PROBABILÍSTICO VIA SIMULAÇÃO MONTE CARLO E MÉTODO DA ENTROPIA CRUZADAANDRE MILHORANCE DE CASTRO 12 February 2019 (has links)
[pt] Em planejamento e operação de sistemas de energia elétrica, é necessário realizar diversas avaliações utilizando o algoritmo de fluxo de potência, para obter e monitorar o ponto de operação da rede em estudo. Em sua utilização determinística, devem ser especificados valores de geração e níveis de carga por barra, bem como considerar uma configuração especifica da rede elétrica. Existe, porém, uma restrição evidente em se trabalhar com algoritmo de fluxo de potência determinístico: não há qualquer percepção do impacto gerado por incertezas nas variáveis de entrada que o algoritmo utiliza. O algoritmo de fluxo de potência probabilístico (FPP) visa extrapolar as limitações impostas pelo uso da ferramenta convencional determinística, permitindo a consideração das incertezas de entrada. Obtém-se maior sensibilidade na avaliação dos resultados, visto que possíveis regiões de operação são mais claramente examinadas. Consequentemente, estima-se o risco do sistema funcionar fora de suas condições operativas nominais. Essa dissertação propõe uma metodologia baseada na simulação Monte Carlo (SMC) utilizando técnicas de amostragem por importância via o método de entropia cruzada. Índices de risco para eventos selecionados (e.g., sobrecargas em equipamentos de transmissão) são avaliados, mantendo-se a precisão e flexibilidade permitidas pela SMC convencional, porém em tempo computacional muito reduzido. Ao contrário das técnicas analíticas concebidas para solução do FPP, que visam primordialmente à elaboração de curvas de densidade de probabilidade para as variáveis de saída (fluxos, etc.) e sempre necessitam ter a precisão obtida comparada à SMC, o método proposto avalia somente as áreas das caudas dessas densidades, obtendo resultados com maior exatidão nas regiões de interesse do ponto de vista do risco operativo. O método proposto é aplicado nos sistemas IEEE 14 barras, IEEE RTS e IEEE 118 barras, sendo os resultados obtidos amplamente discutidos. Em todos os casos, há claros ganhos de desempenho
computacional, mantendo-se a precisão, quando comparados à SMC convencional. As possíveis aplicações do método e suas derivações futuras também fazem parte da dissertação. / [en] In planning and operation of electric energy systems, it is necessary to perform several evaluations using the power flow algorithm to obtain and monitor the operating point of the network under study. Bearing in mind its deterministic use, generation values and load levels per bus must be specified, as well as a specific configuration of the power network. There is, however, an obvious constraint in running a deterministic power flow tool: there is no perception of the impact produced by uncertainties on
the input variables used by the conventional algorithm. The probabilistic power flow (PLF) algorithm aims to solve the limitations imposed by the use of the deterministic conventional tool, allowing the consideration of input uncertainties. Superior sensitivity is obtained in the evaluation of results, as
possible regions of operation are more clearly examined. Consequently, the risk of the system operating outside its nominal conditions is duly estimated. This dissertation proposes a methodology based on Monte Carlo simulation (MCS) using importance sampling techniques via the cross-entropy method. Risk indices for selected events (e.g., overloads on transmission equipment) are evaluated, keeping the same accuracy and flexibility tolerable by the conventional MCS, but in much less computational time. Unlike the FPP
solution obtained by analytical techniques, which primarily aim at assessing probability density curves for the output variables (flows, etc.) and always need to have the accuracy compared to MCS, the proposed method evaluates only the tail areas of these densities, obtaining results with greater accuracy in the regions of interest from the operational risk point of view. The proposed method is applied to IEEE 14, IEEE RTS and IEEE 118 bus systems, and the results are widely discussed. In all cases, there are clear
gains in computational performance, maintaining accuracy when compared to conventional SMC. The possible applications of the method and future developments are also part of the dissertation.
|
460 |
Nanoscale pattern formation on ion-sputtered surfaces / Musterbildung auf der Nanometerskala an ion-gesputterten OberflächenYasseri, Taha 21 January 2010 (has links)
No description available.
|
Page generated in 0.0862 seconds