• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 3
  • 2
  • 2
  • Tagged with
  • 10
  • 10
  • 10
  • 6
  • 3
  • 3
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Development of a fast Monte Carlo code for dose calculation in treatment planning and feasibility study of high contrast portal imaging

Jabbari, Keivan, January 1900 (has links)
Thesis (Ph.D.). / Written for the Dept. of Physics. Title from title page of PDF (viewed 2009/11/06). Includes bibliographical references.
2

Optimisation-based verification process of obstacle avoidance systems for unmanned vehicles

Thedchanamoorthy, Sivaranjini January 2014 (has links)
This thesis deals with safety verification analysis of collision avoidance systems for unmanned vehicles. The safety of the vehicle is dependent on collision avoidance algorithms and associated control laws, and it must be proven that the collision avoidance algorithms and controllers are functioning correctly in all nominal conditions, various failure conditions and in the presence of possible variations in the vehicle and operational environment. The current widely used exhaustive search based approaches are not suitable for safety analysis of autonomous vehicles due to the large number of possible variations and the complexity of algorithms and the systems. To address this topic, a new optimisation-based verification method is developed to verify the safety of collision avoidance systems. The proposed verification method formulates the worst case analysis problem arising the verification of collision avoidance systems into an optimisation problem and employs optimisation algorithms to automatically search the worst cases. Minimum distance to the obstacle during the collision avoidance manoeuvre is defined as the objective function of the optimisation problem, and realistic simulation consisting of the detailed vehicle dynamics, the operational environment, the collision avoidance algorithm and low level control laws is embedded in the optimisation process. This enables the verification process to take into account the parameters variations in the vehicle, the change of the environment, the uncertainties in sensors, and in particular the mismatching between model used for developing the collision avoidance algorithms and the real vehicle. It is shown that the resultant simulation based optimisation problem is non-convex and there might be many local optima. To illustrate and investigate the proposed optimisation based verification process, the potential field method and decision making collision avoidance method are chosen as an obstacle avoidance candidate technique for verification study. Five benchmark case studies are investigated in this thesis: static obstacle avoidance system of a simple unicycle robot, moving obstacle avoidance system for a Pioneer 3DX robot, and a 6 Degrees of Freedom fixed wing Unmanned Aerial Vehicle with static and moving collision avoidance algorithms. It is proven that although a local optimisation method for nonlinear optimisation is quite efficient, it is not able to find the most dangerous situation. Results in this thesis show that, among all the global optimisation methods that have been investigated, the DIviding RECTangle method provides most promising performance for verification of collision avoidance functions in terms of guaranteed capability in searching worst scenarios.
3

Metody výpočtu VaR pro tržní a kreditní rizika / Methods of the calculation of Value at Risk for the market and credit risks

Štolc, Zdeněk January 2008 (has links)
This thesis is focused on a theoretical explication of the basic methods of the calculation Value at Risk for the market and credit risk. For the market risk there is in detail developed the variance -- covariance method, historical simulation and Monte Carlo simulation, above all for the nonlinear portfolio. For all methods the assumptions of their applications are highlighted and the comparation of these methods is made too. For the credit risk there is made a theoretical description of CreditMetrics, CreditRisk+ and KMV models. Analytical part is concerned in the quantification of Value at Risk on two portfolios, namely nonlinear currency portfolio, which particular assumptions of the variance -- covariance method a Monte Carlo simulation are tested on. Then by these methods the calculation of Value at Risk is realized. The calculation of Credit Value at Risk is made on the portfolio of the US corporate bonds by the help of CreditMetrics model.
4

Self-assembly of two-dimensional convex and nonconvex colloidal platelets

Pakalidou, Nikoletta January 2017 (has links)
One of the most promising routes to create advanced materials is self-assembly. Self-assembly refers to the self-organisation of building blocks to form ordered structures. As the properties of the self-assembled materials will inherit the properties of the basic building blocks, it is then possible to engineer the properties of the materials by tailoring the properties of the building blocks. In order to create mesoscale materials, the self-assembly of molecular building blocks of different sizes and interactions is important. Mesoscopic materials can be obtained by using larger building blocks such as nano and colloidal particles. Colloidal particles are particularly attractive as building blocks because it is possible to design interparticle interactions by controlling both the chemistry of the particles' surface and the properties of the solvent in which the particles are immersed. The self-assembly of spherical colloidal particles has been widely reported in the literature. However, advances in experimental techniques to produce particles with different shapes and sizes have opened new opportunities to create more complex structures that cannot be formed using spherical particles. Indeed, the particles' shape and effective interactions between them dictate the spatial arrangement and micro-structure of the system, which can be engineered to produce functional materials for a wide range of applications. The driving forces determining the self-assembly of colloidal particles can be modified by the use of external influences such as geometrical confinement and electromagnetic forces. Geometrical confinement, for example, has been used to design quasi two-dimensional materials such as multi-layered structures of spheres, dimers, rods, spherical caps, and monolayers of platelets with various geometries and symmetries. In this dissertation, we present three computer simulations studies using Monte Carlo and Molecular Dynamics simulations determining the self-assembly of monolayer colloidal platelets with different shapes confined in two dimensions. These particles have been selected due to recent experiments in colloidal particles with similar shapes. All the particles' models are represented by planar polygons, and three different effects affecting their self-assembly have been analysed: (a) the curvature of the particles' vertices; (b) the curvature of the particles' edges; and finally (c) the addition of functional groups on the particles' surface. These studies aim to demonstrate that the subtle changes on the particle's shape can be used to engineer complex patterns for the fabrication of advanced materials. Monte Carlo simulations are performed to study the self-assembly of colloidal platelets with rounded corners with 4, 5, and 6-fold symmetries. Square platelets provide a rich phase behaviour that ranges between disorder-order and order-order phase transitions. Suprisingly, the disk-like shape of pentagons and hexagons prevents the total crystallisation of these systems, even at a high pressure state. A hysteresis gap is observed by the analysis of compression and expansion runs for the case of square platelets and the thermodynamic method known as direct coexistence method is used to be accurately determined the point of the order-order transition. Further, unexpected results are obtained by performing Molecular Dynamics simulations in systems with platelets with 3, 4, 5, and 6-fold symmetries when all the sides of each polygon are curved. Macroscopic chiral symmetry breaking is observed for platelets with 4 and 6-fold symmetries, and for the first time a rule is promoted to explain when these chiral structures can be formed driven only by packing effects. This unique rule is verified also for platelets with the same curved sides as previously when functional chains tethered to either vertices or sides. Indeed, square platelets with curved sides confined in two dimensions can form chiral structures at medium densities when flexible chains tethered to either vertices or sides. Triangular platelets with curved sides can form chiral structures only when the chains are tethered to the corners, since the chains experience an one-hand rotation to sterically protect one side. When the chains are symmetrically tethered to the sides, local chiral symmetry breaking is observed as both left-hand and right-hand sides on each vertex are sterically protected allowing the same probability for rotation either in clockwise or anticlockwise direction.
5

Aplikace metod rizikového managementu při rozhodování o vstupu společnosti ComAp na trh hybridních systémů / Application of Risk Management Methods on Decision Making Process regarding company ComAp entering hybrid systems market

Král, Lukáš January 2012 (has links)
Decision making is one of the primary management activities, sometimes even referred to as the core of management. In order to survive in today's turbulent and competitive environment, the company places even greater demands on managers who control the firm through their decisions. Practice shows that the risk during their decisions is often underestimated, in many cases even ignored, threatening the business prosperity of companies. Such an unfavourable situation was the main reason why the author of this work decided to apply risk management methods for solving a decision problem in practice. For this purpose, author cooperated with company called ComAp, which is nowadays considering expanding its focus and eventually enter into new markets. The aim of this study is to assess the possibility of ComAp entering the market of hybrid systems and by using the methods of risk management to recommend a suitable alternative. Another objective of this work is to evaluate the contribution of the Monte Carlo method for solving the problem possible usage of this tool in ComAp in future.
6

MCAC - Monte Carlo Ant Colony: um novo algoritmo estocástico de agrupamento de dados

AGUIAR, José Domingos Albuquerque 29 February 2008 (has links)
Submitted by (ana.araujo@ufrpe.br) on 2016-07-06T19:39:45Z No. of bitstreams: 1 Jose Domingos Albuquerque Aguiar.pdf: 818824 bytes, checksum: 7c15525f356ca47ab36ddd8ac61ebd31 (MD5) / Made available in DSpace on 2016-07-06T19:39:45Z (GMT). No. of bitstreams: 1 Jose Domingos Albuquerque Aguiar.pdf: 818824 bytes, checksum: 7c15525f356ca47ab36ddd8ac61ebd31 (MD5) Previous issue date: 2008-02-29 / In this work we present a new data cluster algorithm based on social behavior of ants which applies Monte Carlo simulations in selecting the maximum path length of the ants. We compare the performance of the new method with the popular k-means and another algorithm also inspired by the social ant behavior. For the comparative study we employed three data sets from the real world, three deterministic artificial data sets and two random generated data sets, yielding a total of eight data sets. We find that the new algorithm outperforms the others in all studied cases but one. We also address the issue concerning about the right number of groups in a particular data set. Our results show that the proposed algorithm yields a good estimate for the right number of groups present in the data set. / Esta dissertação apresenta um algoritmo inédito de agrupamento de dados que têm como fundamentos o método de Monte Carlo e uma heurística que se baseia no comportamento social das formigas, conhecida como Otimização por Colônias de Formigas. Neste trabalho realizou-se um estudo comparativo do novo algoritmo com outros dois algoritmos de agrupamentos de dados. O primeiro algoritmo é o KMédias que é muito conhecido entre os pesquisadores. O segundo é um algoritmo que utiliza a Otimização por Colônias de Formigas juntamente com um híbrido de outros métodos de otimização. Para implementação desse estudo comparativo utilizaram-se oito conjuntos de dados sendo três conjuntos de dados reais, dois artificiais gerados deterministicamente e três artificiais gerados aleatoriamente. Os resultados do estudo comparativo demonstram que o novo algoritmo identifica padrões nas massas de dados, com desempenho igual ou superior aos outros dois algoritmos avaliados. Neste trabalho investigou-se também a capacidade do novo algoritmo em identificar o número de grupos existentes nos conjuntos dados. Os resultados dessa investigação mostram que o novo algoritmo é capaz de identificar o de número provável de grupos existentes dentro do conjunto de dados.
7

On The Non-linear Vibration And Mistuning Identification Of Bladed Disks

Yumer, Mehmet Ersin 01 January 2010 (has links) (PDF)
Forced response analysis of bladed disk assemblies plays a vital role in rotor blade design and has been drawing a great deal of attention both from research community and engine industry for more than half a century. However because of the phenomenon called &lsquo / mistuning&rsquo / , which destroys the cyclic symmetry of a rotor, there have been several difficulties related to forced response analysis ever since, two of which are addressed in this thesis: efficient non-linear forced response analysis of mistuned bladed disks and mistuning identification. On the nonlinear analysis side, a new solution approach is proposed studying the combined effect of non-linearity and mistuning, which is relatively recent in this research area and generally conducted with methods whose convergence and accuracy depend highly on the number of degrees of freedom where non-linear elements are attached. The proposed approach predicts nonlinear forced response of mistuned bladed disk assemblies considering any type of nonlinearity. In this thesis, special attention is given to the friction contact modeling of bladed disks which is the most common type of nonlinearity found in bladed disk assemblies. In the modeling of frictional contact a friction element which enables normal load variation and separation of the contact interface in three-dimensional space is utilized. Moreover, the analysis is carried out in modal domain where the differential equations of motions are converted to a set of non-linear algebraic equations using harmonic balance method and modal superposition technique. Thus, the number of non-linear equations to be solved is independent of the number of non-linear elements used. On the mistuning identification side, a new method is enclosed herein which makes use of neural networks to assess unknown mistuning parameters of a given bladed disk assembly from its assembly modes, thus being suitable for integrally bladed disks. The method assumes that a tuned mathematical model of the rotor under consideration is readily available, which is always the case for today&rsquo / s realistic bladed disk assemblies. A data set of selected mode shapes and natural frequencies is created by a number of simulations performed by mistuning the tuned mathematical model randomly. A neural network created by considering the number of modes, is then trained with this data set for being used to identify mistuning of the rotor from measured data. On top of these, a new adaptive algorithm is developed for harmonic balance method, several intentional mistuning patterns are investigated via excessive Monte-Carlo simulations and a new approach to locate, classify and parametrically identify structural non-linearities is introduced.
8

ESTIMAÇÃO PROBABILÍSTICA DO NÍVEL DE DISTORÇÃO HARMÔNICA TOTAL DE TENSÃO EM REDES DE DISTRIBUIÇÃO SECUNDÁRIAS COM GERAÇÃO DISTRIBUÍDA FOTOVOLTAICA / PROBABILISTIC ESTIMATION OF THE LEVEL OF DISTORTION TOTAL HARMONIC VOLTAGE IN DISTRIBUTION NETWORKS SECONDARY WITH PHOTOVOLTAIC DISTRIBUTED GENERATION

SILVA, Elson Natanael Moreira 10 February 2017 (has links)
Submitted by Maria Aparecida (cidazen@gmail.com) on 2017-04-17T13:14:17Z No. of bitstreams: 1 Elson Moreira.pdf: 7883984 bytes, checksum: cf59b3b0b24a249a7fd9e2390b7f16de (MD5) / Made available in DSpace on 2017-04-17T13:14:17Z (GMT). No. of bitstreams: 1 Elson Moreira.pdf: 7883984 bytes, checksum: cf59b3b0b24a249a7fd9e2390b7f16de (MD5) Previous issue date: 2017-02-10 / CNPQ / A problem of electric power quality that always affects the consumers of the distribution network are the harmonic distortions. Harmonic distortions arise from the presence of socalled harmonic sources, which are nonlinear equipment, i.e., equipment in which the voltage waveform differs from the current. Such equipment injects harmonic currents in the network generating distortions in the voltage waveform. Nowadays, the number of these equipment in the electrical network has increased considerably. However, the increasing use of such equipment over the network makes systems more vulnerable and prone to quality problems in the supply of electricity to consumers. In addition, it is important to note that in the current scenario, the generation of electricity from renewable sources, connected in the secondary distribution network, is increasing rapidly. This is mainly due to shortage and high costs of fossil fuels. In this context, the Photovoltaic Distributed Generation (PVDG), that uses the sun as a primary source for electric energy generation, is the main technology of renewable generation installed in distribution network. However, the PVDG is a potential source of harmonics, because the interface of the PVDG with the CA network is carried out by a CC/CA inverter, that is a highly nonlinear equipment. Thus, the electrical power quality problems associated with harmonic distortion in distribution networks tend to increase and be very frequent. One of the main indicators of harmonic distortion is the total harmonic distortion of voltage ( ) used by distribution utilities to limit the levels of harmonic distortion present in the electrical network. In the literature there are several deterministic techniques to estimate . These techniques have the disadvantage of not considering the uncertainties present in the electric network, such as: change in the network configuration, load variation, intermittence of the power injected by renewable distributed generation. Therefore, in order to provide a more accurate assessment of the harmonic distortions, this dissertation has as main objective to develop a probabilistic methodology to estimate the level of in secondary distribution networks considering the uncertainties present in the network and PVDG connected along the network. The methodology proposed in this dissertation is based on the combination of the following techniques: three-phase harmonic power flow in phase coordinate via method sum of admittance, point estimate method and series expansion of Gram-Charlier. The validation of the methodology was performed using the Monte Carlo Simulation. The methodology was tested in European secondary distribution network with 906 nodes of 416 V. The results were obtained by performing two case studies: without the presence of PVDG and with the PVDG connection. For the case studies, the following statistics for nodal were estimated: mean value, standard deviation and the 95% percentile. The results showed that the probabilistic estimation of is more complete, since it shows the variation of due to the uncertainties associated with harmonic sources and electric network. In addition, they show that the connection of PV-DG in the electric network significantly affects the levels of of the electric network. / Um problema de qualidade de energia elétrica que afeta os consumidores da rede de distribuição secundária são as distorções harmônicas. As distorções harmônicas são provenientes da presença das chamadas fontes de harmônicas que são equipamentos de características não-lineares, ou seja, equipamentos em que a forma de onda da tensão difere da de corrente. Tais equipamentos injetam correntes harmônicas na rede produzindo, portanto distorções na forma de onda da tensão. Nos dias atuais, a quantidade desses equipamentos na rede elétrica tem aumentado consideravelmente. Porém, o uso crescente desse tipo de equipamento ao longo da rede torna os sistemas mais vulneráveis e propensos a apresentarem problemas de qualidade no fornecimento de energia elétrica aos consumidores. Além disso, é importante destacar que no cenário atual, a geração de energia elétrica a partir de fontes renováveis, conectada na rede de distribuição secundária, está aumentando rapidamente. Isso se deve principalmente devido a escassez e altos custos dos combustíveis fosseis. Neste contexto, a Geração Distribuída Fotovoltaica (GDFV), que utiliza o sol como fonte primária para geração de energia elétrica, é a principal tecnologia de geração renovável instalada na rede de distribuição no Brasil. Contudo, a GDFV é uma potencial fonte de harmônica, pois a interface da GDFV com a rede CA é realizada por um inversor CC/CA, que é um equipamento altamente não-linear. Desde modo, os problemas de qualidade de energia elétrica associados à distorção harmônica nas redes de distribuição tendem a aumentar e a serem bem frequentes nos consumidores da rede de distribuição secundárias. Um dos principais indicadores de distorção harmônica é a distorção harmônica total de tensão ( do inglês “Total Harmonic Distortion of Voltage”) utilizada pelas concessionárias de energia elétrica para quantificar os níveis de distorção harmônica presentes na rede elétrica. Na literatura técnica existem várias técnicas determinísticas para estimar a . Essas técnicas possuem a desvantagem de não considerar as incertezas presentes na rede elétrica, tais como: mudança na configuração da rede, variação de carga e intermitência da potência injetada pela geração distribuída renovável. Portanto, a fim de fornecer uma avaliação mais precisa das distorções harmônicas, este trabalho tem como principal objetivo desenvolver uma metodologia probabilística para estimar o nível de em redes de distribuição secundária considerando as incertezas presentes na rede e na GDFV conectada ao longo da rede. A metodologia proposta nesta dissertação se baseia na combinação das seguintes técnicas: fluxo de potência harmônico trifásico em coordenadas de fase via método de soma de admitância, método de estimação por pontos e expansão em série de Gram-Charlier. Além disso, a validação da metodologia foi realizada utilizando a Simulação Monte Carlo. A metodologia desenvolvida foi testada na rede de distribuição secundária europeia com 906 nós de 416 V. Os resultados foram obtidos realizando dois casos de estudos: sem a presença de GDFV e com a conexão de GDFV. Para ambos os casos de estudo as seguintes estatísticas do nodal foram estimadas: valor médio, desvio padrão e o percentil de 95%. Os resultados demonstraram que a estimação probabilística da é mais completa, pois mostra a variação da devido às incertezas associadas com as fontes de harmônicas e as da rede elétrica. Os resultados também mostram que a conexão da GDFV afeta significativamente os níveis de da rede elétrica
9

Essays on multivariate generalized Birnbaum-Saunders methods

MARCHANT FUENTES, Carolina Ivonne 31 October 2016 (has links)
Submitted by Rafael Santana (rafael.silvasantana@ufpe.br) on 2017-04-26T17:07:37Z No. of bitstreams: 2 license_rdf: 1232 bytes, checksum: 66e71c371cc565284e70f40736c94386 (MD5) Carolina Marchant.pdf: 5792192 bytes, checksum: adbd82c79b286d2fe2470b7955e6a9ed (MD5) / Made available in DSpace on 2017-04-26T17:07:38Z (GMT). No. of bitstreams: 2 license_rdf: 1232 bytes, checksum: 66e71c371cc565284e70f40736c94386 (MD5) Carolina Marchant.pdf: 5792192 bytes, checksum: adbd82c79b286d2fe2470b7955e6a9ed (MD5) Previous issue date: 2016-10-31 / CAPES; BOLSA DO CHILE. / In the last decades, univariate Birnbaum-Saunders models have received considerable attention in the literature. These models have been widely studied and applied to fatigue, but they have also been applied to other areas of the knowledge. In such areas, it is often necessary to model several variables simultaneously. If these variables are correlated, individual analyses for each variable can lead to erroneous results. Multivariate regression models are a useful tool of the multivariate analysis, which takes into account the correlation between variables. In addition, diagnostic analysis is an important aspect to be considered in the statistical modeling. Furthermore, multivariate quality control charts are powerful and simple visual tools to determine whether a multivariate process is in control or out of control. A multivariate control chart shows how several variables jointly affect a process. First, we propose, derive and characterize multivariate generalized logarithmic Birnbaum-Saunders distributions. Also, we propose new multivariate generalized Birnbaum-Saunders regression models. We use the method of maximum likelihood estimation to estimate their parameters through the expectation-maximization algorithm. We carry out a simulation study to evaluate the performance of the corresponding estimators based on the Monte Carlo method. We validate the proposed models with a regression analysis of real-world multivariate fatigue data. Second, we conduct a diagnostic analysis for multivariate generalized Birnbaum-Saunders regression models. We consider the Mahalanobis distance as a global influence measure to detect multivariate outliers and use it for evaluating the adequacy of the distributional assumption. Moreover, we consider the local influence method and study how a perturbation may impact on the estimation of model parameters. We implement the obtained results in the R software, which are illustrated with real-world multivariate biomaterials data. Third and finally, we develop a robust methodology based on multivariate quality control charts for generalized Birnbaum-Saunders distributions with the Hotelling statistic. We use the parametric bootstrap method to obtain the distribution of this statistic. A Monte Carlo simulation study is conducted to evaluate the proposed methodology, which reports its performance to provide earlier alerts of out-of-control conditions. An illustration with air quality real-world data of Santiago-Chile is provided. This illustration shows that the proposed methodology can be useful for alerting episodes of extreme air pollution. / Nas últimas décadas, o modelo Birnbaum-Saunders univariado recebeu considerável atenção na literatura. Esse modelo tem sido amplamente estudado e aplicado inicialmente à modelagem de fadiga de materiais. Com o passar dos anos surgiram trabalhos com aplicações em outras áreas do conhecimento. Em muitas das aplicações é necessário modelar diversas variáveis simultaneamente incorporando a correlação entre elas. Os modelos de regressão multivariados são uma ferramenta útil de análise multivariada, que leva em conta a correlação entre as variáveis de resposta. A análise de diagnóstico é um aspecto importante a ser considerado no modelo estatístico e verifica as suposições adotadas como também sua sensibilidade. Além disso, os gráficos de controle de qualidade multivariados são ferramentas visuais eficientes e simples para determinar se um processo multivariado está ou não fora de controle. Este gráfico mostra como diversas variáveis afetam conjuntamente um processo. Primeiro, propomos, derivamos e caracterizamos as distribuições Birnbaum-Saunders generalizadas logarítmicas multivariadas. Em seguida, propomos um modelo de regressão Birnbaum-Saunders generalizado multivariado. Métodos para estimação dos parâmetros do modelo, tal como o método de máxima verossimilhança baseado no algoritmo EM, foram desenvolvidos. Estudos de simulação de Monte Carlo foram realizados para avaliar o desempenho dos estimadores propostos. Segundo, realizamos uma análise de diagnóstico para modelos de regressão Birnbaum-Saunders generalizados multivariados. Consideramos a distância de Mahalanobis como medida de influência global de detecção de outliers multivariados utilizando-a para avaliar a adequacidade do modelo. Além disso, desenvolvemos medidas de diagnósticos baseadas em influência local sob alguns esquemas de perturbações. Implementamos a metodologia apresentada no software R, e ilustramos com dados reais multivariados de biomateriais. Terceiro, e finalmente, desenvolvemos uma metodologia robusta baseada em gráficos de controle de qualidade multivariados para a distribuição Birnbaum-Saunders generalizada usando a estatística de Hotelling. Baseado no método bootstrap paramétrico encontramos aproximações da distribuição desta estatística e obtivemos limites de controle para o gráfico proposto. Realizamos um estudo de simulação de Monte Carlo para avaliar a metodologia proposta indicando seu bom desempenho para fornecer alertas precoces de processos fora de controle. Uma ilustração com dados reais de qualidade do ar de Santiago-Chile é fornecida. Essa ilustração mostra que a metodologia proposta pode ser útil para alertar sobre episódios de poluição extrema do ar, evitando efeitos adversos na saúde humana.
10

Numerical analysis and multi-precision computational methods applied to the extant problems of Asian option pricing and simulating stable distributions and unit root densities

Cao, Liang January 2014 (has links)
This thesis considers new methods that exploit recent developments in computer technology to address three extant problems in the area of Finance and Econometrics. The problem of Asian option pricing has endured for the last two decades in spite of many attempts to find a robust solution across all parameter values. All recently proposed methods are shown to fail when computations are conducted using standard machine precision because as more and more accuracy is forced upon the problem, round-off error begins to propagate. Using recent methods from numerical analysis based on multi-precision arithmetic, we show using the Mathematica platform that all extant methods have efficacy when computations use sufficient arithmetic precision. This creates the proper framework to compare and contrast the methods based on criteria such as computational speed for a given accuracy. Numerical methods based on a deformation of the Bromwich contour in the Geman-Yor Laplace transform are found to perform best provided the normalized strike price is above a given threshold; otherwise methods based on Euler approximation are preferred. The same methods are applied in two other contexts: the simulation of stable distributions and the computation of unit root densities in Econometrics. The stable densities are all nested in a general function called a Fox H function. The same computational difficulties as above apply when using only double-precision arithmetic but are again solved using higher arithmetic precision. We also consider simulating the densities of infinitely divisible distributions associated with hyperbolic functions. Finally, our methods are applied to unit root densities. Focusing on the two fundamental densities, we show our methods perform favorably against the extant methods of Monte Carlo simulation, the Imhof algorithm and some analytical expressions derived principally by Abadir. Using Mathematica, the main two-dimensional Laplace transform in this context is reduced to a one-dimensional problem.

Page generated in 0.1761 seconds