• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 6
  • 1
  • 1
  • Tagged with
  • 8
  • 8
  • 3
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Incorporating discontinuities in value-at-risk via the poisson jump diffusion model and variance gamma model

Lee, Brendan Chee-Seng, Banking & Finance, Australian School of Business, UNSW January 2007 (has links)
We utilise several asset pricing models that allow for discontinuities in the returns and volatility time series in order to obtain estimates of Value-at-Risk (VaR). The first class of model that we use mixes a continuous diffusion process with discrete jumps at random points in time (Poisson Jump Diffusion Model). We also apply a purely discontinuous model that does not contain any continuous component at all in the underlying distribution (Variance Gamma Model). These models have been shown to have some success in capturing certain characteristics of return distributions, a few being leptokurtosis and skewness. Calibrating these models onto the returns of an index of Australian stocks (All Ordinaries Index), we then use the resulting parameters to obtain daily estimates of VaR. In order to obtain the VaR estimates for the Poisson Jump Diffusion Model and the Variance Gamma Model, we introduce the use of an innovation from option pricing techniques, which concentrates on the more tractable characteristic functions of the models. Having then obtained a series of VaR estimates, we then apply a variety of criteria to assess how each model performs and also evaluate these models against the traditional approaches to calculating VaR, such as that suggested by J.P. Morgan???s RiskMetrics. Our results show that whilst the Poisson Jump Diffusion model proved the most accurate at the 95% VaR level, neither the Poisson Jump Diffusion or Variance Gamma models were dominant in the other performance criteria examined. Overall, no model was clearly superior according to all the performance criteria analysed, and it seems that the extra computational time required to calibrate the Poisson Jump Diffusion and Variance Gamma models for the purposes of VaR estimation do not provide sufficient reward for the additional effort than that currently employed by Riskmetrics.
2

Incorporating discontinuities in value-at-risk via the poisson jump diffusion model and variance gamma model

Lee, Brendan Chee-Seng, Banking & Finance, Australian School of Business, UNSW January 2007 (has links)
We utilise several asset pricing models that allow for discontinuities in the returns and volatility time series in order to obtain estimates of Value-at-Risk (VaR). The first class of model that we use mixes a continuous diffusion process with discrete jumps at random points in time (Poisson Jump Diffusion Model). We also apply a purely discontinuous model that does not contain any continuous component at all in the underlying distribution (Variance Gamma Model). These models have been shown to have some success in capturing certain characteristics of return distributions, a few being leptokurtosis and skewness. Calibrating these models onto the returns of an index of Australian stocks (All Ordinaries Index), we then use the resulting parameters to obtain daily estimates of VaR. In order to obtain the VaR estimates for the Poisson Jump Diffusion Model and the Variance Gamma Model, we introduce the use of an innovation from option pricing techniques, which concentrates on the more tractable characteristic functions of the models. Having then obtained a series of VaR estimates, we then apply a variety of criteria to assess how each model performs and also evaluate these models against the traditional approaches to calculating VaR, such as that suggested by J.P. Morgan???s RiskMetrics. Our results show that whilst the Poisson Jump Diffusion model proved the most accurate at the 95% VaR level, neither the Poisson Jump Diffusion or Variance Gamma models were dominant in the other performance criteria examined. Overall, no model was clearly superior according to all the performance criteria analysed, and it seems that the extra computational time required to calibrate the Poisson Jump Diffusion and Variance Gamma models for the purposes of VaR estimation do not provide sufficient reward for the additional effort than that currently employed by Riskmetrics.
3

Ultra-fast line protection relay algorithm based on a Gamma model of line

Hoxha, Neriton 19 October 2020 (has links) (PDF)
The relay protections of the transmission lines play a fundamental role in the electrical power systems. They permit to ensure the security and the reliability of the electricity transmission from the generators to the final consumers. The objective of a relay protection is to provide a corrective action as quickly as possible when an abnormal condition of the power system is detected. The quickness of the response permits to limit the stress on the equipments of the power system and the consumers, to ensure the security of the people, to improve the power quality and to maintain the stability of the power system.The protective relaying systems have evolved a lot since their first implementation in the 1900’s. However, the electrical power systems are in constant evolution and the reliability of the protective relaying systems becomes more and more challenging. The three main characteristics of the relay protections which are security, dependability and speed must be continuously improved to achieve these objectives. The major relay protections implemented nowadays are based on frequency-domain methods. These methods are intrinsically limited in speed by the phasor estimation of the voltage and current signals. More recent methods based on incremental quantities permitted to break this limitation by working directly in time-domain. Despite the speed of these methods, the dependability is usually limited in order to ensure the security.In this work, it is proposed to develop a time-domain ultra-fast non-pilot distance protection based on a Gamma model of line to improve the security, the dependability and the speed, even for long lines and weak power systems. This protection is composed of a loop selection element, a directional element and a distance element. The target tripping time is 4 ms or less. / Doctorat en Sciences de l'ingénieur et technologie / info:eu-repo/semantics/nonPublished
4

Bacteria Growth Modeling using Long-Short-Term-Memory Networks

Shojaee, Ali, B.S. 29 September 2021 (has links)
No description available.
5

Dynamic prediction of repair costs in heavy-duty trucks

Saigiridharan, Lakshidaa January 2020 (has links)
Pricing of repair and maintenance (R&M) contracts is one among the most important processes carried out at Scania. Predictions of repair costs at Scania are carried out using experience-based prediction methods which do not involve statistical methods for the computation of average repair costs for contracts terminated in the recent past. This method is difficult to apply for a reference population of rigid Scania trucks. Hence, the purpose of this study is to perform suitable statistical modelling to predict repair costs of four variants of rigid Scania trucks. The study gathers repair data from multiple sources and performs feature selection using the Akaike Information Criterion (AIC) to extract the most significant features that influence repair costs corresponding to each truck variant. The study proved to show that the inclusion of operational features as a factor could further influence the pricing of contracts. The hurdle Gamma model, which is widely used to handle zero inflations in Generalized Linear Models (GLMs), is used to train the data which consists of numerous zero and non-zero values. Due to the inherent hierarchical structure within the data expressed by individual chassis, a hierarchical hurdle Gamma model is also implemented. These two statistical models are found to perform much better than the experience-based prediction method. This evaluation is done using the mean absolute error (MAE) and root mean square error (RMSE) statistics. A final model comparison is conducted using the AIC to draw conclusions based on the goodness of fit and predictive performance of the two statistical models. On assessing the models using these statistics, the hierarchical hurdle Gamma model was found to perform predictions the best
6

Bayesian Inference in Structural Second-Price Auctions

Wegmann, Bertil January 2011 (has links)
The aim of this thesis is to develop efficient and practically useful Bayesian methods for statistical inference in structural second-price auctions. The models are applied to a carefully collected coin auction dataset with bids and auction-specific characteristics from one thousand Internet auctions on eBay. Bidders are assumed to be risk-neutral and symmetric, and compete for a single object using the same game-theoretic strategy. A key contribution in the thesis is the derivation of very accurate approximations of the otherwise intractable equilibrium bid functions under different model assumptions. These easily computed and numerically stable approximations are shown to be crucial for statistical inference, where the inverse bid functions typically needs to be evaluated several million times. In the first paper, the approximate bid is a linear function of a bidder's signal and a Gaussian common value model is estimated. We find that the publicly available book value and the condition of the auctioned object are important determinants of bidders' valuations, while eBay's detailed seller information is essentially ignored by the bidders. In the second paper, the Gaussian model in the first paper is contrasted to a Gamma model that allows intrinsically non-negative common values. The Gaussian model performs slightly better than the Gamma model on the eBay data, which we attribute to an almost normal or at least symmetrical distribution of valuations. The third paper compares the model in the first paper to a directly comparable model for private values. We find many interesting empirical regularities between the models, but no strong and consistent evidence in favor of one model over the other. In the last paper, we consider auctions with both private-value and common-value bidders. The equilibrium bid function is given as the solution to an ordinary differential equation, from which we derive an approximate inverse bid as an explicit function of a given bid. The paper proposes an elaborate model where the probability of being a common value bidder is a function of covariates at the auction level. The model is estimated by a Metropolis-within-Gibbs algorithm and the results point strongly to an active influx of both private-value and common-value bidders. / <p>At the time of the doctoral defense, the following papers were unpublished and had a status as follows: Paper 1: Epub ahead of print. Paper 2: Manuscript. Paper 3: Manuscript. Paper 4: Manuscript.</p>
7

Modelos não-lineares de regressão : alguns aspectos de teoria assintótica

PRUDENTE, Andréa Andrade 18 March 2009 (has links)
Submitted by (ana.araujo@ufrpe.br) on 2016-05-20T14:30:00Z No. of bitstreams: 1 Andrea Andrade Prudente.pdf: 1364424 bytes, checksum: 52db48248a4f42fd96b6ee53463083eb (MD5) / Made available in DSpace on 2016-05-20T14:30:00Z (GMT). No. of bitstreams: 1 Andrea Andrade Prudente.pdf: 1364424 bytes, checksum: 52db48248a4f42fd96b6ee53463083eb (MD5) Previous issue date: 2009-03-18 / Conselho Nacional de Pesquisa e Desenvolvimento Científico e Tecnológico - CNPq / The main objective in this dissertation is to derive expressions for the second-order biases of the maximum likelihood estimators of the parameters of the Weibull generalized linear model (WGLM), which are useful to define corrected estimators. In order to reduce the bias of these estimators in finite sample sizes, the method of bias correction introduced by Cox and Snell (1968) was used. The new model adopts a link function which relates the vector of scale parameters of the Weibull distribution to a linear predictor. As a second objective, a revision of the normal non-linear models was also presented, including the method of least squares for estimating the parameters, some asymptotic results, measures of nonlinearity and diagnostic techniques, because in contrast to linear models, quality and, especially, the validity of their fits are evaluated not only by means of regression diagnostics, but also with the extent of the non-linear behavior. Finally, a brief description of generalized linear models (GLM) is given and the applicability of the model range. Real data sets were analyzed to demonstrate the applicability of the proposed models. These tests were conducted in the R environment for programming, data analysis, andgraphics. / Esta dissertação tem como objetivo principal apresentar expressões para os vieses de segunda ordem dos estimadores de máxima verossimilhança dos parâmetros do modelo linear generalizado de Weibull (MLGW), utilizando-as para obter estimadores corrigidos. Com o intuito de reduzir os vieses destes estimadores, em amostras de tamanho finito, utilizou-se a correção do viés pelo uso da equação de Cox e Snell (1968). Esse modelo permite a utilização de uma função de ligação para relacionar o vetor dos parâmetros de escala da distribuição de Weibull (parte da média) ao preditor linear. Um objetivo secundário foi revisar os modelos normais não-lineares, contemplando o método de mínimos quadrados para estimação dos seus parâmetros, alguns resultados assintóticos, medidas de não-linearidade e técnicas de diagnóstico, pois ao contrário dos modelos lineares, a qualidade e, principalmente, a validade dos seus ajustes são avaliadas não só por meio de diagnósticos de regressão, mas pela extensão do comportamento nãolinear. Por fim, foi apresentada, também, uma sucinta descrição dos modelos lineares generalizados (MLG) e a aplicabilidade do modelo gama. Dados reais foram analisados para demonstrar a aplicabilidade dos modelos propostos. Estas análises foram realizadas no ambiente de programação, análise de dados e gráficos R.
8

Modélisation statistique et probabiliste du temps inter-véhiculaire aux différents niveaux de trafic / Statistic and probabilistic modeling of time headway variable in different traffic levels

Ha, Duy Hung 11 May 2011 (has links)
Temps Inter-véhiculaire (TIV) est une variable microscopique fondamentale dans la théorie du trafic, et a été étudié depuis le début du développement de cette théorie, vers 1930. La distribution de probabilité du TIV décrit la répartition des arrivées des véhicules en un point donné et reflète dans une certaine mesure le comportement de conduite. Beaucoup d'applications en ingénierie du trafic viennent de la connaissance fine de cette variable. La thèse a pour but d'approfondir cette connaissance en modélisant la distribution du TIV dans différents contextes selon différents points de vue. Tout d'abord, deux méthodes d'échantillonnage, la méthode de groupement et la méthode de raffinement sont considérées. L'application numérique concerne deux bases de données, celle de la route nationale RN118 et celle de l'autoroute A6. Ensuite, trois types de modèles probabilistes sont analysés et classifiés. Une comparaison exhaustive des modèles et des méthodes d'estimation est réalisée ce qui conduit à considérer que le modèle gamma-GQM est supérieur aux autres modèles en matière de performance statistique et en efficacité de calcul. Différentes procédures d'estimation sont testées, celle qui est proposée et retenue favorise la stabilité des paramètres estimés. Six nouveaux modèles de TIV sont proposés, calibrés, analysés. Mis à part deux modèles de performance inférieure aux autres et au modèle gamma-GQM, quatre modèles sont équivalents voire meilleurs que le modèle gamma-GQM. Pour une raison pratique, le modèle Double Gamma est choisi à côté du modèle gamma-GQM, comme modèle de comparaison, dans toute la modélisation des TIV. Le calibrage des modèles et l'analyse des paramètres des modèles sont menés, à partir des données réelles, en considérant trois dimensions d'étude du trafic: les échelles macroscopique, mésoscopique et microscopique. Une quatrième dimension d'étude des TIV est constituée des facteurs exogènes au trafic. La prise en compte de ces facteurs exogènes, à chaque échelle macroscopique entraîne la distinction de deux types de facteur exogène : « empêchant » et « impulsant». Finalement, différentes approches de validation sont testées. L'approche proposée par « enveloppe des distributions » semble prometteuse pour le futur / Time Headway (TH) is a microscopic variable in traffic flow theories that has been studied since the 1930s. Distribution of this fundamental variable describes the arrival pattern of vehicles in traffic flow, so probabilistic modeling is the main approach to study TH and represent driving behaviour. The applications of the variable in traffic engineering are varied; include capacity calculation, microscopic simulation, traffic safety analysis, etc. This dissertation aims at modeling the TH distribution in different contexts. Firstly, the short-time sampling method and long-time sampling method are applied to obtain TH samples from the two data bases (the RN118 national roadway and the A6 motorway). Then, three probabilistic TH model types are analyzed and classified. An exhaustive comparison between the existing models and between the corresponding estimation methods lead to consider that the gamma-GQM is the best TH model in the literature. An estimation process is also proposed in order to obtain good and stable estimated results of the parameters. After that, the TH probabilistic modeling is developed by six new models. Except for the two ones which are worse, the four other models are statistically equivalent and/or better than the gamma-GQM. For practical reason, the Double Gamma model is selected, as a comparison model, with the gamma-GQM to calibrate all TH samples. Three traffic levels are considered: macroscopic, mesoscopic and microscopic. The effects of exogenous factors are also examined. Examining this factor in each macroscopic variable level leads to distinguish two following factor types: impeding factor and propulsive factor. Finally, different approaches for TH validation are tested. The proposed approach of “envelope of distributions” seems to be promising for future applications

Page generated in 0.0369 seconds