Spelling suggestions: "subject:"amathematical statistics."" "subject:"dmathematical statistics.""
651 |
How useful are intraday data in Risk Management? : An application of high frequency stock returns of three Nordic Banks to the VaR and ES calculationSomnicki, Emil, Ostrowski, Krzysztof January 2010 (has links)
<p>The work is focused on the Value at Risk and the Expected Shortfallcalculation. We assume the returns to be based on two pillars - the white noise and the stochastic volatility. We assume that the white noise follows the NIG distribution and the volatility is modeled using the nGARCH, NIG-GARCH, tGARCH and the non-parametric method. We apply the models into the stocks of three Banks of the Nordic market. We consider the daily and the intraday returns with the frequencies 5, 10, 20 and 30 minutes. We calculate the one step ahead VaR and ES for the daily and the intraday data. We use the Kupiec test and the Markov test to assess the correctness of the models. We also provide a new concept of improving the daily VaR calculation by using the high frequency returns. The results show that the intraday data can be used to the one step ahead VaR and the ES calculation. The comparison of the VaR for the end of the following trading day calculated on the basis of the daily returns and the one computed using the high frequency returns shows that using the intraday data can improve the VaR outcomes.</p>
|
652 |
A new approach to pricing real options on swaps : a new solution technique and extension to the non-a.s. finite stopping realmChu, Uran 07 June 2012 (has links)
This thesis consists of extensions of results on a perpetual American swaption problem.
Companies routinely plan to swap uncertain benefits with uncertain costs in the
future for their own benefits. Our work explores the choice of timing policies associated
with the swap in the form of an optimal stopping problem. In this thesis, we have shown
that Hu, Oksendal's (1998) condition given in their paper to guarantee that the optimal
stopping time is a.s. finite is in fact both a necessary and sufficient condition. We have
extended the solution to the problem from a region in the parameter space where optimal
stopping times are a.s. finite to a region where optimal stopping times are non-a.s. finite,
and have successfully calculated the probability of never stopping in this latter region. We
have identified the joint distribution for stopping times and stopping locations in both the
a.s. and non-a.s. finite stopping cases. We have also come up with an integral formula for
the inner product of a generalized hyperbolic distribution with the Cauchy distribution.
Also, we have applied our results to a back-end forestry harvesting model where
stochastic costs are assumed to exponentiate upwards to infinity through time. / Graduation date: 2013
|
653 |
Numerical analysis for random processes and fields and related design problemsAbramowicz, Konrad January 2011 (has links)
In this thesis, we study numerical analysis for random processes and fields. We investigate the behavior of the approximation accuracy for specific linear methods based on a finite number of observations. Furthermore, we propose techniques for optimizing performance of the methods for particular classes of random functions. The thesis consists of an introductory survey of the subject and related theory and four papers (A-D). In paper A, we study a Hermite spline approximation of quadratic mean continuous and differentiable random processes with an isolated point singularity. We consider a piecewise polynomial approximation combining two different Hermite interpolation splines for the interval adjacent to the singularity point and for the remaining part. For locally stationary random processes, sequences of sampling designs eliminating asymptotically the effect of the singularity are constructed. In Paper B, we focus on approximation of quadratic mean continuous real-valued random fields by a multivariate piecewise linear interpolator based on a finite number of observations placed on a hyperrectangular grid. We extend the concept of local stationarity to random fields and for the fields from this class, we provide an exact asymptotics for the approximation accuracy. Some asymptotic optimization results are also provided. In Paper C, we investigate numerical approximation of integrals (quadrature) of random functions over the unit hypercube. We study the asymptotics of a stratified Monte Carlo quadrature based on a finite number of randomly chosen observations in strata generated by a hyperrectangular grid. For the locally stationary random fields (introduced in Paper B), we derive exact asymptotic results together with some optimization methods. Moreover, for a certain class of random functions with an isolated singularity, we construct a sequence of designs eliminating the effect of the singularity. In Paper D, we consider a Monte Carlo pricing method for arithmetic Asian options. An estimator is constructed using a piecewise constant approximation of an underlying asset price process. For a wide class of Lévy market models, we provide upper bounds for the discretization error and the variance of the estimator. We construct an algorithm for accurate simulations with controlled discretization and Monte Carlo errors, andobtain the estimates of the option price with a predetermined accuracy at a given confidence level. Additionally, for the Black-Scholes model, we optimize the performance of the estimator by using a suitable variance reduction technique.
|
654 |
How useful are intraday data in Risk Management? : An application of high frequency stock returns of three Nordic Banks to the VaR and ES calculationSomnicki, Emil, Ostrowski, Krzysztof January 2010 (has links)
The work is focused on the Value at Risk and the Expected Shortfallcalculation. We assume the returns to be based on two pillars - the white noise and the stochastic volatility. We assume that the white noise follows the NIG distribution and the volatility is modeled using the nGARCH, NIG-GARCH, tGARCH and the non-parametric method. We apply the models into the stocks of three Banks of the Nordic market. We consider the daily and the intraday returns with the frequencies 5, 10, 20 and 30 minutes. We calculate the one step ahead VaR and ES for the daily and the intraday data. We use the Kupiec test and the Markov test to assess the correctness of the models. We also provide a new concept of improving the daily VaR calculation by using the high frequency returns. The results show that the intraday data can be used to the one step ahead VaR and the ES calculation. The comparison of the VaR for the end of the following trading day calculated on the basis of the daily returns and the one computed using the high frequency returns shows that using the intraday data can improve the VaR outcomes.
|
655 |
Revision Moment for the Retail Decision-Making SystemJuszczuk, Agnieszka Beata, Tkacheva, Evgeniya January 2010 (has links)
In this work we address to the problems of the loan origination decision-making systems. In accordance with the basic principles of the loan origination process we considered the main rules of a clients parameters estimation, a change-point problem for the given data and a disorder moment detection problem for the real-time observations. In the first part of the work the main principles of the parameters estimation are given. Also the change-point problem is considered for the given sample in the discrete and continuous time with using the Maximum likelihood method. In the second part of the work the disorder moment detection problem for the real-time observations is considered as a disorder problem for a non-homogeneous Poisson process. The corresponding optimal stopping problem is reduced to the free-boundary problem with a complete analytical solution for the case when the intensity of defaults increases. Thereafter a scheme of the real time detection of a disorder moment is given.
|
656 |
Pricing of exotic options under the Kou model by using the Laplace transformDzharayan, Gayk, Voronova, Elena January 2011 (has links)
In this thesis we present the Laplace transform method of option pricing and it's realization, also compare it with another methods. We consider vanilla and exotic options, but more attention we pay to the two-asset correlation options. We chose the one of the modifications of Black-Scholes model, the Kou double exponential jump-diffusion model with the double exponential distribution of jumps, as model of the underlying stock prices development. The computations was done by the Laplace transform and it's inversion by the Euler method. We will present in details proof of finding Laplace transforms of put and call two-asset correlation options, the calculations of the moment generation function of the jump-diffusion by Levy-Khintchine formulae in cases without jumps and with independent jumps, and direct calculation of the risk-neutral expectation by solving double integral. Our work also contains the programme code for two-asset correlation call and put options. We will show the realization of our programme in the real data. As a result we see how our model complies on the NASDAQ OMX Stock-holm Market, considering the two-asset correlation options on three cases by stock prices of Handelsbanken, Ericsson and index OMXS30.
|
657 |
Detection of the Change Point and Optimal Stopping Time by Using Control Charts on Energy DerivativesAL, Cihan, Koroglu, Kubra January 2011 (has links)
No description available.
|
658 |
Contributions to quality improvement methodologies and computer experimentsTan, Matthias H. Y. 16 September 2013 (has links)
This dissertation presents novel methodologies for five problem areas in modern quality improvement and computer experiments, i.e., selective assembly, robust design with computer experiments, multivariate quality control, model selection for split plot experiments, and construction of minimax designs.
Selective assembly has traditionally been used to achieve tight specifications on the clearance of two mating parts. Chapter 1 proposes generalizations of the selective assembly method to assemblies with any number of components and any assembly response function, called generalized selective assembly (GSA). Two variants of GSA are considered: direct selective assembly (DSA) and fixed bin selective assembly (FBSA). In DSA and FBSA, the problem of matching a batch of N components of each type to give N assemblies that minimize quality cost is formulated as axial multi-index assignment and transportation problems respectively. Realistic examples are given to show that GSA can significantly improve the quality of assemblies.
Chapter 2 proposes methods for robust design optimization with time consuming computer simulations. Gaussian process models are widely employed for modeling responses as a function of control and noise factors in computer experiments. In these experiments, robust design optimization is often based on average quadratic loss computed as if the posterior mean were the true response function, which can give misleading results. We propose optimization criteria derived by taking expectation of the average quadratic loss with respect to the posterior predictive process, and methods based on the Lugannani-Rice saddlepoint approximation for constructing accurate credible intervals for the average loss. These quantities allow response surface uncertainty to be taken into account in the optimization process.
Chapter 3 proposes a Bayesian method for identifying mean shifts in multivariate normally distributed quality characteristics. Multivariate quality characteristics are often monitored using a few summary statistics. However, to determine the causes of an out-of-control signal, information about which means shifted and the directions of the shifts is often needed. We propose a Bayesian approach that gives this information. For each mean, an indicator variable that indicates whether the mean shifted upwards, shifted downwards, or remained unchanged is introduced. Default prior distributions are proposed. Mean shift identification is based on the modes of the posterior distributions of the indicators, which are determined via Gibbs sampling.
Chapter 4 proposes a Bayesian method for model selection in fractionated split plot experiments. We employ a Bayesian hierarchical model that takes into account the split plot error structure. Expressions for computing the posterior model probability and other important posterior quantities that require evaluation of at most two uni-dimensional integrals are derived. A novel algorithm called combined global and local search is proposed to find models with high posterior probabilities and to estimate posterior model probabilities. The proposed method is illustrated with the analysis of three real robust design experiments. Simulation studies demonstrate that the method has good performance.
The problem of choosing a design that is representative of a finite candidate set is an important problem in computer experiments. The minimax criterion measures the degree of representativeness because it is the maximum distance of a candidate point to the design. Chapter 5 proposes algorithms for finding minimax designs for finite design regions. We establish the relationship between minimax designs and the classical set covering location problem in operations research, which is a binary linear program. We prove that the set of minimax distances is the set of discontinuities of the function that maps the covering radius to the optimal objective function value, and optimal solutions at the discontinuities are minimax designs. These results are employed to design efficient procedures for finding globally optimal minimax and near-minimax designs.
|
659 |
On Incentives affecting Risk and Asset Management of Power DistributionWallnerström, Carl Johan January 2011 (has links)
The introduction of performance based tariff regulations along with higher media and political pressure have increased the need for well-performed risk and asset management applied to electric power distribution systems (DS), which is an infrastructure considered as a natural monopoly. Compared to other technical systems, DS have special characteristics which are important to consider. The Swedish regulation of DS tariffs between 1996 and 2012 is described together with complementary laws such as customer compensation for long outages. The regulator’s rule is to provide incentives for cost efficient operation with acceptable reliability and reasonable tariff levels. Another difficult task for the regulator is to settle the complexity, i.e. the balance between considering many details and the manageability. Two performed studies of the former regulatory model, included in this thesis, were part of the criticism that led to its fall. Furthermore, based on results from a project included here, initiated by the regulator to review a model to judge effectible costs, the regulator changed some initial plans concerning the upcoming regulation. A classification of the risk management divided into separate categories is proposed partly based on a study investigating investment planning and risk management at a distribution system operator (DSO). A vulnerability analysis method using quantitative reliability analyses is introduced aimed to indicate how available resources could be better utilized and to evaluate whether additional security should be deployed for certain forecasted events. To evaluate the method, an application study has been performed based on hourly weather measurements and detailed failure reports over eight years for two DS. Months, weekdays and hours have been compared and the vulnerability of several weather phenomena has been evaluated. Of the weather phenomena studied, heavy snowfall and strong winds significantly affect the reliability, while frost, rain and snow depth have low or no impact. The main conclusion is that there is a need to implement new, more advanced, analysis methods. The thesis also provides a statistical validation method and introduces a new category of reliability indices, RT. / Distribution av elektricitet är att betrakta som ett naturligt monopol och är med stor sannolikhet det moderna samhällets viktigaste infrastruktur – och dess betydelse förutspås öka ytterligare i takt med implementering av teknik ämnad att minska mänsklighetens klimatpåverkan. I Sverige finns det fler än 150 elnätsbolag, vilka är av varierande storleksordning och med helt olika ägarstrukturer. Tidigare var handel med elektricitet integrerat i elnätsbolagens verksamhet, men 1996 avreglerades denna; infrastruktur för överföring separerades från produktion och handel. Införandet av kvalitetsreglering av elnätstariffer under början av 2000-talet och hårdare lagar om bland annat kundavbrottsersättning samt politiskt- och medialt tryck har givit incitament till kostnadseffektivitet med bibehållen god leveranskvalitet. En viktig aspekt är att eldistribution har, jämfört med andra infrastrukturer, flera speciella egenskaper som måste beaktas, vilket beskrives i avhandlingens första del tillsammans med introduktion av risk- och tillförlitlighetsteori samt ekonomisk teori. Två studier som kan ha bidragit till den förra regleringens fall och en studie vars resultat ändrat reglermyndighetens initiala idé avseende modell för att beräkna påverkbara kostnader i kommande förhandsreglering från 2012 är inkluderade. Av staten utsedd myndighet övervakar att kunder erbjudes elnätsanslutning och att tjänsten uppfyller kvalitetskrav samt att tariffnivåerna är skäliga och icke diskriminerande. Traditionellt har elnätsföretag mer eller mindre haft tillåtelse till intäkter motsvarande samtliga omkostnader och skälig vinst, så kallad självkostnadsprissättning. Under slutet av 1990-talet började ansvarig myndighet emellertid arbeta mot en reglering av intäktsram som även beaktar kostnadseffektivitet och kundkvalitet. Vid utformande av en sådan reglering måste svåra avvägningar göras. Exempelvis bör elnätsföretags objektiva förutsättningar, såsom terräng och kunder, tas i beaktning samtidigt som modellen bör vara lätthanterlig och konsekvent. Myndigheten ansåg ingen existerande reglermodell vara lämplig att anpassa till svenska förhållanden, så en ny modell utvecklades: Nätnyttomodellen (NNM). För 2003 års tariffer användes denna och beslut om krav på återbetalning till berörda elnätskunder togs, vilka överklagades. En utdragen juridisk process inleddes, där modellen kritiserades hårt av branschen på flera punkter. Två, i avhandlingen inkluderade studier, underbyggde kritisk argumentation mot NNM. Beslut i första instans (Länsrätt) hade inte tagits 2008 då parterna kom överens avseende år 2003-2007. Ett EU-direktiv tvingar Sverige att gå över till förhandsreglering, och i stället för att modifiera NNM och fortsätta strida juridiskt för den, togs beslut att ta fram en helt ny modell. Nätföretagens tillåtna intäktsram kommer förenklat grunda sig på elnätsföretagens kapitalkostnader och löpande kostnader. Därtill, utifrån hur effektivt och med vilken kvalitet nätföretagen bedrivit sin verksamhet, kan tillåten intäktsram justeras. En systematisk beskrivning av ett elnätsföretags nuvarande riskhantering och investeringsstrategier för olika spänningsnivåer tillhandahålles med syfte att stödja elnätsföretag i utvecklandet av riskhantering och att ge akademiskt referensmaterial baserat på branscherfarenhet. En klassificering av riskhantering uppdelat i olika kategorier och en sårbarhetsanalysmetod samt en ny tillförlitlighetsindexkategori (RT) föreslås i avhandlingen, delvis baserat på genomförd studie. Sårbarhetsanalysens övergripande idé är att identifiera och utvärdera möjliga systemtillstånd med hjälp av kvantitativa tillförlitlighetsanalyser. Målet är att detta skall vara ett verktyg för att nyttja tillgängliga resurser effektivare, t.ex. förebyggande underhåll och semesterplanering samt för att bedöma om förebyggande åtgärder baserat på väderprognoser vore lämpligt. RT är en flexibel kategori av mått på sannolikhet för kundavbrott ≥T timmar, vilket exempelvis är användbart för analys av kundavbrottsersättningslagars påverkan; sådana har exempelvis införts i Sverige och UK under 2000-talet. En statistisk valideringsmetod av tillförlitlighetsindex har tagits fram för att uppskatta statistisk osäkerhet som funktion av antal mätdata ett tillförlitlighetsindexvärde är baseras på. För att utvärdera introducerad sårbarhetsanalysmetod har en studie utförts baserat på timvisa väderdata och detaljerad avbrottsstatistik avseende åtta år för två olika eldistributionsnät i Sverige. Månader, veckodagar och timmar har jämförts vars resultat exempelvis kan användas för fördelning av resurser mer effektivt över tid. Sårbarhet med avseende på olika väderfenomen har utvärderats. Av de studerade väderfenomen är det blott ymnigt snöfall och hårda vindar, särskilt i kombination, som signifikant påverkar eldistributionssystems tillförlitlighet. Andra studier har visat på sårbarhet även för blixtnedslag (som ej fanns med som parameter i avhandlingen inkluderad studie). Temperatur (t.ex. inverkan av frost), regn och snödjup har således försumbar påverkan. Korrelationsstudier har utförts vilket bland annat visar på ett nästan linjärt samband i Sverige mellan temperatur och elförbrukning, vilket indirekt indikerar att även elförbrukning har försumbar påverkan på leveranskvalitet. Slutligen föreslås ett analysramverk som introducerad sårbarhetsanalys skulle vara en del av. Övergripande idé presenteras, vilket främst skall inspirera för fortsatt arbete; emellertid bör påpekas att introducerad sårbarhetsanalysmetod är en självständig och färdig metod oavsett om föreslagna idéer genomföres eller ej. / QC 20110815
|
660 |
Šetření vybraných parametrů výjezdů Hasičského záchranného sboru k dopravním nehodám v Jihočeském kraji / Statistic Review of Parameters of Fire Rescue Service's Interventions in Traffic Accidents in South BohemiaPETRŮV, Josef January 2016 (has links)
Fire Rescue Service is territorialized in each region, which means that Fire Rescue Service is present in each region of the Czech Republic as an individual accounting entity. Each accounting entity is obliged to prepare own budget estimate for next year which serves for covering costs related to interventions. Therefore it is beneficial to have an overview about the number of interventions expected in following year. The prediction of the expected interventions should be based utmost on real data from the the last period. Fire Rescue Service units' interventions in traffic accidents are covered from the state budget as well as other performances. The thesis is concerning the statistic survey related to Fire Rescue Service's interventions in traffic accidents. Specified goals of the thesis: G1) The survey of the development in time of the number of traffic accidents associated with South Bohemia Fire Rescue Service's interventions in traffic accidents within one month as the unit of time during the last 5 years. It is considered as the survey of the first of selected parameters concerning Fire Rescue Service's interventions in traffic accidents in the South Bohemia region. G2) The survey of the development in time of costs associated with South Bohemia Fire Rescue Service's interventions in traffic accidents within one month as the unit of time during the last 5 years. It is considered as the survey of the second of selected parameters related to Fire Rescue Service's interventions in traffic accidents in the South Bohemia region. G3) The comparison of the number of interventions and costs, the investigation of the apportionment of the number of interventions and costs in terms of a proper theoretical apportionment. It is considered as the survey concerning the dependence of selected parameters of Fire Rescue Service's interventions in traffic accidents in the South Bohemia region. With regard to a relativelly high number of primary data (entries from about 8 000 traffic accidents in the time period between 7/2010 and 6/2015 in the South Bohemia region, where was always present at least one unit of the South Bohemia Fire Rescue Service), mathematical statistic tools were needed to create a survey of selected parameters of the data file. Determining intervals for individual statistic features is to some extent carried out "by feeling". With regard to it, there was done the sensitivity analysis within which was investigated the influence of minor changes in the number of intervals on apportionment of percent occurrences of monitored statistical features (intervals were determined "by feeling" and based on recommendation from technical literature). For investigation of statistical features were used description statistics and also methods of statistical induction. The information used from technical literature and the interpretation of partial results and conclusions concerning individual sections of the thesis are provided in the section Discussion. The thesis demonstrated that the empirical apportionment of the number of traffic accidents and costs on South Bohemia Fire Rescue Service's interventions in traffic accidents can be considered as standard. Therefore it is correct to use theoretical findings associated with standard apportionment to work with those data. Also the thesis demonstrated that the number of traffic accidents and costs associated with South Bohemia Fire Rescue Service's interventions not only does not decrease but on the contrary increases. It was demonstrated that there is present very high positive correlation between numbers of the traffic accidents and the costs related to South Bohemia Fire Rescue Service's interventions in single months of the monitored time period. This brings us to a conclusion that the increase of costs related to interventions is caused mainly by the increase of the number of traffic accidents where the units of Fire Rescue Service intervene.
|
Page generated in 0.3593 seconds