• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 11
  • 2
  • 1
  • 1
  • 1
  • Tagged with
  • 19
  • 9
  • 9
  • 8
  • 7
  • 4
  • 4
  • 4
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
11

International stock market liquidity

Stahel, Christof W. 30 September 2004 (has links)
No description available.
12

Dynamic modeling and feedback control with mode-shifting of a two-mode electrically variable transmission

Katariya, Ashish Santosh 31 August 2012 (has links)
This thesis develops dynamic models for the two-mode FWD EVT, develops a control system based on those models that is capable of meeting driver torque demands and performing synchronous mode shifts between different EVT modes while also accommodating preferred engine operating points. The two-input two-output transmission controller proposed herein incorporates motor-generator dynamics, is based on a general state-space integral control structure, and has feedback gains determined using linear quadratic regulator (LQR) optimization. Dynamic modeling of the vehicle is categorized as dynamic modeling of the mechanical and electrical subsystems where the mechanical subsystem consists of the planetary gear sets, the transmission and the engine whereas the electrical subsystem consists of the motor-generator units and the battery pack. A discussion of load torque is also considered as part of the mechanical subsystem. With the help of these derived dynamic models, a distinction is made between dynamic output torque and steady-state output torque. The overall control system consisting of multiple subsystems such as the human driver, power management unit (PMU), friction brakes, combustion engine, transmission control unit (TCU) and motor-generator units is designed. The logic for synchronous mode shifts between different EVT modes is also detailed as part of the control system design. Finally, the thesis presents results for responses in individual operating modes, EVT mode shifting and a full UDDS drive cycle simulation.
13

Stress testing and financial risks / Stress Tests et Risques financiers

Koliai, Lyes 27 October 2014 (has links)
Cette thèse établit un cadre d’évaluation des stress tests financiers, en identifiant leurs principales limites. Trois approches ont été proposées pour améliorer les pratiques actuelles à chaque étape du processus. Elles incluent : (i) un modèle semi-paramétrique TVE–copules-paires pour les facteurs de risque financiers, avec un accent particulier sur les valeurs extrêmes, (ii) un modèle d'évaluation pour estimer l'impact de ces facteurs sur un système financier, via des effets directs, indirects et de contagion, en considérant les réactions endogènes publiques et privées, et (iii) une approche bayésienne pour mener une sélection systématique des scénarios de stress pour des portefeuilles non linéaires. Le modèle de risque a montré de meilleures performances par rapport à la plupart des spécifications courantes ; ce qui augmente la crédibilité du test. Le modèle d'évaluation est estimé pour le système bancaire français, révélant ses principales sources de vulnérabilité et le rôle clé des réactions publiques. Enfin, l'approche bayésienne a permis de remplacer les scénarios subjectifs traditionnels et d’inclure les résultats de stress tests dans la gestion quantitative des risques aux côtés des autres outils conventionnels / This thesis has set a comprehensive framework to assess the relevance of financial stress tests, identifying their main drawbacks. Three robust and flexible model frameworks have been proposed to improve current practices in each of the tests’ stages. This is achieved through: (i) a semi-parametric EVT–Pair-copulas model for financial risk factors, with a specific focus on extreme values, (ii) a valuation model to assess the impact of risk factors on a financial system, through direct and indirect effects, contagion channels, and considering private and public response functions, and (iii) a Bayesian-based approach to run a systematic selection of stress scenarios for nonlinear portfolios. The presented risk model has proven to outperform commonly used specifications, hence increasing the test’s credibility. Estimated for the French banking system, the valuation model revealed the related risk profile and the main vulnerabilities. Public responses turned to be of vital interest. Finally, the Bayesian approach allows replacing the traditional subjective scenarios and including the tests’ results in quantitative risk management alongside with other conventional tools
14

Genetické programování - Java implementace / Genetic programming - Java implementation

Tomaštík, Marek January 2013 (has links)
This Master´s thesis implements computer program in Java, useful for automatic model generating, specially in symbolic regression problem. Thesis includes short description of genetic programming (GP) and own implementation with advanced GP operands (non-destructive operations, elitism, exptression reduction). Mathematical model is generating by symbolic regression, exacly for choosen data set. For functioning check are used test tasks. Optimal settings is found for choosen GP parameters.
15

Analyzing value at risk and expected shortfall methods: the use of parametric, non-parametric, and semi-parametric models

Huang, Xinxin 25 August 2014 (has links)
Value at Risk (VaR) and Expected Shortfall (ES) are methods often used to measure market risk. Inaccurate and unreliable Value at Risk and Expected Shortfall models can lead to underestimation of the market risk that a firm or financial institution is exposed to, and therefore may jeopardize the well-being or survival of the firm or financial institution during adverse markets. The objective of this study is therefore to examine various Value at Risk and Expected Shortfall models, including fatter tail models, in order to analyze the accuracy and reliability of these models. Thirteen VaR and ES models under three main approaches (Parametric, Non-Parametric and Semi-Parametric) are examined in this study. The results of this study show that the proposed model (ARMA(1,1)-GJR-GARCH(1,1)-SGED) gives the most balanced Value at Risk results. The semi-parametric model (Extreme Value Theory, EVT) is the most accurate Value at Risk model in this study for S&P 500. / October 2014
16

Modeling Extreme Values / Modelování extrémních hodnot

Shykhmanter, Dmytro January 2013 (has links)
Modeling of extreme events is a challenging statistical task. Firstly, there is always a limit number of observations and secondly therefore no experience to back test the result. One way of estimating higher quantiles is to fit one of theoretical distributions to the data and extrapolate to the tail. The shortcoming of this approach is that the estimate of the tail is based on the observations in the center of distribution. Alternative approach to this problem is based on idea to split the data into two sub-populations and model body of the distribution separately from the tail. This methodology is applied to non-life insurance losses, where extremes are particularly important for risk management. Never the less, even this approach is not a conclusive solution of heavy tail modeling. In either case, estimated 99.5% percentiles have such high standard errors, that the their reliability is very low. On the other hand this approach is theoretically valid and deserves to be considered as one of the possible methods of extreme value analysis.
17

Théorie des options et fonctions d'utilité : stratégies de couverture en présence des fluctuations non gaussiennes / Options theory and utility functions : hedging strategies in the presence of non-gaussian fluctuations

Hamdi, Haykel 04 March 2011 (has links)
L'approche traditionnelle des produits dérivés consiste, sous certaines hypothèses bien définies, à construire des stratégies de couverture à risque strictement nul. Cependant,dans le cas général ces stratégies de couverture "parfaites" n'existent pas,et la théorie doit plutôt s'appuyer sur une idée de minimisation du risque. Dans ce cas, la couverture optimale dépend de la quantité du risque à minimiser. Dans lecadre des options, on considère dans ce travail une nouvelle mesure du risque vial'approche de l'utilité espérée qui tient compte, à la fois, du moment d'ordre quatre,qui est plus sensible aux grandes fluctuations que la variance, et de l'aversion aurisque de l'émetteur d'une option vis-à-vis au risque. Comparée à la couverture endelta, à l'optimisation de la variance et l'optimisation du moment d'ordre quatre,la stratégie de couverture, via l'approche de l'utilité espérée, permet de diminuer lasensibilité de la couverture par rapport au cours du sous-jacent. Ceci est de natureà réduire les coûts des transactions associées / The traditional approach of derivatives involves, under certain clearly defined hypothesis, to construct hedging strategies for strictly zero risk. However, in the general case these perfect hedging strategies do not exist, and the theory must be rather based on the idea of risk minimization. In this case, the optimal hedging strategy depends on the amount of risk to be minimized. Under the options approach, we consider here a new measure of risk via the expected utility approach that takes into account both, the moment of order four, which is more sensitive to fluctuations than large variance, and risk aversion of the investor of an option towards risk. Compared to delta hedging, optimization of the variance and maximizing the moment of order four, the hedging strategy, via the expected utilitiy approach, reduces the sensitivy of the hedging approach reported in the underlying asset price. This is likely to reduce the associated transaction costs.
18

Outliers detection in mixtures of dissymmetric distributions for data sets with spatial constraints / Détection de valeurs aberrantes dans des mélanges de distributions dissymétriques pour des ensembles de données avec contraintes spatiales

Planchon, Viviane 29 May 2007 (has links)
In the case of soil chemical analyses, frequency distributions for some elements show a dissymmetrical aspect, with a very marked spread to the right or to the left. A high frequency of extreme values is also observed and a possible mixture of several distributions, due to the presence of various soil types within a single geographical unit, is encountered. Then, for the outliers detection and the establishment of detection limits, an original outliers detection procedure has been developed; it allows estimating extreme quantiles above and under which observations are considered as outliers. The estimation of these detection limits is based on the right and the left of the distribution tails. A first estimation is realised for each elementary geographical unit to determine an appropriate truncation level. Then, a spatial classification allows creating adjoining homogeneous groups of geographical units to estimate robust limit values based on an optimal number of observations. / Dans le cas des analyses chimiques de sols, les distributions de fréquences des résultats présentent, pour certains éléments étudiés, un caractère très dissymétrique avec un étalement très marqué à droite ou à gauche. Une fréquence importante de valeurs extrêmes est également observée et un mélange éventuel de plusieurs distributions au sein dune même entité géographique, lié à la présence de divers types de sols, peut être rencontré. Dès lors, pour la détection des valeurs aberrantes et la fixation des limites de détection, une méthode originale, permettant destimer des quantiles extrêmes au-dessus et en dessous desquelles les observations sont considérées comme aberrantes, a été élaborée. Lestimation des limites de détection est établie de manière distincte à partir des queues des distributions droite et gauche. Une première estimation par entité géographique élémentaire est réalisée afin de déterminer un niveau de troncature adéquat. Une classification spatiale permet ensuite de créer des groupes dentités homogènes contiguës, de manière à estimer des valeurs limites robustes basées sur un nombre dobservations optimal.
19

Stochastic Modelling of Daily Peak Electricity Demand Using Value Theory

Boano - Danquah, Jerry 21 September 2018 (has links)
MSc (Statistics) / Department of Statistics / Daily peak electricity data from ESKOM, South African power utility company for the period, January 1997 to December 2013 consisting of 6209 observations were used in this dissertation. Since 1994, the increased electricity demand has led to sustainability issues in South Africa. In addition, the electricity demand continues to rise everyday due to a variety of driving factors. Considering this, if the electricity generating capacity in South Africa does not show potential signs of meeting the country’s demands in the subsequent years, this may have a significant impact on the national grid causing it to operate in a risky and vulnerable state, leading to disturbances, such as load shedding as experienced during the past few years. In particular, it is of greater interest to have sufficient information about the extreme value of the stochastic load process in time for proper planning, designing the generation and distribution system, and the storage devices as these would ensure efficiency in the electrical energy in order to maintain discipline in the grid systems. More importantly, electricity is an important commodity used mainly as a source of energy in industrial, residential and commercial sectors. Effective monitoring of electricity demand is of great importance because demand that exceeds maximum power generated will lead to power outage and load shedding. It is in the light of this that the study seeks to assess the frequency of occurrence of extreme peak electricity demand in order to come up with a full electricity demand distribution capable of managing uncertainties in the grid system. In order to achieve stationarity in the daily peak electricity demand (DPED), we apply a penalized regression cubic smoothing spline to ensure the data is non-linearly detrended. The R package “evmix” is used to estimate the thresholds using the bounded corrected kernel density plot. The non-linear detrended datasets were divided into summer, spring, winter and autumn according to the calender dates in the Southern Hemisphere for frequency analysis. The data is declustered using Ferro and Segers automatic declustering method. The cluster maxima is extracted using the R package “evd”. We fit Poisson GPD and stationary point process to the cluster maxima and the intensity function of the point process which measures the frequency of occurrence of the daily peak electricity demand per year is calculated for each dataset. The formal goodness-of-fit test based on Cramer-Von Mises statistics and Anderson-Darling statistics supported the null hypothesis that each dataset follow Poisson GPD (σ, ξ) at 5 percent level of significance. The modelling framework, which is easily extensible to other peak load parameters, is based on the assumption that peak power follows a Poisson process. The parameters of the developed i models were estimated using the Maximum Likelihood. The usual asymptotic properties underlying the Poisson GPD were satisfied by the model. / NRF

Page generated in 0.0763 seconds