181 |
Solvency Capital Requirement (SCR) for Market Risks : A quantitative assessment of the Standard formula and its adequacy for a Swedish insurance company / Kapitalbaskrav för marknadsrisker under Solvens II : En kvantitativ utvärdering av Standardformeln och dess lämplighet för ett svenskt försäkringsbolagWiding, Björn January 2016 (has links)
The purpose of this project is to validate the adequacy of the Standard formula, used to calculate the Solvency Capital Requirement (SCR), with respect to a Swedish insurance company. The sub-modules evaluated are Equity risk (type 1) and Interest rate risk. The validation uses a quantitative assessment and the concept of Value at Risk (VaR). Additionally, investment strategies for risk free assets are evaluated through a scenario based analysis. The findings support that the Equity shock of 39%, as proposed in the Standard formula, is appropriate for a diversified portfolio of global equities. Furthermore, to some extent; the Equity shock is also sufficient for a diversified global portfolio with an overweight of Swedish equities. Additionally, the findings shows that the Standard formula for Interest rate risks occasionally underestimates the true Interest rate risk. Furthermore, it’s shown that there are some advantage of selecting an investment strategy that stabilizes the Own fund of an insurance company rather than a strategy that minimizes the SCR. / Syftet med detta arbete är att utvärdera Standardformeln, som används för att beräkna solvenskapitalkravet (SCR) under Solvens II, med avseende på dess lämplighet för ett svensk försäkringsbolag. Modulerna som utvärderas är aktierisk (typ 1) och ränterisk. Utvärderingen genomförs med kvantitativa metoder och utifrån konceptet Value at Risk (VaR). Dessutom utvärderas investeringsstrategier för riskfria tillgångar genom en scenariobaserad analys. Resultaten stödjer att den av Standardformeln föreskrivna aktiechocken på -39 % är tillräcklig för en diversifierad global aktieportfölj. Dessutom är aktiechocken även tillräcklig för en diversifierad global portfölj med en viss övervikt mot svenska aktier. Vidare visar resultaten att Standardformeln under vissa omständigheter underskattar ränterisken. Slutligen visar den scenariobaserade analysen att det är fördelaktigt att välja en investeringsstrategi som stabiliserar Own fund, hellre än en strategi som minimerar SCR.
|
182 |
Infinite dimensional Markovian lifts of non-Markovian dynamics / Continuum seed-bank and price impact modelsJiao, Likai 07 January 2025 (has links)
Diese Dissertation wendet eine unendlichdimensionalen Markov'schen Hebemethode auf nicht-Markov'sche Dynamiken an und schlägt das Modell der kontinuierlichen Saatbank sowie ein unendlichdimensionales Preisbeeinflussungsmodell vor. Wir verallgemeinern das Saatbankmodell aus [BGCKWB16], um allgemeinere Dormanzzeitverteilungen zu berücksichtigen. Inspiriert von [GdHO22] führen wir die Wright-Fisher-Diffusion und Koaleszenz mit einer Kontinuität von Saatbänken ein. Durch die Formulierung einer unendlichen dimensionale stochastischen Differentialgleichung beweisen wir die Existenz einer eindeutigen starken Lösung: der kontinuierlichen Saatbank-Diffusion. Anschließend zeigen wir, dass dieser Diffusionsprozess das Skalierungs-Limit der Allelfrequenzprozesse in einer Reihe von diskreten Wright-Fisher-Modellen darstellt. Darüber hinaus stellen wir eine Dualitätsbeziehung zwischen der kontinuierlichen Saatbank-Diffusion und der kontinuierlichen Saatbank-Koaleszenz her und diskutieren einige grundlegende Eigenschaften dieses Koaleszenzprozesses. Im finanziellen Bereich entwickeln wir, ähnlich wie im kontinuierlichen Saatbankmodell, einen unendlichen transienten Preisbeeinflussungsprozess. Dieser Prozess ist ein Markov'sche Hebung eines nicht-Markov'schen 1-dimensionalen Preisbeeinflussungsprozesses. In einem additiven Preisbeeinflussungsszenario vereinfachen wir, entsprechend den Methoden in [AKU22] und [BB24], das Problem der optimalen Liquidation zu einem linearen-quadratischen Optimalsteuerproblem. Strafterm werden in das erwartete Kostenfunktional eingeführt, um die eindeutige Lösbarkeit sicherzustellen. Schließlich stellen wir in Szenarien wie multiplikativen Preisbeeinflussungen die Skorokhod M1-Kontinuität der Kosten im unendlichen Dimensionen-Setting sicher. / This thesis applies an infinite-dimensional Markovian lifting method to non-Markovian dynamics, proposing the continuum seed-bank model and an infinite-dimensional price impact model. We generalize the seed-bank model from [BGCKWB16] to accommodate more general dormancy time distributions. Inspired by [GdHO22], we introduce the Wright-Fisher diffusion and coalescent with a continuum of seed-banks. By formulating an infinite-dimensional stochastic differential equation, we prove the existence of a unique strong solution: the continuum seed-bank diffusion. We then show that this diffusion process is the scaling limit of allele frequency processes in a sequence of discrete-time Wright-Fisher type models. Furthermore, we establish a duality relation between the continuum seed-bank diffusion and the continuum seed-bank coalescent, and discuss some basic properties of this coalescent process. In the financial domain, akin to the continuum seed-bank model, we develop an infinite-dimensional transient price impact process. This process is a Markovian lift of a non-Markovian 1-dimensional price impact process. In an additive price impact scenario, following the methods in [AKU22] and [BB24], we simplify the optimal liquidation problem to a linear-quadratic optimal control problem. Penalty terms are introduced into the expected cost functional to ensure unique solvability. Finally, in scenarios such as multiplicative price impacts, we establish the Skorokhod M1 continuity of the cost in the infinite-dimensional setting.
|
183 |
Nonparametric Combination Methodology : A Better Way to Handle Composite Endpoints?Baurne, Yvette January 2015 (has links)
Composite endpoints are widely used in clinical trials. The outcome of a clinical trial can affect many individuals and it is therefore of importance that the methods used are as effective and correct as possible. Improvements of the standard method of testing composite endpoints have been proposed and in this thesis, the alternative method using nonparametric combination methodology is compared to the standard method. Performing a simulation study, the power of three combining functions (Fisher, Tippett and the Logistic) are compared to the power of the standard method. The performances of the four methods are evaluated for different compositions of treatment effects, as well as for independent and dependent components. The results show that using the nonparametric combination methodology leads to higher power in both dependent and independent cases. The combining functions are suitable for different compositions of treatment effects, the Fisher combining function being the most versatile. The thesis is written with support from Statisticon AB.
|
184 |
The optimal control of a Lévy processDiTanna, Anthony Santino 23 October 2009 (has links)
In this thesis we study the optimal stochastic control problem of the drift of a Lévy process. We show that, for a broad class of Lévy processes, the partial integro-differential Hamilton-Jacobi-Bellman equation for the value function admits classical solutions and that control policies exist in feedback form. We then explore the class of Lévy processes that satisfy the requirements of the theorem, and find connections between the uniform integrability requirement and the notions of the score function and Fisher information from information theory. Finally we present three different numerical implementations of the control problem: a traditional dynamic programming approach, and two iterative approaches, one based on a finite difference scheme and the other on the Fourier transform. / text
|
185 |
Mouvement brownien branchant avec sélectionMaillard, Pascal 11 October 2012 (has links) (PDF)
Dans cette thèse, le mouvement brownien branchant (MBB) est un système aléatoire de particules, où celles-ci diffusent sur la droite réelle selon des mouvements browniens et branchent à taux constant en un nombre aléatoire de particules d'espérance supérieure à 1. Nous étudions deux modèles de MBB avec sélection : le MBB avec absorption à une droite espace-temps et le N -MBB, où, dès que le nombre de particules dépasse un nombre donné N , seules les N particules les plus à droite sont gardées tandis que les autres sont enlevées du système. Pour le premier modèle, nous étudions la loi du nombre de particules absorbées dans le cas où le processus s'éteint presque sûrement, en utilisant un lien entre les équations de Fisher-Kolmogorov-Petrovskii-Piskounov (FKPP) et de Briot-Bouquet. Pour le deuxième modèle, dont l'étude représente la plus grande partie de cette thèse, nous donnons des asymptotiques précises sur la position du nuage de particules quand N est grand. Plus précisément, nous montrons qu'elle converge à l'échelle de temps log³ N vers un processus de Lévy plus une dérive linéaire, tous les deux explicites, confirmant des prévisions de Brunet, Derrida, Mueller et Munier. Cette étude contribue à la compréhension de fronts du type FKPP sous l'influence de bruit. Enfin, une troisième partie montre le lien qui existe entre le MBB et des processus ponctuels stables.
|
186 |
Applied Adaptive Optimal Design and Novel Optimization Algorithms for Practical UseStrömberg, Eric January 2016 (has links)
The costs of developing new pharmaceuticals have increased dramatically during the past decades. Contributing to these increased expenses are the increasingly extensive and more complex clinical trials required to generate sufficient evidence regarding the safety and efficacy of the drugs. It is therefore of great importance to improve the effectiveness of the clinical phases by increasing the information gained throughout the process so the correct decision may be made as early as possible. Optimal Design (OD) methodology using the Fisher Information Matrix (FIM) based on Nonlinear Mixed Effect Models (NLMEM) has been proven to serve as a useful tool for making more informed decisions throughout the clinical investigation. The calculation of the FIM for NLMEM does however lack an analytic solution and is commonly approximated by linearization of the NLMEM. Furthermore, two structural assumptions of the FIM is available; a full FIM and a block-diagonal FIM which assumes that the fixed effects are independent of the random effects in the NLMEM. Once the FIM has been derived, it can be transformed into a scalar optimality criterion for comparing designs. The optimality criterion may be considered local, if the criterion is based on singe point values of the parameters or global (robust), where the criterion is formed for a prior distribution of the parameters. Regardless of design criterion, FIM approximation or structural assumption, the design will be based on the prior information regarding the model and parameters, and is thus sensitive to misspecification in the design stage. Model based adaptive optimal design (MBAOD) has however been shown to be less sensitive to misspecification in the design stage. The aim of this thesis is to further the understanding and practicality when performing standard and MBAOD. This is to be achieved by: (i) investigating how two common FIM approximations and the structural assumptions may affect the optimized design, (ii) reducing runtimes complex design optimization by implementing a low level parallelization of the FIM calculation, (iii) further develop and demonstrate a framework for performing MBAOD, (vi) and investigate the potential advantages of using a global optimality criterion in the already robust MBAOD.
|
187 |
Modeling the Distribution of Bobcats and Areas of Reintroduction for Fisher in the Southern Washington CascadesHalsey, Shiloh Michael 16 August 2013 (has links)
The fisher (Martes pennanti) is a medium sized member of the mustelid family that once roamed the forests of Washington and whose historic range in the western United States once spread throughout the northern Rocky Mountains, the Cascade and Coast Ranges, and the Sierra Nevada (Carroll, Zielinski, and Noss 1999; Powell 1993, Spencer et al. 2011). Due to pressures from trapping and habitat fragmentation, the abundance of the species in the western United States has decreased dramatically and is thought to be limited to several small, isolated populations. In 2008, fishers were reintroduced to the Olympic Peninsula; however, bobcat (Lynx rufus) predation in the first years is thought to have killed off a significant portion of the released fisher hindering their ability to establish a self-sustaining population (Lewis et al. 2011). Other studies in the western United States have shown that bobcats can be a dramatic force on small or isolated fisher populations.
The coniferous forest of the southern Washington Cascades is the possible site of a release of currently extirpated fishers. My research examines the distribution of bobcats in the region and explores the implication this and the habitat variables of the area have for a future reintroduction of fisher. The workflow of the research was a stepwise process of: 1) surveying forested areas in the southern Washington Cascades for the presence and absence of bobcat and acquiring previously completed survey data 2) using a classification tree to model the correlation of bobcat presence or absence with forest variables and 3) applying these relationships to spatial analysis the creation of maps showing areas of high ranking fisher habitat.
The classification tree modeled the correlation between the forest variables and the results of the surveys, which included 145 bobcat absence observations and 39 presence observations. The model highlighted a 95% probability of absence above 1,303 m in elevation, 73% probability of absence in areas under 1,303 m in elevation and with a tree diameter value under 43.45 cm, 57% probability of absence in areas between 1,070 m and 1,303 m in elevation and with a tree diameter value above 43.45 cm, and an 89% probability of bobcat presence in areas under 1,070 m in elevation with a tree diameter value above 43.45 cm. I applied an upper elevation limit of 1,676 meters as a threshold for suitable habitat and only considered habitat suitable in cells with a tree diameter above 29 cm. The three locations highlighted as the most suitable areas for reintroduction due to a large amount of the highest ranking habitat and the largest aggregations of suitable habitat cells were around the William O. Douglas Wilderness that straddles the border of the Gifford Pinchot National Forest (GPNF) and the Wenatchee National Forest, another location in the Norse Peak Wilderness northeast of Mount Rainier, and a third location in Indian Heaven Wilderness in the southern portion of the GPNF.
|
188 |
Video content analysis for intelligent forensicsFraz, Muhammad January 2014 (has links)
The networks of surveillance cameras installed in public places and private territories continuously record video data with the aim of detecting and preventing unlawful activities. This enhances the importance of video content analysis applications, either for real time (i.e. analytic) or post-event (i.e. forensic) analysis. In this thesis, the primary focus is on four key aspects of video content analysis, namely; 1. Moving object detection and recognition, 2. Correction of colours in the video frames and recognition of colours of moving objects, 3. Make and model recognition of vehicles and identification of their type, 4. Detection and recognition of text information in outdoor scenes. To address the first issue, a framework is presented in the first part of the thesis that efficiently detects and recognizes moving objects in videos. The framework targets the problem of object detection in the presence of complex background. The object detection part of the framework relies on background modelling technique and a novel post processing step where the contours of the foreground regions (i.e. moving object) are refined by the classification of edge segments as belonging either to the background or to the foreground region. Further, a novel feature descriptor is devised for the classification of moving objects into humans, vehicles and background. The proposed feature descriptor captures the texture information present in the silhouette of foreground objects. To address the second issue, a framework for the correction and recognition of true colours of objects in videos is presented with novel noise reduction, colour enhancement and colour recognition stages. The colour recognition stage makes use of temporal information to reliably recognize the true colours of moving objects in multiple frames. The proposed framework is specifically designed to perform robustly on videos that have poor quality because of surrounding illumination, camera sensor imperfection and artefacts due to high compression. In the third part of the thesis, a framework for vehicle make and model recognition and type identification is presented. As a part of this work, a novel feature representation technique for distinctive representation of vehicle images has emerged. The feature representation technique uses dense feature description and mid-level feature encoding scheme to capture the texture in the frontal view of the vehicles. The proposed method is insensitive to minor in-plane rotation and skew within the image. The capability of the proposed framework can be enhanced to any number of vehicle classes without re-training. Another important contribution of this work is the publication of a comprehensive up to date dataset of vehicle images to support future research in this domain. The problem of text detection and recognition in images is addressed in the last part of the thesis. A novel technique is proposed that exploits the colour information in the image for the identification of text regions. Apart from detection, the colour information is also used to segment characters from the words. The recognition of identified characters is performed using shape features and supervised learning. Finally, a lexicon based alignment procedure is adopted to finalize the recognition of strings present in word images. Extensive experiments have been conducted on benchmark datasets to analyse the performance of proposed algorithms. The results show that the proposed moving object detection and recognition technique superseded well-know baseline techniques. The proposed framework for the correction and recognition of object colours in video frames achieved all the aforementioned goals. The performance analysis of the vehicle make and model recognition framework on multiple datasets has shown the strength and reliability of the technique when used within various scenarios. Finally, the experimental results for the text detection and recognition framework on benchmark datasets have revealed the potential of the proposed scheme for accurate detection and recognition of text in the wild.
|
189 |
Teaching natural philosophy and mathematics at Oxford and Cambridge 1500-1570Hannam, James January 2008 (has links)
The syllabus in natural philosophy and mathematics was radically changed in the course of the sixteenth century with new subjects, textbooks and methods introduced. Education became more practical and less dependent on medieval antecedents. Printing technology improved textbooks and made it possible to replace them with newer versions. Following sweeping syllabus reform around 1500, the Cambridge Master of Arts course was heavily slanted towards humanism. The old scholastic textbooks were rejected and replaced with modern authors. The purpose of natural philosophy was explicitly to illuminate the providential work of the creator, especially through natural history (a newly developing subject in the sixteenth century thanks to newly translated and promulgated Greek texts) where examples of God's work were there for all to see. Oxford remained wedded to scholastic texts although the trivium was reformed along humanistic lines. Cromwell's visitors in 1535 outlawed scholasticism by decree but gave little indication of the alternative (their white list stipulating only Aristotle). The solution adopted by the Oxford masters was to import the Cambridge syllabus and textbooks wholesale. When the evangelical regime of Edward VI reformed the universities in 1549, the humanist natural philosophy syllabus was adjudged appropriate, especially those parts promoted by Philip Melanchthon at the University of Wittenberg. However, the visitors' background at court meant they valued ethics and politics more highly. The Reformation itself left natural philosophy largely unaffected although the barrier preventing Catholics from entering clerical careers after 1558 appears to have encouraged some to remain philosophers. In mathematics, the 1549 visitation was highly significant. Cambridge University's initiative in 1500 in employing a university lecturer in the subject was in danger of stagnating due to inappropriate appointments. However, John Cheke's statutes in 1549 promoted the use of modern textbooks of practical arithmetic, finance and surveying useful to the centralised Tudor state. He also introduced the new subject of geography as a result of his contacts at court with merchants and explorers. The thesis concludes that during the second half of the sixteenth century,English students could expect a mathematical and philosophical education comparable to that of their Italian peers. This was sufficient to provide graduates with the knowledge they needed to carry these subjects forward in the seventeenth century.
|
190 |
Ronald Fisher e a eugenia: estatística, evolução e genética na busca da civilização permanente / Ronald Fisher and eugenics: statistics, evolution and genetics in the quest for permanent civilizationCruz, Rodrigo Andrade da 12 September 2016 (has links)
Submitted by Filipe dos Santos (fsantos@pucsp.br) on 2016-11-30T11:43:39Z
No. of bitstreams: 1
Rodrigo Andrade da Cruz.pdf: 2363690 bytes, checksum: 6e3f1554dbccc52ab3eb3c9c79ef1b3f (MD5) / Made available in DSpace on 2016-11-30T11:43:39Z (GMT). No. of bitstreams: 1
Rodrigo Andrade da Cruz.pdf: 2363690 bytes, checksum: 6e3f1554dbccc52ab3eb3c9c79ef1b3f (MD5)
Previous issue date: 2016-09-12 / Coordenação de Aperfeiçoamento de Pessoal de Nível Superior / In the beginning of the 20th century, England was the stage for an
intense debate on the mechanisms of inheritance and their relationship with
evolution. To the biologists known as gradualists it was impossible to
reconcile Mendelian genetics with Darwinian thought. In turn, advocates of
saltationism admitted Mendelian inheritance, but put the foundations of
evolution theory into question. This opposition defines the context for the work
of Ronald Fisher, who through the use of statistics was able to bridge the gap
between the contending positions and thus contributed to the formulation of
so-called evolutionary synthesis or Neodarwinism.
However, the literature often passes over the fact that the background
for the debate and formulation of evolutionary synthesis was provided by
eugenics, namely, the science that sought to improve humankind through the
control of reproduction. To be sure, eugenics called the attention of Fisher
since his youth, and in time he became one of its main developers and
advocates. A large part of the theoretical-conceptual grounds of contemporary
statistics and population genetics was formulated as a function of Fisher’s
concerns with the phenomenon he called differential fertility. According to him,
the higher reproduction rates of ‘inadequate’ people by comparison to the
‘best stocks’ was the main cause of the problems humankind had to deal with
all along history and also in his time / No início do século XX ocorreram na Inglaterra intensos debates sobre
os mecanismos de hereditariedade e suas vinculações com o darwinismo.
Uma corrente de biólogos, denominada gradualista, hesitava em aceitar os
preceitos mendelianos. Segundo esses não havia composição entre a
genética mendeliana e o darwinismo. Por sua vez, a corrente denominada
saltacionista, aceitava a genética mendeliana, porém questionava os
fundamentos darwinistas de evolução. É nesse contexto que o trabalho de
Ronald Fisher se destaca. Ele é considerado um dos principais responsáveis
pela formulação da síntese neodarwiniana, ao utilizar a estatística como
conciliadora de ambas as correntes.
Como pano de fundo das formulações de Fisher e de vários outros
naturalistas do período, entretanto, estava a eugenia: a ciência do
melhoramento da espécie humana a partir do controle da reprodução. Desde
muito jovem as ideias eugênicas chamaram a atenção de Fisher – que se
tornou um de seus grandes formuladores e defensores. Grande parte da
formulação teórico-conceitual da estatística e da genética de populações foi
elaborada sob a preocupação com o fenômeno que Fisher denominou de
fertilidade diferencial. Para ele, as maiores taxas de reprodução dos
“inadequados” em comparação às ‘melhores linhagens’ humanas era a maior
causa dos problemas que a humanidade enfrenta e enfrentou ao longo de
toda a história
|
Page generated in 0.0496 seconds