• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 163
  • 56
  • 23
  • 13
  • 11
  • 11
  • 9
  • 9
  • 9
  • 4
  • 4
  • 3
  • 2
  • 2
  • 2
  • Tagged with
  • 367
  • 43
  • 40
  • 37
  • 35
  • 34
  • 30
  • 26
  • 25
  • 25
  • 22
  • 21
  • 19
  • 19
  • 19
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
291

Genetic Transformation of Switchgrass (Panicum Virgatum L.) with Endoglucanase Gene and Characterization of Plants with Endoglucanase Transgene

Dere, Madhavi Suresh 24 August 2012 (has links)
As a warm season grass native to the North American continent, switchgrass is considered as one of the most promising biofuel crops in the USA. It is a C4 plant that makes it energy efficient. Switchgrass has a deep root system that allows it to grow on marginal land with low water and nutrient input. Switchgrass has been used as a forage crop and its use for biofuel will not affect food security. Biofuels are more environment-friendly than fossil fuels as they do not produce net greenhouse gases. However, the problem of high cost of production per unit for biofuel has to be overcome if we want to replace fossil fuels with biofuels. One of the major factors related to the high cost of biofuel are the expensive cellulase enzymes used in the pretreatment of feedstock. Endoglucanase is the key enzyme used for breaking down cellulose before fermentation. Currently, endoglucanase is produced from engineered E. coli or yeast strains, which is still expensive for enzyme production and purification of industrial scales. Expression of endoglucanase in plants has been previously reported. However, there are no reports of transgenic switchgrass producing cellulase enzyme. In this study, the catalytic domain of beta-endoglucanase gene was codon-optimized and synthesized based on the cDNA cloned from Hypocrea jecorina. Rice RuBisCO small subunit targeting signal peptide was fused to the N-terminus of the beta-endoglucanase gene, which was expected to target the fusion protein to chloroplast. This subcellular compartment targeting could minimize negative effects on cell function and plant development. The endoglucanase gene was cloned with maize ubiquitin promoter in a modified binary vector pCambia 1305-2 and transformed into switchgrass genotype HR8 by using Agrobacterium tumefaciens. In this study, I generated five independent transgenic switchgrass lines and they were confirmed by growing on the selection agent hygromycin, GUS assay, PCR amplification, southern blotting hybridization, for the presence of hygromycin and endoglucanase genes. However, based on RT-PCR analysis, only two transgenic lines were confirmed to produce mRNAs of the endoglucanase gene. These two transgenic lines were further characterized for their agronomic traits and the chlorophyll contents. Our results suggested that expression of endoglucanase in switchgrass could reduce chlorophyll content and affect plant development. Nevertheless, in this study, we demonstrated that a fungal endoglucanase gene could be expressed in switchgrass transgenic plants, though the gene expression level and the subcellular localization need to be carefully regulated in order to minimize the toxic effect of endoglucanase on plant cells. / Master of Science
292

Living Structure for Understanding Human Activity Patterns Using Multi-Source Geospatial Big Data

Ren, Zheng January 2023 (has links)
Geographic space is not neutral or lifeless, but an intricate living structure composed of numerous small features and a few large ones across all scales. The living structure is crucial for comprehending how geographic space shapes human activities. With the emerging geospatial big data, researchers now have unprecedented opportunities to study the relationship between geographic space and human behaviour at a finer spatial resolution. This thesis leverages multisource geospatial big data, including Twitter check-in locations, street networks from OpenStreetMap, building footprints, and night-time light images, to explore the fundamental mechanisms of human activities that underlie geographic space. To overcome the limitations of conventional analytics in this era of big data, we propose the topological representation and living structure based on Christopher Alexander's conception of space. We utilize scaling and topological analyses to reveal the underlying living structure of geographic space with various big datasets. Our results demonstrate that tweet locations or human activities at different scales can be accurately predicted by the underlying living structure of street nodes. We also capture and characterize human activities using big data and find that building footprints and tweets show similar scaling patterns in terms of sizes of their spatial clusters. We also propose an improved spatial clustering method to increase the processing speed of geospatial big data. Finally, we adopt topological representation to identify urban centres by the fusion of multi-source geospatial big data. The living structure, together with its topological representation can help us better understand human activities patterns in the geographic space at both city and country levels.
293

Poly(A) Tail Regulation in the Nucleus

Alles, Jonathan 19 May 2022 (has links)
Der Ribonukleinsäure (RNS) Stoffwechsel umfasst verschiedene Schritte, beginnend mit der Transkription der RNS über die Translation bis zum RNA Abbau. Poly(A) Schwänze befinden sich am Ende der meisten der Boten-RNS, schützen die RNA vor Abbau und stimulieren Translation. Die Deadenylierung von Poly(A) Schwänzen limitiert den Abbau von RNS. Bisher wurde RNS Abbau meist im Kontext von cytoplasmatischen Prozessen untersucht, ob und wie RNS Deadenylierung und Abbau in Nukleus erfolgen ist bisher unklar. Es wurde daher eine neue Methode zur genomweiten Bestimmung von Poly(A) Schwanzlänge entwickelt, welche FLAM-Seq genannt wurde. FLAM-Seq wurde verwendet um Zelllinien, Organoide und C. elegans RNS zu analysieren und es wurde eine signifikante Korrelation zwischen 3’-UTR und Poly(A) Länge gefunden, sowie für viele Gene ein Zusammenhang von alternativen 3‘-UTR Isoformen und Poly(A) Länge. Die Untersuchung von Poly(A) Schwänzen von nicht-gespleißten RNS Molekülen zeige, dass deren Poly(A) Schwänze eine Länge von mehr als 200 nt hatten. Die Analyse wurde durch eine Inhibition des Spleiß-Prozesses validiert. Die Verwendung von Methoden zur Markierung von RNS, welche die zeitliche Auflösung der RNS Prozessierung ermöglicht, deutete auf eine Deadenylierung der Poly(A) Schwänze schon wenige Minuten nach deren Synthesis hin. Die Analyse von subzellulären Fraktionen zeigte, dass diese initiale Deadenylierung ein Prozess im Nukleus ist. Dieser Prozess ist gen-spezifisch und Poly(A) Schwänze von bestimmten Typen von Transkripten, wie nuklearen langen nicht-kodierende RNS Molekülen waren nicht deadenyliert. Um Enzyme zu identifizieren, welche die Deadenylierung im Zellkern katalysieren, wurden verschiedene Methoden wie RNS-abbauende Cas Systeme, siRNAs oder shRNA Zelllinien verwendet. Trotz einer effizienten Reduktion der RNS Expression entsprechender Enzymkomplexe konnten keine molekularen Phänotypen identifiziert werden welche die Poly(A) Länge im Zellkern beeinflussen. / The RNA metabolism involves different steps from transcription to translation and decay of messenger RNAs (mRNAs). Most mRNAs have a poly(A) tail attached to their 3’-end, which protects them from degradation and stimulates translation. Removal of the poly(A) tail is the rate-limiting step in RNA decay controlling stability and translation. It is yet unclear if and to what extent RNA deadenylation occurs in the mammalian nucleus. A novel method for genome-wide determination of poly(A) tail length, termed FLAM-Seq, was developed to overcome current challenges in sequencing mRNAs, enabling genome-wide analysis of complete RNAs, including their poly(A) tail sequence. FLAM-Seq analysis of different model systems uncovered a strong correlation between poly(A) tail and 3’-UTR length or alternative polyadenylation. Cytosine nucleotides were further significantly enriched in poly(A) tails. Analyzing poly(A) tails of unspliced RNAs from FLAM-Seq data revealed the genome-wide synthesis of poly(A) tails with a length of more than 200 nt. This could be validated by splicing inhibition experiments which uncovered potential links between the completion of splicing and poly(A) tail shortening. Measuring RNA deadenylation kinetics using metabolic labeling experiments hinted at a rapid shortening of tails within minutes. The analysis of subcellular fractions obtained from HeLa cells and a mouse brain showed that initial deadenylation is a nuclear process. Nuclear deadenylation is gene specific and poly(A) tails of lncRNAs retained in the nucleus were not shortened. To identify enzymes responsible for nuclear deadenylation, RNA targeting Cas-systems, siRNAs and shRNA cell lines were used to different deadenylase complexes. Despite efficient mRNA knockdown, subcellular analysis of poly(A) tail length by did not yield molecular phenotypes of changing nuclear poly(A) tail length.
294

Expansion av e-handelsverksamhet i detaljhandeln : En studie om utmaningar och framgångsfaktorer / Growing e-commerce operations within the retail industry : A study of challenges and success factors

Steen Lagerstam, Nathalie January 2016 (has links)
I vårt allt mer digitaliserade samhälle har internet fått en central roll, och har förändrat många aspekter i vårt vardagsliv. Ett av de områden som påverkats starkt av denna utveckling, är den svenska detaljhandelns e-handelsmarknad. De senaste 10 åren har dess omsättning sexdubblats, och prognoser tyder på att denna expansiva trend kommer att hålla i sig. Detta innebär helt nya förutsättningar för de svenska detaljhandelsföretag som tidigare bara bedrivit försäljning i fysiska butiker, eller bedrivit e-handelsverksamhet i liten skala. Ämnet för denna studie föreslogs av den svenska detaljhandelskedjan Företag X[1], som har beslutat att lansera en e-handel i början av år 2016. Företag X föreslog att författaren skulle kartlägga vad som krävs för att gå från en liten nyetablerad e-handel, till en stor och framgångsrik e-handel som bidrar till företagets lönsamhet och fortsatta välmående. Studiens syfte är således att kartlägga praxis och metoder som möjliggör framgångsrik expansion av e-handeln för medelstora, svenska detaljhandelsföretag. För att uppnå syftet valdes en kvalitativ multifallstudie-design. Information till den teoretiska referensramen samlades in från vetenskapliga artiklar och böcker, och kompletterades med sekundärempiri i form av rapporter från företag. Kvalitativ primärdata samlades sedan in från intervjuer med tre fallföretag, som hittades genom en mindre tvärsnittsstudie av 20 konkurrenter till Företag X där företagens EBITDA-marginal de senaste fem åren studerades.   Studiens resultat visade att den svenska detaljhandelns e-handelsmarknad har sett en mycket expansiv trend de senaste åren, och att denna ser ut att hålla i sig under det kommande decenniet. Detta innebär att marknadsklimatet håller på att förändras och går mot att bli allt mer digitalt, vilket kräver att företagen ser över sina försäljningsorganisationer. Den största trenden på marknaden kallas ”omnikanalhandel”, och innebär i korthet att företaget integrerar sin butiks- och e-handelsförsäljning för att möjliggöra en sömlös köpupplevelse för kunden. Denna trend har framträtt som ett svar på ett nytt köpmönster som noterats bland kunderna, där dessa rör sig mellan företagens digitala och fysiska försäljningskanaler under köpresan. Studien pekar på att de e-handelsrelaterade frågor som är viktigast att adressera och arbeta med finns inom följande områden: KundfokusIT Kundrelationer Lyhördhet mot kund KostnadseffektivitetSupply chain och logistik Strategi och styrning MotståndshanteringUtbildning Samspel med butiker Studien utgör författarens examensarbete i utbildningen Industriell Ekonomi vid Linköpings Tekniska Högskola.   [1] Företag X heter egentligen något annat, men har bett att få vara anonyma i denna studie. / The Internet has become a central part of the modern society and has change our everyday life in many ways. An area that’s been strongly affected by this development is the Swedish retail e-commerce market. E-commerce sales have increased six-fold over the course of the past decade and forecasts indicate that this expansive trend will continue. This trend places significant challenges on Swedish retail companies who have a legacy of exclusively carrying out their sales in the traditional brick and mortar format. The subject of this study was originally proposed by the Swedish retail company Företag X[1], who have decided to embark on their e-commerce initiative in early 2016. Företag X asked the author of this report to help with finding out what it takes to develop a small e-commerce store, in such a way that it becomes a successful part of the company, contributing to the company’s overall profitability. The purpose of this study is therefore to identify practices and methods that enable successful expansion of e-commerce for medium-sized, Swedish retail companies. In order to achieve the purpose, a qualitative multi-case study design was chosen for the study. Information for the theoretical framework was collected from articles in scientific journals and books, and supplemented with secondary empirical data in the form of reports from companies. Qualitative primary data were then collected from interviews with three chosen companies, which were found with through a small cross-sectional study of 20 competitors to Företag X where their EBITDA margins over the past five years were studied. The study results showed an expansive trend on the Swedish retail e-commerce market, and it seems that this trend is set to continue over the next decade. As a consequence, market climate is changing and shifting towards becoming more digitalized, which requires companies to review their sales organizations. The main trend in the market is called “omni-channel retailing" and means that a company is integrating its brick and mortar and e-commerce sales to enable a seamless shopping experience for the customer. This trend has emerged as a response to a new buying patterns observed among customers, who move between their digital and physical sales channels during the purchase. The study indicates that the e-commerce-related issues that are most important to address and work with are in the following areas: Customer FocusIT Customer relations Responsiveness to customers Cost effectiveness Supply chain and logistics Strategy and governance Resistance ManagementEducation Synchronization between brick and mortar stores and e-commerce store This study represents the author’s master’s thesis in the Industrial Engineering and Management program at Linköping University, Sweden. [1] Företag X are known by a different name, but have asked to remain anonymous in this study
295

Modelování přírodních katastrof v pojišťovnictví / Modelling natural catastrophes in insurance

Varvařovský, Václav January 2009 (has links)
Quantification of risks is one of the pillars of the contemporary insurance industry. Natural catastrophes and their modelling represents one of the most important areas of non-life insurance in the Czech Republic. One of the key inputs of catastrophe models is a spatial dependence structure in the portfolio of an insurance company. Copulas represents a more general view on dependence structures and broaden the classical approach, which is implicitly using the dependence structure of a multivariate normal distribution. The goal of this work, with respect to absence of comprehensive monographs in the Czech Republic, is to provide a theoretical basis for use of copulas. It focuses on general properties of copulas and specifics of two most commonly used families of copulas -- Archimedean and elliptical. The other goal is to quantify difference between the given copula and the classical approach, which uses dependency structure of a multivariate normal distribution, in modelled flood losses in the Czech Republic. Results are largely dependent on scale of losses in individual areas. If the areas have approximately a "tower" structure (i.e., one area significantly outweighs others), the effect of a change in the dependency structure compared to the classical approach is between 5-10% (up and down depending on a copula) at 99.5 percentile of original losses (a return period of once in 200 years). In case that all areas are approximately similarly distributed the difference, owing to the dependency structure, can be up to 30%, which means rather an important difference when buying the most common form of reinsurance -- an excess of loss treaty. The classical approach has an indisputable advantage in its simplicity with which data can be generated. In spite of having a simple form, it is not so simple to generate Archimedean copulas for a growing number of dimensions. For a higher number of dimensions the complexity of data generation greatly increases. For above mentioned reasons it is worth considering whether conditions of 2 similarly distributed variables and not too high dimensionality are fulfilled, before general forms of dependence are applied.
296

保險公司因應死亡率風險之避險策略 / Hedging strategy against mortality risk for insurance company

莊晉國, Chuang, Chin Kuo Unknown Date (has links)
本篇論文主要討論在死亡率改善不確定性之下的避險策略。當保險公司負債面的人壽保單是比年金商品來得多的時候,公司會處於死亡率的風險之下。我們假設死亡率和利率都是隨機的情況,部分的死亡率風險可以經由自然避險而消除,而剩下的死亡率風險和利率風險則由零息債券和保單貼現商品來達到最適避險效果。我們考慮mean variance、VaR和CTE當成目標函數時的避險策略,其中在mean variance的最適避險策略可以導出公式解。由數值結果我們可以得知保單貼現的確是死亡率風險的有效避險工具。 / This paper proposes hedging strategies to deal with the uncertainty of mortality improvement. When insurance company has more life insurance contracts than annuities in the liability, it will be under the exposure of mortality risk. We assume both mortality and interest rate risk are stochastic. Part of mortality risk is eliminated by natural hedging and the remaining mortality risk and interest rate risk will be optimally hedged by zero coupon bond and life settlement contract. We consider the hedging strategies with objective functions of mean variance, value at risk and conditional tail expectation. The closed-form optimal hedging formula for mean variance assumption is derived, and the numerical result show the life settlement is indeed a effective hedging instrument against mortality risk.
297

Essays on tail risk in macroeconomics and finance: measurement and forecasting

Ricci, Lorenzo 13 February 2017 (has links)
This thesis is composed of three chapters that propose some novel approaches on tail risk for financial market and forecasting in finance and macroeconomics. The first part of this dissertation focuses on financial market correlations and introduces a simple measure of tail correlation, TailCoR, while the second contribution addresses the issue of identification of non- normal structural shocks in Vector Autoregression which is common on finance. The third part belongs to the vast literature on predictions of economic growth; the problem is tackled using a Bayesian Dynamic Factor model to predict Norwegian GDP.Chapter I: TailCoRThe first chapter introduces a simple measure of tail correlation, TailCoR, which disentangles linear and non linear correlation. The aim is to capture all features of financial market co- movement when extreme events (i.e. financial crises) occur. Indeed, tail correlations may arise because asset prices are either linearly correlated (i.e. the Pearson correlations are different from zero) or non-linearly correlated, meaning that asset prices are dependent at the tail of the distribution.Since it is based on quantiles, TailCoR has three main advantages: i) it is not based on asymptotic arguments, ii) it is very general as it applies with no specific distributional assumption, and iii) it is simple to use. We show that TailCoR also disentangles easily between linear and non-linear correlations. The measure has been successfully tested on simulated data. Several extensions, useful for practitioners, are presented like downside and upside tail correlations.In our empirical analysis, we apply this measure to eight major US banks for the period 2003-2012. For comparison purposes, we compute the upper and lower exceedance correlations and the parametric and non-parametric tail dependence coefficients. On the overall sample, results show that both the linear and non-linear contributions are relevant. The results suggest that co-movement increases during the financial crisis because of both the linear and non- linear correlations. Furthermore, the increase of TailCoR at the end of 2012 is mostly driven by the non-linearity, reflecting the risks of tail events and their spillovers associated with the European sovereign debt crisis. Chapter II: On the identification of non-normal shocks in structural VARThe second chapter deals with the structural interpretation of the VAR using the statistical properties of the innovation terms. In general, financial markets are characterized by non- normal shocks. Under non-Gaussianity, we introduce a methodology based on the reduction of tail dependency to identify the non-normal structural shocks.Borrowing from statistics, the methodology can be summarized in two main steps: i) decor- relate the estimated residuals and ii) the uncorrelated residuals are rotated in order to get a vector of independent shocks using a tail dependency matrix. We do not label the shocks a priori, but post-estimate on the basis of economic judgement.Furthermore, we show how our approach allows to identify all the shocks using a Monte Carlo study. In some cases, the method can turn out to be more significant when the amount of tail events are relevant. Therefore, the frequency of the series and the degree of non-normality are relevant to achieve accurate identification.Finally, we apply our method to two different VAR, all estimated on US data: i) a monthly trivariate model which studies the effects of oil market shocks, and finally ii) a VAR that focuses on the interaction between monetary policy and the stock market. In the first case, we validate the results obtained in the economic literature. In the second case, we cannot confirm the validity of an identification scheme based on combination of short and long run restrictions which is used in part of the empirical literature.Chapter III :Nowcasting NorwayThe third chapter consists in predictions of Norwegian Mainland GDP. Policy institutions have to decide to set their policies without knowledge of the current economic conditions. We estimate a Bayesian dynamic factor model (BDFM) on a panel of macroeconomic variables (all followed by market operators) from 1990 until 2011.First, the BDFM is an extension to the Bayesian framework of the dynamic factor model (DFM). The difference is that, compared with a DFM, there is more dynamics in the BDFM introduced in order to accommodate the dynamic heterogeneity of different variables. How- ever, in order to introduce more dynamics, the BDFM requires to estimate a large number of parameters, which can easily lead to volatile predictions due to estimation uncertainty. This is why the model is estimated with Bayesian methods, which, by shrinking the factor model toward a simple naive prior model, are able to limit estimation uncertainty.The second aspect is the use of a small dataset. A common feature of the literature on DFM is the use of large datasets. However, there is a literature that has shown how, for the purpose of forecasting, DFMs can be estimated on a small number of appropriately selected variables.Finally, through a pseudo real-time exercise, we show that the BDFM performs well both in terms of point forecast, and in terms of density forecasts. Results indicate that our model outperforms standard univariate benchmark models, that it performs as well as the Bloomberg Survey, and that it outperforms the predictions published by the Norges Bank in its monetary policy report. / Doctorat en Sciences économiques et de gestion / info:eu-repo/semantics/nonPublished
298

Tail Risk Protection via reproducible data-adaptive strategies

Spilak, Bruno 15 February 2024 (has links)
Die Dissertation untersucht das Potenzial von Machine-Learning-Methoden zur Verwaltung von Schwanzrisiken in nicht-stationären und hochdimensionalen Umgebungen. Dazu vergleichen wir auf robuste Weise datenabhängige Ansätze aus parametrischer oder nicht-parametrischer Statistik mit datenadaptiven Methoden. Da datengetriebene Methoden reproduzierbar sein müssen, um Vertrauen und Transparenz zu gewährleisten, schlagen wir zunächst eine neue Plattform namens Quantinar vor, die einen neuen Standard für wissenschaftliche Veröffentlichungen setzen soll. Im zweiten Kapitel werden parametrische, lokale parametrische und nicht-parametrische Methoden verglichen, um eine dynamische Handelsstrategie für den Schutz vor Schwanzrisiken in Bitcoin zu entwickeln. Das dritte Kapitel präsentiert die Portfolio-Allokationsmethode NMFRB, die durch eine Dimensionsreduktionstechnik hohe Dimensionen bewältigt. Im Vergleich zu klassischen Machine-Learning-Methoden zeigt NMFRB in zwei Universen überlegene risikobereinigte Renditen. Das letzte Kapitel kombiniert bisherige Ansätze zu einer Schwanzrisikoschutzstrategie für Portfolios. Die erweiterte NMFRB berücksichtigt Schwanzrisikomaße, behandelt nicht-lineare Beziehungen zwischen Vermögenswerten während Schwanzereignissen und entwickelt eine dynamische Schwanzrisikoschutzstrategie unter Berücksichtigung der Nicht-Stationarität der Vermögensrenditen. Die vorgestellte Strategie reduziert erfolgreich große Drawdowns und übertrifft andere moderne Schwanzrisikoschutzstrategien wie die Value-at-Risk-Spread-Strategie. Die Ergebnisse werden durch verschiedene Data-Snooping-Tests überprüft. / This dissertation shows the potential of machine learning methods for managing tail risk in a non-stationary and high-dimensional setting. For this, we compare in a robust manner data-dependent approaches from parametric or non-parametric statistics with data-adaptive methods. As these methods need to be reproducible to ensure trust and transparency, we start by proposing a new platform called Quantinar, which aims to set a new standard for academic publications. In the second chapter, we dive into the core subject of this thesis which compares various parametric, local parametric, and non-parametric methods to create a dynamic trading strategy that protects against tail risk in Bitcoin cryptocurrency. In the third chapter, we propose a new portfolio allocation method, called NMFRB, that deals with high dimensions thanks to a dimension reduction technique, convex Non-negative Matrix Factorization. This technique allows us to find latent interpretable portfolios that are diversified out-of-sample. We show in two universes that the proposed method outperforms other classical machine learning-based methods such as Hierarchical Risk Parity (HRP) concerning risk-adjusted returns. We also test the robustness of our results via Monte Carlo simulation. Finally, the last chapter combines our previous approaches to develop a tail-risk protection strategy for portfolios: we extend the NMFRB to tail-risk measures, we address the non-linear relationships between assets during tail events by developing a specific non-linear latent factor model, finally, we develop a dynamic tail risk protection strategy that deals with the non-stationarity of asset returns using classical econometrics models. We show that our strategy is successful at reducing large drawdowns and outperforms other modern tail-risk protection strategies such as the Value-at-Risk-spread strategy. We verify our findings by performing various data snooping tests.
299

The development of the ethmoidal region of Ascaphus truei (Stejneger)

Baard, E. H. W. (Ernst Hendrik Wolfaardt) 12 1900 (has links)
Thesis (MSc)--University of Stellenbosch, 1982. / ENGLISH ABSTRACT:This study deals with the development of the ethmoidal region of Ascaphus truei Stejneger with special interest to the development of the nasal sacs, the diverticulum medium and the Jacobson's organ. This is to gather more information regarding the phylogeny of the structures. The opinions concerning the phylogenetical migration of the Jacobson's organ, are corroborated by the development of the organ in Ascaphus. The possible origin of the diverticulum medium from the nasal end of the nasolacrimal duct also is commented on. / No Afrikaans abstract available
300

衡量銀行市場風險-VaR與ETL模型的應用

陳嘉敏, Chen, Jia Min Unknown Date (has links)
本文提出了一個新興風險衡量的工具的概念-期望尾端損失值(ETL),其有別於風險值為百分位數且未考慮報酬分配的尾部風險(Tail Risk),本研究期望能透過ETL的估計可以更完整表達投資組合所有可能面臨的風險,對於市場風險能更有效控管。 本文實證討論有關VaR與ETL穩定度的部分,VaR雖然在理論上證明無法滿足次可加性這個條件,但是在本研究實證中,即使在分配具厚尾狀況下,VaR仍滿足次加性的性質。這也表示,我們在現實生活中很難因VaR理論上缺乏次可加性,而捨棄VaR這個風險衡量工具,然ETL也有其貢獻性,其較VaR多考慮尾部資訊,可視為風險值外另一參考指標,此為本文貢獻一。 本文實證也探討移動窗口中歷史資料長度的不同,是否造成VaR與ETL估算準確性的差異,本文由實證結果發現:在歷史窗口的資料長度越長(1000日)下,並沒有正確預估VaR與ETL,而本研究中以移動窗口為500日下,使用內部模型較具正確性,故在使用風險值模型時,應謹慎選擇移動窗口之長度,此為本文貢獻二。

Page generated in 0.0373 seconds