Spelling suggestions: "subject:"fail"" "subject:"mail""
291 |
Living Structure for Understanding Human Activity Patterns Using Multi-Source Geospatial Big DataRen, Zheng January 2023 (has links)
Geographic space is not neutral or lifeless, but an intricate living structure composed of numerous small features and a few large ones across all scales. The living structure is crucial for comprehending how geographic space shapes human activities. With the emerging geospatial big data, researchers now have unprecedented opportunities to study the relationship between geographic space and human behaviour at a finer spatial resolution. This thesis leverages multisource geospatial big data, including Twitter check-in locations, street networks from OpenStreetMap, building footprints, and night-time light images, to explore the fundamental mechanisms of human activities that underlie geographic space. To overcome the limitations of conventional analytics in this era of big data, we propose the topological representation and living structure based on Christopher Alexander's conception of space. We utilize scaling and topological analyses to reveal the underlying living structure of geographic space with various big datasets. Our results demonstrate that tweet locations or human activities at different scales can be accurately predicted by the underlying living structure of street nodes. We also capture and characterize human activities using big data and find that building footprints and tweets show similar scaling patterns in terms of sizes of their spatial clusters. We also propose an improved spatial clustering method to increase the processing speed of geospatial big data. Finally, we adopt topological representation to identify urban centres by the fusion of multi-source geospatial big data. The living structure, together with its topological representation can help us better understand human activities patterns in the geographic space at both city and country levels.
|
292 |
[pt] VALOR EM RISCO: UMA COMPARAÇÃO ENTRE MÉTODOS DE ESCOLHA DA FRAÇÃO AMOSTRAL NA ESTIMAÇÃO DO ÍNDICE DE CAUDA DE DISTRIBUIÇÕES GEV / [en] VALUE AT RISK: A COMPARISON OF METHODS TO CHOOSE THE SAMPLE FRACTION IN TAIL INDEX ESTIMATION OF GENERALIZED EXTREME VALUE DISTRIBUTIONCHRISTIAM MIGUEL GONZALES CHAVEZ 28 August 2002 (has links)
[pt] Valor em Risco -VaR- já é parte das ferramentas habituais
que um analista financeiro utiliza para estimar o risco
de mercado. Na implementação do VaR é necessário que seja
estimados quantis de baixa probabilidade para a
distribuição condicional dos retornos dos portfólios. A
metodologia tradicional para o cálculo do VaR requer a
estimação de um modelo tipo GARCH com distribuição normal.
Entretanto, a hipótese de normalidade condicional nem
sempre é adequada, principalmente quando se deseja
estimar o VaR em períodos atípicos, caracterizados pela
ocorrência de eventos extremos. Nesta situações a
distribuição condicional deve apresentar excesso de
curtose. O uso de distribuições derivadas do Teorema do
Valor Extremos -TVE-, conhecidas coletivamente como
GEV,associadas aos modelos tipo GARCH, tornou possível o
cálculo do VaR nestas situações.Um parâmetro chave nas
distribuições da família GEV é o índice de cauda, o qual
pode ser estimado através do estimador de Hill.
Entretanto este estimador apresenta muita sensibilidade
em termos de variância e viés com respeito à fração
amostral utilizada na sua estimação. O objetivo principal
desta dissertação foi fazer uma comparação entre três
métodos de escolha da fração amostral, recentemente
sugeridos na literatura: o método bootstrap duplo
Danielsson, de Haan, Peng e de Vries 1999, o método
threshold Guillou e Hall 2001 e o Hill plot alternativo
Drees, de Haan e Resnick 2000. A avaliação dos métodos
foi feita através do teste de cobertura condicional
de Christoffersen 1998, o qual foi aplicado às séries de
retornos dos índices: NASDAQ, NIKKEY,MERVAL e IBOVESPA.
Os nossos resultados indicam que os três métodos
apresentam aproximadamente o mesmo desempenho, com uma
ligeira vantagem dos métodos bootstrap duplo e o
threshold sobre o Hill plot alternativo, porque este
ultimo tem um componente normativo na determinação do
índice de cauda ótimo. / [en] Value at Risk -VaR- is already part of the toolkit of financial analysts assessing market risk. In order to implement VaR it is needed to estimate low quantiles of the portfolio returns distribution. Traditional methodologies combine a normal conditional distribution together with ARCH type models to accomplish this goal. Albeit well succeed in evaluating risk for typical periods, this methodology has not been able to accommodate events that occur with very low probabilities. For these situations one needs conditional distributions with excess of kurtosis. The use of distributions derived from the ExtremeValue Theory -EVT-, collectively known as Generalized Extreme Value distribution -GEV-, together with ARCH type models have made it possible to address this problem ina proper framework. A key parameter in the GEV distribution is the tail index, which can be estimated by Hill s estimator. Hill s estimator is very sensible, in terms of bias and RMSE, to the sample fraction that is used in its estimation. The objective of this dissertation is to compare three recently suggested methods presented in the statistical literature: the double bootstrap method Danielsson, de Haan, Peng and de Vries 1999,the threshold method Guillou and Hall 2001 and the alternative Hill plot Drees, de Haan and Resnick 2000. The methods have been evaluated with respect to the conditional coverage test of Christoffersen 1998, which has been applied to the followingreturns series : NASDAQ, NIKKEY, MERVAL e IBOVESPA. Our empirical findings suggests that, overall the three methods have the same performance, with some advantage of the bootstrap and threshold methods over the alternative Hill plot, which has a normative component in the determination of the optimal tail index.
|
293 |
Expansion av e-handelsverksamhet i detaljhandeln : En studie om utmaningar och framgångsfaktorer / Growing e-commerce operations within the retail industry : A study of challenges and success factorsSteen Lagerstam, Nathalie January 2016 (has links)
I vårt allt mer digitaliserade samhälle har internet fått en central roll, och har förändrat många aspekter i vårt vardagsliv. Ett av de områden som påverkats starkt av denna utveckling, är den svenska detaljhandelns e-handelsmarknad. De senaste 10 åren har dess omsättning sexdubblats, och prognoser tyder på att denna expansiva trend kommer att hålla i sig. Detta innebär helt nya förutsättningar för de svenska detaljhandelsföretag som tidigare bara bedrivit försäljning i fysiska butiker, eller bedrivit e-handelsverksamhet i liten skala. Ämnet för denna studie föreslogs av den svenska detaljhandelskedjan Företag X[1], som har beslutat att lansera en e-handel i början av år 2016. Företag X föreslog att författaren skulle kartlägga vad som krävs för att gå från en liten nyetablerad e-handel, till en stor och framgångsrik e-handel som bidrar till företagets lönsamhet och fortsatta välmående. Studiens syfte är således att kartlägga praxis och metoder som möjliggör framgångsrik expansion av e-handeln för medelstora, svenska detaljhandelsföretag. För att uppnå syftet valdes en kvalitativ multifallstudie-design. Information till den teoretiska referensramen samlades in från vetenskapliga artiklar och böcker, och kompletterades med sekundärempiri i form av rapporter från företag. Kvalitativ primärdata samlades sedan in från intervjuer med tre fallföretag, som hittades genom en mindre tvärsnittsstudie av 20 konkurrenter till Företag X där företagens EBITDA-marginal de senaste fem åren studerades. Studiens resultat visade att den svenska detaljhandelns e-handelsmarknad har sett en mycket expansiv trend de senaste åren, och att denna ser ut att hålla i sig under det kommande decenniet. Detta innebär att marknadsklimatet håller på att förändras och går mot att bli allt mer digitalt, vilket kräver att företagen ser över sina försäljningsorganisationer. Den största trenden på marknaden kallas ”omnikanalhandel”, och innebär i korthet att företaget integrerar sin butiks- och e-handelsförsäljning för att möjliggöra en sömlös köpupplevelse för kunden. Denna trend har framträtt som ett svar på ett nytt köpmönster som noterats bland kunderna, där dessa rör sig mellan företagens digitala och fysiska försäljningskanaler under köpresan. Studien pekar på att de e-handelsrelaterade frågor som är viktigast att adressera och arbeta med finns inom följande områden: KundfokusIT Kundrelationer Lyhördhet mot kund KostnadseffektivitetSupply chain och logistik Strategi och styrning MotståndshanteringUtbildning Samspel med butiker Studien utgör författarens examensarbete i utbildningen Industriell Ekonomi vid Linköpings Tekniska Högskola. [1] Företag X heter egentligen något annat, men har bett att få vara anonyma i denna studie. / The Internet has become a central part of the modern society and has change our everyday life in many ways. An area that’s been strongly affected by this development is the Swedish retail e-commerce market. E-commerce sales have increased six-fold over the course of the past decade and forecasts indicate that this expansive trend will continue. This trend places significant challenges on Swedish retail companies who have a legacy of exclusively carrying out their sales in the traditional brick and mortar format. The subject of this study was originally proposed by the Swedish retail company Företag X[1], who have decided to embark on their e-commerce initiative in early 2016. Företag X asked the author of this report to help with finding out what it takes to develop a small e-commerce store, in such a way that it becomes a successful part of the company, contributing to the company’s overall profitability. The purpose of this study is therefore to identify practices and methods that enable successful expansion of e-commerce for medium-sized, Swedish retail companies. In order to achieve the purpose, a qualitative multi-case study design was chosen for the study. Information for the theoretical framework was collected from articles in scientific journals and books, and supplemented with secondary empirical data in the form of reports from companies. Qualitative primary data were then collected from interviews with three chosen companies, which were found with through a small cross-sectional study of 20 competitors to Företag X where their EBITDA margins over the past five years were studied. The study results showed an expansive trend on the Swedish retail e-commerce market, and it seems that this trend is set to continue over the next decade. As a consequence, market climate is changing and shifting towards becoming more digitalized, which requires companies to review their sales organizations. The main trend in the market is called “omni-channel retailing" and means that a company is integrating its brick and mortar and e-commerce sales to enable a seamless shopping experience for the customer. This trend has emerged as a response to a new buying patterns observed among customers, who move between their digital and physical sales channels during the purchase. The study indicates that the e-commerce-related issues that are most important to address and work with are in the following areas: Customer FocusIT Customer relations Responsiveness to customers Cost effectiveness Supply chain and logistics Strategy and governance Resistance ManagementEducation Synchronization between brick and mortar stores and e-commerce store This study represents the author’s master’s thesis in the Industrial Engineering and Management program at Linköping University, Sweden. [1] Företag X are known by a different name, but have asked to remain anonymous in this study
|
294 |
Modelování přírodních katastrof v pojišťovnictví / Modelling natural catastrophes in insuranceVarvařovský, Václav January 2009 (has links)
Quantification of risks is one of the pillars of the contemporary insurance industry. Natural catastrophes and their modelling represents one of the most important areas of non-life insurance in the Czech Republic. One of the key inputs of catastrophe models is a spatial dependence structure in the portfolio of an insurance company. Copulas represents a more general view on dependence structures and broaden the classical approach, which is implicitly using the dependence structure of a multivariate normal distribution. The goal of this work, with respect to absence of comprehensive monographs in the Czech Republic, is to provide a theoretical basis for use of copulas. It focuses on general properties of copulas and specifics of two most commonly used families of copulas -- Archimedean and elliptical. The other goal is to quantify difference between the given copula and the classical approach, which uses dependency structure of a multivariate normal distribution, in modelled flood losses in the Czech Republic. Results are largely dependent on scale of losses in individual areas. If the areas have approximately a "tower" structure (i.e., one area significantly outweighs others), the effect of a change in the dependency structure compared to the classical approach is between 5-10% (up and down depending on a copula) at 99.5 percentile of original losses (a return period of once in 200 years). In case that all areas are approximately similarly distributed the difference, owing to the dependency structure, can be up to 30%, which means rather an important difference when buying the most common form of reinsurance -- an excess of loss treaty. The classical approach has an indisputable advantage in its simplicity with which data can be generated. In spite of having a simple form, it is not so simple to generate Archimedean copulas for a growing number of dimensions. For a higher number of dimensions the complexity of data generation greatly increases. For above mentioned reasons it is worth considering whether conditions of 2 similarly distributed variables and not too high dimensionality are fulfilled, before general forms of dependence are applied.
|
295 |
保險公司因應死亡率風險之避險策略 / Hedging strategy against mortality risk for insurance company莊晉國, Chuang, Chin Kuo Unknown Date (has links)
本篇論文主要討論在死亡率改善不確定性之下的避險策略。當保險公司負債面的人壽保單是比年金商品來得多的時候,公司會處於死亡率的風險之下。我們假設死亡率和利率都是隨機的情況,部分的死亡率風險可以經由自然避險而消除,而剩下的死亡率風險和利率風險則由零息債券和保單貼現商品來達到最適避險效果。我們考慮mean variance、VaR和CTE當成目標函數時的避險策略,其中在mean variance的最適避險策略可以導出公式解。由數值結果我們可以得知保單貼現的確是死亡率風險的有效避險工具。 / This paper proposes hedging strategies to deal with the uncertainty of mortality improvement. When insurance company has more life insurance contracts than annuities in the liability, it will be under the exposure of mortality risk. We assume both mortality and interest rate risk are stochastic. Part of mortality risk is eliminated by natural hedging and the remaining mortality risk and interest rate risk will be optimally hedged by zero coupon bond and life settlement contract. We consider the hedging strategies with objective functions of mean variance, value at risk and conditional tail expectation. The closed-form optimal hedging formula for mean variance assumption is derived, and the numerical result show the life settlement is indeed a effective hedging instrument against mortality risk.
|
296 |
Essays on tail risk in macroeconomics and finance: measurement and forecastingRicci, Lorenzo 13 February 2017 (has links)
This thesis is composed of three chapters that propose some novel approaches on tail risk for financial market and forecasting in finance and macroeconomics. The first part of this dissertation focuses on financial market correlations and introduces a simple measure of tail correlation, TailCoR, while the second contribution addresses the issue of identification of non- normal structural shocks in Vector Autoregression which is common on finance. The third part belongs to the vast literature on predictions of economic growth; the problem is tackled using a Bayesian Dynamic Factor model to predict Norwegian GDP.Chapter I: TailCoRThe first chapter introduces a simple measure of tail correlation, TailCoR, which disentangles linear and non linear correlation. The aim is to capture all features of financial market co- movement when extreme events (i.e. financial crises) occur. Indeed, tail correlations may arise because asset prices are either linearly correlated (i.e. the Pearson correlations are different from zero) or non-linearly correlated, meaning that asset prices are dependent at the tail of the distribution.Since it is based on quantiles, TailCoR has three main advantages: i) it is not based on asymptotic arguments, ii) it is very general as it applies with no specific distributional assumption, and iii) it is simple to use. We show that TailCoR also disentangles easily between linear and non-linear correlations. The measure has been successfully tested on simulated data. Several extensions, useful for practitioners, are presented like downside and upside tail correlations.In our empirical analysis, we apply this measure to eight major US banks for the period 2003-2012. For comparison purposes, we compute the upper and lower exceedance correlations and the parametric and non-parametric tail dependence coefficients. On the overall sample, results show that both the linear and non-linear contributions are relevant. The results suggest that co-movement increases during the financial crisis because of both the linear and non- linear correlations. Furthermore, the increase of TailCoR at the end of 2012 is mostly driven by the non-linearity, reflecting the risks of tail events and their spillovers associated with the European sovereign debt crisis. Chapter II: On the identification of non-normal shocks in structural VARThe second chapter deals with the structural interpretation of the VAR using the statistical properties of the innovation terms. In general, financial markets are characterized by non- normal shocks. Under non-Gaussianity, we introduce a methodology based on the reduction of tail dependency to identify the non-normal structural shocks.Borrowing from statistics, the methodology can be summarized in two main steps: i) decor- relate the estimated residuals and ii) the uncorrelated residuals are rotated in order to get a vector of independent shocks using a tail dependency matrix. We do not label the shocks a priori, but post-estimate on the basis of economic judgement.Furthermore, we show how our approach allows to identify all the shocks using a Monte Carlo study. In some cases, the method can turn out to be more significant when the amount of tail events are relevant. Therefore, the frequency of the series and the degree of non-normality are relevant to achieve accurate identification.Finally, we apply our method to two different VAR, all estimated on US data: i) a monthly trivariate model which studies the effects of oil market shocks, and finally ii) a VAR that focuses on the interaction between monetary policy and the stock market. In the first case, we validate the results obtained in the economic literature. In the second case, we cannot confirm the validity of an identification scheme based on combination of short and long run restrictions which is used in part of the empirical literature.Chapter III :Nowcasting NorwayThe third chapter consists in predictions of Norwegian Mainland GDP. Policy institutions have to decide to set their policies without knowledge of the current economic conditions. We estimate a Bayesian dynamic factor model (BDFM) on a panel of macroeconomic variables (all followed by market operators) from 1990 until 2011.First, the BDFM is an extension to the Bayesian framework of the dynamic factor model (DFM). The difference is that, compared with a DFM, there is more dynamics in the BDFM introduced in order to accommodate the dynamic heterogeneity of different variables. How- ever, in order to introduce more dynamics, the BDFM requires to estimate a large number of parameters, which can easily lead to volatile predictions due to estimation uncertainty. This is why the model is estimated with Bayesian methods, which, by shrinking the factor model toward a simple naive prior model, are able to limit estimation uncertainty.The second aspect is the use of a small dataset. A common feature of the literature on DFM is the use of large datasets. However, there is a literature that has shown how, for the purpose of forecasting, DFMs can be estimated on a small number of appropriately selected variables.Finally, through a pseudo real-time exercise, we show that the BDFM performs well both in terms of point forecast, and in terms of density forecasts. Results indicate that our model outperforms standard univariate benchmark models, that it performs as well as the Bloomberg Survey, and that it outperforms the predictions published by the Norges Bank in its monetary policy report. / Doctorat en Sciences économiques et de gestion / info:eu-repo/semantics/nonPublished
|
297 |
Tail Risk Protection via reproducible data-adaptive strategiesSpilak, Bruno 15 February 2024 (has links)
Die Dissertation untersucht das Potenzial von Machine-Learning-Methoden zur Verwaltung von Schwanzrisiken in nicht-stationären und hochdimensionalen Umgebungen. Dazu vergleichen wir auf robuste Weise datenabhängige Ansätze aus parametrischer oder nicht-parametrischer Statistik mit datenadaptiven Methoden. Da datengetriebene Methoden reproduzierbar sein müssen, um Vertrauen und Transparenz zu gewährleisten, schlagen wir zunächst eine neue Plattform namens Quantinar vor, die einen neuen Standard für wissenschaftliche Veröffentlichungen setzen soll. Im zweiten Kapitel werden parametrische, lokale parametrische und nicht-parametrische Methoden verglichen, um eine dynamische Handelsstrategie für den Schutz vor Schwanzrisiken in Bitcoin zu entwickeln. Das dritte Kapitel präsentiert die Portfolio-Allokationsmethode NMFRB, die durch eine Dimensionsreduktionstechnik hohe Dimensionen bewältigt. Im Vergleich zu klassischen Machine-Learning-Methoden zeigt NMFRB in zwei Universen überlegene risikobereinigte Renditen. Das letzte Kapitel kombiniert bisherige Ansätze zu einer Schwanzrisikoschutzstrategie für Portfolios. Die erweiterte NMFRB berücksichtigt Schwanzrisikomaße, behandelt nicht-lineare Beziehungen zwischen Vermögenswerten während Schwanzereignissen und entwickelt eine dynamische Schwanzrisikoschutzstrategie unter Berücksichtigung der Nicht-Stationarität der Vermögensrenditen. Die vorgestellte Strategie reduziert erfolgreich große Drawdowns und übertrifft andere moderne Schwanzrisikoschutzstrategien wie die Value-at-Risk-Spread-Strategie. Die Ergebnisse werden durch verschiedene Data-Snooping-Tests überprüft. / This dissertation shows the potential of machine learning methods for managing tail risk in a non-stationary and high-dimensional setting. For this, we compare in a robust manner data-dependent approaches from parametric or non-parametric statistics with data-adaptive methods. As these methods need to be reproducible to ensure trust and transparency, we start by proposing a new platform called Quantinar, which aims to set a new standard for academic publications. In the second chapter, we dive into the core subject of this thesis which compares various parametric, local parametric, and non-parametric methods to create a dynamic trading strategy that protects against tail risk in Bitcoin cryptocurrency. In the third chapter, we propose a new portfolio allocation method, called NMFRB, that deals with high dimensions thanks to a dimension reduction technique, convex Non-negative Matrix Factorization. This technique allows us to find latent interpretable portfolios that are diversified out-of-sample. We show in two universes that the proposed method outperforms other classical machine learning-based methods such as Hierarchical Risk Parity (HRP) concerning risk-adjusted returns. We also test the robustness of our results via Monte Carlo simulation. Finally, the last chapter combines our previous approaches to develop a tail-risk protection strategy for portfolios: we extend the NMFRB to tail-risk measures, we address the non-linear relationships between assets during tail events by developing a specific non-linear latent factor model, finally, we develop a dynamic tail risk protection strategy that deals with the non-stationarity of asset returns using classical econometrics models. We show that our strategy is successful at reducing large drawdowns and outperforms other modern tail-risk protection strategies such as the Value-at-Risk-spread strategy. We verify our findings by performing various data snooping tests.
|
298 |
The development of the ethmoidal region of Ascaphus truei (Stejneger)Baard, E. H. W. (Ernst Hendrik Wolfaardt) 12 1900 (has links)
Thesis (MSc)--University of Stellenbosch, 1982. / ENGLISH ABSTRACT:This study deals with the development of the ethmoidal
region of Ascaphus truei Stejneger with special interest
to the development of the nasal sacs, the diverticulum
medium and the Jacobson's organ. This is to gather
more information regarding the phylogeny of the structures.
The opinions concerning the phylogenetical
migration of the Jacobson's organ, are corroborated by
the development of the organ in Ascaphus. The possible
origin of the diverticulum medium from the nasal end of
the nasolacrimal duct also is commented on. / No Afrikaans abstract available
|
299 |
衡量銀行市場風險-VaR與ETL模型的應用陳嘉敏, Chen, Jia Min Unknown Date (has links)
本文提出了一個新興風險衡量的工具的概念-期望尾端損失值(ETL),其有別於風險值為百分位數且未考慮報酬分配的尾部風險(Tail Risk),本研究期望能透過ETL的估計可以更完整表達投資組合所有可能面臨的風險,對於市場風險能更有效控管。
本文實證討論有關VaR與ETL穩定度的部分,VaR雖然在理論上證明無法滿足次可加性這個條件,但是在本研究實證中,即使在分配具厚尾狀況下,VaR仍滿足次加性的性質。這也表示,我們在現實生活中很難因VaR理論上缺乏次可加性,而捨棄VaR這個風險衡量工具,然ETL也有其貢獻性,其較VaR多考慮尾部資訊,可視為風險值外另一參考指標,此為本文貢獻一。
本文實證也探討移動窗口中歷史資料長度的不同,是否造成VaR與ETL估算準確性的差異,本文由實證結果發現:在歷史窗口的資料長度越長(1000日)下,並沒有正確預估VaR與ETL,而本研究中以移動窗口為500日下,使用內部模型較具正確性,故在使用風險值模型時,應謹慎選擇移動窗口之長度,此為本文貢獻二。
|
300 |
The dynamics of Alfvén eigenmodes excited by energetic ions in toroidal plasmasTholerus, Emmi January 2016 (has links)
The future fusion power plants that are based on magnetic confinement will deal with plasmas that inevitably contain energetic (non-thermal) particles. These particles come, for instance, from fusion reactions or from external heating of the plasma. Ensembles of energetic ions can excite eigenmodes in the Alfvén frequency range to such an extent that the resulting wave fields redistribute the energetic ions, and potentially eject them from the plasma. The redistribution of ions may cause a substantial reduction of heating efficiency. Understanding the dynamics of such instabilities is necessary to optimise the operation of fusion experiments and of future fusion power plants. Two models have been developed to simulate the interaction between energetic ions and Alfvén eigenmodes. One is a bump-on-tail model, of which two versions have been developed: one fully nonlinear and one quasilinear. The quasilinear version has a lower dimensionality of particle phase space than the nonlinear one. Unlike previous similar studies, the bump-on-tail model contains a decorrelation of the wave-particle phase in order to model stochasticity of the system. When the characteristic time scale for macroscopic phase decorrelation is similar to or shorter than the time scale of nonlinear wave-particle dynamics, the nonlinear and the quasilinear descriptions quantitatively agree. A finite phase decorrelation changes the growth rate and the saturation amplitude of the wave mode in systems with an inverted energy distribution around the wave-particle resonance. Analytical expressions for the correction of the growth rate and the saturation amplitude have been derived, which agree well with numerical simulations. A relatively weak phase decorrelation also diminishes frequency chirping events of the eigenmode. The second model is called FOXTAIL, and it has a wider regime of validity than the bump-on-tail model. FOXTAIL is able to simulate systems with multiple eigenmodes, and it includes effects of different individual particle orbits relative to the wave fields. Simulations with FOXTAIL and the nonlinear bump-on-tail model have been compared in order to determine the regimes of validity of the bump-on-tail model quantitatively. Studies of two-mode scenarios confirmed the expected consequences of a fulfillment of the Chirikov criterion for resonance overlap. The influence of ICRH on the eigenmode-energetic ion system has also been studied, showing qualitatively similar effects as seen by the presence of phase decorrelation. Another model, describing the efficiency of fast wave current drive, has been developed in order to study the influence of passive components close to the antenna, in which currents can be induced by the antenna generated wave field. It was found that the directivity of the launched wave, averaged over model parameters, was lowered by the presence of passive components in general, except for low values of the single pass damping of the wave, where the directivity was slightly increased, but reversed in the toroidal direction. / De framtida fusionskraftverken baserade på magnetisk inneslutning kommer att hantera plasmor som oundvikligen innehåller energetiska (icke-termiska) partiklar. Dessa partiklar kommer exempelvis från fusionsreaktioner eller från externa uppvärmningsmekanismer av plasmat. Ensembler av energetiska joner kan excitera egenmoder i Alfvén-frekvensområdet i en sådan utsträckning att de resulterande vågfälten omfördelar de energetiska jonerna i rummet, och potentiellt slungar ut jonerna ur plasmat. Omfördelningen av joner kan orsaka en väsentligen minskad uppvärmningseffekt. Det är nödvändigt att förstå dynamiken hos denna typ av instabilitet för att kunna optimera verkningsgraden hos experiment och hos framtida fusionskraftverk. Två modeller har utvecklats för att simulera interaktionen mellan energetiska joner och Alfvén-egenmoder. Den första är en bump-on-tail-modell, av vilken två versioner har utvecklats: en fullt icke-linjär och en kvasi-linjär. I den kvasi-linjära versionen har partiklarnas fasrum en lägre dimensionalitet än i den icke-linjära versionen. Till skillnad från tidigare liknande studier innehåller denna bump-on-tail-modell en dekorrelation av våg-partikelfasen för att modellera stokasticitet hos systemet. När den karakteristiska tidsskalan för makroskopisk fasdekorrelation är ungefär samma som eller kortare än tidsskalan för icke-linjär våg-partikeldynamik så stämmer den icke-linjära och den kvasi-linjära beskrivningen överens kvantitativt. En ändlig fasdekorrelation förändrar vågmodens tillväxthastighet och satureringsamplitud i system med en inverterad energifördelning omkring våg-partikelresonansen. Analytiska uttryck för korrektionen av tillväxthastigheten och satureringsamplituden har härletts, vilka stämmer väl överens med numeriska simuleringar. En relativt svag fasdekorrelation försvagar även "frequency chirping events" (snabba frekvensskiftningar i korttids-Fourier-transformen av egenmodens amplitudutveckling) hos egenmoden. Den andra modellen, kallad FOXTAIL, har ett mycket bredare giltighetsområde än bump-on-tail-modellen. FOXTAIL kan simulera system med flera egenmoder, och den inkluderar effekter av olika enskilda partikelbanor relativt vågfälten. Simuleringar med FOXTAIL och med bump-on-tail-modellen har jämförts för att kvantitativt bestämma bump-on-tail-modellens giltighetsområde. Studier av scenarier med två egenmoder bekräftar de förväntade effekterna av när Chirikov-kriteriet för resonansöverlapp uppfylls. Även inflytandet av ICRH på dynamiken mellan egenmoder och energetiska joner har studerats, vilket har visat kvalitativt liknande effekter som har observerats i närvaron av fasdekorrelation. En annan modell, vilken beskriver effektiviteten hos "fast wave current drive" (strömdrivning med snabba magnetosoniska vågor), har utvecklats för att studera inflytandet av passiva komponenter nära antennen, i vilka strömmar kan induceras av vågfälten som genereras av antennen. Det visades att den utskickade vågens direktivitet, medelvärdesbildat över modellparametrar, generellt sett minskade vid närvaron av passiva komponenter, förutom vid låg "sinlge pass damping" (dämpning av vågen vid propagering genom hela plasmat), då direktiviteten istället ökade något, men bytte tecken i toroidal riktning. / <p>QC 20160927</p>
|
Page generated in 0.0464 seconds