• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 140
  • 21
  • 21
  • 13
  • 8
  • 7
  • 6
  • 6
  • 5
  • 3
  • 2
  • 2
  • 2
  • 2
  • 1
  • Tagged with
  • 268
  • 268
  • 168
  • 55
  • 38
  • 33
  • 31
  • 30
  • 26
  • 24
  • 22
  • 22
  • 21
  • 21
  • 21
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
221

Existentialismen är en sociologi : en essä om sociologi i en fragmenterad samtid

Grönqvist, Simon January 2010 (has links)
<p>The purpose of this essay is to discuss the base of sociology from an existentialistic perspective. The discussion takes it's position in the debate on the crisis of sociology, and aligns with Alvin Gouldner's understanding of the crisis. Gouldner believed that the crisis of sociology was mirrored in a sociological method that failed to describe the social reality that it meant to describe; a lack in self-criticism and self-reflexivity, a lack in self-containment in relation to state interests and a lack in moral engagement. This raises a number of questions, which I discuss in the essay. What is the role of sociology in our society? How shall we form a method that responds to the social reality that we sociologists aim to describe? How can we make sociology more moral?</p><p>Existentialism offers a starting point to describe these questions. I argue that the existentialistic description of man as essence carries implications for the social science. By constituting a critique of a computable moral, existentialism points at the necessity of a standing self-criticism and dialogue. An existentialistic description of man as non-essence carries implications for the theory and method of sociology. Man's possibility of radical exceeding of himself means the impossibility to reach theories that describe reality as it is. At the same time, method and theory are necessary to create knowledge about social phenomenon. I read existentialism as an imperative for a sociology that is reflexive in the sense a) a reflexivity in relation to the basic presumptions (value philosophical and ontological) that effect our sociological examinations b) a self-reflexivity that amount to an understanding of the scientists own role relation to his study and the object being studied. Furthermore, I read existentialism as an imperative for a radicalization of dialogue as method.</p>
222

Vägen till nyhetsvärdig : En kvalitativ textanalys för att utmana medielogiken genom att testa tre nyhetsvärderingsteorier / The road to being newsworthy : A qualitative text analysis to challenge the media logic by testing three news value theories

Ahlberg, Christofer, Trygged, Mattias, Wahlström, Alexander January 2011 (has links)
I augusti 2010 inträffade en olycka i San José-gruvan i Chile. Olyckan resulterade i att 33 gruvarbe-tare blev fast i ett skyddsrum i över två månader innan de kunde räddas. Händelsen fick ett stort utrymme i media världen över. I vår studie har vi utmanat medielogiken genom att testa tre ny-hetsvärderingsteorier utifrån aftonbladet.ses och dn.ses nyhetsrapportering kring gruvolyckan. Tidningarna valdes därför att de är Sveriges största kvälls- respektive morgontidning. Vi undersök-te vilken av teorierna som bäst kunde appliceras på rapporteringen och även på skillnader mellan tidningarnas rapportering sett utifrån teorierna. De nyhetsvärderingsteorier vi testat kommer från Håkan Hvitfelt, Tony Harcup &amp; Deirdre O’Neill samt Pamela J. Shoemaker, Tsan-Kuo Chang &amp; Nancy Brendlinger.Vid analysen utförde vi en kvalitativ textanalys och analyserade artiklarna hermeneutiskt genom åtta dimensioner i ett analysschema. Under studien kom vi fram till att det endast finns små skill-nader mellan tidningarna i deras rapportering. De skiljer sig bara åt i två av de åtta dimensioner vi analyserade. Ingen av teorierna stämde helt överens med rapporteringen, men Hvitfelt är den teo-retiker vars teorier stämmer bäst överens med hur tidningarna skrev. Tätt därefter följer Harcup &amp; O’Neill och därefter Shoemaker et al. Det bör dock understrykas att teoriernas kriterier i många fall var vaga och svårtolkade, vilket ledde till att vi själva fick tolka vad teoretikerna menade. Fast-än teoretikernas idéer skiljer sig åt finns det alltså ingen som lyckats skapa en teori som är fullstän-dig. Det är först när teorierna konvergerar som de visar en godtagbar väg till hur en artikel blir nyhetsvärdig i dagens medielandskap. / In August 2010 an accident occurred in the San José mine in Chile. 33 miners were trapped in a shelter at a depth of 700 meters for over two months before they were rescued. The accident got a lot of media cover-age worldwide. In our study we have challenged the media logic by testing three news value theories from the content of aftonbladet.se:s and dn.se:s news articles regarding the mining accident. The newspapers were selected because they are the largest evening and morning newspaper in Sweden. We looked at which of the theories that best could be applied to the newspapers articles and also the differences between the newspa-pers' reporting from the perspective of the theories. The news value theories that we tested are from the theorists Håkan Hvitfelt, Tony Harcup &amp; Deirdre O'Neill and Pamela J. Shoemaker, Tsan-Kuo Chang &amp; Nancy Brendlinger. In the analysis we made a qualitative text analysis and analyzed the articles hermeneutically through eight dimensions in an analytical framework. During the study we concluded that there are only small differences between the newspapers in their reporting. We only found differences in two of the eight dimensions we analyzed. None of the theories fully consists with the newspapers reporting, but Hvitfelt is the theorist whose theory is most consistent with how the newspapers were written. He is followed closely by Harcup &amp; O'Neill, then Shoemaker et al. It should be emphasized that the factors in the theories in many cases are vague and difficult to interpret, which led us to make our own interpretations in those cases. Although the theorists’ ideas differ, none of them has succeeded in creating a theory that is complete. It is only when the theories converge that they demonstrate an acceptable way of how an occurrence becomes news worthy in today's media landscape.
223

Existentialismen är en sociologi : en essä om sociologi i en fragmenterad samtid

Grönqvist, Simon January 2010 (has links)
The purpose of this essay is to discuss the base of sociology from an existentialistic perspective. The discussion takes it's position in the debate on the crisis of sociology, and aligns with Alvin Gouldner's understanding of the crisis. Gouldner believed that the crisis of sociology was mirrored in a sociological method that failed to describe the social reality that it meant to describe; a lack in self-criticism and self-reflexivity, a lack in self-containment in relation to state interests and a lack in moral engagement. This raises a number of questions, which I discuss in the essay. What is the role of sociology in our society? How shall we form a method that responds to the social reality that we sociologists aim to describe? How can we make sociology more moral? Existentialism offers a starting point to describe these questions. I argue that the existentialistic description of man as essence carries implications for the social science. By constituting a critique of a computable moral, existentialism points at the necessity of a standing self-criticism and dialogue. An existentialistic description of man as non-essence carries implications for the theory and method of sociology. Man's possibility of radical exceeding of himself means the impossibility to reach theories that describe reality as it is. At the same time, method and theory are necessary to create knowledge about social phenomenon. I read existentialism as an imperative for a sociology that is reflexive in the sense a) a reflexivity in relation to the basic presumptions (value philosophical and ontological) that effect our sociological examinations b) a self-reflexivity that amount to an understanding of the scientists own role relation to his study and the object being studied. Furthermore, I read existentialism as an imperative for a radicalization of dialogue as method.
224

Development Of Methods For Structural Reliability Analysis Using Design And Analysis Of Computer Experiments And Data Based Extreme Value Analysis

Panda, Satya Swaroop 06 1900 (has links)
The work reported in this thesis is in the area of computational modeling of reliability of engineering structures. The emphasis of the study is on developing methods that are suitable for analysis of large-scale structures such as aircraft structure components. This class of problems continues to offer challenges to an analyst with the most difficult aspect of the analysis being the treatment of nonlinearity in the structural behavior, non-Gaussian nature of uncertainties and quantification of low levels of probability of failure (of the order of 10-5 or less), requiring significant computational effort. The present study covers static/ dynamic behavior, Gaussian/ non-Gaussian models of uncertainties, and (or) linear/ nonlinear structures. The novel elements in the study consist of two components: • application of modeling tools that already exists in the area of design and analysis of computer experiments, and . • application of data based extreme value analysis procedures that are available in the statistics literature. The first component of the work provides opportunity to combine space filling sampling strategies (which have promise for reducing variance of estimation) with kriging based modeling in reliability studies-an opportunity that has not been explored in the existing literature. The second component of the work exploits the virtues of limiting behavior of extremes of sequence of random variables with Monte Carlo simulations of structural response-a strategy for reliability modeling that has not been explored in the existing literature. The hope here is that failure events with probabilities of the order of 10-5 or less could be investigated with relatively less number of Monte Carlo runs. The study also brings out the issues related to combining the above sources of existing knowledge with finite element modeling of engineering structures, thereby leading to newer tools for structural reliability analysis. The thesis is organized into four chapters. The first chapter provides a review of literature that covers methods of reliability analysis and also the background literature on design and analysis of computer experiments and extreme value analysis. The problem of reliability analysis of randomly parametered, linear (or) nonlinear structures subjected to static and (or) dynamic loads is considered in Chapter 2. A deterministic finite element model for the structure to analyze sample realization of the structure is assumed to be available. The reliability analysis is carried out within the framework of response surface methods, which involves the construction of surrogate models for performance functions to be employed in reliability calculations. These surrogate models serve as models of models, and hence termed as meta-models, for structural behavior in the neighborhood of design point. This construction, in the present study, has involved combining space filling optimal Latin hypercube sampling and kriging models. Illustrative examples on numerical prediction of reliability of a ten-bay truss and a W-seal in an aircraft structure are presented. Limited Monte Carlo simulations are used to validate the approximate procedures developed. The reliability of nonlinear vibrating systems under stochastic excitations is investigated in Chapter 3 using a two-stage Monte Carlo simulation strategy. Systems subjected to Gaussian random excitation are considered for the study. It is assumed that the probability distribution of the maximum response in the steady state belongs to the basin of attraction of one of the classical asymptotic extreme value distributions. The first stage of the solution strategy consists of an objective selection of the form of the extreme value distribution based on hypothesis tests, and the next involves the estimation of parameters of the relevant extreme value distribution. Both these steps are implemented using data from limited Monte Carlo simulations of the system response. The proposed procedure is illustrated with examples of linear/nonlinear single-degree and multi-degree of freedom systems driven by random excitations. The predictions from the proposed method are compared with results from large-scale Monte Carlo simulations and also with classical analytical results, when available, from theory of out-crossing statistics. The method is further extended to cover reliability analysis of nonlinear dynamical systems with randomly varying system parameters. Here the methods of meta-modeling developed in Chapter 2 are extended to develop response surface models for parameters of underlying extreme value distributions. Numerical examples presented cover a host of low-dimensional dynamical systems and also the analysis of a wind turbine structure subjected to turbulent wind loads and undergoing large amplitude oscillations. A summary of contributions made along with a few suggestions for further research is presented in Chapter 4.
225

Analyzing value at risk and expected shortfall methods: the use of parametric, non-parametric, and semi-parametric models

Huang, Xinxin 25 August 2014 (has links)
Value at Risk (VaR) and Expected Shortfall (ES) are methods often used to measure market risk. Inaccurate and unreliable Value at Risk and Expected Shortfall models can lead to underestimation of the market risk that a firm or financial institution is exposed to, and therefore may jeopardize the well-being or survival of the firm or financial institution during adverse markets. The objective of this study is therefore to examine various Value at Risk and Expected Shortfall models, including fatter tail models, in order to analyze the accuracy and reliability of these models. Thirteen VaR and ES models under three main approaches (Parametric, Non-Parametric and Semi-Parametric) are examined in this study. The results of this study show that the proposed model (ARMA(1,1)-GJR-GARCH(1,1)-SGED) gives the most balanced Value at Risk results. The semi-parametric model (Extreme Value Theory, EVT) is the most accurate Value at Risk model in this study for S&P 500. / October 2014
226

Brown-Resnick Processes: Analysis, Inference and Generalizations

Engelke, Sebastian 14 December 2012 (has links)
No description available.
227

Characterization and construction of max-stable processes

Strokorb, Kirstin 02 July 2013 (has links)
No description available.
228

Measuring and managing operational risk in the insurance and banking sectors

Karam, Elias 26 June 2014 (has links) (PDF)
Our interest in this thesis is first to combine the different measurement techniques for operational risk in financial companies, and we highlight more and more the consequences of estimation risk which is treated as a particular part of operational risk. In the first part, we will present a full overview of operational risk, from the regulatory laws and regulations to the associated mathematical and actuarial concepts as well as a numerical application regarding the Advanced Measurement Approach, like Loss Distribution to calculate the capital requirement, then applying the Extreme Value Theory. We conclude this first part by setting a scaling technique based on (OLS) enabling us to normalize our external data to a local Lebanese Bank. On the second part, we feature estimation risk by first measuring the error induced on the SCR by the estimation error of the parameters, to having an alternative yield curve estimation and finishing by calling attention to the reflections on assumptions of the calculation instead of focusing on the so called hypothesis "consistent with market values", would be more appropriate and effective than to complicate models and generate additional errors and instability. Chapters in this part illustrate the estimation risk in its different aspects which is a part of operational risk, highlighting as so the attention that should be given in treating our models
229

電路設計中電流值之罕見事件的統計估計探討 / A study of statistical method on estimating rare event in IC Current

彭亞凌, Peng, Ya Ling Unknown Date (has links)
距離期望值4至6倍標準差以外的罕見機率電流值,是當前積體電路設計品質的關鍵之一,但隨著精確度的標準提升,實務上以蒙地卡羅方法模擬電路資料,因曠日廢時愈發不可行,而過去透過參數模型外插估計或迴歸分析方法,也因變數蒐集不易、操作電壓減小使得電流值尾端估計產生偏差,上述原因使得尾端電流值估計困難。因此本文引進統計方法改善罕見機率電流值的估計:先以Box-Cox轉換觀察值為近似常態,改善尾端分配值的估計,再以加權迴歸方法估計罕見電流值,其中迴歸解釋變數為Log或Z分數轉換的經驗累積機率,而加權方法採用Down-weight加重極值樣本資訊的重要性,此外,本研究也考慮能蒐集完整變數的情況,改以電路資料作為解釋變數進行加權迴歸。另一方面,本研究也採用極值理論作為估計方法。 本文先以電腦模擬評估各方法的優劣,假設母體分配為常態、T分配、Gamma分配,以均方誤差作為衡量指標,模擬結果驗證了加權迴歸方法的可行性。而後參考模擬結果決定篩選樣本方式進行實證研究,資料來源為新竹某科技公司,實證結果顯示加權迴歸配合Box-Cox轉換能以十萬筆樣本數,準確估計左、右尾機率10^(-4) 、10^(-5)、10^(-6)、10^(-7)極端電流值。其中右尾部分的加權迴歸解釋變數採用對數轉換,而左尾部分的加權迴歸解釋變數採用Z分數轉換,估計結果較為準確,又若能蒐集電路資訊作為解釋變數,在左尾部份可以有最準確的估計結果;而篩選樣本尾端1%和整筆資料的方式對於不同方法的估計準確度各有利弊,皆可考慮。另外,1%門檻值比例的極值理論能穩定且中等程度的估計不同電壓下的電流值,且有短程估計最準的趨勢。 / To obtain the tail distribution of current beyond 4 to 6 sigma is nowadays a key issue in integrated circuit (IC) design and computer simulation is a popular tool to estimate the tail values. Since creating rare events via simulation is time-consuming, often the linear extrapolation methods (such as regression analysis) are applied to enhance efficiency. However, it is shown from past work that the tail values is likely to behave differently if the operating voltage is getting lower. In this study, a statistical method is introduced to deal with the lower voltage case. The data are evaluated via the Box-Cox (or power) transformation and see if they need to be transformed into normally distributed data, following by weighted regression to extrapolate the tail values. In specific, the independent variable is the empirical CDF with logarithm or z-score transformation, and the weight is down-weight in order to emphasize the information of extreme values observations. In addition to regression analysis, Extreme Value Theory (EVT) is also adopted in the research. The computer simulation and data sets from a famous IC manufacturer in Hsinchu are used to evaluate the proposed method, with respect to mean squared error. In computer simulation, the data are assumed to be generated from normal, student t, or Gamma distribution. For empirical data, there are 10^8 observations and tail values with probabilities 10^(-4),10^(-5),10^(-6),10^(-7) are set to be the study goal given that only 10^5 observations are available. Comparing to the traditional methods and EVT, the proposed method has the best performance in estimating the tail probabilities. If the IC current is produced from regression equation and the information of independent variables can be provided, using the weighted regression can reach the best estimation for the left-tailed rare events. Also, using EVT can also produce accurate estimates provided that the tail probabilities to be estimated and the observations available are on the similar scale, e.g., probabilities 10^(-5)~10^(-7) vs.10^5 observations.
230

Tail risk in the hedge fund industry

Santos, Eduardo Alonso Marza dos 28 May 2015 (has links)
Submitted by Eduardo Alonso Marza dos Santos (eduardo.marza.santos@gmail.com) on 2015-06-21T10:30:55Z No. of bitstreams: 1 Eduardo_A_M_Santos.pdf: 646820 bytes, checksum: aaba122a576d7c75ad0e5803539c25d4 (MD5) / Approved for entry into archive by Suzinei Teles Garcia Garcia (suzinei.garcia@fgv.br) on 2015-06-22T11:46:18Z (GMT) No. of bitstreams: 1 Eduardo_A_M_Santos.pdf: 646820 bytes, checksum: aaba122a576d7c75ad0e5803539c25d4 (MD5) / Made available in DSpace on 2015-06-22T11:56:18Z (GMT). No. of bitstreams: 1 Eduardo_A_M_Santos.pdf: 646820 bytes, checksum: aaba122a576d7c75ad0e5803539c25d4 (MD5) Previous issue date: 2015-05-28 / The dissertation goal is to quantify the tail risk premium embedded into hedge funds' returns. Tail risk is the probability of extreme large losses. Although it is a rare event, asset pricing theory suggests that investors demand compensation for holding assets sensitive to extreme market downturns. By de nition, such events have a small likelihood to be represented in the sample, what poses a challenge to estimate the e ects of tail risk by means of traditional approaches such as VaR. The results show that it is not su cient to account for the tail risk stemming from equities markets. Active portfolio management employed by hedge funds demand a speci c measure to estimate and control tail risk. Our proposed factor lls that void inasmuch it presents explanatory power both over the time series as well as the cross-section of funds' returns. / O objetivo do trabalho é quanti car o prêmio de risco de cauda presente nos retornos de fundos de investimento americanos. Risco de cauda é o risco de perdas excepcionalmente elevadas. Apesar de ser um evento raro, a teoria de apreçamento de ativos sugere que os investidores exigem um prêmio de risco para reter ativos expostos a eventos negativos extremos (eventos de cauda). Por de nição, observações extremas têm baixa probabilidade de estarem presentes na amostra, o que di culta a estimação dos impactos de risco de cauda sobre os retornos e reduz o poder de técnicas tradicionais como VaR. Os resultados indicam que não é su ciente controlar somente para o risco de cauda do mercado de capitais. A gestão ativa de portfólio por parte dos gestores de fundos requer uma medida própria para estimação e o controle de risco de cauda. O fator de risco de cauda que propomos cumpre este papel ao apresentar poder explicativo tanto na série temporal dos retornos quanto no corte transversal.

Page generated in 0.0406 seconds