101 |
IFRS 9 replacing IAS 39 : A study about how the implementation of the Expected Credit Loss Model in IFRS 9 i beleived to impact comparability in accountingKlefvenberg, Louise, Nordlander, Viktoria January 2015 (has links)
This thesis examines how the implementation process of Expected Credit Loss Model in the accounting standard IFRS 9 – Financial instruments is perceived and interpreted and how these factors can affect comparability in accounting. One of the main changes with IFRS 9 is that companies need to account for expected credit losses rather than just incurred ones. The data is primarily collected through a web survey where all of Nordic banks and credit institutes with a minimum book value of total assets of euro 1 billion, are invited to participate. The presentation of the collected data from the web survey is reported relative frequencies in tables. The analysis is carried out with the assistance of the theoretical framework consisting of Positive Accounting Theory and Agency Theory. The conclusion of the thesis is that how the level of information in the implementation process is interpreted and perceived can affect comparability in accounting negatively due to the room for subjective interpretations.
|
102 |
An Evaluation Framework for Adaptive User InterfaceNoriega Atala, Enrique January 2014 (has links)
With the rise of powerful mobile devices and the broad availability of computing power, Automatic Speech Recognition is becoming ubiquitous. A flawless ASR system is still far from existence. Because of this, interactive applications that make use of ASR technology not always recognize speech perfectly, when not, the user must be engaged to repair the transcriptions. We explore a rational user interface that uses of machine learning models to make its best effort in presenting the best repair strategy available to reduce the time in spent the interaction between the user and the system as much as possible. A study is conducted to determine how different candidate policies perform and results are analyzed. After the analysis, the methodology is generalized in terms of a decision theoretical framework that can be used to evaluate the performance of other rational user interfaces that try to optimize an expected cost or utility.
|
103 |
Empirical Analysis of Value at Risk and Expected Shortfall in Portfolio Selection ProblemDing, Liyuan 1988- 14 March 2013 (has links)
Safety first criterion and mean-shortfall criterion both explore cases of assets allocation with downside risk. In this paper, I compare safety first portfolio selection problem and mean-shortfall portfolio optimization problem, considering risk averse investors in practice. Safety first portfolio selection uses Value at Risk (VaR) as a risk measure, and mean-shortfall portfolio optimization uses expected shortfall as a risk measure, respectively. VaR is estimated by implementing extreme theory using a semi-parametric method. Expected shortfall is estimated by two nonparametric methods: a natural estimation and a kernel-weighted estimation.
I use daily data on three international stock indices, ranging from January 1986 to February 2012, to provide empirical evidence in asset allocations and illustrate the performances of safety first and mean-shortfall with their risk measures. Also, the historical data has been divided in two ways. One is truncated at year 1998 and explored the performance during tech boom and financial crisis. the mean-shortfall portfolio optimization with the kernel-weighted method performed better than the safety first criterion, while the safety first criterion was better than the mean-shortfall portfolio optimization with the natural estimation method.
|
104 |
Significance of transport dynamics on concentration statistics and expected mass fraction based risk assessment in the subsurfaceSrzic, Veljko January 2013 (has links)
This thesis relies on a Langrangian framework used for conservative tracer transport simulations through 2-D heterogeneous porous media. Conducted numerical simulations enable large sets of concentration values in both spatial and temporal domains. In addition to the advection, which acts on all scales, an additional mechanism considered is local scale dispersion (LSD), accounting for both mechanical dispersion and molecular diffusion. The ratio between these two mechanisms is quantified by the Peclet (Pe) number. In its base, the thesis gives answers to contaminant concentration features when influenced by: i) different log-conductivity variance; ii) log-conductivity structures defined by the same global variogram but with different log conductivity patterns cor-related; and iii) for a wide range of Peclet values. Results conducted by Monte Carlo (MC) analysis show a complex interplay between the aforementioned pa-rameters, indicating the influence of aquifer properties to temporal LSD evolu-tion. A stochastic characterization of the concentration scalar is done through moment analysis: mean, coefficient of variation (CVC), skewness and kurtosis as well as through the concentration probability density function (PDF). A re-markable collapse of higher order to second-order concentration moments leads to the conclusion that only two concentration moments are required for an accurate description of concentration fluctuations. This explicitly holds for the pure advection case, while in the case of LSD presence the Moment Deriv-ing Function (MDF) is involved to ensure the moment collapse validity. Fur-thermore, the expected mass fraction (EMF) concept is applied in groundwater transport. In its origin, EMF is function of the concentration but with lower number of realizations needed for its determination, compared to the one point PDF. From practical point of view, EMF excludes meandering effect and incorporates information about exposure time for each non-zero concentration value present. Also, it is shown that EMF is able to clearly reflect the effects of aquifer heterogeneity and structure as well as the Pe value. To demonstrate the uniqueness of the moment collapse feature and ability of the Beta distribution to account for the concentration frequencies even in real cases, Macrodisper-sion Experiment (MADE1) data sets are used. / <p>QC 20131104</p>
|
105 |
Can we replace CAPM and the Three-Factor model with Implied Cost of Capital?Löthman, Robert, Pettersson, Eric January 2014 (has links)
Researchers criticize predominant expected return models for being imprecise and based on fundamentally flawed assumptions. This dissertation evaluates Implied Cost of Capital, CAPM and the Three-Factor model abilities to estimate returns. We study each models expected return association to realized return and test for abnormal returns. Our sample covers the period 2000 to 2012 and includes 2916 US firms. We find that Implied Cost of Capital has a stronger association with realized returns than CAPM and the Three-Factor model. Implied Cost of Capital also has lower abnormal returns not accounted for by expected returns. Our results suggest that we can replace CAPM and the Three-Factor model with Implied Cost of Capital.
|
106 |
The Practicality of Statistics: Why Money as Expected Value Does Not Make Statistics PracticalReimer, Sean 01 January 2015 (has links)
This thesis covers the uncertainty of empirical prediction. As opposed to objectivity, I will discuss the practicality of statistics. Practicality defined as "useful" in an unbiased sense, in relation to something in the external world that we care about. We want our model of prediction to give us unbiased inference whilst also being able to speak about something we care about. For the reasons explained, the inherent uncertainty of statistics undermines the unbiased inference for many methods. Bayesian Statistics, by valuing hypotheses is more plausible but ultimately cannot arrive at an unbiased inference. I posit the value theory of money as a concept that might be able to allow us to derive unbiased inferences from while still being something we care about. However, money is of instrumental value, ultimately being worth less than an object of “transcendental value.” Which I define as something that is worth more than money since money’s purpose is to help us achieve “transcendental value” under the value theory. Ultimately, as long as an individual has faith in a given hypothesis it will be worth more than any hypothesis valued with money. From there we undermine statistic’s practicality as it seems as though without the concept of money we have no manner of valuing hypotheses unbiasedly, and uncertainty undermines the “objective” inferences we might have been able to make.
|
107 |
Broadcast Strategy for Delay-Limited Communication over Fading ChannelsYoo, Jae Won 03 October 2013 (has links)
Delay is an important quality-of-service measure for the design of next-generation wireless networks. This dissertation considers the problem of delay-limited communication over block-fading channels, where the channel state information is available at the receiver but not at the transmitter. For this communication scenario, the difference between the ergodic capacity and the maximum achievable expected rate (the expected capacity) for coding over a finite number of coherent blocks represents a fundamental measure of the penalty incurred by the delay constraint.
This dissertation introduces a notion of worst-case expected-capacity loss. Focusing on the slow-fading scenario (one-block delay), the worst-case additive and multiplicative expected-capacity losses are precisely characterized for the point-to- point fading channel. Extension to the problem of writing on fading paper is also considered, where both the ergodic capacity and the additive expected-capacity loss over one-block delay are characterized to within one bit per channel use.
The problem with multiple-block delay is considerably more challenging. This dissertation presents two partial results. First, the expected capacity is precisely characterized for the point-to-point two-state fading channel with two-block delay. Second, the optimality of Gaussian superposition coding with indirect decoding is established for a two-parallel Gaussian broadcast channel with three receivers. Both results reveal some intrinsic complexity in characterizing the expected capacity with multiple-block delay.
|
108 |
2d Correlated Diffusion Process For Mobility Modeling In Mobile NetworksCakar, Tunc 01 December 2004 (has links) (PDF)
This thesis introduces a novel mobility model based on so called &ldquo / 2D correlated diffusion process&rdquo / . In this model, motion components over x and y axes are dependent. Joint density function of the process is derived. The expected exit time from an arbitrary domain is characterized by a boundary value problem. Analytical solution of this problem is given for a specific case. Numerical solution of the problem is presented by several examples. The results obtained in these examples are verified by simulations. The expected exit time computed by this method holds for any given 2D domain and any given starting position inside.
|
109 |
Applications of constrained non-parametric smoothing methods in computing financial riskWong, Chung To (Charles) January 2008 (has links)
The aim of this thesis is to improve risk measurement estimation by incorporating extra information in the form of constraint into completely non-parametric smoothing techniques. A similar approach has been applied in empirical likelihood analysis. The method of constraints incorporates bootstrap resampling techniques, in particular, biased bootstrap. This thesis brings together formal estimation methods, empirical information use, and computationally intensive methods. In this thesis, the constraint approach is applied to non-parametric smoothing estimators to improve the estimation or modelling of risk measures. We consider estimation of Value-at-Risk, of intraday volatility for market risk, and of recovery rate densities for credit risk management. Firstly, we study Value-at-Risk (VaR) and Expected Shortfall (ES) estimation. VaR and ES estimation are strongly related to quantile estimation. Hence, tail estimation is of interest in its own right. We employ constrained and unconstrained kernel density estimators to estimate tail distributions, and we estimate quantiles from the fitted tail distribution. The constrained kernel density estimator is an application of the biased bootstrap technique proposed by Hall & Presnell (1998). The estimator that we use for the constrained kernel estimator is the Harrell-Davis (H-D) quantile estimator. We calibrate the performance of the constrained and unconstrained kernel density estimators by estimating tail densities based on samples from Normal and Student-t distributions. We find a significant improvement in fitting heavy tail distributions using the constrained kernel estimator, when used in conjunction with the H-D quantile estimator. We also present an empirical study demonstrating VaR and ES calculation. A credit event in financial markets is defined as the event that a party fails to pay an obligation to another, and credit risk is defined as the measure of uncertainty of such events. Recovery rate, in the credit risk context, is the rate of recuperation when a credit event occurs. It is defined as Recovery rate = 1 - LGD, where LGD is the rate of loss given default. From this point of view, the recovery rate is a key element both for credit risk management and for pricing credit derivatives. Only the credit risk management is considered in this thesis. To avoid strong assumptions about the form of the recovery rate density in current approaches, we propose a non-parametric technique incorporating a mode constraint, with the adjusted Beta kernel employed to estimate the recovery density function. An encouraging result for the constrained Beta kernel estimator is illustrated by a large number of simulations, as genuine data are very confidential and difficult to obtain. Modelling high frequency data is a popular topic in contemporary finance. The intraday volatility patterns of standard indices and market-traded assets have been well documented in the literature. They show that the volatility patterns reflect the different characteristics of different stock markets, such as double U-shaped volatility pattern reported in the Hang Seng Index (HSI). We aim to capture this intraday volatility pattern using a non-parametric regression model. In particular, we propose a constrained function approximation technique to formally test the structure of the pattern and to approximate the location of the anti-mode of the U-shape. We illustrate this methodology on the HSI as an empirical example.
|
110 |
Postupy homogenizace pojistného kmeneHrouz, David January 2015 (has links)
This diploma thesis deals with transferring the risk of a insurance company to another subject. The basic requirement is to homogenize the selected insurance portfolio. The amount of capital required is determined by identifying and quantifying the risk. Adjusted indicator of Economic value added (EVA) determines the optimal ratio of the retention and the risk transferred. There are several factors that can affect the amount of the retained risk. The main objective is to determine the amount of the optimal retention itself and select the appropriate type of reinsurance. The recommendation is based on the current development of expenses on insurance claims.
|
Page generated in 0.0638 seconds