• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 17
  • 14
  • 9
  • 1
  • 1
  • Tagged with
  • 54
  • 28
  • 18
  • 13
  • 13
  • 13
  • 11
  • 11
  • 10
  • 9
  • 7
  • 7
  • 7
  • 7
  • 6
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
31

Data-Dependent Analysis of Learning Algorithms

Philips, Petra Camilla, petra.philips@gmail.com January 2005 (has links)
This thesis studies the generalization ability of machine learning algorithms in a statistical setting. It focuses on the data-dependent analysis of the generalization performance of learning algorithms in order to make full use of the potential of the actual training sample from which these algorithms learn.¶ First, we propose an extension of the standard framework for the derivation of generalization bounds for algorithms taking their hypotheses from random classes of functions. This approach is motivated by the fact that the function produced by a learning algorithm based on a random sample of data depends on this sample and is therefore a random function. Such an approach avoids the detour of the worst-case uniform bounds as done in the standard approach. We show that the mechanism which allows one to obtain generalization bounds for random classes in our framework is based on a “small complexity” of certain random coordinate projections. We demonstrate how this notion of complexity relates to learnability and how one can explore geometric properties of these projections in order to derive estimates of rates of convergence and good confidence interval estimates for the expected risk. We then demonstrate the generality of our new approach by presenting a range of examples, among them the algorithm-dependent compression schemes and the data-dependent luckiness frameworks, which fall into our random subclass framework.¶ Second, we study in more detail generalization bounds for a specific algorithm which is of central importance in learning theory, namely the Empirical Risk Minimization algorithm (ERM). Recent results show that one can significantly improve the high-probability estimates for the convergence rates for empirical minimizers by a direct analysis of the ERM algorithm. These results are based on a new localized notion of complexity of subsets of hypothesis functions with identical expected errors and are therefore dependent on the underlying unknown distribution. We investigate the extent to which one can estimate these high-probability convergence rates in a data-dependent manner. We provide an algorithm which computes a data-dependent upper bound for the expected error of empirical minimizers in terms of the “complexity” of data-dependent local subsets. These subsets are sets of functions of empirical errors of a given range and can be determined based solely on empirical data. We then show that recent direct estimates, which are essentially sharp estimates on the high-probability convergence rate for the ERM algorithm, can not be recovered universally from empirical data.
32

Risk Measurement, Management And Option Pricing Via A New Log-normal Sum Approximation Method

Zeytun, Serkan 01 October 2012 (has links) (PDF)
In this thesis we mainly focused on the usage of the Conditional Value-at-Risk (CVaR) in risk management and on the pricing of the arithmetic average basket and Asian options in the Black-Scholes framework via a new log-normal sum approximation method. Firstly, we worked on the linearization procedure of the CVaR proposed by Rockafellar and Uryasev. We constructed an optimization problem with the objective of maximizing the expected return under a CVaR constraint. Due to possible intermediate payments we assumed, we had to deal with a re-investment problem which turned the originally one-period problem into a multiperiod one. For solving this multi-period problem, we used the linearization procedure of CVaR and developed an iterative scheme based on linear optimization. Our numerical results obtained from the solution of this problem uncovered some surprising weaknesses of the use of Value-at-Risk (VaR) and CVaR as a risk measure. In the next step, we extended the problem by including the liabilities and the quantile hedging to obtain a reasonable problem construction for managing the liquidity risk. In this problem construction the objective of the investor was assumed to be the maximization of the probability of liquid assets minus liabilities bigger than a threshold level, which is a type of quantile hedging. Since the quantile hedging is not a perfect hedge, a non-zero probability of having a liability value higher than the asset value exists. To control the amount of the probable deficient amount we used a CVaR constraint. In the Black-Scholes framework, the solution of this problem necessitates to deal with the sum of the log-normal distributions. It is known that sum of the log-normal distributions has no closed-form representation. We introduced a new, simple and highly efficient method to approximate the sum of the log-normal distributions using shifted log-normal distributions. The method is based on a limiting approximation of the arithmetic mean by the geometric mean. Using our new approximation method we reduced the quantile hedging problem to a simpler optimization problem. Our new log-normal sum approximation method could also be used to price some options in the Black-Scholes model. With the help of our approximation method we derived closed-form approximation formulas for the prices of the basket and Asian options based on the arithmetic averages. Using our approximation methodology combined with the new analytical pricing formulas for the arithmetic average options, we obtained a very efficient performance for Monte Carlo pricing in a control variate setting. Our numerical results show that our control variate method outperforms the well-known methods from the literature in some cases.
33

Finding Profitability of Technical Trading Rules in Emerging Market Exchange Traded Funds

Hallett, Austin P. 01 January 2012 (has links)
This thesis further investigates the effectiveness of 15 variable moving average strategies that mimic the trading rules used in the study by Brock, Lakonishok, and LeBaron (1992). Instead of applying these strategies to developed markets, unique characteristics of emerging markets offer opportunity to investors that warrant further research. Before transaction costs, all 15 variable moving average strategies outperform the naïve benchmark strategy of buying and holding different emerging market ETF's over the volatile period of 858 trading days. However, the variable moving averages perform poorly in the "bubble" market cycle. In fact, sell signals become more unprofitable than buy signals are profitable. Furthermore, variations of 4 of 5 variable moving average strategies demonstrate significant prospects of returning consistent abnormal returns after adjusting for transaction costs and risk.
34

Analýza vybraných ukazatelů na akciovém trhu / Analysis of selected indicators on stock market

BUREŠ, Otto January 2014 (has links)
In this work was evaluated the effectiveness of artificial neural networks in trading on the stock markets. The subject of the work was the process of optimizing parameters of artificial neural networks, the resulting predictive efficiency was determined on the basis of the application being optimized parameters of neural networks.
35

A evolu??o do emprego formal industrial nas cidades m?dias do estado do Cear? (Juazeiro do Norte, Crato e Sobral) no per?odo de 1990 a 2010

Barbosa, Maria Nivania Feitosa 24 April 2013 (has links)
Made available in DSpace on 2014-12-17T14:34:45Z (GMT). No. of bitstreams: 1 MariaNFB_DISSERT.pdf: 1439709 bytes, checksum: 01b20e1ce03d68a5902b7992884ca724 (MD5) Previous issue date: 2013-04-24 / The work consists in a discussion of the evolution of formal employment in the industrial cities of Cear? state averages from 1990 to 2010, since this period was marked by important changes. It is emphasized that in order to achieve this aim, the present study was based on a survey of relevant literature on the subject, as well as the use of the Annual Report of Social Information (RAIS), published by the Ministry of Labour and Employment (MTE) and the Brazilian Institute of Geography and Statistics (IBGE). The central question to be considered in this study is how we evolved formal employment industry in medium-sized cities (Juazeiro do Norte, Crato and Sobral) of Cear?? The assumption that guides this work is that given the economic policies of the 1990 and 2000 these policies encouraged the relocation, thus implying significant growth in the formal manufacturing employment in these cities. Regarding the results obtained in the survey, it was found that the industrial sector of these cities, showed considerable dynamism in what refers to the expansion of establishments. When observed in percentage terms medium-sized cities (345.5%) had the highest growth in number of establishments in the 1990s with rates higher than the Northeast region (285.9%) and Brazil (167.5%). The highlight was the city of Juazeiro, with the highest concentration of micro and small footwear companies in the state. Regarding the number of formal jobs created in medium-sized cities, it went from 6.596 in 1990 to 41.660 million formal jobs in 2010, with a growth rate of 532%. The sector contributed most to employment generation was the footwear. Although the levels of minimum wages, the 1990 recorded the lowest levels. In the 2000, there were real gains in levels of minimum wages in all cities, however, it may be noted that over the decades there has been significant momentum. However, this momentum was not enough to prevent the end of the study period CMs-Cear? present low wages / O presente trabalho constitui-se em uma abordagem sobre a evolu??o do emprego formal industrial nas cidades m?dias do estado do Cear? no per?odo de 1990 a 2010, posto que esse per?odo foi marcado por importantes mudan?as. Ressalta-se que com o prop?sito de alcan?ar tal intuito, foi realizado um levantamento da literatura relevante sobre a tem?tica, bem como a utiliza??o de estat?sticas da Rela??o Anual de Informa??es Sociais (RAIS), publicada pelo Minist?rio do Trabalho e do Emprego (MTE) e do Instituto Brasileiro de Geografia e Estat?stica (IBGE). A quest?o central a ser considerada neste estudo ? saber como evoluiu o emprego formal da ind?stria nas cidades m?dias (Juazeiro do Norte, Crato e Sobral) do estado do Cear?. O pressuposto que norteia este trabalho ? que as pol?ticas econ?micas dos anos de 1990 e 2000 estimularam a relocaliza??o com implica??es importantes no emprego industrial formal nessas cidades. No que concerne aos resultados obtidos na pesquisa, constatou-se que o setor industrial dessas cidades, apresentou consider?vel dinamismo no que refere-se ? expans?o dos estabelecimentos. Quando se observa em termos percentuais as cidades m?dias (345,5%) tiveram o maior crescimento do n?mero de estabelecimentos na d?cada de 1990 com taxas mais elevadas que a regi?o Nordeste (285,9%) e o Brasil (167,5%). O destaque foi para a cidade de Juazeiro do Norte, com maior concentra??o de micro e pequenas empresas cal?adistas do estado. No que concerne a quantidade de empregos formais criados nas cidades m?dias, o mesmo passou de 6.596, em 1990, para 41.660 mil empregos formais em 2010, apresentando uma taxa de crescimento de 532%. O setor que mais contribuiu para gera??o de emprego foi o cal?adista. Ainda, quanto o n?vel de salarial, a d?cada de 1990 registrou os menores n?veis. Nos anos 2000, houve ganhos em todas as cidades. Entretanto, n?o foi suficiente para evitar que ao final do per?odo estudado as CMs-Cear? apresentassem n?veis salariais relativamente baixos
36

On Ergodic Theorems for Cesàro Convergence of Spherical Averages for Fuchsian Groups: Geometric Coding via Fundamental Domains

Drygajlo, Lars 04 November 2021 (has links)
The thesis is organized as follows: First we state basic ergodic theorems in Section 2 and introduce the notation of Cesàro averages for multiple operators in Section 3. We state a general theorem in Section 3 for groups that can be represented by a finite alphabet and a transition matrix. In the second part we show that finitely generated Fuchsian groups, with certain restrictions to the fundamental domain, admit such a representation. To develop the representation we give an introduction into Möbius transformations (Section 4), hyperbolic geometry (Section 5), the concept of Fuchsian groups and their action in the hyperbolic plane (Section 6) and fundamental domains (Section 7). As hyperbolic geometry calls for visualization we included images at various points to make the definitions and statements more approachable. With those tools at hand we can develop a geometrical coding for Fuchsian groups with respect to their fundamental domain in Section 8. Together with the coding we state in Section 9 the main theorem for Fuchsian groups. The last chapter (Section 10) is devoted to the application of the main theorem to three explicit examples. We apply the developed method to the free group F3, to a fundamental group of a compact manifold with genus two and we show why the main theorem does not hold for the modular group PSL(2, Z).:1 Introduction 2 Ergodic Theorems 2.1 Mean Ergodic Theorems 2.2 Pointwise Ergodic Theorems 2.3 The Limit in Ergodic Theorems 3 Cesàro Averages of Sphere Averages 3.1 Basic Notation 3.2 Cesàro Averages as Powers of an Operator 3.3 Convergence of Cesàro Averages 3.4 Invariance of the Limit 3.5 The Limit of Cesàro Averages 3.6 Ergodic Theorems for Strictly Markovian Groups 4 Möbius Transformations 4.1 Introduction and Properties 4.2 Classes of Möbius Transformations 5 Hyperbolic Geometry 5.1 Hyperbolic Metric 5.2 Upper Half Plane and Poincaré Disc 5.3 Topology 5.4 Geodesics 5.5 Geometry of Möbius Transformations 6 Fuchsian Groups and Hyperbolic Space 6.1 Discrete Groups 6.2 The Group PSL(2, R) 6.3 Fuchsian Group Actions on H 6.4 Fuchsian Group Actions on D 7 Geometry of Fuchsian Groups 7.1 Fundamental Domains 7.2 Dirichlet Domains 7.3 Locally Finite Fundamental Domains 7.3.1 Sides of Locally Finite Fundamental Domains 7.3.2 Side Pairings for Locally Finite Fundamental Domains 7.3.3 Finite Sided Fundamental Domains 7.4 Tessellations of Hyperbolic Space 7.5 Example Fundamental Domains 8 Coding for Fuchsian Groups 8.1 Geometric Alphabet 8.1.1 Alphabet Map 8.2 Transition Matrix 8.2.1 Irreducibility of the Transition Matrix 8.2.2 Strict Irreducibility of the Transition Matrix 9 Ergodic Theorem for Fuchsian Groups 10 Example Constructions 10.1 The Free Group with Three Generators 10.1.1 Transition Matrix 10.2 Example of a Surface Group 10.2.1 Irreducibility of the Transition Matrix 10.2.2 Strict Irreducibility of the Transition Matrix 10.3 Example of PSL(2, Z) 10.3.1 Irreducibility of the Transition Matrix 10.3.2 Strict Irreducibility of the Transition Matrix
37

Uplatnění statistických metod při technické analýze akcií / The Use of Statistical Methods in Technical Analysis of Stocks

Pavlásek, Ondřej January 2013 (has links)
This diploma thesis is focused on using statistical methods of technical analysis of stocks. The teoretical part describes basic principals of regression and technical analysis with a description of the technical indicators that are used to predict the future development of share prices and finding appropriate moment to buy or sell stocks. The results of the analysis is the comparison of indicators and their applicability to trade the stock titles.
38

Convergence of Large Deviations Probabilities for Processes with Memory - Models and Data Study

Massah, Mozhdeh 17 April 2019 (has links)
A commonly used tool in data analysis is to compute a sample mean. Assuming a uni-modal distribution, its mean provides valuable information about which value is typically found in an observation. Also, it is one of the simplest and therefore very robust statistics to compute and suffers much less from sampling effects of tails of the distribution than estimates of higher moments. In the context of a time series, the sample mean is a time average. Due to correla- tions among successive data points, the information stored in a time series might be much less than the information stored in a sample of independently drawn data points of equal size, since correlation always implies redundancy. Hence, the issue of how close the sample estimate of a time average is to the true mean value of the process depends on correlations in data. In this thesis, we will study the proba- bility that a single time average deviates by more than some threshold value from the true process mean. This will be called the Large Deviation Probability (LDP), and it will be a function of the time interval over which the average is taken: The longer the time interval, the smaller will this probability be. However, it is the precise functional form of this decay which will be in the focus of this thesis. The LDP is proven to decay exponentially for identically independently distributed data. On the other hand we will see in this thesis that this result does not apply to long-range correlated data. The LDP is found to decay slower than exponential for such data. It will be shown that for intermittent series this exponential decay breaks down severely and the LDP is a power law. These findings are outlined in the methodological explanations in chapter 3, after an overview of the theoretical background in chapter 2. In chapter 4, the theoretical and numerical results for the studied models in chapter 3 are compared to two types of empirical data sets which are both known to be long- range correlated in the literature. The earth surface temperature of two stations of two climatic zones are modelled and the error bars for the finite time averages are estimated. Knowing that the data is long-range correlated by estimating the scaling exponent of the so called fluctuation function, the LDP estimation leads to noticeably enlarged error bars of time averages, based on the results in chapter 3. The same analysis is applied on heart inter-beat data in chapter 5. The contra- diction to the classical large deviation principle is even more severe in this case, induced by the long-range correlations and additional inherent non-stationarity. It will be shown that the inter-beat intervals can be well modeled by bounded fractional Brownian motion. The theoretical and numerical LDP, both for the model and the data, surprisingly indicates no clear decay of LDP for the time scales under study.
39

A Comparative Study of High School Academic Paths, Grade Point Averages, and ACT Composite Scores as Predictors of Success at Walters State Community College.

Reuschel, Jill C. 09 May 2009 (has links) (PDF)
With an overwhelming number of students attempting to enter college after high school, the competitive nature of college admissions continues to grow. Colleges and universities are attempting to find the appropriate means to adequately predict collegiate success. Common methods of this prediction have come from a variety of sources most of which are the use of high school performance and standardized college admissions testing. Walters State Community College was chosen for this study because of its open door admission policy that allows for variability in high school academic paths as well as grade point averages and ACT scores students earned in high school. The purpose of this study was to examine the associations between high school grade point averages, high school academic paths, ACT scores, and 1st-year college success as measured by the number of college credit hours completed and college grade point averages at the end of the 1st semester and at the end of the 1st academic year. The study included 797 high school students entering the college in fall semester 2007 and completing their 1st academic year in spring semester 2008. The major findings of this study included: university Path students were (a) more likely to have a higher high school grade point average, (b) more likely to have a higher college grade point average and have earned more college credit hours at the end of the 1st semester and year, and (c) were less likely to enroll in remedial and developmental courses. Additionally, a moderate positive relationship was found between high school grade point averages and college grade point averages at the end of the college academic year. High school grade point averages and ACT scores were found to be statistically significant in predicting the number of college credit hours earned at the end of the college academic year.
40

Forecasting annual tax revenue of the South African taxes using time series Holt-Winters and ARIMA/SARIMA Models

Makananisa, Mangalani P. 10 1900 (has links)
This study uses aspects of time series methodology to model and forecast major taxes such as Personal Income Tax (PIT), Corporate Income Tax (CIT), Value Added Tax (VAT) and Total Tax Revenue(TTAXR) in the South African Revenue Service (SARS). The monthly data used for modeling tax revenues of the major taxes was drawn from January 1995 to March 2010 (in sample data) for PIT, VAT and TTAXR. Due to higher volatility and emerging negative values, the CIT monthly data was converted to quarterly data from the rst quarter of 1995 to the rst quarter of 2010. The competing ARIMA/SARIMA and Holt-Winters models were derived, and the resulting model of this study was used to forecast PIT, CIT, VAT and TTAXR for SARS fiscal years 2010/11, 2011/12 and 2012/13. The results show that both the SARIMA and Holt-Winters models perform well in modeling and forecasting PIT and VAT, however the Holt-Winters model outperformed the SARIMA model in modeling and forecasting the more volatile CIT and TTAXR. It is recommended that these methods are used in forecasting future payments, as they are precise about forecasting tax revenues, with minimal errors and fewer model revisions being necessary. / Statistics / M.Sc. (Statistics)

Page generated in 0.0312 seconds