• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 22
  • 14
  • 6
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 49
  • 49
  • 20
  • 19
  • 13
  • 12
  • 11
  • 11
  • 9
  • 9
  • 8
  • 7
  • 7
  • 7
  • 7
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
31

Fundamentální a technická analýza vybraného aktiva / Fundamental and technical analysis of a particular asset

Nepomnyashchiy, Ilya January 2015 (has links)
The goal of the thesis is to evaluate the degree of efficiency of the particular markets and to apply the methods of fundamental and technical analysis on them in order to assess their efficiency in terms of profitablity. The thesis analyses the degree of long-term memory of the particular commodities and stock indices via Hurst coefficient. Afterwards fundamental and technical methods are applied to the market with the highest degree of long-term memory, which is the feeder cattle market. Indidivual methods from both disciplines are being applied at first, after wich a combnation of both is appleid as well. The result is the discovery, whether combining the two approaches leads to a higher profitability of the trading strategy. At the end the effect of transacton costs is also evalauted and a final conclusion is made regarding the profit potential of both methods for the case of individual Czech investor.
32

Capital Asset Prices Modelling - Concept VAPM / Capital Asset Price Modelling: Concept VAPM

Kuklik, Robert G. January 2008 (has links)
The key objective of this thesis is the outline of an alternative capital market modeling framework, the Volatility Asset Pricing Model, VAPM, inspired by the innovative dual approach of Mandelbrot and Hudson using the method based on synthesis of two seemingly antagonistic factors -- the volatility of market prices and their serial dependence determining the capital markets' dynamics. The pilot tests of this model in various periods using the market index as well as a portfolio of selected securities delivered generally satisfactory results. Firstly, the work delivers a brief recapitulation regarding the concepts of a consumer/investor choice under general conditions of hypothetical certainty. Secondly, this outline is then followed by a description of the "classical" methodologies in the risky environment of uncertainty, with assessment of their corresponding key models, i.e. the CAPM, SIM, MIM, APTM, etc., notwithstanding results of the related testing approaches. Thirdly, this assessment is based on evaluation of the underlying doctrine of Efficient Market Hypothesis in relation to the so called Random Walk Model. Fourthly, in this context the work also offers a brief exposure to a few selected tests of these contraversial concepts. Fifthly, the main points of conteporary approaches such as the Fractal Dimension and the Hurst Exponent in the dynamic framework of information entropy are subsequently described as the theoretical tools leading to development of the abovementioned model VAPM. The major contribution of this thesis is considered its attempt to apply the abovementioned concepts in practice, with the intention to possibly inspire a further analytical research.
33

Probando la Hipótesis de Eficiencia de Mercado para el MILA utilizando el exponente de Hurst: Una aproximación dinámica del Rango reescalado (R/S) / Testing Efficient Market Hypothesis for MILA markets using the Hurst Exponent: A Dynamic Rescale/Range (R/S) approach

García Arroyo, Álvaro Leonardo 10 November 2021 (has links)
El presente trabajo comprueba la hipótesis de eficiencia de mercado (EMH) a través de una medida de persistencia temporal conocida como exponente de Hurst. Esta aproximación además de estar relacionada con la dimensión fractal, permite expandir el análisis de la hipótesis de mercado eficiente, propuesta por Eugene Fama en 1970. El cálculo del exponente de Hurst se realiza en base al método de rango reescalado; y se extiende su aplicación a una estimación dinámica entre el periodo 2006-2021. Este indicador sirve como índice de eficiencia de mercado, y se estima para las series de retornos diarios de los mercados de valores del MILA, conformado por Chile, Colombia, México y Perú. Los resultados demuestran que Perú es el mercado menos eficiente, y con mayor número de ciclos de ineficiencia para el periodo calculado. Por otro lado, México resulta ser el único mercado del MILA que estuvo dentro de la región de eficiencia. / The present work tests the market efficiency hypothesis (EMH) through a measure of time persistence known as the Hurst exponent. This approach, in addition to being related to the fractal dimension, allows us to expand the analysis of the efficient market hypothesis proposed by Eugene Fama in 1970. The calculation of the Hurst exponent is carried out based on the rescaled range method; and its application is extended to a dynamic estimation between the 2006-2021 period. This indicator serves as a market efficiency index and is estimated for the series of daily returns of the MILA securities markets, made up of Chile, Colombia, Mexico, and Peru. The results show that Peru is the least efficient market, and with the highest number of inefficiency cycles for the calculated period. On the other hand, Mexico turns out to be the only MILA market which has been in the efficient region. / Trabajo de investigación
34

Prospective Control: Effect of Exploratory-task-generated-motion on Adaptation in Real and Virtual Environments

Littman, Eric Marshall 25 March 2009 (has links)
No description available.
35

Application of Wavelets to Filtering and Analysis of Self-Similar Signals

Wirsing, Karlton 30 June 2014 (has links)
Digital Signal Processing has been dominated by the Fourier transform since the Fast Fourier Transform (FFT) was developed in 1965 by Cooley and Tukey. In the 1980's a new transform was developed called the wavelet transform, even though the first wavelet goes back to 1910. With the Fourier transform, all information about localized changes in signal features are spread out across the entire signal space, making local features global in scope. Wavelets are able to retain localized information about the signal by applying a function of a limited duration, also called a wavelet, to the signal. As with the Fourier transform, the discrete wavelet transform has an inverse transform, which allows us to make changes in a signal in the wavelet domain and then transform it back in the time domain. In this thesis, we have investigated the filtering properties of this technique and analyzed its performance under various settings. Another popular application of wavelet transform is data compression, such as described in the JPEG 2000 standard and compressed digital storage of fingerprints developed by the FBI. Previous work on filtering has focused on the discrete wavelet transform. Here, we extended that method to the stationary wavelet transform and found that it gives a performance boost of as much as 9 dB over that of the discrete wavelet transform. We also found that the SNR of noise filtering decreases as a frequency of the base signal increases up to the Nyquist limit for both the discrete and stationary wavelet transforms. Besides filtering the signal, the discrete wavelet transform can also be used to estimate the standard deviation of the white noise present in the signal. We extended the developed estimator for the discrete wavelet transform to the stationary wavelet transform. As with filtering, it is found that the quality of the estimate decreases as the frequency of the base signal increases. Many interesting signals are self-similar, which means that one of their properties is invariant on many different scales. One popular example is strict self-similarity, where an exact copy of a signal is replicated on many scales, but the most common property is statistical self-similarity, where a random segment of a signal is replicated on many different scales. In this work, we investigated wavelet-based methods to detect statistical self-similarities in a signal and their performance on various types of self-similar signals. Specifically, we found that the quality of the estimate depends on the type of the units of the signal being investigated for low Hurst exponent and on the type of edge padding being used for high Hurst exponent. / Master of Science
36

Análise de textura em imagens baseado em medidas de complexidade / Image Texture Analysis based on complex measures

Condori, Rayner Harold Montes 30 November 2015 (has links)
A análise de textura é uma das mais básicas e famosas áreas de pesquisa em visão computacional. Ela é também de grande importância em muitas outras disciplinas, tais como ciências médicas e biológicas. Por exemplo, uma tarefa comum de análise de textura é a detecção de tecidos não saudáveis em imagens de Ressonância Magnética do pulmão. Nesta dissertação, nós propomos um método novo de caracterização de textura baseado nas medidas de complexidade tais como o expoente de Hurst, o expoente de Lyapunov e a complexidade de Lempel-Ziv. Estas medidas foram aplicadas sobre amostras de imagens no espaço de frequência. Três métodos de amostragem foram propostas, amostragem: radial, circular e por caminhadas determinísticas parcialmente auto- repulsivas (amostragem CDPA). Cada método de amostragem produz um vetor de características por medida de complexidade aplicada. Esse vetor contem um conjunto de descritores que descrevem a imagem processada. Portanto, cada imagem será representada por nove vetores de características (três medidas de complexidade e três métodos de amostragem), os quais serão comparados na tarefa de classificação de texturas. No final, concatenamos cada vetor de características conseguido calculando a complexidade de Lempel-Ziv em amostras radiais e circulares com os descritores obtidos através de técnicas de análise de textura tradicionais, tais como padrões binários locais (LBP), wavelets de Gabor (GW), matrizes de co-ocorrência en níveis de cinza (GLCM) e caminhadas determinísticas parcialmente auto-repulsivas em grafos (CDPAg). Este enfoque foi testado sobre três bancos de imagens: Brodatz, USPtex e UIUC, cada um com seus próprios desafios conhecidos. As taxas de acerto de todos os métodos tradicionais foram incrementadas com a concatenação de relativamente poucos descritores de Lempel-Ziv. Por exemplo, no caso do método LBP, o incremento foi de 84.25% a 89.09% com a concatenação de somente cinco descritores. De fato, simplesmente concatenando cinco descritores são suficientes para ver um incremento na taxa de acerto de todos os métodos tradicionais estudados. Por outro lado, a concatenação de un número excessivo de descritores de Lempel-Ziv (por exemplo mais de 40) geralmente não leva a melhora. Neste sentido, vendo os resultados semelhantes obtidos nos três bancos de imagens analisados, podemos concluir que o método proposto pode ser usado para incrementar as taxas de acerto em outras tarefas que envolvam classificação de texturas. Finalmente, com a amostragem CDPA também se obtém resultados significativos, que podem ser melhorados em trabalhos futuros. / Texture analysis is one of the basic and most popular computer vision research areas. It is also of importance in many other disciplines, such as medical sciences and biology. For example, non-healthy tissue detection in lung Magnetic Resonance images is a common texture analysis task. We proposed a novel method for texture characterization based on complexity measures such as Lyapunov exponent, Hurst exponent and Lempel-Ziv complexity. This measurements were applied over samples taken from images in the frequency domain. Three types of sampling methods were proposed: radial sampling, circular sampling and sampling by using partially self-avoiding deterministic walks (CDPA sampling). Each sampling method produce a feature vector which contains a set of descriptors that characterize the processed image. Then, each image will be represented by nine feature vectors which are means to be compared in texture classification tasks (three complexity measures over samples from three sampling methods). In the end, we combine each Lempel-Ziv feature vector from the circular and radial sampling with descriptors obtained through traditional image analysis techniques, such as Local Binary Patterns (LBP), Gabor Wavelets (GW), Gray Level Co-occurrence Matrix (GLCM) and Self-avoiding Deterministic Walks in graphs (CDPAg). This approach were tested in three datasets: Brodatz, USPtex and UIUC, each one with its own well-known challenges. All traditional methods success rates were increased by adding relatively few Lempel-Ziv descriptors. For example in the LBP case the increment went from 84.25% to 89.09% with the addition of only five descriptors. In fact, just adding five Lempel-Ziv descriptors are enough to see an increment in the success rate of every traditional method. However, adding too many Lempel-Ziv descriptors (for example more than 40) generally doesnt produce better results. In this sense, seeing the similar results we obtain in all three databases, we conclude that this approach may be used to increment the success rate in a lot of others texture classification tasks. Finally, the CDPA sampling also obtain very promising results that we can improve further on future works.
37

Comparing South African financial markets behaviour to the geometric Brownian Motion Process

Karangwa, Innocent January 2008 (has links)
<p>This study examines the behaviour of the South African financial markets with regards to the Geometric Brownian motion process. It uses the daily, weekly, and monthly stock returns time series of some major securities trading in the South African financial market, more specifically the US dollar/Euro, JSE ALSI Total Returns Index, South African All Bond Index, Anglo American Corporation, Standard Bank, Sasol, US dollar Gold Price , Brent spot oil price, and South African white maize near future. The assumptions underlying the&nbsp / Geometric Brownian motion in finance, namely the stationarity, the normality and the independence of stock returns, are tested using both graphical (histograms and normal plots)&nbsp / and statistical test (Kolmogorov-Simirnov test, Box-Ljung statistic and Augmented Dickey-Fuller test) methods to check whether or not the Brownian motion as a model for South&nbsp / African financial markets holds. The Hurst exponent or independence index is also applied to support the results from the previous test. Theoretically, the independent or Geometric&nbsp / Brownian motion time series should be characterised by the Hurst exponent of &frac12 / . A value of a Hurst exponent different from that would indicate the presence of long memory or&nbsp / fractional Brownian motion in a time series. The study shows that at least one assumption is violated when the Geometric Brownian motion process is examined assumption by&nbsp / assumption. It also reveals the presence of both long memory and random walk or Geometric Brownian motion in the South African financial markets returns when the Hurst index analysis is used and finds that the Currency market is the most efficient of the South African financial markets. The study concludes that although some assumptions underlying the&nbsp / rocess are violated, the Brownian motion as a model in South African financial markets can not be rejected. It can be accepted in some instances if some parameters such as the Hurst exponent are added.</p>
38

Comparing South African financial markets behaviour to the geometric Brownian Motion Process

Karangwa, Innocent January 2008 (has links)
<p>This study examines the behaviour of the South African financial markets with regards to the Geometric Brownian motion process. It uses the daily, weekly, and monthly stock returns time series of some major securities trading in the South African financial market, more specifically the US dollar/Euro, JSE ALSI Total Returns Index, South African All Bond Index, Anglo American Corporation, Standard Bank, Sasol, US dollar Gold Price , Brent spot oil price, and South African white maize near future. The assumptions underlying the&nbsp / Geometric Brownian motion in finance, namely the stationarity, the normality and the independence of stock returns, are tested using both graphical (histograms and normal plots)&nbsp / and statistical test (Kolmogorov-Simirnov test, Box-Ljung statistic and Augmented Dickey-Fuller test) methods to check whether or not the Brownian motion as a model for South&nbsp / African financial markets holds. The Hurst exponent or independence index is also applied to support the results from the previous test. Theoretically, the independent or Geometric&nbsp / Brownian motion time series should be characterised by the Hurst exponent of &frac12 / . A value of a Hurst exponent different from that would indicate the presence of long memory or&nbsp / fractional Brownian motion in a time series. The study shows that at least one assumption is violated when the Geometric Brownian motion process is examined assumption by&nbsp / assumption. It also reveals the presence of both long memory and random walk or Geometric Brownian motion in the South African financial markets returns when the Hurst index analysis is used and finds that the Currency market is the most efficient of the South African financial markets. The study concludes that although some assumptions underlying the&nbsp / rocess are violated, the Brownian motion as a model in South African financial markets can not be rejected. It can be accepted in some instances if some parameters such as the Hurst exponent are added.</p>
39

Prospective control effect of exploratory-task-generated-motion on adaptation in real and virtual environments /

Littman, Eric Marshall. January 2009 (has links)
Thesis (M.A.)--Miami University, Dept. of Psychology, 2009. / Title from first page of PDF document. Includes bibliographical references (p. 43-47).
40

Compara??o da an?lise de diferentes perfis de po?os pelo m?todo DFA / Comparative analysis of different profiles of wells by method DFA

Silva, ?talo Batista da 16 January 2014 (has links)
Made available in DSpace on 2014-12-17T14:08:54Z (GMT). No. of bitstreams: 1 ItaloBS_DISSERT_Parcial.pdf: 1092449 bytes, checksum: 84033c6935995f4df8913ea7b715a8e5 (MD5) Previous issue date: 2014-01-16 / The study of complex systems has become a prestigious area of science, although relatively young . Its importance was demonstrated by the diversity of applications that several studies have already provided to various fields such as biology , economics and Climatology . In physics , the approach of complex systems is creating paradigms that influence markedly the new methods , bringing to Statistical Physics problems macroscopic level no longer restricted to classical studies such as those of thermodynamics . The present work aims to make a comparison and verification of statistical data on clusters of profiles Sonic ( DT ) , Gamma Ray ( GR ) , induction ( ILD ) , neutron ( NPHI ) and density ( RHOB ) to be physical measured quantities during exploratory drilling of fundamental importance to locate , identify and characterize oil reservoirs . Software were used : Statistica , Matlab R2006a , Origin 6.1 and Fortran for comparison and verification of the data profiles of oil wells ceded the field Namorado School by ANP ( National Petroleum Agency ) . It was possible to demonstrate the importance of the DFA method and that it proved quite satisfactory in that work, coming to the conclusion that the data H ( Hurst exponent ) produce spatial data with greater congestion . Therefore , we find that it is possible to find spatial pattern using the Hurst coefficient . The profiles of 56 wells have confirmed the existence of spatial patterns of Hurst exponents , ie parameter B. The profile does not directly assessed catalogs verification of geological lithology , but reveals a non-random spatial distribution / O estudo dos sistemas complexos tornou-se uma ?rea prestigiada da ci?ncia, apesar de ser relativamente jovem. Sua import?ncia foi comprovada pela diversidade de aplica??es que v?rios estudos j? proporcionaram para campos diversos como os da Biologia, Economia e Climatologia. Na F?sica, a abordagem dos sistemas complexos vem criando paradigmas que influenciam de forma marcante os novos m?todos, trazendo para a F?sica Estat?stica problemas de n?vel macrosc?pico n?o mais restritos a estudos cl?ssicos como os da Termodin?mica. O presente trabalho tem como objetivo fazer uma compara??o e verifica??o dos aglomerados estat?sticos de dados relativos aos perfis de S?nico (DT), Raio Gama (GR), Indu??o (ILD), Neutr?nico (NPHI) e Densidade (RHOB) por serem grandezas f?sicas medidas durante a perfura??o de po?os explorat?rios de fundamental import?ncia para localizar, identificar e caracterizar reservat?rios de petr?leo. Foram utilizados os softwares: Statistica, Matlab R2006a, Origin 6.1 e Fortran para a compara??o e verifica??o dos dados dos perfis de po?os de petr?leo da Escola Campo Namorado cedidos pela ANP (Ag?ncia nacional de petr?leo). Foi poss?vel evidenciar a import?ncia do m?todo DFA e que o mesmo mostrou-se bastante satisfat?rio no referido trabalho, chegando-se a conclus?o que os dados do H (expoente de Hurst) produzem dados espaciais com uma maior aglomera??o. Portanto, constatamos que ? poss?vel encontrar padr?o espacial usando o coeficiente de Hurst. Os perfis dos 56 po?os comprovaram a exist?ncia de padr?es espaciais dos expoentes de Hurst, ou seja, par?metro B. O perfil avaliado n?o cataloga diretamente a verifica??o da litologia geol?gica, mas revela a exist?ncia de uma distribui??o espacial n?o aleat?ria / 2024-12-31

Page generated in 0.0728 seconds