• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 6
  • 4
  • 4
  • 3
  • 1
  • Tagged with
  • 14
  • 14
  • 8
  • 5
  • 5
  • 4
  • 4
  • 4
  • 4
  • 4
  • 4
  • 4
  • 4
  • 4
  • 4
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
11

資料窺探與交易策略之獲利性:以亞洲股票市場為例 / Data snooping and the profitability of trading strategies: evidence from the asian stock markets

李榮傑, Lee, Chung Chieh Unknown Date (has links)
於這篇論文中,我們運White (2000)的Reality Check與Romano and Wolf (2005)的stepwise multiple test檢測交易策略的獲利性以更正資料窺探的偏誤。不同於先前運用資料窺探法則的研究,我們的研究以技術分析及時間序列預測兩者為依歸來建立交易策略,另外我們探討的市場集中在六個主要的亞洲股票市場。大致上,我們發現鮮少證據支持技術交易策略的獲利性;於基礎分析中且考慮交易成本時,只有少數幾個獲利性交易法則出現於兩個興新市場。另外在子樣本期間中,我們發現獲利性策略的表現並不穩定且這幾年來獲利性有逐漸變弱的趨勢。在進階分析中,我們發現沒有任何交易策略表現優越於基本的買進持有策略。 / In this paper, we exam the profitability of trading strategies by using both White’s (2000) Reality Check and Romano and Wolf (2005)s’ stepwise multiple test that correct the data snooping bias. Different from previous studies with the data snooping methodology, our analysis set the universe of forecasts (trading strategies) based on both technical analysis and time series prediction, and the markets which our investigation focuses on are six major Asian stock markets. Overall we find little supportive evidence for the profitability of trading strategies. Our basic analysis shows that there are only few profitable trading strategies detected for two emerging markets while transaction costs are taken into account. Moreover, the performances of the profitable strategies are unstable and the profitability becomes much weaker in the recent years as we find in the sub-periods. In further analysis, we also find that there is no trading strategies in our universe that can outperform the basically buy and hold strategy.
12

Multiple Outlier Detection: Hypothesis Tests versus Model Selection by Information Criteria

Lehmann, Rüdiger, Lösler, Michael 14 June 2017 (has links) (PDF)
The detection of multiple outliers can be interpreted as a model selection problem. Models that can be selected are the null model, which indicates an outlier free set of observations, or a class of alternative models, which contain a set of additional bias parameters. A common way to select the right model is by using a statistical hypothesis test. In geodesy data snooping is most popular. Another approach arises from information theory. Here, the Akaike information criterion (AIC) is used to select an appropriate model for a given set of observations. The AIC is based on the Kullback-Leibler divergence, which describes the discrepancy between the model candidates. Both approaches are discussed and applied to test problems: the fitting of a straight line and a geodetic network. Some relationships between data snooping and information criteria are discussed. When compared, it turns out that the information criteria approach is more simple and elegant. Along with AIC there are many alternative information criteria for selecting different outliers, and it is not clear which one is optimal.
13

Multiple Outlier Detection: Hypothesis Tests versus Model Selection by Information Criteria

Lehmann, Rüdiger, Lösler, Michael January 2016 (has links)
The detection of multiple outliers can be interpreted as a model selection problem. Models that can be selected are the null model, which indicates an outlier free set of observations, or a class of alternative models, which contain a set of additional bias parameters. A common way to select the right model is by using a statistical hypothesis test. In geodesy data snooping is most popular. Another approach arises from information theory. Here, the Akaike information criterion (AIC) is used to select an appropriate model for a given set of observations. The AIC is based on the Kullback-Leibler divergence, which describes the discrepancy between the model candidates. Both approaches are discussed and applied to test problems: the fitting of a straight line and a geodetic network. Some relationships between data snooping and information criteria are discussed. When compared, it turns out that the information criteria approach is more simple and elegant. Along with AIC there are many alternative information criteria for selecting different outliers, and it is not clear which one is optimal.
14

AJUSTAMENTO DE LINHA POLIGONAL NO ELIPSÓIDE / TRAVERSE ADJUSTMENT IN THE ELLIPSOID

Bisognin, Márcio Giovane Trentin 26 April 2006 (has links)
Traverses Adjustment in the surface of the ellipsoid with the objectives to guarantee the solution unicity in the transport of curvilinear geodesic coordinates (latitude and longitude) and in the azimuth transport and to get the estimates of quality. It deduces the coordinate transport and the azimuth transport by mean Legendre s series of the geodesic line. This series is based on the Taylor s series, where the argument is the length of the geodesic line. For the practical applications, it has the necessity to effect the truncation of the series and to calculate the function error for the latitude, the function error for the longitude and the function error for the azimuth. In this research, these series are truncated in the derivative third and calculates the express functions error in derivative fourth. It is described the adjustment models based on the least-squares method: combined model with weighted parameters, combined model or mixed model, parametric model or observations equations and correlates model or condition equations model. The practical application is the adjustment by mean parametric model of a traverse measured by the Instituto Brasileiro de Geografia e Estatística (IBGE), constituted of 8 vertices and the 129.661 km length. The localization of errors in the observations is calculated by the Baarda s data snooping test in the last iteration of the adjustment that showed some observations with error. The estimates of quality are in the variance-covariance matrices and calculate the semiaxes of the error ellipse or standard ellipse of each point by means of the spectral decomposition (or Jordan s decomposition) of the submatrices of the variance-covariance matrix of the adjusted parameters (the coordinates). It is important to note that the application of the Legendre s series is satisfactory for short distances until 40km length. The convergence of the series is fast for the adjusted coordinates, where the stopped criterion of the iterations is four decimals in the sexagesimal second arc, where it is obtained from interation second of the adjustment. / Ajustamento de linhas poligonais na superfície do elipsóide com os objetivos de garantir a unicidade de solução no transporte de coordenadas geodésicas curvilíneas (latitude ϕ e longitude λ ) e no transporte de azimute e de obter as estimativas de qualidade. Deduz o transporte de coordenadas e o transporte de azimute pelas séries de Legendre da linha geodésica. Essa série se fundamenta na série de Taylor, em que o argumento é o comprimento da linha geodésica. Para as aplicações práticas, há a necessidade de efetuar o truncamento da série e calcular a função erro para a latitude, função erro para a longitude e função erro para o azimute. Nesta pesquisa, trunca-se a série na derivada terceira e calculam-se as funções erro expressas em derivada quarta. Expõe os modelos de ajustamento fundamentados no método dos mínimos quadrados (MMQ): modelo combinado com ponderação aos parâmetros, modelo combinado ou implícito, modelo paramétrico ou das equações de observação e modelo dos correlatos ou das equações de condição. A aplicação prática é o ajustamento pelo modelo paramétrico de uma linha poligonal medida pelo Instituto Brasileiro de Geografia e Estatística (IBGE), constituída de 8 vértices e de comprimento igual a 129,661 km. A localização de erros nas observações é efetuada pelo teste data snooping de Baarda na última etapa do ajustamento que mostrou algumas observações com erro. As estimativas de qualidade estão nas matrizes variância-covariância (MVC) e calcula-se os semieixos da elipse dos erros (ou elipse padrão) de cada ponto mediante a decomposição espectral (ou decomposição de Jordan) das submatrizes da MVC dos parâmetros (as coordenadas) ajustados. Mostra-se que a aplicação das séries de Legendre é satisfatória para distâncias curtas até 40km. A convergência da série é rápida para as coordenadas ajustadas, onde o critério de parada das iterações seja quatro decimais do segundo de arco em que se atingiu na segunda etapa do ajustamento.

Page generated in 0.0647 seconds