• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 7
  • 5
  • 4
  • 3
  • 2
  • 2
  • 1
  • Tagged with
  • 20
  • 15
  • 8
  • 5
  • 5
  • 4
  • 4
  • 4
  • 4
  • 4
  • 4
  • 4
  • 4
  • 4
  • 4
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
11

外匯市場的技術分析與央行干預 / Technical trading rules in the exchange rate markets and central bank intervention

吳至剛 Unknown Date (has links)
在這篇文章裡我們採用了White所提出的真實檢驗法(Reality Check)來解決探勘資料偏誤(Data-snooping bias)的問題,結果顯示從1980年到2008年間,技術分析法則的確可以幫助投資人在日圓兌美元及英鎊兌美元這兩個外匯市場獲利;我們也發現在外匯市場最普遍的技術分析方式─移動平均法(moving average)表現不如其他的技術分析法則,而通道突破法(channel break-out)的表現則明顯優於其他技術分析法則。 除了檢驗技術分析方法的獲利性之外,我們也嘗試著探討技術分析方法的獲利性與央行干預之間的關係,追隨Szacmary與Mathur在1997年所發表的論文,我們把技術分析法則擴充為在真實檢驗法中所使用到的所有法則,並且盡可能加長分析的期間。結果顯示技術分析法則的獲利與央行干預並不存在任何特定的關係。 / In this paper we construct a huge universe of simple trading rules and apply White’s Reality Check to mitigate data-snooping bias then detect the profitability of technical trading rules. We find that technical analysis is useful no matter in the full sample time or each subsample period. The channel break-out method outperforms the other methods in our finding while the profitability of the most commonly used moving average method is worse than the others. Furthermore, we inspect the relationships between the returns of technical trading rules and central bank intervention. The results suggest that there’s no evident relationship between the return series of trading rules and central bank intervention and are not consistent with the view of our following previous study.
12

Granskning och optimering av data- och IP-telefoninätverk

Eriksson, Jhonny, Karlsson, Joel January 2010 (has links)
<p>The company Västra Mälardalens Kommunalförbund, VMKF, wishes to revise and optimize their present data and IP-telephony network as of today consists of the three municipalities Köping, Arboga and Kungsör. As a municipal corporation, they seek consultation regarding internal as well as external review and investigation of the main structure of the network, its functionality and safety. By today’s increasing demands of Internet accessibility, availability of services and security far more extends the requirement of a complete network design. The foundation of networking rests on the balance between each of these necessities. Therefore, it is of grave importance to optimize a network design, use of hardware and to minimize the administrative overhead. In particular, when the municipality is short of resources and time means money. By letting an impartial investigation of the network act as a starting point it was established that several improvement could be applied. Among these a reconstructed and improved network topology that includes subjects as routing, switching, safety and security, quality of service and technical administrative overhead and the implementation of a real time monitoring of network bandwidth consumption.</p> / <p>Företaget Västra Mälardalens Kommunalförbund, VMKF, har önskemål om att granska och optimera deras befintliga data- och IP-telefoninätverk som i dagsläget spänner över de tre kommunerna Köping, Arboga och Kungsör. Som ett kommunalägt företag önskar de konsultation rörande intern såväl som extern granskning och optimering av huvuddelen av nätverkets funktionalitet samt säkerhet. I och med dagens ökade Internetanvändning och funktionalitetsbehov ställs allt högre krav på tillgänglighet, säkerhet och användarvänlighet. Nätverksteknik bygger mycket på balansen mellan dessa tre punkter. Därför gäller det att optimera nätverkets design, hårdvaruanvändning och att minimera administrativa laster. Detta i synnerhet då kommunens resurser är knappa och då tid i dagens samhälle innebär pengar. Genom att låta en granskning över nätverket som det ser ut i dag ligga till grund konstaterades att flertalet förbättringsmöjligheter kunde genomföras. Bland dessa återfinns en omstrukturerad nätverksdesign som innefattar routing, switching, säkerhet, QoS och teknisk administration samt implementeringen av en realtidsövervakning av bandbreddsanvändning.</p>
13

Granskning och optimering av data- och IP-telefoninätverk

Eriksson, Jhonny, Karlsson, Joel January 2010 (has links)
The company Västra Mälardalens Kommunalförbund, VMKF, wishes to revise and optimize their present data and IP-telephony network as of today consists of the three municipalities Köping, Arboga and Kungsör. As a municipal corporation, they seek consultation regarding internal as well as external review and investigation of the main structure of the network, its functionality and safety. By today’s increasing demands of Internet accessibility, availability of services and security far more extends the requirement of a complete network design. The foundation of networking rests on the balance between each of these necessities. Therefore, it is of grave importance to optimize a network design, use of hardware and to minimize the administrative overhead. In particular, when the municipality is short of resources and time means money. By letting an impartial investigation of the network act as a starting point it was established that several improvement could be applied. Among these a reconstructed and improved network topology that includes subjects as routing, switching, safety and security, quality of service and technical administrative overhead and the implementation of a real time monitoring of network bandwidth consumption. / Företaget Västra Mälardalens Kommunalförbund, VMKF, har önskemål om att granska och optimera deras befintliga data- och IP-telefoninätverk som i dagsläget spänner över de tre kommunerna Köping, Arboga och Kungsör. Som ett kommunalägt företag önskar de konsultation rörande intern såväl som extern granskning och optimering av huvuddelen av nätverkets funktionalitet samt säkerhet. I och med dagens ökade Internetanvändning och funktionalitetsbehov ställs allt högre krav på tillgänglighet, säkerhet och användarvänlighet. Nätverksteknik bygger mycket på balansen mellan dessa tre punkter. Därför gäller det att optimera nätverkets design, hårdvaruanvändning och att minimera administrativa laster. Detta i synnerhet då kommunens resurser är knappa och då tid i dagens samhälle innebär pengar. Genom att låta en granskning över nätverket som det ser ut i dag ligga till grund konstaterades att flertalet förbättringsmöjligheter kunde genomföras. Bland dessa återfinns en omstrukturerad nätverksdesign som innefattar routing, switching, säkerhet, QoS och teknisk administration samt implementeringen av en realtidsövervakning av bandbreddsanvändning.
14

Methods for Creating and Exploiting Data Locality

Wallin, Dan January 2006 (has links)
The gap between processor speed and memory latency has led to the use of caches in the memory systems of modern computers. Programs must use the caches efficiently and exploit data locality for maximum performance. Multiprocessors, built from many processing units, are becoming commonplace not only in large servers but also in smaller systems such as personal computers. Multiprocessors require careful data locality optimizations since accesses from other processors can lead to invalidations and false sharing cache misses. This thesis explores hardware and software approaches for creating and exploiting temporal and spatial locality in multiprocessors. We propose the capacity prefetching technique, which efficiently reduces the number of cache misses but avoids false sharing by distinguishing between cache lines involved in communication from non-communicating cache lines at run-time. Prefetching techniques often lead to increased coherence and data traffic. The new bundling technique avoids one of these drawbacks and reduces the coherence traffic in multiprocessor prefetchers. This is especially important in snoop-based systems where the coherence bandwidth is a scarce resource. Most of the studies have been performed on advanced scientific algorithms. This thesis demonstrates that a cc-NUMA multiprocessor, with hardware data migration and replication optimizations, efficiently exploits the temporal locality in such codes. We further present a method of parallelizing a multigrid Gauss-Seidel partial differential equation solver, which creates temporal locality at the expense of increased communication. Our conclusion is that on modern chip multiprocessors, it is more important to optimize algorithms for data locality than to avoid communication, since communication can take place using a shared cache.
15

Analýza a demonstrace vybraných L2 útoků / An Analysis of Selected Layer 2 Network Attacks

Lomnický, Marek January 2009 (has links)
This MSc Thesis focuses on principles, practical performability and security against four attacks used in contemporary local-area networks: CAM Table Overflow capable of capturing traffic in switched networks, ARP Man-in-the-Middle, whose target is to redirect or modify traffic and against two variants of VLAN Hopping attack allowing a hacker to send and capture data from VLANs he has no access to.
16

資料窺探與交易策略之獲利性:以亞洲股票市場為例 / Data snooping and the profitability of trading strategies: evidence from the asian stock markets

李榮傑, Lee, Chung Chieh Unknown Date (has links)
於這篇論文中,我們運White (2000)的Reality Check與Romano and Wolf (2005)的stepwise multiple test檢測交易策略的獲利性以更正資料窺探的偏誤。不同於先前運用資料窺探法則的研究,我們的研究以技術分析及時間序列預測兩者為依歸來建立交易策略,另外我們探討的市場集中在六個主要的亞洲股票市場。大致上,我們發現鮮少證據支持技術交易策略的獲利性;於基礎分析中且考慮交易成本時,只有少數幾個獲利性交易法則出現於兩個興新市場。另外在子樣本期間中,我們發現獲利性策略的表現並不穩定且這幾年來獲利性有逐漸變弱的趨勢。在進階分析中,我們發現沒有任何交易策略表現優越於基本的買進持有策略。 / In this paper, we exam the profitability of trading strategies by using both White’s (2000) Reality Check and Romano and Wolf (2005)s’ stepwise multiple test that correct the data snooping bias. Different from previous studies with the data snooping methodology, our analysis set the universe of forecasts (trading strategies) based on both technical analysis and time series prediction, and the markets which our investigation focuses on are six major Asian stock markets. Overall we find little supportive evidence for the profitability of trading strategies. Our basic analysis shows that there are only few profitable trading strategies detected for two emerging markets while transaction costs are taken into account. Moreover, the performances of the profitable strategies are unstable and the profitability becomes much weaker in the recent years as we find in the sub-periods. In further analysis, we also find that there is no trading strategies in our universe that can outperform the basically buy and hold strategy.
17

Multiple Outlier Detection: Hypothesis Tests versus Model Selection by Information Criteria

Lehmann, Rüdiger, Lösler, Michael 14 June 2017 (has links) (PDF)
The detection of multiple outliers can be interpreted as a model selection problem. Models that can be selected are the null model, which indicates an outlier free set of observations, or a class of alternative models, which contain a set of additional bias parameters. A common way to select the right model is by using a statistical hypothesis test. In geodesy data snooping is most popular. Another approach arises from information theory. Here, the Akaike information criterion (AIC) is used to select an appropriate model for a given set of observations. The AIC is based on the Kullback-Leibler divergence, which describes the discrepancy between the model candidates. Both approaches are discussed and applied to test problems: the fitting of a straight line and a geodetic network. Some relationships between data snooping and information criteria are discussed. When compared, it turns out that the information criteria approach is more simple and elegant. Along with AIC there are many alternative information criteria for selecting different outliers, and it is not clear which one is optimal.
18

Multiple Outlier Detection: Hypothesis Tests versus Model Selection by Information Criteria

Lehmann, Rüdiger, Lösler, Michael January 2016 (has links)
The detection of multiple outliers can be interpreted as a model selection problem. Models that can be selected are the null model, which indicates an outlier free set of observations, or a class of alternative models, which contain a set of additional bias parameters. A common way to select the right model is by using a statistical hypothesis test. In geodesy data snooping is most popular. Another approach arises from information theory. Here, the Akaike information criterion (AIC) is used to select an appropriate model for a given set of observations. The AIC is based on the Kullback-Leibler divergence, which describes the discrepancy between the model candidates. Both approaches are discussed and applied to test problems: the fitting of a straight line and a geodetic network. Some relationships between data snooping and information criteria are discussed. When compared, it turns out that the information criteria approach is more simple and elegant. Along with AIC there are many alternative information criteria for selecting different outliers, and it is not clear which one is optimal.
19

Architektury systémů na Internetu se skupinovým adresováním / Architectures of Internet-Based Systems with Multicasting

Veselý, Vladimír January 2009 (has links)
With rapid expansion of interest in multimedia and distributed computing applications across the Internet increases importance of optimized delivery of group traffic. According to current situation the best practice to achieve this goal is multicasting. This masters thesis summarizes multicasting methods and facts. Also it draws conclusions from practical application of presented information on commercial project.
20

AJUSTAMENTO DE LINHA POLIGONAL NO ELIPSÓIDE / TRAVERSE ADJUSTMENT IN THE ELLIPSOID

Bisognin, Márcio Giovane Trentin 26 April 2006 (has links)
Traverses Adjustment in the surface of the ellipsoid with the objectives to guarantee the solution unicity in the transport of curvilinear geodesic coordinates (latitude and longitude) and in the azimuth transport and to get the estimates of quality. It deduces the coordinate transport and the azimuth transport by mean Legendre s series of the geodesic line. This series is based on the Taylor s series, where the argument is the length of the geodesic line. For the practical applications, it has the necessity to effect the truncation of the series and to calculate the function error for the latitude, the function error for the longitude and the function error for the azimuth. In this research, these series are truncated in the derivative third and calculates the express functions error in derivative fourth. It is described the adjustment models based on the least-squares method: combined model with weighted parameters, combined model or mixed model, parametric model or observations equations and correlates model or condition equations model. The practical application is the adjustment by mean parametric model of a traverse measured by the Instituto Brasileiro de Geografia e Estatística (IBGE), constituted of 8 vertices and the 129.661 km length. The localization of errors in the observations is calculated by the Baarda s data snooping test in the last iteration of the adjustment that showed some observations with error. The estimates of quality are in the variance-covariance matrices and calculate the semiaxes of the error ellipse or standard ellipse of each point by means of the spectral decomposition (or Jordan s decomposition) of the submatrices of the variance-covariance matrix of the adjusted parameters (the coordinates). It is important to note that the application of the Legendre s series is satisfactory for short distances until 40km length. The convergence of the series is fast for the adjusted coordinates, where the stopped criterion of the iterations is four decimals in the sexagesimal second arc, where it is obtained from interation second of the adjustment. / Ajustamento de linhas poligonais na superfície do elipsóide com os objetivos de garantir a unicidade de solução no transporte de coordenadas geodésicas curvilíneas (latitude ϕ e longitude λ ) e no transporte de azimute e de obter as estimativas de qualidade. Deduz o transporte de coordenadas e o transporte de azimute pelas séries de Legendre da linha geodésica. Essa série se fundamenta na série de Taylor, em que o argumento é o comprimento da linha geodésica. Para as aplicações práticas, há a necessidade de efetuar o truncamento da série e calcular a função erro para a latitude, função erro para a longitude e função erro para o azimute. Nesta pesquisa, trunca-se a série na derivada terceira e calculam-se as funções erro expressas em derivada quarta. Expõe os modelos de ajustamento fundamentados no método dos mínimos quadrados (MMQ): modelo combinado com ponderação aos parâmetros, modelo combinado ou implícito, modelo paramétrico ou das equações de observação e modelo dos correlatos ou das equações de condição. A aplicação prática é o ajustamento pelo modelo paramétrico de uma linha poligonal medida pelo Instituto Brasileiro de Geografia e Estatística (IBGE), constituída de 8 vértices e de comprimento igual a 129,661 km. A localização de erros nas observações é efetuada pelo teste data snooping de Baarda na última etapa do ajustamento que mostrou algumas observações com erro. As estimativas de qualidade estão nas matrizes variância-covariância (MVC) e calcula-se os semieixos da elipse dos erros (ou elipse padrão) de cada ponto mediante a decomposição espectral (ou decomposição de Jordan) das submatrizes da MVC dos parâmetros (as coordenadas) ajustados. Mostra-se que a aplicação das séries de Legendre é satisfatória para distâncias curtas até 40km. A convergência da série é rápida para as coordenadas ajustadas, onde o critério de parada das iterações seja quatro decimais do segundo de arco em que se atingiu na segunda etapa do ajustamento.

Page generated in 0.0555 seconds