Spelling suggestions: "subject:"predictability"" "subject:"redictability""
71 |
Three Essays on Time Series Quantile RegressionWang, Yini 01 August 2012 (has links)
This dissertation considers quantile regression models with nonstationary or nearly nonstationary time series. The first chapter outlines the thesis and discusses its theoretical and empirical contributions. The second chapter studies inference in quantile regressions with cointegrated variables allowing for multiple structural changes. The unknown break dates and regression coefficients are estimated jointly and consistently. The conditional quantile estimator has a nonstandard limit distribution. A fully modified estimator is proposed to remove the second-order bias and nuisance parameters and the resulting limit distribution is mixed normal. A simulation study shows that the fully modified quantile estimator has good finite sample properties. The model is applied to stock index data from the emerging markets of China and several mature markets. Financial market integration is found in some quantiles of the Chinese stock indices. The third chapter considers predictive quantile regression with a nearly integrated regressor. We derive nonstandard distributions for the quantile regression estimator and t-statistic in terms of functionals of diffusion processes. The critical values are found to depend on both the quantile of interest and the local-to-unity parameter, which is not consistently estimable. Based on these critical values, we propose a valid Bonferroni bounds test for quantile predictability with persistent regressors. We employ this new methodology to test the ability of many commonly employed and highly persistent regressors, such as the dividend yield, earnings price ratio, and T-bill rate, to predict the median, shoulders, and tails of the stock return distribution. Chapter Four proposes a cumulated sum (CUSUM) test for the null hypothesis of quantile cointegration. A fully modified quantile estimator is adopted for serial correlation and endogeneity corrections. The CUSUM statistic is composed of the partial sums of the residuals from the fully modified quantile regression. Under the null, the test statistic converges to a functional of Brownian motions. In the application to U.S. interest rates of different maturities, evidence in favor of the expectations hypothesis for the term structure is found in the central part of the distributions of the Treasury bill rate and financial commercial paper rate, but in the tails of the constant maturity rate distribution. / Thesis (Ph.D, Economics) -- Queen's University, 2012-07-30 15:20:38.253
|
72 |
Predictability effects in language acquisitionPate, John Kenton January 2013 (has links)
Human language has two fundamental requirements: it must allow competent speakers to exchange messages efficiently, and it must be readily learned by children. Recent work has examined effects of language predictability on language production, with many researchers arguing that so-called “predictability effects” function towards the efficiency requirement. Specifically, recent work has found that talkers tend to reduce linguistic forms that are more probable more heavily. This dissertation proposes the “Predictability Bootstrapping Hypothesis” that predictability effects also make language more learnable. There is a great deal of evidence that the adult grammars have substantial statistical components. Since predictability effects result in heavier reduction for more probable words and hidden structure, they provide infants with direct cues to the statistical components of the grammars they are trying to learn. The corpus studies and computational modeling experiments in this dissertation show that predictability effects could be a substantial source of information to language-learning infants, focusing on the potential utility of phonetic reduction in terms of word duration for syntax acquisition. First, corpora of spontaneous adult-directed and child-directed speech (ADS and CDS, respectively) are compared to verify that predictability effects actually exist in CDS. While revealing some differences, mixed effects regressions on those corpora indicate that predictability effects in CDS are largely similar (in kind and magnitude) to predictability effects in ADS. This result indicates that predictability effects are available to infants, however useful they may be. Second, this dissertation builds probabilistic, unsupervised, and lexicalized models for learning about syntax from words and durational cues. One series of models is based on Hidden Markov Models and learns shallow constituency structure, while the other series is based on the Dependency Model with Valence and learns dependency structure. These models are then used to measure how useful durational cues are for syntax acquisition, and to what extent their utility in this task can be attributed to effects of syntactic predictability on word duration. As part of this investigation, these models are also used to explore the venerable “Prosodic Bootstrapping Hypothesis” that prosodic structure, which is cued in part by word duration, may be useful for syntax acquisition. The empirical evaluations of these models provide evidence that effects of syntactic predictability on word duration are easier to discover and exploit than effects of prosodic structure, and that even gold-standard annotations of prosodic structure provide at most a relatively small improvement in parsing performance over raw word duration. Taken together, this work indicates that predictability effects provide useful information about syntax to infants, showing that the Predictability Bootstrapping Hypothesis for syntax acquisition is computationally plausible and motivating future behavioural investigation. Additionally, as talkers consider the probability of many different aspects of linguistic structure when reducing according to predictability effects, this result also motivates investigation of Predictability Bootstrapping of other aspects of linguistic knowledge.
|
73 |
An Analysis of Bitcoin Market Efficiency Through Measures of Short-Horizon Return Predictability and Market LiquidityBrown, William L 01 January 2014 (has links)
Bitcoins have the potential to fundamentally change the way value is transferred globally. Their rapid adoption over the past four years has led many to consider the possible results of such a technology. To be a viable currency, however, it is imperative that the market for trading Bitcoins is efficient. By examining the changes in availability of predictable outsized returns and market liquidity over time, this paper examines historical Bitcoin market efficiency and establishes correlations between market liquidity, price predictability, and return data. The results provide insight into the turbulent nature of Bitcoin market efficiency over the past years, but cannot definitively measure the magnitude of the change due to the limitations in efficiency analysis. The most meaningful result of this study, however, is the statistically significant short-horizon price predictability that existed over the duration of the study, which has implications for Bitcoin market efficiency as well as for continued research in short-horizon Bitcoin price forecasting models.
|
74 |
Predictive Radio Access Networks for Vehicular Content DeliveryAbou-zeid, Hatem 01 May 2014 (has links)
An unprecedented era of “connected vehicles” is becoming an imminent reality. This
is driven by advances in vehicular communications, and the development of in-vehicle
telematics systems supporting a plethora of applications. The diversity and multitude
of such developments will, however, introduce excessive congestion across wireless
infrastructure, compelling operators to expand their networks. An alternative to
network expansions is to develop more efficient content delivery paradigms. In particular,
alleviating Radio Access Network (RAN) congestion is important to operators
as it postpones costly investments in radio equipment installations and new spectrum.
Efficient RAN frameworks are therefore paramount to expediting this realm
of vehicular connectivity.
Fortunately, the predictability of human mobility patterns, particularly that of vehicles
traversing road networks, offers unique opportunities to pursue proactive RAN
transmission schemes. Knowing the routes vehicles are going to traverse enables the
network to forecast spatio-temporal demands and predict service outages that specific
users may face. This can be accomplished by coupling the mobility trajectories with
network coverage maps to provide estimates of the future rates users will encounter
along a trip.
In this thesis, we investigate how this valuable contextual information can enable RANs to improve both service quality and operational efficiency. We develop a collection
of methods that leverage mobility predictions to jointly optimize 1) long-term
wireless resource allocation, 2) adaptive video streaming delivery, and 3) energy efficiency in RANs. Extensive simulation results indicate that our approaches provide
significant user experience gains in addition to large energy savings. We emphasize
the applicability of such predictive RAN mechanisms to video streaming delivery, as
it is the predominant source of traffic in mobile networks, with projections of further
growth. Although we focus on exploiting mobility information at the radio access
level, our framework is a direction towards pursuing a predictive end-to-end content
delivery architecture. / Thesis (Ph.D, Electrical & Computer Engineering) -- Queen's University, 2014-04-30 06:15:34.31
|
75 |
What About Short Run?Xu, Lai January 2014 (has links)
<p>This dissertation explores issues regarding the short-lived temporal variation of the equity risk premium. In the past decade, the equity risk premium puzzle is resolved by many competing consumption-based asset pricing models. However, before \cite{btz:vrp:rfs}, the return predictability as an outcome of such models has limited empirical support in the short-run. Nowadays, there has been a consensus of the literature that the short-run equity return's predictability is intimately linked with the variance risk premium---the difference between options-implied and actual realized variation measures.</p><p>In this work, I continue to argue the importance of the short-lived components in the equity risk premium. Specifically, I first provide simulation evidence of the strong return predictability based on the variance risk premium in the U.S. aggregate market, and document new empirical findings in the international setting. Then I attempt to use a structural macro-finance model to guide through the predictability estimation with much more efficiency gain. Finally I decompose the equity risk premium into two short-lived parts --- tail risk and diffusive risk --- and propose a semi-parametric estimation method for each part. The results are arranged in the following order.</p><p>Chapter 1 of the dissertation is co-authored with Tim Bollerslev, James Marrone and Hao Zhou. In this chapter, we demonstrate that statistical finite sample biases cannot ``explain'' this apparent predictability in U.S. market based on variance risk premium. Further corroborating the existing evidence of the U.S., we show that country specific regressions for France, Germany, Japan, Switzerland, the Netherlands, Belgium and the U.K. result in quite similar patterns. Defining a ``global'' variance risk premium, we uncover even stronger predictability and almost identical cross-country patterns through the use of panel regressions. </p><p>Chapter 2 of the dissertation is co-authored with Tim Bollerslev and Hao Zhou. In this chapter, we examine the joint predictability of return and cash flow within a present value framework, by imposing the implications from a long-run risk model that allow for both time-varying volatility and volatility uncertainty. We provide new evidences that the expected return variation and the variance risk premium positively forecast both short-horizon returns \textit{and} dividend growth rates. We also confirm that dividend yield positively forecasts long-horizon returns, but that it does not help in forecasting dividend growth rates. Our equilibrium-based ``structural'' factor GARCH model permits much more accurate inference than %the reduced form VAR and</p><p>univariate regression procedures traditionally employed in the literature. The model also allows for the direct estimation of the underlying economic mechanisms, including a new volatility leverage effect, the persistence of the latent long-run growth component and the two latent volatility factors, as well as the contemporaneous impacts of the underlying ``structural'' shocks.</p><p>In Chapter 3 of the dissertation, I develop a new semi-parametric estimation method based on an extended ICAPM dynamic model incorporating jump tails. The model allows for time-varying, asymmetric jump size distributions and a self-exciting jump intensity process while avoiding commonly used but restrictive affine assumptions on the relationship between jump intensity and volatility. The estimated model implies that the average annual jump risk premium is 6.75\%. The model-implied jump risk premium also has strong explanatory power for short-to-medium run aggregate market returns. Empirically, I present new estimates of the model based equity risk premia of so-called "Small-Big", "Value-Growth" and "Winners-Losers" portfolios. Further, I find that they are all time-varying and all crashed in the 2008 financial crisis. Additionally, both the jump and volatility components of equity risk premia are especially important for the "Winners-Losers" portfolio.</p> / Dissertation
|
76 |
Anomalies on the London Stock Exchange : the influence of the bid-ask spread and nonsynchronous tradingBatty, Richard Andrew January 1994 (has links)
This thesis tests for seasonal anornalies and daily predictability on the UK stock market and investigates how mispricing caused by the bid-ask spread, known as the 'touch' and nonsynchronous trading in portfolio returns may explain these anomalies. By using constructed portfolios within a th-ne-series regression framework, I show that seasonality, in the first instance, is prominent in returns around the turn of the week and the turn of the year. However, this seasonal returns behaviour disappears when the touch is accounted for. Indeed, seasonality seerns to occur in the touch rather than returns. Despite this touch explanation, lagged returns remain significant, suggesting return predictability. In fact, when using a price adjustment model returns are predictable across portfolios. This predictability, while to some extent dependent upon firm size and the touch, may be accounted for by nonsynchronous trading. First-order autocorrelation and cross-autocorrelation found in returns proves more indicative of infrequent trading than return predictability. Thus, these results confirm that mismeasurernent in portfolio returns caused by market microstructure and nonsynchronous trading can create false inferences about the extent of stock market anornalies in the UK and subsequently, market efficiency.
|
77 |
Estetisk vägplanering : Förutsägbarhet i Genetiska Algoritmer och Theta* i dynamiska miljöer / Aesthetic pathfinding : Predictability in Genetic Algorithms and Theta* in dynamic environmentsGlimmerfors, Hans January 2018 (has links)
Detta arbete jämför Theta* och en genetisk algoritm för att undersöka ifall en genetisk algoritm ger mer förutsägbara vägar i dynamiska miljöer än en deterministisk sökteknik. Den valda genetiska algoritmen är inte begränsad till vägval till närliggande noder, den nod som är näst i tur kan ligga var som helst i miljön. Theta* valdes för att matcha detta beteende och anses vara en bra representation av en deterministisk sökteknik då den är en vidareutveckling av standarden A*. Resultaten angående förutsägbarheten samlades in genom en enkätundersökning där fyra aspekter efterfrågades i jämförelser mellan Theta* och den genetiska algoritmen: vilken väg är mjukast, vilken väg är rakast, vilken väg är mest direkt, och vilken väg är kortast. Resultaten visade att den genetiska algoritmen presterat 0,6-2,0% bättre än Theta* och anses därför kunna ge mer förutsägbara vägar. Dock krävs mer forskning för att fastställa hur väsentliga de olika kriterierna är för förutsägbarheten. / This work compares Theta* and a genetic algorithm in order to investigate whether a genetic algorithm gives more predictable paths in dynamic environments than a deterministic search algorithm. The chosen genetic algorithm is not restricted to moving to nearby nodes - the next node may be located anywhere in the environment. Theta* was chosen to match this behaviour and is considered to be a good representation of a deterministic search technique as it is based upon the industry standard A*. Results related to the predictability were collected through a survey where four aspects were asked in comparisons between Theta* and the genetic algorithm: which path is smoother, which path is straighter, which path is more direct, and which path is shorter. The results showed that the genetic algorithm performed 0,6-2,0% better than Theta* and is thereby considered to be able to give more predictable paths. There is however a need for further studies in order to establish how important the criteria are for the predictability.
|
78 |
Squelettes algorithmiques méta-programmés : implantations, performances et sémantique / Metaprogrammed algorithmic skeletons : implementations, performances and semanticsJaved, Noman 21 October 2011 (has links)
Les approches de parallélisme structuré sont un compromis entre la parallélisation automatique et la programmation concurrentes et réparties telle qu'offerte par MPI ou les Pthreads. Le parallélisme à squelettes est l'une de ces approches. Un squelette algorithmique peut être vu comme une fonction d'ordre supérieur qui capture un algorithme parallèle classique tel qu'un pipeline ou une réduction parallèle. Souvent la sémantique des squelettes est simple et correspondant à celle de fonctions d'ordre supérieur similaire dans les langages de programmation fonctionnels. L'utilisation combine les squelettes disponibles pour construire son application parallèle. Lorsqu'un programme parallèle est conçu, les performances sont bien sûr importantes. Il est ainsi très intéressant pour le programmeur de disposer d'un modèle de performance, simple mais réaliste. Le parallélisme quasi-synchrone (BSP) offre un tel modèle. Le parallélisme étant présent maintenant dans toutes les machines, du téléphone au super-calculateur, il est important que les modèles de programmation s'appuient sur des sémantiques formelles pour permettre la vérification de programmes. Les travaux menés on conduit à la conception et au développement de la bibliothèque Orléans Skeleton Library ou OSL. OSL fournit un ensemble de squelettes algorithmiques data-parallèles quasi-synchrones. OSL est une bibliothèque pour le langage C++ et utilise des techniques de programmation avancées pour atteindre une bonne efficacité. Les communications se basent sur la bibliothèque MPI. OSL étant basée sur le modèle BSP, il est possible non seulement de prévoir les performances des programmes OSL mais également de fournir une portabilité des performances. Le modèle de programmation d'OSL a été formalisé dans l'assistant de preuve Coq. L'utilisation de cette sémantique pour la preuve de programmes est illustrée par un exemple. / Structured parallelism approaches are a trade-off between automatic parallelisation and concurrent and distributed programming such as Pthreads and MPI. Skeletal parallelism is one of the structured approaches. An algorithmic skeleton can be seen as higher-order function that captures a pattern of a parallel algorithm such as a pipeline, a parallel reduction, etc. Often the sequential semantics of the skeleton is quite simple and corresponds to the usual semantics of similar higher-order functions in functional programming languages. The user constructs a parallel program by combined calls to the available skeletons. When one is designing a parallel program, the parallel performance is of course important. It is thus very interesting for the programmer to rely on a simple yet realistic parallel performance model. Bulk Synchronous Parallelism (BSP) offers such a model. As the parallelism can now be found everywhere from smart-phones to the super computers, it becomes critical for the parallel programming models to support the proof of correctness of the programs developed with them. . The outcome of this work is the Orléans Skeleton Library or OSL. OSL provides a set of data parallel skeletons which follow the BSP model of parallel computation. OSL is a library for C++ currently implemented on top of MPI and using advanced C++ techniques to offer good efficiency. With OSL being based over the BSP performance model, it is possible not only to predict the performances of the application but also provides the portability of performance. The programming model of OSL is formalized using the big-step semantics in the Coq proof assistant. Based on this formal model the correctness of an OSL example is proved.
|
79 |
Geração de leiautes regulares baseados em matrizes de células / Regular Layout Generation based on Cell MatricesMeinhardt, Cristina January 2006 (has links)
Este trabalho trata de pesquisa de soluções para a síntese física de circuitos integrados menos susceptíveis aos efeitos de variabilidade decorrentes do uso de tecnologias de fabricação com dimensões nanométricas. Também apresenta a pesquisa e o desenvolvimento de uma ferramenta para a geração de leiautes regulares denominada R-CAT. A regularidade geométrica é explorada pela repetição de padrões básicos de leiaute ao longo de uma matriz. A regularidade é apontada como uma das melhores alternativas para lidar com os atuais problemas de fabricação em tecnologias submicrônicas. Projetos regulares são menos suscetíveis aos problemas de litografia, aumentam o yield e diminuem o tempo gasto em re-projeto. Além disso, circuitos regulares apresentam maior previsibilidade de resultados de potência, atraso e yield, principalmente pelo fato das células estarem pré-caracterizadas. A ferramenta desenvolvida visa o trabalho com dois tipos de síntese física para leiautes regulares, produzindo circuitos integrados personalizáveis por todas as máscaras ou circuitos personalizáveis por algumas máscaras. O principal objetivo deste gerador é a facilidade de conversão e adaptação dependendo da abordagem de matriz escolhida. Isso facilitará a comparação entre diferentes alternativas de matrizes, a adoção de blocos lógicos diversos e de novas tecnologias. O gerador de leiautes R-CAT identifica células adjacentes com conexões em comum entre elas e realiza a conexão entre essas células em metal 1, reduzindo o número de conexões a ser realizado pelo roteador em até 10%. A ferramenta R-CAT está inserida em um fluxo de projeto e depende do método de síntese lógica adotado. Duas ferramentas de síntese lógica foram utilizadas: SIS e OrBDDs, oferecendo duas linhas de projeto: a primeira priorizando a área e a segunda priorizando timing e interconexões curtas. Ambas respeitando a mesma regularidade geométrica imposta pela matriz. Os resultados obtidos demonstram que as matrizes SIS ocupam 53% menos área do que a estratégia orBDD e reduzem o wire length em 30%. Uma área menor é obtida devido ao fato da ferramenta SIS gerar descrições com a metade de células lógicas e nets. Entretanto, as matrizes R-CAT OrBDD apresentam menor wire length médio, menor fan-out (redução de 15%), menor delay e maior roteabilidade. As sínteses OrBDD apresentam poucas nets não roteadas sem a inserção de trilhas extras. Além disso, as matrizes R-CAT atingiram resultados até 40% menores em wire length e reduções de área de até 46% em relação às matrizes MARTELO. / This work presents a research for physical synthesis of integrated circuits, which are less susceptible to the effects of variability observed in fabrication technologies using nanometers scale. Moreover, it presents a CAD tool developed to generate regular layouts, which is called R-CAT. The geometric regularity is achieved using basic patterns repeated along one matrix structure. Regularity is pointed like one of the best alternatives to deal with submicron technologies issues. Regular designs are less susceptible to lithographic problems, improve the yield and decrease the time to re-spin. Furthermore, regular circuits improve predictability of power consumption, timing and yield results, because the cells are pre-characterized. The developed tool focuses on two types of physical synthesis for regular layouts, producing either integrated circuit customized using all masks or integrated circuits customized using some masks. The main goal is the facility of conversion and adaptation depending on the chosen matrix approach. This will make easier the comparison of different matrix approaches, besides the adoption of several logic blocks and new technologies. R-CAT layout generator identifies adjacent cells that are placed in a same row and have common connections between them. In this case, the generator can make these connections in Metal 1. This technique reduces the number of connections to be done by the router. The experiments showed that this technique is able to reduce about 10% the number of connections to be done. This tool is inserted into a design flow and it is dependent of the logic synthesis methodology adopted. Two logical syntheses tools were used in the flow: SIS and OrBDDs. R-CAT SIS and R-CAT orBDD Matrices were generated for a set of circuits. The use of R-CAT tool with SIS and orBDD logical synthesis offers two design lines: the first one highlights area and the second one emphasize timing and short connections. Both of them respect the same geometric regularity. The results demonstrate that SIS matrices present 53% less area than orBDD approach and reduce the wire length by 30%. The area reduction is achieved because the SIS tool generates descriptions with the half of logic cells and nets. Nevertheless, the R-CAT orBDD matrices decreased the medium wire length, reduced the fan-out in 15%, reduced the delay and improved the routability. orBDD synthesis presents few non-routed nets without extra tracks insertion. Moreover, the R-CAT matrices obtained about 40% better results in wire length and they reduced area in 46% when compared to MARTELO matrices.
|
80 |
Dividend yield e os retornos das ações brasileirasPinto, Bruno Pereira 19 May 2017 (has links)
Submitted by Bruno Pereira Pinto (brunopereira.p@petrobras.com.br) on 2017-08-16T21:46:35Z
No. of bitstreams: 1
Dividend Yield e os Retornos das Ações Brasileiras.pdf: 337408 bytes, checksum: db0110b730ea5d14f64c7c6cb1ba9c1a (MD5) / Approved for entry into archive by GILSON ROCHA MIRANDA (gilson.miranda@fgv.br) on 2017-08-25T13:38:18Z (GMT) No. of bitstreams: 1
Dividend Yield e os Retornos das Ações Brasileiras.pdf: 337408 bytes, checksum: db0110b730ea5d14f64c7c6cb1ba9c1a (MD5) / Made available in DSpace on 2017-08-31T13:19:02Z (GMT). No. of bitstreams: 1
Dividend Yield e os Retornos das Ações Brasileiras.pdf: 337408 bytes, checksum: db0110b730ea5d14f64c7c6cb1ba9c1a (MD5)
Previous issue date: 2017-05-19 / This paper checks for the predictability of stocks negotiated in Sao Paulo Stock Exchange index from aggregate dividend yield. It examined a multiple regression of the excess return (Risk Premium) and return in relation to the dividend yields of prior periods and changes in dividends for the preceding year. The results show the predictive capacity of the dividend yield, especially for a one year lag. In this case, the coefficient with respect to the dividend yield is statistically and economically significant. However, this is gradually reduced over time until it became null. / Este trabalho verifica a previsibilidade dos retornos das ações cotadas na BM&FBovespa a partir do dividend yield. Foi elaborada uma regressão dos excessos de retorno (prêmios de risco) e retornos com relação ao dividend yield de períodos anteriores e das variações nos dividendos do ano imediatamente anterior. Os resultados evidenciam a capacidade preditiva do dividend yield, sobretudo para um ano de defasagem. Neste caso, o coeficiente com relação ao dividend yield é estatisticamente e economicamente significativo. No entanto, de maneira oposta ao obtido em estudos análogos realizados em outros países, a capacidade preditiva é gradativamente reduzida ao longo do tempo até se tornar nula.
|
Page generated in 0.0738 seconds