Spelling suggestions: "subject:"forecasting "" "subject:"forecasting's ""
931 |
Non-linear versus non-gaussian volatility models in application to different financial marketsMiazhynskaia, Tatiana, Dorffner, Georg, Dockner, Engelbert J. January 2003 (has links) (PDF)
We used neural-network based modelling to generalize the linear econometric return models and compare their out-of-sample predictive ability in terms of different performance measures under three density specifications. As error measures we used the likelihood values on the test sets as well as standard volatility measures. The empirical analysis was based on return series of stock indices from different financial markets. The results indicate that for all markets there was found no improvement in the forecast by non-linear models over linear ones, while nongaussian models significantly dominate the gaussian models with respect to most performance measures. The likelihood performance measure mostly favours the linear model with Student-t distribution, but the significance of its superiority differs between the markets. (author's abstract) / Series: Report Series SFB "Adaptive Information Systems and Modelling in Economics and Management Science"
|
932 |
Combined Use of Models and Measurements for Spatial Mapping of Concentrations and Deposition of PollutantsAmbachtsheer, Pamela January 2004 (has links)
When modelling pollutants in the atmosphere, it is nearly impossible to get perfect results as the chemical and mechanical processes that govern pollutant concentrations are complex. Results are dependent on the quality of the meteorological input as well as the emissions inventory used to run the model. Also, models cannot currently take every process into consideration. Therefore, the model may get results that are close to, or show the general trend of the observed values, but are not perfect. However, due to the lack of observation stations, the resolution of the observational data is poor. Furthermore, the chemistry over large bodies of water is different from land chemistry, and in North America, there are no stations located over the great lakes or the ocean. Consequently, the observed values cannot accurately cover these regions. Therefore, we have combined model output and observational data when studying ozone concentrations in north eastern North America. We did this by correcting model output at observational sites with local data. We then interpolated those corrections across the model grid, using a Kriging procedure, to produce results that have the resolution of model results with the local accuracy of the observed values. Results showed that the corrected model output is much improved over either model results or observed values alone. This improvement was observed both for sites that were used in the correction process as well as sites that were omitted from the correction process.
|
933 |
Linearization Methods in Time Series AnalysisChen, Bei 08 September 2011 (has links)
In this dissertation, we propose a set of computationally efficient methods based on approximating/representing nonlinear processes by linear ones, so-called linearization. Firstly, a linearization method is introduced for estimating the multiple frequencies in sinusoidal processes. It utilizes a regularized autoregressive (AR) approximation, which can be regarded as a "large p - small n" approach in a time series context. An appealing property of regularized AR is that it avoids a model selection step and allows for an efficient updating of the frequency estimates whenever new observations are obtained. The theoretical analysis shows that the regularized AR frequency estimates are consistent and asymptotically normally distributed. Secondly, a sieve bootstrap scheme is proposed using the linear representation of generalized autoregressive conditional heteroscedastic (GARCH) models to construct prediction intervals (PIs) for the returns and volatilities. Our method is simple, fast and distribution-free, while providing sharp and well-calibrated PIs. A similar linear bootstrap scheme can also be used for diagnostic testing. Thirdly, we introduce a robust lagrange multiplier (LM) test, which utilizes either the bootstrap or permutation procedure to obtain critical values, for detecting GARCH effects. We justify that both bootstrap and permutation LM tests are consistent. Intensive numerical studies indicate that the proposed resampling algorithms significantly improve the size and power of the LM test in both skewed and heavy-tailed processes. Moreover, fourthly, we introduce a nonparametric trend test in the presence of GARCH effects (NT-GARCH) based on heteroscedastic ANOVA. Our empirical evidence show that NT-GARCH can effectively detect non-monotonic trends under GARCH, especially in the presence of irregular seasonal components. We suggest to apply the bootstrap procedure for both selecting the window length and finding critical values. The newly proposed methods are illustrated by applications to astronomical data, to foreign currency exchange rates as well as to water and air pollution data. Finally, the dissertation is concluded by an outlook on further extensions of linearization methods, e.g., in model order selection and change point detection.
|
934 |
Active Portfolio Management in the German Stock Market : A CAPM ApproachWüsten, Nicolai January 2012 (has links)
An investor can generate higher returns on the German stock market if he is using an active portfolio management strategy rather than its passive counterpart. This is possible because the market is not efficient and the DAX, namely the market portfolio, can be outperformed in regard to the average annual return and its variance. Therefore, the CAPM does not hold for the German stock market. The investor has to use the 10 weeks old changes of the ifo business climate index to forecast the DAX movement in the upcoming month. Even though this forecasting method only gave the correct trading signal for 56% of the months between 1991 and 2011, it outperformed the Buy and Hold strategy by 324 basis points. The main reason for this is that the business index was able to warn the investor of months in which the DAX lost over 10% of its value. The superiority of the active strategy was still valid when transaction costs were taken into account and was even stronger when call money was the alternative investment to the DAX rather than cash.
|
935 |
Analysis of Structure and Tendencies of Qualified Immigrant Workforce on the Swedish Labor MarketDmytro S., Yefymov January 2006 (has links)
The purpose of this paper is to make quantitative and qualitative analysis of foreign citizens who may participate on the Swedish labor market (in text refers to as ‘immigrants’). This research covers the period 1973-2005 and gives prediction figures of immigrant population, age and gender structure, and education attainment in 2010. To cope with data regarding immigrants from different countries, the population was divided into six groups. The main chapter is divided into two parts. The first part specifies division of immigrants into groups by country of origin according to geographical, ethnical, economical and historical criteria. Brief characteristics and geographic position, dynamic and structure description were given for each group; historical review explain rapid changes in immigrant population. Statistical models for description and estimation future population were given. The second part specifies education and qualification level of the immigrants according to international and Swedish standards. Models for estimating age and gender structure, level of education and professional orientation of immigrants in different groups are given. Inferences were made regarding ethnic, gender and education structure of immigrants; the distribution of immigrants among Swedish counties is given. Discussion part presents the results of the research, gives perspectives for the future brief evaluation of the role of immigrants on the Swedish labor market.
|
936 |
Daily Calls Volume ForecastingAJMAL, KHAN, TAHIR MAHMOOD, HASHMI January 2010 (has links)
A massive amount has been written about forecasting but few articles are written about the development of time series models of call volumes for emergency services. In this study, we use different techniques for forecasting and make the comparison of the techniques for the call volume of the emergency service Rescue 1122 Lahore, Pakistan. For the purpose of this study data is taken from emergency calls of Rescue 1122 from 1st January 2008 to 31 December 2009 and 731 observations are used. Our goal is to develop a simple model that could be used for forecasting the daily call volume. Two different approaches are used for forecasting the daily call volume Box and Jenkins (ARIMA) methodology and Smoothing methodology. We generate the models for forecasting of call volume and present a comparison of the two different techniques.
|
937 |
Piano ConcertoSchimmel, Carl William 25 April 2008 (has links)
The dissertation consists of a Piano Concerto, written for first performance in Fall 2008 by Blair McMillen, piano, and the Raleigh Civic Symphony conducted by Randolph Foy. The three movement work is scored for piccolo, 2 flutes (1 doubling piccolo), 2 oboes, English horn, 2 clarinets in B-flat, bass clarinet in B-flat, 2 bassoons, 4 horns in F, 3 trumpets in C, 2 tenor trombones, bass trombone, tuba, timpani, 3 percussionists, strings, and piano solo. The work is approximately twenty minutes in duration.
The first movement, "Fantod," employs a neo-Romantic idiom, featuring the soloist as both aggressive virtuoso and as a subtle residual resonance which emerges from the orchestral texture. The second movement, "Lament," serves as a simple, pensive, and sorrowful aftermath to the frenzied first movement. In the third movement, "Rondoburlesque," the mood of the work becomes considerably more lighthearted, and moments of the first two movements are caricatured.
The Concerto's harmonic and melodic organization derives from a set theoretical design. The first movement uses the harmonic minor scale and its inversion, the second movement uses the melodic minor scale, and the last movement uses the natural minor scale (the major scale). Important and unique subsets of these scales are used to provide both contrast and interrelatedness between movements. In particular, the main melodic theme of the first movement returns at the end of the last movement. / Dissertation
|
938 |
Forecasting project progress and early warning of project overruns with probabilistic methodsKim, Byung Cheol 15 May 2009 (has links)
Forecasting is a critical component of project management. Project managers must be
able to make reliable predictions about the final duration and cost of projects starting
from project inception. Such predictions need to be revised and compared with the
project’s objectives to obtain early warnings against potential problems. Therefore, the
effectiveness of project controls relies on the capability of project managers to make
reliable forecasts in a timely manner.
This dissertation focuses on forecasting project schedule progress with
probabilistic methods. Currently available methods, for example, the critical path
method (CPM) and earned value management (EVM) are deterministic and fail to
account for the inherent uncertainty in forecasting and project performance.
The objective of this dissertation is to improve the predictive capabilities of
project managers by developing probabilistic forecasting methods that integrate all
relevant information and uncertainties into consistent forecasts in a mathematically
sound procedure usable in practice. In this dissertation, two probabilistic methods, the Kalman filter forecasting method (KFFM) and the Bayesian adaptive forecasting method
(BAFM), were developed. The KFFM and the BAFM have the following advantages
over the conventional methods: (1) They are probabilistic methods that provide
prediction bounds on predictions; (2) They are integrative methods that make better use
of the prior performance information available from standard construction management
practices and theories; and (3) They provide a systematic way of incorporating
measurement errors into forecasting.
The accuracy and early warning capacity of the KFFM and the BAFM were also
evaluated and compared against the CPM and a state-of-the-art EVM schedule
forecasting method. Major conclusions from this research are: (1) The state-of-the-art
EVM schedule forecasting method can be used to obtain reliable warnings only after the
project performance has stabilized; (2) The CPM is not capable of providing early
warnings due to its retrospective nature; (3) The KFFM and the BAFM can and should
be used to forecast progress and to obtain reliable early warnings of all projects; and (4)
The early warning capacity of forecasting methods should be evaluated and compared in
terms of the timeliness and reliability of warning in the context of formal early warning
systems.
|
939 |
Dynamic resource allocation for energy management in data centersRincon Mateus, Cesar Augusto 15 May 2009 (has links)
In this dissertation we study the problem of allocating computational resources and
managing applications in a data center to serve incoming requests in such a way that the
energy usage, reliability and quality of service considerations are balanced. The problem is
motivated by the growing energy consumption by data centers in the world and their overall
inefficiency. This work is focused on designing flexible and robust strategies to manage the
resources in such a way that the system is able to meet the service agreements even when
the load conditions change. As a first step, we study the control of a Markovian queueing
system with controllable number of servers and service rates (M=Mt=kt ) to minimize
effort and holding costs. We present structural properties of the optimal policy and suggest
an algorithm to find good performance policies even for large cases. Then we present
a reactive/proactive approach, and a tailor-made wavelet-based forecasting procedure to
determine the resource allocation in a single application setting; the method is tested by
simulation with real web traces. The main feature of this method is its robustness and flexibility
to meet QoS goals even when the traffic behavior changes. The system was tested
by simulating a system with a time service factor QoS agreement. Finally, we consider
the multi-application setting and develop a novel load consolidation strategy (of combining
applications that are traditionally hosted on different servers) to reduce the server-load
variability and the number of booting cycles in order to obtain a better capacity allocation.
|
940 |
Application of the Stretched Exponential Production Decline Model to Forecast Production in Shale Gas ReservoirsStatton, James Cody 2012 May 1900 (has links)
Production forecasting in shale (ultra-low permeability) gas reservoirs is of great interest due to the advent of multi-stage fracturing and horizontal drilling. The well renowned production forecasting model, Arps? Hyperbolic Decline Model, is widely used in industry to forecast shale gas wells. Left unconstrained, the model often overestimates reserves by a great deal. A minimum decline rate is imposed to prevent overestimation of reserves but with less than ten years of production history available to analyze, an accurate minimum decline rate is currently unknown; an educated guess of 5% minimum decline is often imposed. Other decline curve models have been proposed with the theoretical advantage of being able to match linear flow followed by a transition to boundary dominated flow. This thesis investigates the applicability of the Stretched Exponential Production Decline Model (SEPD) and compares it to the industry standard, Arps' with a minimum decline rate. When possible, we investigate an SEPD type curve.
Simulated data is analyzed to show advantages of the SEPD model and provide a comparison to Arps' model with an imposed minimum decline rate of 5% where the full production history is known. Long-term production behavior is provided by an analytical solution for a homogenous reservoir with homogenous hydraulic fractures. Various simulations from short-term linear flow (~1 year) to long-term linear flow (~20 years) show the ability of the models to handle onset of boundary dominated flow at various times during production history. SEPD provides more accurate reserves estimates when linear flow ends at 5 years or earlier. Both models provide sufficient reserves estimates for longer-term linear flow scenarios.
Barnett Shale production data demonstrates the ability of the models to forecast field data. Denton and Tarrant County wells are analyzed as groups and individually. SEPD type curves generated with 2004 well groups provide forecasts for wells drilled in subsequent years. This study suggests a type curve is most useful when 24 months or less is available to forecast. The SEPD model generally provides more conservative forecasts and EUR estimates than Arps' model with a minimum decline rate of 5%.
|
Page generated in 0.0797 seconds