Spelling suggestions: "subject:"forecasting "" "subject:"forecasting's ""
471 |
Public expenditures and crime in a free societyChukwu, Idam Oko 01 January 1999 (has links)
No description available.
|
472 |
Predicting Electricity Consumption with ARIMA and Recurrent Neural NetworksEnerud, Klara January 2024 (has links)
Due to the growing share of renewable energy in countries' power systems, the need for precise forecasting of electricity consumption will increase. This paper considers two different approaches to time series forecasting, autoregressive moving average (ARMA) models and recurrent neural networks (RNNs). These are applied to Swedish electricity consumption data, with the aim of deriving simple yet efficient predictors. An additional aim is to analyse the impact of day of week and temperature on forecast accuracy. The models are evaluated on both long- and mid-term forecasting horizons, ranging from one day to one month. The results show that neural networks are superior for this task, although stochastic seasonal ARMA models also perform quite well. Including external variables only marginally improved the ARMA predictions, and had somewhat unclear effects on the RNN forecasting accuracy. Depending on the network model used, adding external variables had either a slightly positive or slightly negative impact on prediction accuracy.
|
473 |
Developing a SARIMAX model for monthly wind speed forecasting in the UKKritharas, Petros January 2014 (has links)
Wind is a fluctuating source of energy and, therefore, it can cause several technical impacts. These can be tackled by forecasting wind speed and thus wind power. The introduction of several statistical models in this field of research has brought to light promising results for improving wind speed predictions. However, there is not converging evidence on which is the optimal method. Over the last three decades, significant research has been carried out in the field of short-term forecasting using statistical models though less work focuses on longer timescales. The first part of this work concentrated on long-term wind speed variability over the UK. Two subsets have been used for assessing the variability of wind speed in the UK on both temporal and spatial coverage over a period representative of the expected lifespan of a wind farm. Two wind indices are presented with a calculated standard deviation of 4% . This value reveals that such changes in the average UK wind power capacity factor is equal to 7%. A parallel line of the research reported herein aimed to develop a novel statistical forecasting model for generating monthly mean wind speed predictions. It utilised long-term historic wind speed records from surface stations as well as reanalysis data. The methodology employed a SARIMAX model that incorporated monthly autocorrelation of wind speed and seasonality, and also included exogenous inputs. Four different cases were examined, each of which incorporated different independent variables. The results disclosed a strong association between the independent variables and wind speed showing correlations up to 0.72. Depending on each case, this relationship occurred from 4- up to 12-month lags. The inter comparison revealed an improvement in the forecasting accuracy of the proposed model compared to a similar model that did not take into account exogenous variables. This finding demonstrates the indisputable potential of using a SARIMAX for long-term wind speed forecasting.
|
474 |
Network inference and data-based modelling with applications to stock market time seriesElsegai, Heba January 2015 (has links)
The inference of causal relationships between stock markets constitutes a major research topic in the field of financial time series analysis. A successful reconstruction of the underlying causality structure represents an important step towards the overall aim of improving stock market price forecasting. In this thesis, I utilise the concept of Granger-causality for the identification of causal relationships. One major challenge is the possible presence of latent variables that affect the measured components. An instantaneous interaction can arise in the inferred network of stock market relationships either spuriously due to the existence of a latent confounder or truly as a result of hidden agreements between market players. I investigate the implications of such a scenario; proposing a new method that allows for the first time to distinguish between instantaneous interactions caused by a latent confounder and those resulting from hidden agreements. Another challenge is the implicit assumption of existing Granger-causality analysis techniques that the interactions have a time delay either equal to or a multiple of the observed data. Two sub-cases of this scenario are discussed: (i) when the collected data is simultaneously recorded, (ii) when the collected data is non-simultaneously recorded. I propose two modified approaches based on time series shifting that provide correct inferences of the complete causal interaction structure. To investigate the performance of the above mentioned method improvements in predictions, I present a modified version of the building block model for modelling stock prices allowing causality structure between stock prices to be modelled. To assess the forecasting ability of the extended model, I compare predictions resulting from network reconstruction methods developed throughout this thesis to predictions made based on standard correlation analysis using stock market data. The findings show that predictions based on the developed methods provide more accurate forecasts than predictions resulting from correlation analysis.
|
475 |
RAINFALL-RUNOFF MODELING OF FLASH FLOODS IN SEMI-ARID WATERSHEDSMichaud, Jene Diane 06 1900 (has links)
Flash floods caused by localized thunderstorms are a natural hazard
of the semi -arid Southwest, and many communities have responded by
installing ALERT flood forecasting systems. This study explored a
rainfall- runoff modeling approach thought to be appropriate for
forecasting in such watersheds. The kinematic model KINEROS was
evaluated because it is a distributed model developed specifically for
desert regions, and can be applied to basins without historic data.
This study examined the accuracy of KINEROS under data constraints
that are typical of semi -arid ALERT watersheds. The model was validated
at the 150 km2, semi -arid Walnut Gulch experimental watershed. Under the
conditions examined, KINEROS provided poor simulations of runoff volume
and peak flow, but good simulations of time to peak. For peak flows, the
standard error of estimate was nearly 100% of the observed mean.
Surprisingly, when model parameters were based only on measurable
watershed properties, simulated peak flows were as accurate as when
parameters were calibrated on some historic data. The accuracy of
KINEROS was compared to that of the SCS model. When calibrated, a
distributed SCS model with a simple channel loss component was as
accurate as KINEROS.
Reasons for poor simulations were investigated by examining a)
rainfall sampling errors, b) model sensitivity and dynamics, and c)
trends in simulation accuracy. The cause of poor simulations was divided
between rainfall sampling errors and other problems. It was found that
when raingage densities are on the order of 1/20 km2, rainfall sampling errors preclude the consistent and reliable simulation of runoff from
localized thunderstorms. Even when rainfall errors were minimized,
accuracy of simulations were still poor. Good results, however, have
been obtained with KINEROS on small watersheds; the problem is not
KINEROS itself but its application at larger scales.
The study also examined the hydrology of thunderstorm -generated
floods at Walnut Gulch. The space -time dynamics of rainfall and runoff
were characterized and found to be of fundamental importance. Hillslope
infiltration was found to exert a dominant control on runoff, although
flow hydraulics, channel losses, and initial soil moisture are also
important. Watershed response was found to be nonlinear.
|
476 |
BAYES RISK ANALYSIS OF REGIONAL REGRESSION ESTIMATES OF FLOODSMetler, William Arledge 02 1900 (has links)
This thesis defines a methodology for the evaluation of the
worth of streamflow data using a Bayes risk approach. Using regional
streamflow data in a regression analysis, the Bayes risk can be computed
by considering the probability of the error in using the regionalized
estimates of bridge or culvert design parameters. Cost curves for over-
and underestimation of the design parameter can be generated based on
the error of the estimate. The Bayes risk can then be computed by integrating
the probability of estimation error over the cost curves. The
methodology may then be used to analyze the regional data collection effort
by considering the worth of data for a record site relative to the
other sites contributing to the regression equations.
The methodology is illustrated by using a set of actual streamflow
data from Missouri. The cost curves for over- and underestimation
of the streamflow design parameter for bridges and culverts are hypothesized
so that the Bayes risk might be computed and the results of the
analysis discussed. The results are discussed by demonstrating small
sample bias that is introduced into the estimate of the design parameter
for the construction of bridges and culverts. The conclusions are that
the small sample bias in the estimation of large floods can be substantial
and that the Bayes risk methodology can evaluate the relative worth
of data when the data are used in regionalization.
|
477 |
How to calculate forecast accuracy for stocked items with a lumpy demand : A case study at Alfa LavalRagnerstam, Elsa January 2016 (has links)
Inventory management is an important part of a good functioning logistic. Nearly all the literature on optimal inventory management uses criteria of cost minimization and profit maximization. To have a well functioning forecasting system it is important to have a balance in the inventory. But, it exist different factors that can results in uncertainties and difficulties to maintain this balance. One important factor is the customers’ demand. Over half of the stocked items are in stock to prevent irregular orders and an uncertainty demand. The customers’ demand can be categorized into four categories: Smooth, Erratic, Intermittent and Lumpy. Items with a lumpy demand i.e. the items that are both intermittent and erratic are the hardest to manage and to forecast. The reason for this is that the quantity and demand for these items varies a lot. These items may also have periods of zero demand. Because of this, it is a challenge for companies to forecast these items. It is hard to manage the random values that appear at random intervals and leaving many periods with zero demand. Due to the lumpy demand, an ongoing problem for most organization is the inaccuracy of forecasts. It is almost impossible to predict exact forecasts. It does not matter how good the forecasts are or how complex the forecast techniques are, the instability of the markets confirm that the forecasts always will be wrong and that errors therefore always will exist. Therefore, we need to accept this but still work with this issue to keep the errors as minimal and small as possible. The purpose with measuring forecast errors is to identify single random errors and systematic errors that show if the forecast systematically is too high or too low. To calculate the forecast errors and measure the forecast accuracy also helps to dimensioning how large the safety stock should be and control that the forecast errors are within acceptable error margins. The research questions answered in this master thesis are: How should one calculate forecast accuracy for stocked items with a lumpy demand? How do companies measure forecast accuracy for stocked items with a lumpy demand, which are the differences between the methods? What kind of information do one need to apply these methods? To collect data and answer the research questions, a literature study have been made to compare how different researchers and authors write about this specific topic. Two different types of case studies have also been made. Firstly, a benchmarking process was made to compare how different companies work with this issue. And secondly, a case study in form of a hypothesis test was been made to test the hypothesis based on the analysis from the literature review and the benchmarking process. The analysis of the hypothesis test finally generated a conclusion that shows that a combination of the measurements WAPE, Weighted Absolute Forecast Error, and CFE, Cumulative Forecast Error, is a solution to calculate forecast accuracy for items with a lumpy demand. The keywords that have been used to search for scientific papers are: lumpy demand, forecast accuracy, forecasting, forecast error.
|
478 |
The study of the combination of technical analysis and qualitative model in financial forecasting李寶昇, Li, Po-sing. January 1998 (has links)
published_or_final_version / Business Administration / Master / Master of Business Administration
|
479 |
Alternative methods of raw product valuation for agricultural cooperatives : a forecasting approachWiese, Arthur Michael 10 June 1985 (has links)
Raw product value of vegetables for processing in the
Northwest used to be established by a competitive market
involving proprietary processors and growers. Due to the
relocation of proprietary processors to the Midwest, this
competitive market has eroded forcing cooperative processors
to seek other means to set raw product values. In the
absence of a competitive market for raw product,
cooperatives must rely on an average of last year's prices
paid by processors in a given region to value raw product.
This method of lagged averages may be resulting in
misallocated contracted acreage to grower-members of
cooperatives, and inappropriate production levels of the
processed good given market conditions. Therefore, the
principal objective of this research is to develop and
evaluate alternative methods of forecasting raw product
value.
Since the market for processed vegetables at the
retail level is competitive, one alternative method employed
was to use a forecast of supply and determinants of demand
affecting retail price to forecast raw product value. These
explanatory variables were regressed against raw product
values of various crops obtained from a northwest processing
and marketing cooperative. The raw product values were
expressed as net returns/acre to the crops under
investigation. The estimated equations, which had adjusted
R²'s ranging from .267 to .851, were used to forecast raw
product value. A second forecasting method investigated in
this study was an exponential smoothing model.
Raw product value forecasts were generated over two
different time horizons, identified by the cooperatives'
accounting procedures. The two alternative forecasting
methods were compared to each other, and to the method
currently in use by the cooperative, with the aim of
determining the most accurate forecasting technique.
Results showed that both the econometric and smoothing
approaches fit the data better over the estimation period
than did a naive lagged price estimate resembling the
present method in use by the cooperative. The econometric
method also fit the data better than did the smoothing
approach.
The econometric model provided poor forecasts for the
longer forecast horizon, but proved to be effective in the
shorter. The smoothing technique forecasted more effectively
in the longer forecast horizon as compared with the shorter.
These results suggest the importance of the forecast horizon
in determining the more appropriate forecasting technique.
Both forecasting techniques proposed in this study
produced forecasts which were more accurate than the
cooperative's present method at least half of the time. This
suggests that viable alternatives to the present method of
establishing raw product value exist for agricultural
cooperatives. / Graduation date: 1986
|
480 |
Long-range predictors for saccadic eye movements.Wu, Chao-Yen. January 1988 (has links)
To predict the final eye position in the middle of a saccadic eye movement will require long-range prediction. This dissertation investigated techniques for doing this. Many important results about saccadic eye movements and current prediction techinques were reviewed. New prediction techinques have been developed and tested for real saccadic data in computer. Three block processing predictors, two-point linear predictor (TPLP), five-point quadratic predictor (FPQP), and nine-point cubic predictor (NPCP), were derived based on the matrix approach. A different approach to deriving the TPLP, FPQP, and NPCP based on the difference equation was also developed. The difference equation approach is better than the matrix approach because it is not necessary to compute the matrix inversion. Two polynomial predictors: the polynomial-filter predictor 1 (PFP1), which is a linear combination of a TPLP and an FPQP, and the polynomial-filter predictor 2 (PFP2), which is a linear combination of a TPLP, and FPQP, and an NPCP, were also derived. Two recursive predictors: the recursive-least-square (RLS) predictor and the least-mean-square (LMS) predictor, were derived. Results show that the RLS and LMS predictors perform better than TPLP, FPQP, NPCP, PFP1, and PFP2 in the prediction of saccadic eye movements. A mathematical way of verifying the accuracy of the recursive-least-square predictor was developed. This technique also shows that the RLS predictor can be used to identify a signal. Results show that a sinusoidal signal can be described as a second-order difference equation with coefficients 2cosω and -1. In the same way, a cubic signal can be realized as a fourth-order difference equation with coefficients 4, -6, 4, and -1. A parabolic signal can be written as a third-order difference equation with coefficients 3, -3, and 1. And a triangular signal can be described as a second-order difference equation with coefficients 2 and -1. In this dissertation, all predictors were tested with various signals such as saccadic eye movements, ECG, sinusoidal, cubic, triangular, and parabolic signals. The FFT of these signals were studied and analyzed. Computer programs were written in systems language C and run on UNIX supported minicomputer VAX11/750. Results were discussed and compared to that of short-range prediction problems.
|
Page generated in 0.1578 seconds