Spelling suggestions: "subject:"forecast"" "subject:"dorecast""
51 |
Empirical analysis of Italian electricity market / L'analyse empirique du marché italien de l'électricitéArdian, Faddy 01 July 2016 (has links)
Dérégulation du marché de l'électricité a affiché de nombreux changements dans l'économie et a influencé les chercheurs à initier des études dans ce domaine. Italie fournit une étude de cas intéressante pour explorer le marché de l'électricité en raison de ses spécifications. Notre projet se compose de trois études quantitatives indépendant pour voir le marché de l'électricité Italien en trois angles différents. La première étude permet de répondre la question de la prévision causée par la volatilité du marché de l'électricité. Le résultat suggère une méthode de prévision alternative pour la modélisation de prix de l'électricité sur l'Italie. La deuxième recherche examine l'impact des énergies renouvelables sur l'apparition de la congestion et ses coûts. Nous analysons les propriétés quantitatives de l'estimation économétrique afin de comprendre le mécanisme économique et d'en tirer la suggestion de la politique. Enfin, la recherche finale analyse l'interdépendance des prix dans six macro-zones du marché italien de l'électricité. / Deregulation of electricity market has displayed many changes in the economy and has influenced researchers to initiate studies in this field. The issue of the deregulated arises as the volatility of the wholesale price increases because of the new mechanism in price determination. Italy provides an interesting case study for exploring the electricity market because of its specifications. Our project consists of three independent quantitative research to view the Italian electricity market in three different angles. The first study is aimed to address the forecasting issue caused by the volatility of electricity market. The result suggests alternative forecasting method for modelling electricity price on Italy and comparisons between univariate and panel framework. The second research examines the impact of renewable energy on the congestion occurrence and cost. We analyse quantitative properties of the econometric estimation in order to gain insight into the economic mechanism and to draw policy suggestion. Finally, the final research attempts to address the interdependence of prices in six macro-zones of the Italian electricity market.
|
52 |
Changing view on future population development of the Republic of Kazakhstan according to the United Nations World Population Prospects since the 1992 till the 2008 revisionKerembayev, Anuar January 2010 (has links)
Changing view on future population development of the Republic of Kazakhstan according to the United Nations World Population Prospects since the 1992 till the 2008 revision Abstract The objective of this study is to analyze changing view on future population development and to assess the reliability of the forecast results produced for Kazakhstan by the United Nations World Population Prospects since the 1992 revision with the respect to population development components: fertility, mortality, migration and by principal results: total population, age and sex structure and by some other demographic indicators. In the first part, the United Nations forecasting methodology and its applicability to the Kazakhstan population development, basic conceptual framework and its use are being explored. Analysis of the main forecast results is in the second part; nevertheless the fact that some findings based on these results support the general view on the population development by the United Nations World Population Prospects, some other findings are impugned. Furthermore, the impact of transition period (1990-2000) to the development of population by components and its influence to the age and sex composition has been proved. In this thesis the connection between some scientific disciplines, economics and...
|
53 |
Improving the PEG ratioI'Ons, Trevor Andrew 17 April 2011 (has links)
The effectiveness of the PEG ratio as a valuation tool has been a topical debate between market commentators ever since being popularised by Lynch (1989). This study examines the appropriateness of the fair value criteria of 1.0 (PEGL) in comparison with a time-series based share specific benchmarking model (PEGT). Furthermore, influencing factors of analyst forecasting accuracy, namely: the number of analyst contributions, forecast dispersion and forecast horizon, were tested and compared using sub-set portfolios for each category with the objective of identifying a possible optimal PEG trading rule strategy. The outcome showed a consistent outperformance of PEGT portfolios compared to PEGL portfolios and the market benchmark. Unexpected results were obtained for the impact of analyst forecasts on the performance of the PEG ratio with additional literature review providing possible reasons that analyst optimism may have a more influencing impact on the PEG ratio than forecasting accuracy. Finally, an optimised PEG trading rule strategy delivered annual abnormal returns of 5.4% (CAGR: 19.7%) for a PEGL portfolio, versus that of 13.7% (CAGR: 28.5%) for a PEGT portfolio. The ensuing methodology appeared to single out small cap firms with above market growth prospects. Copyright / Dissertation (MBA)--University of Pretoria, 2010. / Gordon Institute of Business Science (GIBS) / unrestricted
|
54 |
HYDROLOGIC VARIABILITY WITHIN THE CLIMATE REGIONS OF CONTINENTAL UNITED STATES AND ITS TELECONNECTION WITH CLIMATE VARIABLESThakur, Balbhadra 01 September 2020 (has links)
The entropy of all systems is supposed to increase with time, this is also observed in the hydroclimatic records as increased variability. The current dissertation is primarily focused on the hydrologic variability of the hydrologic records in the climate regions across Continental United States. The study evaluated the effects of serial correlation in the historical streamflow records on both gradual trend and abrupt shift in streamflow. The study also evaluated the trend before and after the shift occurrence to validate whether the observed changes in streamflow is a result of long-term variability or climate regime shift. Secondly, the current dissertation evaluated the variability within western US hydrology which is highly driven by the oscillation of Pacific Ocean such as El Niño – Southern Oscillation (ENSO). The dissertation evaluated the variability in snow water equivalent (SWE) of western US as the winter snow accumulation of the region drives the spring-summer streamflow in the region which contributes to the major portion of yearly streamflow. The SWE variability during the individual phases of ENSO were analyzed to reveal the detailed influence of ENSO on historic snow accumulations. The study is not solely limited to the hydrologic variability evaluation rather; it also delves into obtaining the time lagged spatiotemporal teleconnections between large scale climate variables and streamflow and forecast the later based on the obtained teleconnections. To accomplish the research goals the current dissertation was subdivided into three research tasks. First task dealt with the streamflow records of 419 unimpaired streamflow records which were grouped into seven climate regions based on National Climate Assessment, to evaluate the regional changes in both seasonal streamflow and yearly streamflow percentiles. Non-parametric Mann-Kendall test and Pettitt’s test were utilized to evaluate the streamflow variability as gradual trend and abrupt shift, respectively. Walker test was performed to test the global significance of the streamflow variability within each climate regions based on local trend and shift significance of each streamflow stations. The task also evaluated the presence of serial correlation in the streamflow records and its effects on both trend and shift within the climate regions of continental United States for the first time. Maximum variability in terms of both trend and shift were observed for summer as compared to other seasons. Similarly, greater number of stations showed streamflow variability for 5th and 50th percentile streamflow as compared to 95th and 100th percentile streamflow. It was also observed that serial correlation affected both trend and step while, accounting for the lag-1 autocorrelation improved shift results. The results indicated that the streamflow variability has more likely occurred as shift as compared to the gradual trend. The outcomes of the current result detailing historic variability may help to envision future changes in streamflow. The second task evaluated the spatiotemporal variability of western US SWE over 58 years (1961–2018) as a trend and a shift. The task tested whether the SWE is consistent during ENSO phases utilizing the Kolmogorov – Smirnov (KS) test. Trend analysis was performed on the SWE data of each ENSO phase. Shift analysis was performed in the entire time series of 58 years. Additionally, the trend in the SWE data was evaluated before and after shift years. Mann- Kendal and Pettit's tests were utilized for the detection of trend and shift, respectively. The serial correlation was considered during the trend evaluation, while Thiel-Sen approach was used for the evaluation of the trend magnitude. The serial correlation in time series which is the potential cause of overestimation and underestimation of the trend evaluation was found to be absent in the SWE data. The results suggested a negative trend and a shift during the study period. The negative trend was absent during neutral years and present during El Niño and La Niña years. The trend magnitudes were maximum during La Niña years followed by those during El Niño years and the entire length of the data. It was also observed that if the presence of negative shift in the SWE was considered, then most of the stations did not show a significant trend before and after the occurrence of a shift. The third task forecasted the streamflow at a regional scale within Sacramento San Joaquin (SSJ) River Basin with largescale climate variables. SSJ is an agricultural watershed located in the drought sensitive region of California. The forecast techniques involved a hybrid statistical framework that eliminates the bias resulting from predefined indices at regional scale. The study was performed for eight unimpaired streamflow stations from 1962 to 2016. First, the Singular Valued Decomposition (SVD) teleconnections of the streamflow corresponding to 500 mbar geopotential height, sea surface temperature, 500 mbar specific humidity (SHUM500), and 500 mbar U-wind (U500) were obtained. Second, the skillful SVD teleconnections were screened non-parametrically. Finally, the screened teleconnections were used as the streamflow predictors in the non-linear regression models (K-nearest neighbor regression and data-driven support vector machine). The SVD results identified new spatial regions that have not been included in existing predefined indices. The nonparametric model indicated the teleconnections of SHUM500 and U500 being better streamflow predictors compared to other climate variables. The regression models were capable to apprehend most of the sustained low flows, proving the model to be effective for drought-affected regions. It was also observed that the forecasting approach showed better forecasting skills with preprocessed large-scale climate variables rather than using the predefined indices. The techniques involved in this task was simple, yet robust in providing qualitative streamflow forecasts that may assist water managers in making policy-related decisions when planning and managing watersheds.
|
55 |
Population Forecasting for the Town of AncasterAllemang, Mark January 1986 (has links)
<p> This paper applies a cohort survival model to an age-and sex-disagqregated 1985 'base' population of Ancaster. Using a fortran programme, low, high, and 'most probable' projections were made for a 1986 to 2001 time horizon. The migration component was found to be the single most important projection variable. Consequently, only migration was varied between the three sets of projections. In analyzing migration for Ancaster, we identified a persistent trend in net migration over the 1971 to 1985 period. This finding allowed us to apply the 1985 male and female age profiles of net migration to the in-migrants. Thus, this study more accurately quantified net Migration than previous studies. </p> / Thesis / Bachelor of Arts (BA)
|
56 |
Using Machine Learning Techniques to Improve Operational Flash Flood ForecastingDella Libera Zanchetta, Andre January 2022 (has links)
Compared with other types of floods, timely and accurately predicting flash floods is particularly challenging due to the small spatiotemporal scales in which the hydrologic and hydraulic processes tend to develop, and to the short lead time between the causative event and the inundation scenario. With continuous increased availability of data and computational power, the interest in applying techniques based on machine learning for hydrologic purposes in the context of operational forecasting has also been increasing. The primary goal of the research activities developed in the context of this thesis is to explore the use of emerging machine learning techniques for enhancing flash flood forecasting. The studies presented start with a review on the state-of-the-art of documented forecasting systems suitable for flash floods, followed by an assessment of the potential of using multiple concurrent precipitation estimates for early prediction of high-discharge scenarios in a flashy catchment. Then, the problem of rapidly producing realistic highresolution flood inundation maps is explored through the use of hybrid machine learning models based on Non-linear AutoRegressive with eXogenous inputs (NARX) and SelfOrganizing Maps (SOM) structures as surrogates of a 2D hydraulic model. In this context, the use of k-fold ensemble is proposed and evaluated as an approach for estimating uncertainties related to the surrogating of a physics-based model.
The results indicate that, in a small and flashy catchment, the abstract nature of data processing in machine learning models benefits from the presentation of multiple concurrent precipitation products to perform rainfall-runoff simulations when compared to the business-as-usual single-precipitation approach. Also, it was found that the hybrid NARX-SOM models, previously explored for slowly developing flood scenarios, present acceptable performances for surrogating high-resolution models in rapidly evolving inundation events for the production of both deterministic and probabilistic inundation maps in which uncertainties are adequately estimated. / Thesis / Doctor of Science (PhD) / Flash floods are among the most hazardous and impactful environmental disasters faced by different societies across the globe. The timely adoption of mitigation actions by decision makers and response teams is particularly challenging due to the rapid development of such events after (or even during) the occurrence of an intense rainfall. The short time interval available for response teams imposes a constraint for the direct use of computationally demanding components in real-time forecasting chains. Examples of such are high-resolution 2D hydraulic models based on physics laws, which are capable to produce valuable flood inundation maps dynamically. This research explores the potential of using machine learning models to reproduce the behavior of hydraulic models designed to simulate the evolution of flood inundation maps in a configuration suitable for operational flash flood forecasting application. Contributions of this thesis include (1) a comprehensive literature review on the recent advances and approaches adopted in operational flash flood forecasting systems with the identification and the highlighting of the main research gaps on this topic, (2) the identification of evidences that machine learning models have the potential to identify patterns in multiple quantitative precipitation estimates from different sources for enhancing the performance of rainfall-runoff estimation in urban catchments prone to flash floods, (3) the assessment that hybrid data driven structures based on self-organizing maps (SOM) and nonlinear autoregressive with exogenous inputs (NARX), originally proposed for large scale and slow-developing flood scenarios, can be successfully applied on flashy catchments, and (4) the proposal of using k-folding ensemble as a technique to produce probabilistic flood inundation forecasts in which the uncertainty inherent to the surrogating step is represented.
|
57 |
Forecasting Commodity Production SpreadXiaoyu Hu (18431343) 26 April 2024 (has links)
<p dir="ltr">This paper examines the resilience of global food and energy supply chains against the background of recent world disruptions such as China-US trade war, novel coronavirus disease 2019 (COVID-19) pandemic, and Russia’s incursion into Ukraine. It aims at improving forecast methodologies and providing early indications of market stressors by considering three key cracks or spreads within the food and energy industries soy crush spread, crude crack spread, and cattle finish spread. The study uses Autoregressive Integrated Moving Average (ARIMA), Exponential Smoothing State Space (ETS) and Vector Error Correction Model (VECM). The profit relationships are examined in these models with regard to potential problems for supply chains in the soybean crushing industry, cattle finishing, and crude oil refining sectors. It also compares forecasting approaches like univariate (ARIMA & ETS) and multivariate (VECM). This means that it tries to gauge how accurate each one is in predicting where a given sector may be heading or where there are risks likely to happen. The situation is further complicated by on-going capacity expansions in these sectors which are expected to face more challenges due to geopolitical tensions as well as efforts to mitigate climate change internationally.The overall goal of the research is to develop forecasting methods to help industry participants, policymakers, and small producers make informed decisions amid volatility and the threat of imminent supply chain disruptions.</p>
|
58 |
On the 3 M's of Epidemic Forecasting: Methods, Measures, and MetricsTabataba, Farzaneh Sadat 06 December 2017 (has links)
Over the past few decades, various computational and mathematical methodologies have been proposed for forecasting seasonal epidemics. In recent years, the deadly effects of enormous pandemics such as the H1N1 influenza virus, Ebola, and Zika, have compelled scientists to find new ways to improve the reliability and accuracy of epidemic forecasts. The improvement and variety of these prediction methods are undeniable. Nevertheless, many challenges remain unresolved in the path of forecasting the outbreaks using surveillance data. Obtaining the clean real-time data has always been an obstacle. Moreover, the surveillance data is usually noisy and handling the uncertainty of the observed data is a major issue for forecasting algorithms. Correct modeling assumptions regarding the nature of the infectious disease is another dilemma. Oversimplified models could lead to inaccurate forecasts, whereas more complicated methods require additional computational resources and information. Without those, the model may not be able to converge to a unique optimum solution. Through the last decade, there has been a significant effort towards achieving better epidemic forecasting algorithms. However, the lack of standard, well-defined evaluating metrics impedes a fair judgment on the proposed methods.
This dissertation is divided into two parts. In the first part, we present a Bayesian particle filter calibration framework integrated with an agent-based model to forecast the epidemic trend of diseases like flu and Ebola. Our approach uses Bayesian statistics to estimate the underlying disease model parameters given the observed data and handle the uncertainty in the reasoning. An individual-based model with different intervention strategies could result in a large number of unknown parameters that should be properly calibrated. As particle filter could collapse in very large-scale systems (curse-of-dimensionality problem), achieving the optimum solution becomes more challenging. Our proposed particle filter framework utilizes machine learning concepts to restrain the intractable search space. It incorporates a smart analyzer in the state dynamics unit that examines the predicted and observed data using machine learning techniques to guide the direction and amount of perturbation of each parameter in the searching process.
The second part of this dissertation focuses on providing standard evaluation measures for evaluating epidemic forecasts. We present an end-to-end framework that introduces epidemiologically relevant features (Epi-features), error measures, and ranking schema as the main modules of the evaluation process. Lastly, we provide the evaluation framework as a software package named Epi-Evaluator and demonstrate the potentials and capabilities of the framework by applying it to the output of different forecasting methods. / PHD / Epidemics impose substantial costs to societies by deteriorating the public health and disrupting economic trends. In recent years, the deadly effects of wide-spread pandemics such as H1N1, Ebola, and Zika, have compelled scientists to find new ways to improve the reliability and accuracy of epidemic forecasts. The reliable prediction of future pandemics and providing efficient intervention plans for health care providers could prevent or control disease propagations. Over the last decade, there has been a significant effort towards achieving better epidemic forecasting algorithms. The mission, however, is far from accomplished. Moreover, there has been no significant leap towards standard, well-defined evaluating metrics and criteria for a fair performance judgment between the proposed methods.
This dissertation is divided into two parts. In the first part, we present a Bayesian particle filter calibration framework integrated with an agent-based model to forecast the epidemic trend of diseases like flu and Ebola. We model the disease propagation via a large scale agent-based model that simulates the disease spread across the contact network of people. The contact network consists of millions of nodes and is constructed based on demographic information of individuals achieved from the census data. The agent-based model’s configurations are mostly unknown parameters that should be properly calibrated. We present a Bayesian particle filter calibration approach to estimate the underlying disease model parameters given the observed data and handle the uncertainty in the reasoning. As particle filter could collapse in very large-scale systems, achieving the optimum solution becomes more challenging. Our proposed particle filter framework utilizes machine learning concepts to restrain the intractable search space. It incorporates a smart analyzer unit that examines the predicted and observed data using machine learning techniques to guide the direction and amount of perturbation of each parameter in the searching process.
The second part of this dissertation focuses on providing standard evaluation measures for evaluating and comparing epidemic forecasts. We present a framework that introduces epidemiologically relevant features (Epi-features), error measures, and ranking schema as the main modules of the evaluation process. Lastly, we provide the evaluation framework as a software package named Epi-Evaluator and demonstrate the potentials and capabilities of the framework by applying it to the output of different forecasting methods.
|
59 |
Futures-Based Forecasts of U.S. Crop PricesZhu, Jiafeng 03 October 2017 (has links)
Over the last decade, U.S. crop prices have become significantly more volatile. Volatile markets pose increased risks for the agricultural market participants and create a need for reliable price forecasts. Research discussed in this paper aims to find different approaches to forecast crop cash prices based on the prices of related futures contracts.
Corn, soybeans, soft red winter wheat, and cotton are the focus of this research. Since price data for these commodities is non-stationary, this paper used two approaches to solve this problem. The first approach is to forecast the difference in prices between current and future period and the second is to use the regimes. This paper considers the five-year moving average approach as the benchmark when comparing these approaches.
This research evaluated model performance using R-squared, mean errors, root mean squared errors, the modified Diebold-Mariano test, and the encompassing test. The results show that both the difference model and the regime model render better performance than the benchmark in most cases, but without a significant difference between each other. Based on these findings, the regime model was used to make forecasts of the cash prices of corn and soybeans, the difference model was used to make predictions for cotton, and the benchmark was used to forecast the SRW cash price. / Master of Science / This research attempts to develop models to forecast cash prices of corn, soybeans, wheat and cotton using the underlying futures prices. Two alternative approaches are proposed. The difference model focuses on forecasting the differences between current and future time prices. The regime model uses external data to determine potential structural breaks in price relationships. The out-of-sample performance of these models is compared to the benchmark of a five-year average using various performance criteria. The results show that the regime model performs better for corn and soybeans, while the difference model is the best one for cotton. For wheat, the results are mixed, but the benchmark seems to show better performance than the proposed models.
|
60 |
Prévisions d'ensemble à l'échelle saisonnière : mise en place d'une dynamique stochastique / Ensemble predictions at the seasonal time scale : implementation of a stochastic dynamics techniqueSaunier-Batté, Lauriane 23 January 2013 (has links)
La prévision d'ensemble à l'échelle saisonnière avec des modèles de circulation générale a connu un essor certain au cours des vingt dernières années avec la croissance exponentielle des capacités de calcul, l'amélioration de la résolution des modèles, et l'introduction progressive dans ceux-ci des différentes composantes (océan, atmosphère, surfaces continentales et glace de mer) régissant l'évolution du climat à cette échelle. Malgré ces efforts, prévoir la température et les précipitations de la saison à venir reste délicat, non seulement sur les latitudes tempérées mais aussi sur des régions sujettes à des aléas climatiques forts comme l'Afrique de l'ouest pendant la saison de mousson. L'une des clés d'une bonne prévision est la prise en compte des incertitudes liées à la formulation des modèles (résolution, paramétrisations, approximations et erreurs). Une méthode éprouvée est l'approche multi-modèle consistant à regrouper les membres de plusieurs modèles couplés en un seul ensemble de grande taille. Cette approche a été mise en œuvre notamment dans le cadre du projet européen ENSEMBLES, et nous montrons qu'elle permet généralement d'améliorer les rétro-prévisions saisonnières des précipitations sur plusieurs régions d'Afrique par rapport aux modèles pris individuellement. On se propose dans le cadre de cette thèse d'étudier une autre piste de prise en compte des incertitudes du modèle couplé CNRM-CM5, consistant à ajouter des perturbations stochastiques de la dynamique du modèle d'atmosphère ARPEGE-Climat. Cette méthode, baptisée “dynamique stochastique”, consiste à introduire des perturbations additives de température, humidité spécifique et vorticité corrigeant des estimations d'erreur de tendance initiale du modèle. Dans cette thèse, deux méthodes d'estimation des erreurs de tendance initiale ont été étudiées, basées sur la méthode de nudging (guidage) du modèle vers des données de référence. Elles donnent des résultats contrastés en termes de scores des rétro-prévisions selon les régions étudiées. Si on estime les corrections d'erreur de tendance initiale par une méthode de nudging itéré du modèle couplé vers les réanalyses ERA-Interim, on améliore significativement les scores sur l'hémisphère Nord en hiver en perturbant les prévisions saisonnières en tirant aléatoirement parmi ces corrections. Cette amélioration est accompagnée d'une nette réduction des biais de la hauteur de géopotentiel à 500 hPa. Une rétro-prévision en utilisant des perturbations dites“optimales” correspondant aux corrections d'erreurs de tendance initiale du mois en cours de prévision montre l'existence d'une information à l'échelle mensuelle qui pourrait permettre de considérablement améliorer les prévisions. La dernière partie de cette thèse explore l'idée d'un conditionnement des perturbations en fonction de l'état du modèle en cours de prévision, afin de se rapprocher si possible des améliorations obtenues avec ces perturbations optimales / Over the last twenty years, research in ensemble predictions at a seasonal timescale using general circulation models has undergone a considerable development due to the exponential growth rate of computing capacities, the improved model resolution and the introduction of more and more components (ocean, atmosphere, land surface and sea-ice) that have an impact on climate at this time scale. Regardless of these efforts, predicting temperature and precipitation for the upcoming season is a difficult task, not only over mid-latitudes but also over regions subject to high climate risk, like West Africa during the monsoon season. One key to improving predictions is to represent model uncertainties (due to resolution, parametrizations, approximations and model error). The multimodel approach is a well-tried method which consists in pooling members from different individual coupled models into a single superensemble. This approach was undertaken as part of the European Commission funded ENSEMBLES project, and we find that it usually improves seasonal precipitation re-forecasts over several regions of Africa with respect to individual model predictions. The main goal of this thesis is to study another approach to addressing model uncertainty in the global coupled model CNRM-CM5, by adding stochastic perturbations to the dynamics of the atmospheric model ARPEGE-Climat. Our method, called “stochastic dynamics”, consists in adding additive perturbations to the temperature, specific humidity and vorticity fields, thus correcting estimations of model initial tendency errors. In this thesis, two initial tendency error estimation techniques were studied, based on nudging the model towards reference data. They yield different results in terms of re-forecast scores, depending on the regions studied. If the initial tendency error corrections are estimated using an iterative nudging method towards the ERA-Interim reanalysis, seasonal prediction scores over the Northern Hemisphere in winter are significantly improved by drawing random corrections. The 500 hPa geopotential height is also clearly reduced. A re-forecast using “optimal” perturbations drawn within the initial tendency error corrections from the current forecast month shows that useful information at a monthly timescale exists, and could allow significant forecast improvement. The last part of this thesis focuses on the idea of classifying the model perturbations according to its current state during the forecast, in order to take a step closer (if possible) to the improvements noted with these optimal perturbations
|
Page generated in 0.0394 seconds