• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 6
  • Tagged with
  • 6
  • 6
  • 6
  • 3
  • 3
  • 3
  • 3
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Analytic Long Term Forecasting with Periodic Gaussian Processes / Analytic Long Term Forecasting with Periodic Gaussian Processes

Ghassemi, Nooshin Haji January 2014 (has links)
In many application domains such as weather forecasting, robotics and machine learning we need to model, predict and analyze the evolution of periodic systems. For instance, time series applications that follow periodic patterns appear in climatology where the CO2 emissions and temperature changes follow periodic or quasi-periodic patterns. Another example can be in robotics where the joint angle of a rotating robotic arm follows a periodic pattern. It is often very important to make long term prediction of the evolution of such systems. For modeling and prediction purposes, Gaussian processes are powerful methods, which can be adjusted based on the properties of the problem at hand. Gaussian processes belong to the class of probabilistic kernel methods, where the kernels encode the characteristics of the problems into the models. In case of the systems with periodic evolution, taking the periodicity into account can simplifies the problem considerably. The Gaussian process models can account for the periodicity by using a periodic kernel. Long term predictions need to deal with uncertain points, which can be expressed by a distribution rather than a deterministic point. Unlike the deterministic points, prediction at uncertain points is analytically intractable for the Gaussian processes. However, there are approximation methods that allow for dealing with uncertainty in an analytic closed form, such as moment matching. However, only some particular kernels allow for analytic moment matching. The standard periodic kernel does not allow for analytic moment matching when performing long term predictions. This work presents an analytic approximation method for long term forecasting in periodic systems. We present a different parametrization of the standard periodic kernel, which allows us to approximate moment matching in an analytic closed form. We evaluate our approximate method on different periodic systems. The results indicate that the proposed method is valuable for the long term forecasting of periodic processes.
2

Modelling European Forest Products Consumption and Trade in a Context of Structural Change / Modélisation de la Consommation et du Commerce des Produits Forestiers en Europe dans un contexte de Changement Structurel

Rougieux, Paul 09 March 2017 (has links)
Les forêts de l'Union Européenne croissent de 1.2 milliards de m³ par an. La moitié de ce volume reste en forêt. L'autre moitié alimente trois filières industrielles: la filière matériaux, la filière papiers et la filière énergie. Ces flux de produits industriels sont mis en mouvement et financés par divers consommateurs. Or depuis 2000, la consommation change de régime, au point de perturber fortement certains flux de bois et d'impacter l'emploi et la balance commerciale du secteur. Pour prévoir l'impact de ces changements, les économistes modélisent les relations entre l'offre de matières premières, la demande de produits finis, les prix, la production et le commerce international. Cette thèse construit un modèle empirique à même d'évaluer l'impact de ces changements pour le secteur forêt-bois en Europe.Un chapitre introductif définit le contexte des ressources forestières et des produits analysés au niveau macroéconomique. Puis je présente les principaux modèles en équilibre partiel utilisés pour les études prospectives du secteur forêt-bois. A partir d'un cadre général incluant la production et le commerce international, je détaille les problèmes spécifiques rencontrés lors de l'estimation des fonctions de demande. Un deuxième chapitre étudie l'impact potentiel d'un accord commercial entre l'Union Européenne et les États-Unis sur le secteur forestier. Nous avons trouvé que le bien-être total augmenterait dans la région de l'accord et diminuerait légèrement ailleurs. De plus l'accord est plus avantageux pour les consommateurs que pour les producteurs. Les résultats montrent aussi que des pays tiers sont impactés par l'accord, ce qui souligne l'importance d'utiliser un modèle mondial. Dans un troisième chapitre, j'estime les élasticités prix et revenu de la demande en produits forestiers sur un panel de pays européens. Je traite des problèmes de non stationnarité en panel et j'estime les élasticités au sein de panels cointégrés. Les élasticités de demande sont inférieures aux estimations précédentes dans la littérature. Ces élasticités robustes insérées dans un modèle secteur forêt-bois projettent une demande plus faible sur une période de 20 ans. Dans un quatrième chapitre, j'analyse les changements structurels dans la consommation de papier. J'utilise un modèle économétrique sur données de panel permettant d'estimer les effets de seuil dans la relation entre l'utilisation des technologies de l'information et la consommation de papier: papier journal, papier d'impression et papier d'écriture. Je montre comment l'élasticité de demande de papier dépend de la pénétration d'internet dans la population. Un effet de seuil a lieu lorsque la majorité d'une population a accès à internet. Après le seuil, les coefficients liant la consommation et ses variables explicatives (prix et revenu) diminuent en valeur absolue ou changent de signe. A partir d'une projection du nombre d'utilisateurs d'internet par pays, les projections de consommation de papier pourraient être mises à jour avec ce type de modèles à transition. Une plus faible demande de papier libère des ressources et les rend disponibles pour le développement d'autres produits et services forestiers innovants. / Forests in the European Union grow by 1.2 billion m³ per year. Half of this volume stays in the forest, in particular for sustainable forest management purposes. The other half flows into three industrial sectors: wooden material, paper products and wood energy. These industrial product flows are set into motion and paid for by diverse final consumers. Since 2000, consumption is undergoing important structural changes which cause large disturbances in material, paper and fuel flows. To predict the impact of these changes, economists model relationships between raw material supply, final products demand, prices, production and international trade. This thesis uses panel data econometrics to estimate parameters of empirical models. An introductory chapter sets the policy context of forest resources and forest products of interest at a macroeconomic level. Then I review major forest sector models and I focus on issues encountered while estimating parameters of demand models. A second chapter investigates the potential impact of a trade agreement between the EU and the US on the forest sector. We found that total welfare would increase in the region of the agreement, in addition the agreement benefits more to consumers than to producers. Results show that third party countries are impacted by the agreement too, which highlights the importance of using a global trade model in analysing the impacts of the agreement. In a third chapter I estimate revenue and price elasticities of demand for forest products on a panel of European countries. I deal with non stationarity issues and estimate demand elasticities within cointegrated panels. I demonstrate that revenue elasticities of demand are lower than previous estimates from the literature. Simulations using these robust elasticities in a forest sector model, show a lower demand over a 20 years time horizon. In a fourth chapter, I analyse structural changes in paper products consumption. For this purpose, I use a panel threshold model to estimate the relationship between information technology use and paper products consumption: newsprint, printing and writing paper. I show how paper demand elasticities depend on internet penetration in the population. Thresholds occur once a majority of the population has access to the internet. After the threshold, coefficients between paper consumption and its explanatory variables revenue and price become smaller in absolute terms or even change sign. Based on projections of the number of internet users per country, paper consumption projections could be updated with this type of thresholds models. From a policy perspective, lower demand for graphics paper would free resources and make them available for innovative forest products and services.
3

Economic growth, unemployment and skills in South Africa : An Analysis of different recycling schemes of carbon tax revenue / Croissance, chômage et compétences en Afrique du Sud : Analyse de plusieurs plans de recyclage des revenus d’une taxe carbone

Schers, Jules 21 December 2018 (has links)
Cette thèse fournit une illustration numérique de la façon dont une taxe carbone pourrait affecter le PIB, l’emploi, les émissions de CO2 et les inégalités socio-économiques en Afrique du Sud. Elle utilise un modèle d’équilibre général calculable « hybride » en économie ouverte par projection en un seul pas de temps de 2005 à 2035. Le modèle représente des économies de second rang, notamment des rigidités sur le marché du travail liées aux niveaux de qualification et dans la production électrique. Sept scénarios basés sur des modalités différentes de recyclage de la taxe carbone sont analysés, plus une option d’investir une partie des revenus de la taxe dans l’amélioration des qualifications de la force de travail.L’analyse montre que sous hypothèse standard de changement technique, une taxe carbone de 100 ZAR par tonne de CO2 environ a peu d’impact négatif sur le PIB et l’emploi lorsqu’elle est associée à un mode de recyclage des revenus approprié : subventionner le facteur travail et réduire les taxes sur les profits des entreprises pourrait conduire aux meilleurs résultats macroéconomiques, mais ne réduit pas les inégalités. Des mesures supplémentaires sont nécessaires pour réduire la « pauvreté énergétique ». Pour atteindre le NDC d’Afrique du Sud au titre de l’Accord de Paris, un taux de taxe d’environ 300 ZAR ou 55$ par tonne de CO2 serait nécessaire. Toutefois, un tel taux pourrait avoir un impact significatif sur la croissance du PIB. En même temps, sans changement de la tendance de croissance de la productivité du travail, ce PIB plus faible conduirait à un chômage plus élevé que dans le cas de référence. Une politique d’investissement de 7.5 milliards de ZAR de revenus carbone dans les qualifications de la main d’œuvre, avec l’objectif d’augmenter l’accès à la formation de haut niveau et de réduire le manque de salariés très qualifiés, pourrait avoir un effet très positif sur la croissance du PIB.Le progrès technologique, les préférences des consommateurs et le contexte international, limitent la capacité de l’économie à se restructurer et se décarboner et incidemment à réduire les impacts négatifs de la taxe carbone sur la croissance du PIB. Une véritable évaluation du changement technologique futur serait pertinente pour tous les secteurs et facteurs de production. Ces résultats impliquent aussi que la politique climatique internationale doit traiter la question des transferts de technologie et celles des potentiels différents de décarbonation sérieuse à l’échelle nationale. / This PhD thesis gives a numerical illustration of how a carbon tax affects South African GDP, employment, CO2 emissions and socio-economic inequality. It uses a “hybrid” computable general equilibrium model of an open economy in a one-step projection from 2005 to 2035. It models second-best economies, notably skill-related rigidities in the labour market and in production of electricity. Seven scenarios for recycling of carbon tax revenue are analysed, plus an option to invest a part of tax revenue in improvement of skills of labour.The analysis shows that under conventional assumptions about technological change, a carbon tax of around 100 ZAR2005 (18 USD2013) per tonne of CO2 will have little negative consequences for GDP and employment, when combined with the right type of tax revenue recycling: Labour subsidies and company profit tax reduction likely lead to the best macro-economic outcomes, though do not reduce inequality. Additional measures are needed to reduce “energy poverty”. To achieve South Africa’s NDC of the Paris Agreement, a carbon tax rate of around 300 ZAR2005 or 55 USD2013 per tonne of CO2 is necessary. However, this could have serious impacts on GDP growth. Also, without a change in the trend of increasing labour productivity, such lower GDP will lead to higher unemployment than in the reference case. An investment in skills of 7.5 billion ZAR2005 of annual Ctax revenue, with the objective of increasing access to high quality education and reducing the high skill labour shortage, if fond to have a very positive impact on GDP growth. However better calibration data is required.The findings of this PhD thesis furthermore call for thorough examination of what type of technological change could be expected for South Africa. Technological progress, consumer preferences and international circumstances limit the economy’s capacity to restructure and decarbonise and therefore to reduce negative consequences of carbon taxation for GDP growth. Proper assessment of future technological change is relevant for all sectors and inputs. Examples are given which show that energy and materials efficiency have an important role for future GDP growth under carbon constraints, because they determine the economy’s flexibility to reduce energy consumption and to substitute it, e.g. by labour. This finding normally holds not only for South Africa, but also for the rest of the world. These results also imply that international climate policy has to address technology transfer and the different potentials of national economies to decarbonise seriously.
4

Time-series long-term forcasting for A/B tests

Jaunzems, Davis January 2016 (has links)
Den tekniska utvecklingen av datorenheter och kommunikationsverktyg har skapat möjligheter att lagra och bearbeta större mängder information än någonsin tidigare. För forskare är det ett sätt att göra mer exakta vetenskapliga upptäckter, för företag är det ett verktyg för att bättre förstå sina kunder, sina produkter och att skapa fördelar gentemot sina konkurrenter. Inom industrin har A/B-testning blivit ett viktigt och vedertaget sätt att skaffa kunskaper som bidrar till att kunna fatta datadrivna beslut. A/B-test är en jämförelse av två eller flera versioner för att avgöra vilken som fungerar bäst enligt förutbestämda mätningar. I kombination med informationsutvinning och statistisk analys gör dessa tester det möjligt att besvara ett antal viktiga frågor och bidra till övergången från att "vi tror" till att "vi vet". Samtidigt kan dåliga testfall ha negativ inverkan på företags affärer och kan också leda till att användare upplever testerna negativt. Det är skälet till varför det är viktigt att kunna förutsäga A/B-testets långsiktiga effekter, utvunna ur kortsiktiga data. I denna rapport är A/B-tester och de prognoser de skapar undersökta genom att använda univariat tidsserieanalys. Men på grund av den korta tidsperioden och det stora urvalet, är det en stor utmaning att ge korrekta långtidsprognoser. Det är en kvantitativ och empirisk studie som använder verkliga data som tagits från ett socialt spelutvecklingsbolag, King Digital Entertainment PLC (King.com). Först analyseras och förbereds data genom en serie olika steg. Tidsserieprognoser har funnits i generationer. Därför görs en analys och noggrannhetsjämförelse av befintliga prognosmodeller, så som medelvärdesprognos, ARIMA och Artificial Neural Networks. Resultaten av analysen på verkliga data visar liknande resultat som andra forskare har funnit för långsiktiga prognoser med kortsiktiga data. För att förbättra exaktheten i prognosen föreslås en metod med tidsseriekluster. Metoden utnyttjar likheten mellan tidsserier genom Dynamic Time Warping och skapar separata kluster av prognosmodeller. Klustren väljs med hög noggrannhet med hjälp av Random Forest klassificering och de långa tidsserieintervallen säkras genom att använda historiska tester och en Markov Chain. Den föreslagna metoden visar överlägsna resultat i jämförelse med befintliga modeller och kan användas för att erhålla långsiktiga prognoser för A/B-tester. / The technological development of computing devices and communication tools has allowed to store and process more information than ever before. For researchers it is a means of making more accurate scientific discoveries, for companies it is a way of better understanding their clients, products and gain an edge over the competitors. In the industry A/B testing is becoming an important and a common way of obtaining insights that help to make data-driven decisions. A/B test is a comparison of two or more versions to determine which is performing better according to predetermined measurements. In combination of data mining and statistical analysis, these tests allow to answer important questions and help to transition from the state of “we think” to “we know”. Nevertheless, running bad test cases can have negative impact on businesses and can result in bad user experience. That is why it is important to be able to forecast A/B test long-term effects from short-term data. In this report A/B tests and their forecasting is looked at using the univariate time-series analysis. However, because of the short duration and high diversity, it poses a great challenge in providing accurate long-term forecasts. This is a quantitative and empirical study that uses real-world data set from a social game development company King Digital Entertainment PLC(King.com). First through series of steps the data are analysed and pre-processed. Time-series forecasting has been around for generations. That is why an analysis and accuracy comparison of existing forecasting models, like, mean forecast, ARIMA and Artificial Neural Networks, is carried out. The results on real data set show similar results that other researchers have found for long-term forecasts with short-term data. To improve the forecasting accuracy a time-series clustering method is proposed. The method utilizes similarity between time-series through Dynamic Time Warping, and trains separate cluster forecasting models. The clusters are chosen with high accuracy using Random Forest classifier, and certainty about time-series long-term range is obtained by using historical tests and a Markov Chain. The proposed method shows superior results against existing models, and can be used to obtain long-term forecasts for A/B tests.
5

Long Term Forecasting of Industrial Electricity Consumption Data With GRU, LSTM and Multiple Linear Regression

Buzatoiu, Roxana January 2020 (has links)
Accurate long-term energy consumption forecasting of industrial entities is of interest to distribution companies as it can potentially help reduce their churn and offer support in decision making when hedging. This thesis work presents different methods to forecast the energy consumption for industrial entities over a long time prediction horizon of 1 year. Notably, it includes experimentations with two variants of the Recurrent Neural Networks, namely Gated Recurrent Unit (GRU) and Long-Short-Term-Memory (LSTM). Their performance is compared against traditional approaches namely Multiple Linear Regression (MLR) and Seasonal Autoregressive Integrated Moving Average (SARIMA). Further on, the investigation focuses on tailoring the Recurrent Neural Network model to improve the performance. The experiments focus on the impact of different model architectures. Secondly, it focuses on testing the effect of time-related feature selection as an additional input to the Recurrent Neural Network (RNN) networks. Specifically, it explored how traditional methods such as Exploratory Data Analysis, Autocorrelation, and Partial Autocorrelation Functions Plots can contribute to the performance of RNN model. The current work shows through an empirical study on three industrial datasets that GRU architecture is a powerful method for the long-term forecasting task which outperforms LSTM on certain scenarios. In comparison to the MLR model, the RNN achieved a reduction in the RMSE between 5% up to to 10%. The most important findings include: (i) GRU architecture outperforms LSTM on industrial energy consumption datasets when compared against a lower number of hidden units. Also, GRU outperforms LSTM on certain datasets, regardless of the choice units number; (ii) RNN variants yield a better accuracy than statistical or regression models; (iii) using ACF and PACF as dicovery tools in the feature selection process is unconclusive and unefficient when aiming for a general model; (iv) using deterministic features (such as day of the year, day of the month) has limited effects on improving the deep learning model’s performance. / Noggranna långsiktiga energiprognosprognoser för industriella enheter är av intresse för distributionsföretag eftersom det potentiellt kan bidra till att minska deras churn och erbjuda stöd i beslutsfattandet vid säkring. Detta avhandlingsarbete presenterar olika metoder för att prognostisera energiförbrukningen för industriella enheter under en lång tids förutsägelsehorisont på 1 år. I synnerhet inkluderar det experiment med två varianter av de återkommande neurala nätverken, nämligen GRU och LSTM. Deras prestanda jämförs med traditionella metoder, nämligen MLR och SARIMA. Vidare fokuserar undersökningen på att skräddarsy modellen för återkommande neurala nätverk för att förbättra prestanda. Experimenten fokuserar på effekterna av olika modellarkitekturer. För det andra fokuserar den på att testa effekten av tidsrelaterat funktionsval som en extra ingång till RNN -nätverk. Specifikt undersökte den hur traditionella metoder som Exploratory Data Analysis, Autocorrelation och Partial Autocorrelation Funtions Plots kan bidra till prestanda för RNN -modellen. Det aktuella arbetet visar genom en empirisk studie av tre industriella datamängder att GRU -arkitektur är en kraftfull metod för den långsiktiga prognosuppgiften som överträffar ac LSTM på vissa scenarier. Jämfört med MLR -modellen uppnådde RNN en minskning av RMSE mellan 5 % upp till 10 %. De viktigaste resultaten inkluderar: (i) GRU -arkitekturen överträffar LSTM på datauppsättningar för industriell energiförbrukning jämfört med ett lägre antal dolda enheter. GRU överträffar också LSTM på vissa datauppsättningar, oavsett antalet valenheter; (ii) RNN -varianter ger bättre noggrannhet än statistiska modeller eller regressionsmodeller; (iii) att använda ACF och PACF som verktyg för upptäckt i funktionsvalsprocessen är otydligt och ineffektivt när man siktar på en allmän modell; (iv) att använda deterministiska funktioner (t.ex. årets dag, månadsdagen) har begränsade effekter på att förbättra djupinlärningsmodellens prestanda.
6

Long-term forecasting model for future electricity consumption in French non-interconnected territories

CARON, MATHIEU January 2021 (has links)
In the context of decarbonizing the electricity generation of French non-interconnected territories, the knowledge of future electricity demand, in particular annual and peak demand in the long-term, is crucial to design new renewable energy infrastructures. So far, these territories, mainly islands located in the Pacific and Indian ocean, relies mainly on fossil fuels powered facilities. Energy policies envision to widely develop renewable energies to move towards a low-carbon electricity mix by 2028.  This thesis focuses on the long-term forecasting of hourly electricity demand. A methodology is developed to design and select a model able to fit accurately historical data and to forecast future demand in these particular territories. Historical data are first analyzed through a clustering analysis to identify trends and patterns, based on a k-means clustering algorithm. Specific calendar inputs are then designed to consider these first observations. External inputs, such as weather data, economic and demographic variables, are also included.  Forecasting algorithms are selected based on the literature and they are than tested and compared on different input datasets. These input datasets, besides the calendar and external variables mentioned, include different number of lagged values, from zero to three. The combination of model and input dataset which gives the most accurate results on the testing set is selected to forecast future electricity demand. The inclusion of lagged values leads to considerable improvements in accuracy. Although gradient boosting regression features the lowest errors, it is not able to detect peaks of electricity demand correctly. On the contrary, artificial neural network (ANN) demonstrates a great ability to fit historical data and demonstrates a good accuracy on the testing set, as well as for peak demand prediction. Generalized additive model, a relatively new model in the energy forecasting field, gives promising results as its performances are close to the one of ANN and represent an interesting model for future research.  Based on the future values of inputs, the electricity demand in 2028 in Réunion was forecasted using ANN. The electricity demand is expected to reach more than 2.3 GWh and the peak demand about 485 MW. This represents a growth of 12.7% and 14.6% respectively compared to 2019 levels. / I samband med utfasningen av fossila källor för elproduktion i franska icke-sammankopplade territorier är kunskapen om framtida elbehov, särskilt årlig förbrukning och topplast på lång sikt, avgörande för att utforma ny infrastruktur för förnybar energi. Hittills är dessa territorier, främst öar som ligger i Stilla havet och Indiska oceanen, beroende av anläggningar med fossila bränslen. Energipolitiken planerar att på bred front utveckla förnybar energi för att gå mot en koldioxidsnål elmix till 2028.  Denna avhandling fokuserar på den långsiktiga prognosen för elbehov per timme. En metod är utvecklad för att utforma och välja en modell som kan passa korrekt historisk data och för att förutsäga framtida efterfrågan inom dessa specifika områden. Historiska data analyseras först genom en klusteranalys för att identifiera trender och mönster, baserat på en k-means klusteralgoritm. Specifika kalenderinmatningar utformas sedan för att beakta dessa första observationer. Externa inmatningar, såsom väderdata, ekonomiska och demografiska variabler, ingår också.  Prognosalgoritmer väljs utifrån litteraturen och de testas och jämförs på olika inmatade dataset. Dessa inmatade dataset, förutom den nämnda kalenderdatan och externa variabler, innehåller olika antal fördröjda värden, från noll till tre. Kombinationen av modell och inmatat dataset som ger de mest exakta resultaten på testdvärdena väljs för att förutsäga framtida elbehov. Införandet av fördröjda värden leder till betydande förbättringar i exakthet. Även om gradientförstärkande regression har de lägsta felen kan den inte upptäcka toppar av elbehov korrekt. Tvärtom, visar artificiella neurala nätverk (ANN) en stor förmåga att passa historiska data och visar en god noggrannhet på testuppsättningen, liksom för förutsägelse av toppefterfrågan. En generaliserad tillsatsmodell, en relativt ny modell inom energiprognosfältet, ger lovande resultat eftersom dess prestanda ligger nära den för ANN och representerar en intressant modell för framtida forskning.  Baserat på de framtida värdena på indata, prognostiserades elbehovet 2028 i Réunion med ANN. Elbehovet förväntas nå mer än 2,3 GWh och toppbehovet cirka 485 MW. Detta motsvarar en tillväxt på 12,7% respektive 14,6% jämfört med 2019 års nivåer.

Page generated in 0.1086 seconds