• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 36
  • 6
  • 3
  • 1
  • 1
  • Tagged with
  • 67
  • 67
  • 50
  • 13
  • 12
  • 11
  • 11
  • 10
  • 10
  • 9
  • 9
  • 9
  • 8
  • 7
  • 7
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
41

Classification of weather conditions based on supervised learning

Safia, Mohamad, Abbas, Rodi January 2023 (has links)
Forecasting the weather remains a challenging task because of the atmosphere's complexity and unpredictable nature. A few of the factors that decide weather conditions, such as rain, clouds, clear skies, and sunshine, include temperature, pressure, humidity, wind speed, and direction. Currently, sophisticated, and physical models are used to forecast weather, but they have several limitations, particularly in terms of computational time. In the past few years, supervised machine learning algorithms have shown great promise for the precise forecasting of meteorological events. Using historical weather data, these strategies train a model to predict the weather in the future. This study employs supervised machine learning techniques, including k-nearest neighbors (KNNs), support vector machines (SVMs), random forests (RFs), and artificial neural networks (ANNs), for better weather forecast accuracy. To conduct this study, we employed historical weather data from the Weatherstack API. The data spans several years and contains information on several meteorological variables, including temperature, pressure, humidity, wind speed, and direction. The data is processed beforehand which includes normalizing it and dividing it into separate training and testing sets. Finally, the effectiveness of different models is examined to determine which is best for producing accurate weather forecasts. The results of this study provide information on the application of supervised machine learning methods for weather forecasting and support the creation of better weather prediction models. / Att förutsäga vädret är fortfarande en utmanande uppgift på grund av atmosfärens komplexitet och oförutsägbara natur. Några av faktorerna som påverkar väderförhållandena, som regn, moln, klart väder och solsken, inkluderar temperatur, tryck, luftfuktighet, vindhastighet och riktning. För närvarande används sofistikerade fysiska modeller för att förutsäga vädret, men de har flera begränsningar, särskilt när det gäller beräkningstid. Under de senaste åren har övervakade maskininlärningsalgoritmer visat stor potential för att noggrant förutsäga meteorologiska händelser. Genom att använda historiska väderdata tränar dessa strategier en modell för att förutsäga framtida väder. Denna studie använder övervakade maskininlärningstekniker, inklusive k-nearest neighbors (KNNs), support vector machines (SVMs), random forests (RFs) och artificial neural networks (ANNs), för att förbättra noggrannheten i väderprognoser. För att genomföra denna studie använde vi historiska väderdata från Weatherstack API. Data sträcker sig över flera år och innehåller information om flera meteorologiska variabler, inklusive temperatur, tryck, luftfuktighet, vindhastighet och riktning. Data bearbetas i förväg, vilket inkluderar normalisering och uppdelning i separata tränings- och testset. Slutligen undersöks effektiviteten hos olika modeller för att avgöra vilken som är bäst för att producera noggranna väderprognoser. Resultaten av denna studie ger information om tillämpningen av övervakade maskininlärningsmetoder för väderprognoser och stödjer skapandet av bättre väderprognosmodeller.
42

Model Predictive Control for Ground Source Heat Pumps : Reducing cost while maintaining comfort

Bokne, Isak, Elf, Charlie January 2023 (has links)
Today, the control of heat pumps aims to first and foremost maintain a comfortable indoor temperature. This is primarily done by deciding input power based on outside temperature. The cost of electricity, which can be rather volatile, is not taken into account. Electricity price can be provided on an hourly rate, and since a house can store thermal energy for a duration of time, it is possible to move electricity consumption to hours when electricity is cheap. In this thesis, the strategy used in the developed controller is Model Predictive Control (MPC). It is a suitable strategy because of the ability to incorporate an objective function that can be designed to take the trade-off between indoor temperature and electricity cost into account. The MPC prediction horizon is dynamic as the horizon of known electricity spot prices varies between 12 and 36 hours throughout the day. We model a residential house heated with a ground source heat pump for use in a case analysis. Sampled weather and spot price data for three different weeks are used in computer simulations. The developed MPC controller is compared with a classic \textit{heat curve} controller, as well as with variations of the MPC controller to estimate the effects of prediction and model errors.  The MPC controller is found to be able to reduce the electricity cost and/or provide better comfort and the prioritization of these factors can be changed depending on user preferences. When shifting energy consumption in time it is necessary to store thermal energy somewhere. If the house itself is used for this purpose, variations in indoor temperature must be accepted. Further, accurate modeling of the Coefficient of Performance (COP) is essential for ground source heat pumps. The COP varies significantly depending on operating conditions and the MPC controller must therefore have a correct perception of the COP. Publicly available weather forecasts are of sufficient quality to be usable for future prediction of outside temperature. For future studies, it would be advantageous if better models can be developed for prediction of global radiation. Including radiation in the MPC controller model would enable better comfort with very similar operating costs compared to when the MPC controller does not take radiation into account.
43

ASSESMENT OF WIND POWER FORECASTING ERROR FOR GOTLAND

Rengmyr, Simon January 2022 (has links)
When the wind blows and wind turbine generators harvests the kinetic energy and trans- forms it to electrical power, there is a need for predicting how much power that will be dispatched from the turbines. Even the most perfect computer model with high computa- tional power could not model the beauty of the forces of nature and we must accept some degree of forecasting error in the predicted power output due to the inherently stochastic patterns in the atmosphere.  This project set out to investigate the main reasons and factors that impacts the forecasting error related to wind power assets on Gotland. From theory and the performed case study, wind speed is the strongest predictor of wind power production, to claim anything else would be severely inaccurate. However, the main predictors of wind power prediction are summarized from a literature study, extracted from a weather model and tried in a case study for the wind farm Stugylparken on Näsudden, Gotland. Three different prediction methods were tried and the ensemble trees model was the best model by the evaluation metrics that was chosen. The second-best performing model was the artificial neural network, and prediction by theoretical power curve performed worse than the standard machine learning methods what was tested in the study. It can be noted that when assessing what model to choose, it depends on how the evaluation is done and which metric is deemed most important. Besides that wind speed will have the most significant impact in all models, forecasting error seem to have correlation to the diurnal cycle. One reason could be land-sea interaction during the day, especially at the period April-September. Higher forecasting errors correlates strongly to periods of a higher mean wind speed and times of varying weather will impact the forecastability and larger errors should be expected. In this project, numerical weather prediction data is used to investigate the forecasting error. A lower error can be seen at the first hours from the model run. This should be expected because it is when we are closest to the initial conditions, in other words, the real world. However, it seems like wind speed and diurnal cycle are more significant than the performance of the numerical weather prediction model in the first 24 hours.  Predicting the future power output of wind assets is expected to be even more impor- tant in the future years due to larger installed capacity. Even with an increase in installed capacity, an over capacity is not wanted and flexibility will be more important. There are challenges, but also opportunity to have a more efficient use of resources in our society and lowering the climate impact that our society has on the planet through a more flexible use of resources.
44

Machine Learning for Improvement of Ocean Data Resolution for Weather Forecasting and Climatological Research

Huda, Md Nurul 18 October 2023 (has links)
Severe weather events like hurricanes and tornadoes pose major risks globally, underscoring the critical need for accurate forecasts to mitigate impacts. While advanced computational capabilities and climate models have improved predictions, lack of high-resolution initial conditions still limits forecast accuracy. The Atlantic's "Hurricane Alley" region sees most storms arise, thus needing robust in-situ ocean data plus atmospheric profiles to enable precise hurricane tracking and intensity forecasts. Examining satellite datasets reveals radio occultation (RO) provides the most accurate 5-25 km altitude atmospheric measurements. However, below 5 km accuracy remains insufficient over oceans versus land areas. Some recent benchmark study e.g. Patil Iiyama (2022), and Wei Guan (2022) in their work proposed the use of deep learning models for sea surface temperature (SST) prediction in the Tohoku region with very low errors ranging from 0.35°C to 0.75°C and the root-mean-square error increases from 0.27°C to 0.53°C over the over the China seas respectively. The approach we have developed remains unparalleled in its domain as of this date. This research is divided into two parts and aims to develop a data driven satellite-informed machine learning system to combine high-quality but sparse in-situ ocean data with more readily available low-quality satellite data. In the first part of the work, a novel data-driven satellite-informed machine learning algorithm was implemented that combines High-Quality/Low-Coverage in-situ point ocean data (e.g. ARGO Floats) and Low-Quality/High-Coverage Satellite ocean Data (e.g. HYCOM, MODIS-Aqua, G-COM) and generated high resolution data with a RMSE of 0.58◦C over the Atlantic Ocean.The second part of the work a novel GNN algorithm was implemented on the Gulf of Mexico and showed it can successfully capture the complex interactions between the ocean and mimic the path of a ARGO floats with a RMSE of 1.40◦C. / Doctor of Philosophy / Severe storms like hurricanes and tornadoes are a major threat around the world. Accurate weather forecasts can help reduce their impacts. While climate models have improved predictions, lacking detailed initial conditions still limits forecast accuracy. The Atlantic's "Hurricane Alley" sees many storms form, needing good ocean and atmospheric data for precise hurricane tracking and strength forecasts. Studying satellite data shows radio occultation provides the most accurate 5-25 km high altitude measurements over oceans. But below 5 km accuracy remains insufficient versus over land. Recent research proposed using deep learning models for sea surface temperature prediction with low errors. Our approach remains unmatched in this area currently. This research has two parts. First, we developed a satellite-informed machine learning system combining limited high-quality ocean data with more available low-quality satellite data. This generated high resolution Atlantic Ocean data with an error of 0.58°C. Second, we implemented a new algorithm on the Gulf of Mexico, successfully modeling complex ocean interactions and hurricane paths with an error of 1.40°C. Overall, this research advances hurricane forecasting by combining different data sources through innovative machine learning techniques. More accurate predictions can help better prepare communities in hurricane-prone regions.
45

Design and Implementation of a Cost-Effective Sky Imager Station

Dehdari, Amirreza, Cazaubon, Tadj Anton January 2024 (has links)
Accurate and cost-effective weather prediction is crucial for various industries, yet current methods and tools are either expensive or lack real-time, local applicability. This thesis presents the development and evaluation of a cost-effective sky-imaging weather station designed to accurately track cloud cover using a combination of visual and environmental data. Our research focuses on constructing a system that utilises a single camera and image processing techniques for cloud separation. By employing colour-space filtering and modern image processing methods, we aim to enhance accuracy while minimising costs. The hardware design leverages consumer-grade components, reducing the unit cost to a fraction of existing solutions. The methodology involves an iterative design process, expert consultation, and rigorous testing to refine the prototype. We evaluate the system's performance by comparing sensor readings to METAR data and assessing accuracy. Additionally, we investigate the feasibility of using the Lifted Condensation Level as a substitute for Cloud Base Height. Our findings demonstrate that it is possible to create a sky-imaging weather station at a cost significantly lower than that of comparable products while achieving accurate cloud tracking and separation. This research contributes to the field by offering a practical, low-cost sky imager with potential applications in everyday weather preparedness, industrial forecasting, and solar energy management.
46

Βελτίωση μετεωρολογικών προγνώσεων με χρήση τεχνητών νευρωνικών δικτύων για τη βελτιστοποίηση συστήματος ενεργειακής διαχείρισης κτιρίων

Θραμπουλίδης, Εμμανουήλ 27 January 2014 (has links)
Σημαντική παράμετρος στο σχεδιασμό των σύγχρονων κτιρίων αποτελεί η ορθολογικότερη διαχείριση της ενέργειας. Η ορθολογικότερη διαχείριση ενέργειας επιτυγχάνεται με το σχεδιασμό κατάλληλων ενεργειακών συστημάτων. Για την αποτελεσματική σχεδίαση αυτών των συστημάτων λαμβάνονται υπόψιν τα μετεωρολογικά δεδομένα, όχι μόνο τα τρέχοντα αλλά και τα προγνωστικά. Τα αριθμητικά πρότυπα πρόγνωσης καιρού παρέχουν εκτιμήσεις των διαφόρων μετεωρολογικών παραμέτρων σε δεδομένα σημεία του χώρου κοντά στην επιφάνεια του εδάφους αλλά και σε διάφορα ύψη. Οι εκτιμήσεις αυτές αποκλίνουν αρκετά από τα πραγματικά δεδομένα γεγονός που παρέχει ένα σημαντικό περιθώριο βελτίωσης της πρόγνωσης. Στην εργασία αυτή προτείνεται μία μέθοδος βελτίωσης της πρόγνωσης μετεωρολογικών δεδομένων με στόχο την αξιοποίηση τους για βελτιστοποίηση της ενεργειακής κατανάλωσης κτιρίου. Η μέθοδος αναπτύχθηκε χρησιμοποιώντας μετρήσεις της ταχύτητας του ανέμου από το μετεωρολογικό σταθμό του Εργαστηρίου Φυσικής της Ατμόσφαιρας του Τμήματος Φυσικής του Πανεπιστημίου Πατρών (ΕΦΑΠ2), καθώς και προγνώσεις του ΕΦΑΠ2 μέσω του αριθμητικού προτύπου πρόγνωσης καιρού WRF (Weather Research and Forecasting model) στο πλησιέστερο δυνατό πλεγματικό σημείο. . Η μέθοδος που προτείνεται, αξιοποιεί τα τεχνητά νευρωνικά δίκτυα και όντας ανεξάρτητη της φύσης της εισόδου μπορεί να χρησιμοποιηθεί για τη βελτίωση της πρόγνωσης μετεωρολογικών παραμέτρων. Επιπλέον, μελετήθηκε η συνεισφορά της μεθόδου στον ακριβέστερο υπολογισμό της ροής αέρα, η οποία υπολογίζεται για ένα πειραματικό θάλαμο δοκιμών, ο οποίος έχει υιοθετηθεί από την Ευρωπαϊκή επιτροπή για την εναρμονισμένη μελέτη ενεργειακών συστημάτων κτιρίων υπό πραγματικές συνθήκες. / An important consideration in the design of modern buildings is the rational use of energy. The rational energy management is achieved by designing appropriate energy systems. For efficient design of these systems we should take into account the meteorological data, not only current but also predictive.Numerical weather prediction models provide estimates of various meteorological parameters to data points of space near the surface and at various heights. These estimates differ considerably from the actual data which provides a significant margin improvement of prognosis. In this work we propose a method of improving the prediction of meteorological data to exploit them to optimize energy consumption in building management systems. The method was developed using measurements of wind speed, from the meteorological station of the Laboratory of Atmospheric Physics of the Department of Physics of the University of Patras (LAPUP), and prognostications LAPUP through numerical weather prediction model WRF (Weather Research and Forecasting model) to the closest possible lattice point. The proposed method utilizes the artificial neural networks and being independent of the nature of the inputs it can be used to improve forecasting meteorological parameters. Furthermore, we studied the contribution of the method to accurately calculate the air flow of an experimental test chamber, which has been adopted by the European Committee for the study of building management systems under real conditions.
47

Sensibilité des assimilations d'ensemble globales et régionales aux conditions initialites et aux conditions limites latérales / Sensitivity of global and regional ensemble assimilation to initial conditions and lateral boundary conditions

El Ouaraini, Rachida 16 April 2016 (has links)
La mise en œuvre de méthodes d'assimilation d'ensemble est une technique assez récente visant à simuler les erreurs d'analyse et de prévision d'un système d'assimilation de données. Cela permet d'une part d'estimer des covariances spatiales des erreurs de prévision, qui sont un ingrédient essentiel des systèmes d'assimilation de données, dans la mesure où elles permettent de filtrer et de propager spatialement l'information observée. La dépendance de ces covariances d'erreur à la situation météorologique devient ainsi accessible avec ces techniques d'ensemble. D'autre part, l'assimilation d'ensemble est également une méthode de plus en plus utilisée pour fournir des perturbations initiales aux systèmes de prévision d'ensemble. Une telle approche peut être mise en place non seulement dans un système modélisant l'atmosphère sur l'ensemble du globe, mais aussi dans un système régional à aire limitée, en utilisant dans ce cas des conditions limites latérales appropriées. Le sujet de thèse proposé consiste à examiner certaines propriétés de sensibilité de ces techniques d'assimilation d'ensemble dans ces deux types de contextes (à savoir global et régional, respectivement). Il s'agit premièrement d'étudier la sensibilité d'un système global d'assimilation d'ensemble à son initialisation. Cela sera mené en comparant une technique d'initialisation "à froid" (basée sur des perturbations initiales nulles) avec une méthode basée sur des perturbations initiales tirées d'un modèle de covariance. Dans une deuxième partie, la sensibilité d'une assimilation d'ensemble régionale aux conditions limites latérales sera examinée. Dans cette perspective, une comparaison entre différentes techniques de production des perturbations latérales sera réalisée. Il s'agit notamment de comparer les approches basées sur des perturbations latérales qui sont soit nulles, soit tirées d'un ensemble global, ou encore produites à l'aide d'un modèle de covariance. Ces études de sensibilité seront menées d'une part en utilisant des expérimentations avec les systèmes global Arpege et régional Aladin. Ce travail s'appuiera d'autre part sur une formalisation des équations qui gouvernent l'évolution des perturbations au sein d'une assimilation d'ensemble. Ces études devraient permettre de documenter les propriétés de ces assimilations d'ensemble, et de définir des stratégies de mise en œuvre en grandeur réelle pour l'assimilation de données ainsi qu'éventuellement pour la prévision d'ensemble. / The implementation of ensemble assimilation methods is a fairly recent technique used to simulate the analysis and forecast errors within a data assimilation system. On the one hand, this allows to estimate the spatial covariances of forecast errors, which are an essential component in data assimilation systems, insofar as they are used to filter and disseminate spatially the observed information. The dependence of such error covariances to the weather situation becomes accessible with these ensemble techniques. On the other hand, the ensemble assimilation is a method increasingly used to provide initial perturbations to ensemble prediction systems. Such approach may be implemented not only in a system modeling the atmosphere throughout the globe, but also in a regional system with limited area using suitable lateral boundary conditions. The proposed thesis consists on examining some sensitivity properties of these ensemble assimilation techniques in both contexts (global and regional, respectively). In the first part, the sensitivity of a global ensemble assimilation system to its initialization will be examined. This will be conducted by comparing a "cold" initialization technique (initial perturbations equal to zero) with a method based on initial perturbations drawn from a covariance model. In the second part, the sensitivity of a regional ensemble assimilation to lateral boundary conditions will be considered. In this context, a comparison between different techniques producing lateral boundaries will be achieved. It involves comparing approaches using lateral boundaries which are equal to zero or drawn from a global ensemble, or generated using a covariance model. These sensitivity studies will be conducted using experiments using the global and regional modeling systems, Arpège and Aladin respectively. Furthermore, this work will be based on a formalization of the equations governing the evolution of perturbations in an ensemble assimilation. These studies should help to document the ensemble assimilation properties, and develop strategies for implementing in real scale for data assimilation and possibly for ensemble prediction system.
48

Modélisation et assimilation d’observations satellitaires micro-ondes dans les systèmes dépressionnaires tropicaux / Modelling and assimilation of rainy microwave satellite observations in tropical systems

Guerbette, Jérémy 04 April 2016 (has links)
Cette thèse s’inscrit dans la problématique de l’utilisation des observations satellitaires pour l’assimilation en prévision numérique du temps dans les régions nuageuses pluvieuses. Les travaux sont abordés en lien avec l’amélioration de la prévision des cyclones tropicaux et s’appuient sur la mission satellitaire innovante MEGHATROPIQUES couvrant les zones tropicales avec une répétitivité temporelle inégalée et en particulier sur le sondeur micro-ondes d’humidité SAPHIR à 183 GHz. Nous avons utilisé le modèle de prévision numérique du temps ALADIN-Réunion opérationnel à Météo-France depuis 2006 dont le domaine couvre une large partie de l’océan Indien avec une résolution horizontale de 8 km, ainsi que le modèle de transfert radiatif RTTOV-SCATT qui offre un bon compromis entre sa précision pour décrire les atmosphères diffusantes et sa rapidité d’exécution. Dans un premier temps nous avons optimisé le choix des propriétés radiatives des précipitations solides afin de simuler au mieux les températures de brillance SAPHIR avec les modèles ALADIN-Réunion et RTTOV-SCATT. Nous avons ensuite proposé une méthode d’inversion des températures de brillance SAPHIR en zones nuageuses basée sur une méthode bayésienne permettant de restituer des profils atmosphériques corrigés. Ces profils inversés ont été validés pour une situation particulière associée au cyclone Benilde (Décembre 2011). Les profils d’humidité spécifique ont alors été introduits comme de nouvelles observations dans l’assimilation variationnelle tridimensionnelle (3D-Var) du modèle ALADIN-Réunion. La capacité du système 3D-Var à contraindre le champ d’humidité analysé vers les profils inversés est démontrée, ainsi que l’amélioration des prévisions de précipitations à courte échéance. Toutefois, la prévision du cyclone Benilde est de moins bonne qualité avec ces observations additionnelles. Plusieurs pistes sont proposées pour expliquer et améliorer ces premiers résultats. Finalement, une étude a été réalisée pour préparer les évolutions des modèles de prévision numérique. Nous avons examiné la capacité d’une version d’ALADIN-Réunion avec un schéma de convection profonde pronostique à simuler le cycle de vie du cyclone Bejisa (Décembre 2013 - Janvier 2014). Des améliorations significatives sont notées à la fois sur la trajectoire et l’intensification de ce système tropical. De manière cohérente, la simulation des températures de brillance SAPHIR en zones nuageuses est en meilleur accord avec les observations. Un modèle à plus fine échelle (AROME) résolvant explicitement la convection profonde (résolution horizontale de 2.5 km) est appelé à remplacer le modèle ALADIN-Réunion. Sa capacité à décrire le système Bejisa est démontrée. Toutefois il apparaît que le choix optimal pour le type de particule décrivant les précipitations solides fait pour ALADIN-Réunion n’est pas adapté à la simulation des températures de brillance SAPHIR avec AROME et RTTOV-SCATT. Les causes de cette incohérence sont expliquées. / This thesis is focused on the use of satellite observations within cloudy and rainy areas for assimilation in numerical weather prediction models. The activities have been undertaken in the context of tropical cyclone forecasting. They have taken advantage of the recent satellite mission MEGHA-TROPIQUES covering tropical regions with an unprecedented temporal revisit with a focus on the humidity sounder SAPHIR at 183 GHz. We have used the numerical weather prediction model ALADIN-Réunion that is operational at Météo-France since 2006 and covers a large fraction of the Indian ocean with a 8 km horizontal resolution. The radiative transfer model RTTOV-SCATT has also been considered, since it provides a good compromise between its accuracy to simulate scattering atmospheres and its computational cost. In a first step, the choice of the radiative properties for solid precipitating particles has been optimized in order to improve the simulation of SAPHIR brightness temperatures with ALADIN-Réunion and RTTOV-SCATT models. Then, an inversion method of cloudy SAPHIR brightness temperatures based on the bayesian technique has been chosen in order to retrieve improved atmospheric profiles. The retrieved profiles have been validated for a case study corresponding to the tropical cyclone Benilde (December 2011). Profiles of specific humidity have been introduced as new observations in the tridimensional variational assimilation (3D-Var) system of the ALADIN-Réunion model. The capacity of the 3D-Var system to constrain the humidity analysis towards the retrieved profiles is demonstrated, together with improved short-range precipitation forecasts. On the other hand, the prediction of the tropical cyclone Benilde is degraded with these additional observations. A number of reasons are provided to explain and improve these first results. Finally, a study has been done to prepare future evolutions of numerical weather prediction models. We have examined the skill of a version of the ALADIN-Réunion model with a prognostic deep moist convection scheme to simulate the life cycle of tropical cyclone Bejisa (December 2013 - January 2014). Significant improvements have been noticed on the trajectory and on the intensification of this tropical system. Consistently, the simulation of SAPHIR brightness temperatures is in better agreement with observations. A fine scale model (AROME) describing explicitly deep moist convection is planned to replace the ALADIN-Réunion model. Its ability to describe the cyclone Bejisa is demonstrated. However, it appears that the optimal choice of the solid particle made for ALADIN-Réunion is not suited for the simulation of SAPHIR brightness temperatures with AROME and RTTOV-SCATT. Explanations are given of such inconsistency.
49

Apport de prévisions météorologiques à échelle kilométrique pour la modélisation du manteau neigeux en montagne / Potential of kilometric-resolution meteorological forecasts for snowpack modelling in mountainous terrain

Quéno, Louis 24 November 2017 (has links)
Le suivi et la représentation de la variabilité du manteau neigeux en montagne sont des enjeux écologiques et sociétaux majeurs. Le récent développement de modèles météorologiques à échelle kilométrique offre un potentiel nouveau pour améliorer les simulations d'enneigement en montagne. Dans cette thèse, nous avons évalué l'apport des prévisions météorologiques du modèle de prévision numérique du temps AROME à 2.5 km de résolution horizontale pour alimenter le modèle détaillé de manteau neigeux Crocus. Les simulations AROME-Crocus distribuées ont d'abord été évaluées sur les Pyrénées de 2010 à 2014, montrant un apport en termes de représentation de la variabilité spatio-temporelle du manteau neigeux par rapport à l'approche par massif du système opérationnel actuel SAFRAN-Crocus, malgré une surestimation des hauteurs de neige. Par la suite, la valeur ajoutée de produits satellitaires de rayonnements incidents a été étudiée pour des simulations d'enneigement dans les massifs alpins et pyrénéens, soulignant leur bonne qualité en montagne mais un impact mitigé sur le couvert neigeux simulé. Enfin, on a montré comment le schéma de microphysique nuageuse d'AROME associé à Crocus permet de mieux prévoir la formation de glace en surface du manteau neigeux par précipitations verglaçantes dans les Pyrénées. Ces travaux ouvrent la voie à une prévision nivologique distribuée à haute résolution en montagne. / Monitoring and representing the snowpack variability in mountains are crucial ecological and societal issues. The recent development of meteorological models at kilometric scale offers a new potential to improve snowpack simulations in mountains. In this thesis, we assessed the potential of forecasts from the numerical weather prediction model AROME at 2.5 km horizontal resolution to drive the detailed snowpack model Crocus. AROME-Crocus distributed simulations were first evaluated over the Pyrenees from 2010 to 2014. They showed benefits in representing the snowpack spatio-temporal variability as compared to the massif-based approach of the current operational system SAFRAN-Crocus, despite an overestimation of snow depth. Then, we studied the potential added value of satellite-derived products of incoming radiations for simulating the snow cover in the French Alps and Pyrenees. These products were found of good quality in mountains but their impact on the simulated snow cover is questionable. Finally, we showed how the cloud microphysics scheme of AROME associated with Crocus enables to better predict ice formation on top of the snowpack due to freezing precipitation in the Pyrenees. These works pave the way for high-resolution distributed snowpack forecasting in mountains.
50

Optimal predictive control of thermal storage in hollow core ventilated slab systems

Ren, Mei Juan January 1997 (has links)
The energy crisis together with greater environmental awareness, has increased interest in the construction of low energy buildings. Fabric thermal storage systems provide a promising approach for reducing building energy use and cost, and consequently, the emission of environmental pollutants. Hollow core ventilated slab systems are a form of fabric thermal storage system that, through the coupling of the ventilation air with the mass of the slab, are effective in utilizing the building fabric as a thermal store. However, the benefit of such systems can only be realized through the effective control of the thermal storage. This thesis investigates an optimum control strategy for the hollow core ventilated slab systems, that reduces the energy cost of the system without prejudicing the building occupants thermal comfort. The controller uses the predicted ambient temperature and solar radiation, together with a model of the building, to predict the energy costs of the system and the thermal comfort conditions in the occupied space. The optimum control strategy is identified by exercising the model with a numerical optimization method, such that the energy costs are minimized without violating the building occupant's thermal comfort. The thesis describes the use of an Auto Regressive Moving Average model to predict the ambient conditions for the next 24 hours. A building dynamic lumped parameter thermal network model, is also described, together with its validation. The implementation of a Genetic Algorithm search method for optimizing the control strategy is described, and its performance in finding an optimum solution analysed. The characteristics of the optimum schedule of control setpoints are investigated for each season, from which a simplified time-stage control strategy is derived. The effects of weather prediction errors on the optimum control strategy are investigated and the performance of the optimum controller is analysed and compared to a conventional rule-based control strategy. The on-line implementation of the optimal predictive controller would require the accurate estimation of parameters for modelling the building, which could form part of future work.

Page generated in 0.2038 seconds