Spelling suggestions: "subject:"ERA interim"" "subject:"ERA ínterim""
1 |
Analyzing Arctic surface temperatures with Self Organizing-Maps: Influence of the maps sizeMewes, Daniel, Jacobi, Ch. 26 September 2018 (has links)
We use ERA-Interim reanalysis data of 2 meter temperature to perform a pattern analysis of the Arctic temperatures exploiting an artificial neural network called Self Organizing-Map (SOM). The SOM method is used as a cluster analysis tool where the number of clusters has to be specified by the user. The different sized SOMs are analyzed in terms of how the size changes the representation of specific features. The
results confirm that the larger the SOM is chosen the larger will be the root mean square error (RMSE) for the given SOM, which is followed by the fact that a larger number of patterns can reproduce more specific features for the temperature. / Wir benutzten das künstliche neuronale Netzwerk Self Organizing-Map (SOM), um eine Musteranalyse von ERA-Interim Reanalysedaten durchzuführen. Es wurden SOMs mit verschiedener Musteranzahl verglichen. Die Ergebnisse zeigen, dass SOMs mit einer größeren Musteranzahl deutlich spezifischere Muster produzieren im Vergleich zu SOMs mit geringen Musteranzahlen. Dies zeigt sich unter anderem in
der Betrachtung der mittleren quadratischen Abweichung (RMSE) der Muster zu den zugeordneten ERA Daten.
|
2 |
Verifiering av WRF-modellen över SvalbardWaxegård, Anna January 2011 (has links)
Glaciologer har under en längre tid observerat förändringar av glaciärer på Svalbard, att några minskar i storlek och att vissa växer. Avsmältning med ökade havsnivåer och potentiellt ändrad havscirkulation till följd är ett scenario som berör människor över hela värden. Dessa förändringar kan eventuellt förklaras genom att koppla de meteorologiska förhållandena i området till större cirkulationsförändringar. De meteorologiska förhållandena över Svalbard har simulerats med en regional klimatmodell, WRF (Weather Research Forecasting), för tre domäner med upplösningarna 24 km, 8 km och 2,7 km. Modellen har testats i två versioner, standard-WRF med förvalda processbeskrivningar och WRF med processbeskrivningar anpassade för polärt klimat och har drivits med ERA-Interim data, som är en återanalys av de globala väderförhållandena framtagen av ECMWF. Resultaten från WRF har verifierats mot observationer uppmätta av AWS-stationer (Automatic Weather Station). Följande parametrar ingår i studien: temperatur, vindhastighet, specifik fuktighet, kortvågig in och utstrålning samt långvågig instrålning. Simulationer med standard-WRF underskattar samtliga strålningsparametrar. En felaktig strålningsbalans leder till att standard-WRF simulerar för låga temperaturer. Att mängden kortvågig och långvågig instrålning är för liten beror förmodligen på att standard-WRF simulerar för stor mängd höga moln och för liten mängd låga moln. För vindhastigheten och den långvågiga instrålningen ökar respektive minskar korrelationen när resultaten från nedskalning från 24 km till 8 km med standard-WRF analyseras. Bäst korrelation för vindsimuleringar fås med standard-WRF i upplösningen 8 km. För temperaturen ger ERA-Interim bättre korrelation mot observationer än simuleringar med standard-WRF. Ett test av polaroptimerade WRF visar att detta utförande av modellen bättre förutsäger strålningsbalansen över glaciärerna och som en följd av detta fås en mer överensstämmande temperaturmodellering. Polaroptimerade WRF simulerar en mindre mängd höga moln och en strörre mängd låga moln jämfört med standard-WRF. Bättre molnmodelleringarna i kombination med ett mer passande schema som beskriver mängden kortvågig strålning ger en förbättrad energibalans. Vindmodelleringar i upplösningen 2,7 km utförda av standard-WRF och polaroptimerade WRF ger minskad korrelation och ökad spridning jämfört med simuleringar i upplösningen 8 km. Denna rapport visar på att polaroptimerade WRF är ett bättre alternativ än standard-WRF när Svalbards meteorologiska parametrar ska simuleras.
|
3 |
An Analysis of Using Error Metrics to Determine the Accuracy of Modeled Historical Streamflow on a Global ScaleJackson, Elise Katherine 01 April 2018 (has links)
Streamflow data is used throughout the world in applications such as flooding, agriculture, and urban planning. Understanding daily and seasonal patterns in streamflow is important for decision makers, so that they can accurately predict and react to seasonal changes in streamflow for the region. This understanding of daily and seasonal patterns has historically been achieved through interpretation of observed historical data at stream reaches throughout the individual regions. Developing countries have limited and sporadic observed stream and rain gage data, making it difficult for stakeholders to manage their water resources to their fullest potential. In areas where observed historical data is not readily available, the European Reanalysis Interim (ERA-Interim) data provided by the European Center for Medium-Range Weather Forecasts (ECMWF) can be used as a surrogate. The ERA-Interim data can be compared to historic observed flow to determine the accuracy of the ERA-Interim data using statistical measures such as the correlation coefficient, the mean difference, the root mean square error, R2 coefficients and spectral angle metrics. These different statistical measures determine different aspects of the predicted data's accuracy. These metrics measure correlation, errors in magnitude, errors in timing, and errors in shape. This thesis presents a suite of tests that can be used to determine the accuracy and correlation of the ERA-Interim data compared to the observed data, the accuracy of the ERA-Interim data in capturing the overall events, and the accuracy of the data in capturing the magnitude of events. From these tests, and the cases presented in this thesis, we can conclude that the ERA-Interim is a sufficient model for simulating historic data on a global scale. It is able to capture the seasonality of the historical data, the magnitude of the events, and the overall timing of the events sufficiently to be used as a surrogate dataset. The suite of tests can also be applied to other applications, to make comparing two datasets of flow data a quicker and easier process.
|
4 |
A climatological study of Clear Air Turbulence over the North Atlantic / En klimatologisk studie av Clear Air Turbulence över NordatlantenLee, Leon January 2013 (has links)
Clear Air Turbulence (CAT) is the turbulence experienced at high altitude on board an aircraft. The main mechanisms for its generation are often said to be Kelvin-Helmholtz instability and mountain waves. CAT is an issue to the aviation industry in the sense that it is hard to predict its magnitude and exact location. Mostly, it is just a nuisance for the crew and passengers, but occasionally it causes serious injuries and aircraft damage. It also prevents air-to-air refuelling to be conducted in a safe manner. The micro scale nature of CAT makes it necessary to describe it with turbulence indices. The first part of this study presents a verification of the two commonly used turbulence indices, TI1 and TI2, developed by Ellrod and Knapp in 1992. The verification is done with AMDAR (Aircraft Meteorological Data Relay) reports and computed indices from ERA-Interim data. The second part presents a 33-year climatology of the indices for describing CAT. Results show that the index TI1 is generally the better of the two indices based on hit rate, but TI2 performs better based on false alarm rate. The climatology suggests that CAT is more frequent at the northern east coast of the U.S., over the island of Newfoundland and east of Greenland. In the vertical, CAT seems to occur most frequently at the 225 hPa level but also occur frequently at the 300 hPa level at the aforementioned areas. Based on AMDAR reports from 2011, only 0.014% of the reports were positive turbulence observations. The low amount of reports suggests that CAT can be avoided effectively with current CAT predicting skills and flight planning. / Clear Air Turbulence (CAT) är den turbulens på hög höjd som upplevs ombord på flygplan och orsaken till denna turbulens sägs ofta vara Kelvin-Helmholtz-instabilitet och lävågor. På grund av svårigheten att förutsäga dess styrka och exakta position är CAT ett problem inom flygbranschen. Ofta är CAT bara ett irritationsmoment för besättning och passagerare, men kan ibland orsaka personskador och flygplansskador. Den mikroskaliga strukturen som CAT har gör det nödvändigt att beskriva den med turbulensindex. Den första delen av denna studie tar upp pålitligheten av två ofta använda turbulensindex, TI1 och TI2, utvecklade av Ellrod och Knapp år 1992. Verifikationen görs med hjälp av AMDAR-rapporter (Aircraft Meteorological Data Relay) och turbulensindex beräknade med data från ERA-Interim. Den andra delen består av en 33-års klimatologisk studie av CAT baserat på dessa index. Baserat på träffgrad presterar TI1 generellt bättre än TI2, men TI2 presterar bättre än TI1 vad gäller falsklarmsgrad. Den klimatologiska studien tyder på att CAT är mer frekvent över USAs norra ostkust, över Newfoundland och öster om Grönland. I vertikalled verkar CAT förekomma mest frekvent omkring 225 hPa-nivån, men även runt 300 hPa-nivån över de geografiska områden som nämnts ovan. AMDARrapporter från 2011 visar att endast 0.014% av rapporterna observerade turbulens. Den låga andelen antyder att man effektivt kan undvika CAT över nordatlanten med branschens nuvarande förmåga att förutse CAT och god färdplanering. / Research on a CRuiser Enabled Air Transport Environment, RECREATE
|
5 |
Observation et simulation de la température de surface en Antarctique : application à l'estimation de la densité superficielle de la neige / Observation and simulation of surface temperature in Antartica : application in snow surface density estimationFréville, Hélène 24 November 2015 (has links)
La situation en Antarctique est complexe. Continent peu connu et isolégéographiquement,les processus qui contrôlent son bilan de masse et son bilan d'énergie sont encore mal compris. Dans ce contexte, l'étude de la température de surface connaît un intérêt grandissant de la part de la communauté scientifique. En effet, en contrôlant fortement la température de la neige jusqu'à des dizaines, voire des centaines, de mètres sous la surface, la température de surface influence l'état thermique de la calotte du plateau antarctique, sa dynamique et, par conséquent, son bilan de masse. De plus, en agissant sur les émissions de flux thermiques infrarouges et sur les flux turbulents de chaleurs sensibles et latents, la température de surface est directement liée au bilan énergétique de surface du plateau antarctique. L'analyse de la température de surface et l'étude des processus physiques à l'origine de sa variabilité participent à l'amélioration de la compréhension du bilan énergétique de surface, étape nécessaire pour déterminer l'état actuel de sa calotte et faire des prévisions sur sa potentielle contribution à l'élévation du niveau des mers. Ce travail de thèse participe à cet effort en s'intéressant au cycle diurne de la température de surface et aux différents facteurs contribuant à sa variabilité spatiale et temporelle sur le plateau antarctique. Il débute par une évaluation de différentes données entre 2000 et 2012 montrant le bon potentiel de la température de surface MODIS qui peut dès lors être utilisée comme donnée de référence pour l'évaluation des modèles et réanalyses. Un biais chaud systématique de 3 à 6°C dans la réanalyse ERA-interim de la température de surface est ainsi mis en évidence sur le plateau antarctique. L'observation du cycle diurne de la température de surface a, quant à elle, permis d'identifier la densité superficielle parmi ses facteurs de variabilité. Sur les premiers centimètres du manteau neigeux où se concentrent la majorité des échanges de masse et d'énergie entre l'atmosphère et la calotte antarctique, la densité de la neige est une donnée cruciale car elle agit sur l'absorption du rayonnement solaire dans le manteau neigeux mais également sur la conductivité thermique du manteau et donc sur la propagation de la chaleur entre la surface et les couches en profondeur. La densité superficielle de la neige présente cependant de nombreuses incertitudes sur sa variabilité spatio-temporelle et sur les processus qui la contrôlent. De plus, ne pouvant être mesurées qu'in situ, les données de densité superficielle en Antarctique sont restreintes géographiquement. Cette thèse explore une nouvelle application de la température de surface consistant à estimer la densité superficielle de la neige via une méthode d'inversion de simulations numériques. Une carte de la densité superficielle en Antarctique a ainsi pu être produite en minimisant l'erreur de simulation sur l'amplitude diurne. / The antarctic ice sheet is a key element in the climate system and an archive of past climate variations. However, given the scarcity of observations due to the geographical remoteness of Antarctica and its harsh conditions, little is known about the processes that control its mass balance and energy. In this context, several studies focus on the surface temperature which controls the snow temperature up to tens, if not hundreds, of meters beneath the surface. It also influences the thermal state of the antarctic ice sheet, its dynamics, and thus, its mass balance. Surface temperature is also directly linked to the surface energy balance through its impact on thermal and surface turbulent heat flux emissions. Thus, surface temperature analysis and the study of physical processes that control surface temperature variability contribute to the better understanding of the surface energy balance, which is a necessary step to identify the actual state of the antarctic ice sheet and forecast its impact on sea level rise. This thesis work contributes to this effort by focusing on the surface temperature diurnal cycle and various factors impacting spatial and temporal surface temperature variability on the Antarctic Plateau. First, an evaluation of MODIS data, done by comparison with in situ measurements, shows MODIS great potential in the observation of the surface temperature of the Antarctic Plateau under clear-sky conditions. Hourly MODIS surface temperature data from 2000 to 2011 were then used to evaluate the accuracy of snow surface temperature in the ERA-Interim reanalysis and the temperature produced by a stand-alone simulation with the Crocus snowpack model using ERA-Interim forcing. It reveals that ERA-Interim has a widespread warm bias on the Antarctic Plateau ranging from +3 to +6°C depending on the location. Afterwards, observations of the surface temperature diurnal cycle allow an identification of the surface density as a factor of surface temperature variability. On the topmost centimeters of the snowpack where most mass and energy exchanges between the surface and atmosphere happen, density is critical for the energy budget because it impacts both the effective thermal conductivity and the penetration depth of light. However, there are considerable uncertainties around surface density spatio-temporal variability and the processes that control it. Besides, since surface density can only be measured in situ, surface density measurements in Antarctica are restricted to limited geographical areas. Thus, this thesis also explores a new application of surface temperature by estimating surface density in Antarctica based on the monotonic relation between surface density and surface temperature diurnal amplitude. A map of surface density is obtained by minimising the simulation error related to diurnal amplitude of the surface temperature.
|
6 |
Extreme value analysis of non-stationary time series: Quantifying climate change using observational data throughout GermanyMüller, Philipp 11 March 2019 (has links)
The overall subject of this thesis is the massive parallel application of the extreme value analysis (EVA) on climatological time series. In this branch of statistics one strives to learn about the tails of a distribution and its upper quantiles, like the so-called 50 year return level, an event realized on average only once during its return period of 50 years. Since most studies just focus on average statistics and it's the extreme events that have the biggest impact on our life, such an analysis is key for a proper understanding of the climate change. In there a time series gets separated into blocks, whose maxima can be described using the generalized extreme value (GEV) distribution for sufficiently large block sizes.
But, unfortunately, the estimation of its parameters won't be possible on a massive parallel scale with any available software package since they are all affected by onceptional problems in the maximum likelihood fit. Both the logarithms in the negative log-likelihood of the GEV distribution and the theoretical limitations on one of its parameters give rise to regions in the parameter space inaccessible to the optimization routines, causing them to produce numerical artifacts. I resolved this issue by incorporating all constraints into the optimization using the augmented Lagrangian method. With my implementation in the open source package **climex** it is now possible to analyze large
climatological data sets. In this thesis I used temperature and precipitation data from measurement stations provided by the German weather service (DWD) and the ERA-Interim reanalysis data set and analyzed them using both a qualitative method based on time windows and a more quantitative one relying on the class of vector generalized linear models (VGLM).
Due to the climate change a general shift of the temperature towards higher values and thus more hot and less cold extremes would be expect. Indeed, I could find the cation parameters of the GEV distributions, which can be thought of as the mean event size at a return period of approximately the block size of one year, to increase for both the aily maximum and minimum temperatures. But the overall changes are far more complex and dependent on the geographical location as well as the considered return period, hich is quite unexpected. E.g. for the 100 year return levels of the daily maximum temperatures a decrease was found in the east and the center of Germany for both the raw series and their anomalies, as well as a quite strong reduction for the raw series in the very south of Germany.
The VGLM-based non-stationary EVA resulted in significant trends in the GEV parameters for the daily maximum temperatures of almost all stations and for about half of them in case of the daily minima. So, there is statistically sound evidence for a change in the extreme temperatures and, surprisingly, it is not exclusively towards higher values. The analysis yielded several significant trends featuring a negative slope in the 10 year return levels.
The analysis of the temperature data of the ERA-Interim reanalysis data set yielded quite surprising results too. While in some parts of the globe, especially on land, the 10 year return levels were found to increase, they do in general decrease in most parts of the earth and almost entirely over the sea. But since we found a huge discrepancy between the results of the analysis using the station data within Germany and the results obtained for the corresponding grid points of the reanalysis data set, we can not be sure whether the patterns in the return levels of the ERA-Interim data are trustworthy. / Das Ziel dieser Arbeit ist die massiv parallele Anwendung der Extremwertanalyse (EVA) auf klimatologischen Zeitreihen. Dieser Bereich der Statistik beschäftigt sich mit den Schwänzen von Wahrscheinlichkeitsverteilungen und deren großen Quantilen, wie z.B. dem sogenannten 50-jährigen Return Level. Dies ist ein Ereignis, welches im Mittel nur einmal innerhalb seiner Return Periode von 50 Jahren realisiert wird. Da sich aber die Mehrheit der wissenschaftlichen Studien auf die Analyse gemittelter statistischer Größen stützen, aber es gerade die extremen Ereignisse sind, welche unser Leben maßgeblich beeinflussen, ist eine solche EVA entscheidend für ein umfassendes Verständnis des
Klimawandels. In der Extremwertanalyse wird eine Zeitreihe in einzelne Blöcke geteilt, deren Maxima sich bei hinreichend großer Blocklänge mittels der generalisierten Extremwertverteilung (GEV) beschreiben lassen.
Die Schätzung ihrer Parameter ist auf solch massiv parallelen Skalen jedoch mit keinem der verfügbaren Softwarepakete möglich, da sie alle vom selben konzeptionellen Problem der Maximum Likelihood Methode betroffen sind. Sowohl die Logarithmen in der negativen log-Likelihood der GEV Verteilung, als auch die theoretischen Beschränkungen im Wertebereich eines ihrer Parameter machen Teile des Parameterraumes für den Optimierungsalgorithmus unzugänglich und führen zur Erzeugung numerischer Artefakte durch die Routine. Dieses Problem konnte ich lösen, indem ich die Beschränkungen mittels der augmented Lagrangian Methode in die Optimierung integrierte. Mittels dem verbesserten Fit, den ich in dem Open Source Paket **climex** zur Verfügung stellte, ist es nun möglich beliebig viele Zeitreihen in einer parallelen Analyse zu behandeln. In dieser Arbeit verwende ich Temperatur- und Niederschlagszeitreihen des deutschen Wetterdienstes (DWD) und den ERA-Interim Reanalyse Datensatz in Kombination mit sowohl einer qualitativen Analyse basierend auf Zeitfenstern, als auch einer quantitativen, welche auf der Modellklasse der Vektor-generalisierten linearen Modellen (VGLM) beruht.
Aufgrund des Klimawandels ist intuitiv eine Verschiebung der Temperaturverteilung zu höheren Werten und damit mehr heiße und weniger kalte Temperaturextreme zu erwarten. Tatsächlich konnte ich für die täglichen Maximal- und Minimaltemperaturen einen Anstieg des Location Parameters finden, dem man sich als mittlere Ereignisgröße
für eine Return Periode gleich der verwendeten Blocklänge von einem Jahr versinnbildlichen kann. Im Großen und Ganzen sind die Änderungen jedoch deutlich komplexer und hängen sowohl vom Ort, als auch von der Return Periode ab. Z.B. verringern sich die 100 jährigen Return Level der täglichen Maximaltemperaturen im Osten und im Zentrum
Deutschlands für sowohl die unprozessierten Zeitreihen, als auch für deren Anomalien, und weisen eine besonders starke Reduktion im Süden des Landes für die prozessierten auf.
Durch die VGLM-basierte, nicht-stationäre EVA konnte ich zeigen, dass nahezu alle Stationen für die täglichen Maximaltemperaturen, sowie rund die Hälfte aller Stationen für die täglichen Minimaltemperaturen, signifikante Trends in den Parameters der GEV Verteilung aufweisen. Somit war es mir möglich statistisch fundierte Beweise für Veränderungen in den extremen Temperaturen finden, die jedoch nicht ausschließlich in einer Verschiebung zu höheren Werten bestanden. Einige Stationen wiesen eine negativen Trend in ihren 10 jährigen Return Leveln auf.
Die Analyse der Temperaturzeitreihen des ERA-Interim Reanalyse Datensatzes ergab ebenfalls überraschende Resultate. Während in einigen Teilen der Welt, hauptsächlich an Land, die 10 jährigen Return Level steigen, sinkt ihr Wert für den Großteil der Zeitreihen und fast über den gesamten Ozeanen. Da jedoch eine große Diskrepanz zwischen den Ergebnissen der Stationsdaten des DWD und den dazugehörigen Rasterpunkten im ERA-Interim Datensatz besteht, konnte nicht abschließend geklärt werden in wieweit die Resultate der Rasteranalyse der Natur entsprechen.
|
7 |
A new way to quantify stratosphere-troposphere coupling in observations and climate modelsClemo, Thomas Daniel January 2017 (has links)
Atmospheric mass is transported in and out of the stratospheric polar cap region by a wave-driven meridional circulation. Using composites of polar cap pressure anomalies, defined as deviations from the average annual cycle, it is shown that this stratospheric mass flux is accompanied by a similar mass flux near the surface. This 'tropospheric amplification' of the stratospheric signal is introduced as a new way to quantify stratosphere-troposphere coupling. Regression analysis is used to create a vertical profile of atmospheric pressure during a tropospheric amplification event, and the regression slope profile is used as a tool to quantify the amplification. Using data from 5 reanalysis datasets and 11 climate models, it is shown that high-top models, with a model lid of above 1 hPa, are significantly better at reproducing tropospheric amplification than low-top models, due to having more detailed parameterisations of stratospheric processes. However, the regression slope profiles of all models, bar one, are significantly different to the profile of reanalysis data at a 95% confidence level. Tropospheric amplification is also investigated in historical and future simulations from these models, and it is concluded that there is not expected to be a large change in the phenomenon over the next 100 years. The processes needed to reproduce tropospheric amplification can be identified by comparing idealised models of different complexity. A simple dry-core model is not able to reproduce tropospheric amplification, while a model with a comprehensive radiation scheme does produce the basic regression slope profile under certain configurations. The associations between pressure change and mass flux are further investigated using primitive equations. It is found that vertical and horizontal contributions to mass flux act to mostly cancel each other out, leaving a poorly-conditioned residual, and that the horizontal mass flux across the polar cap boundary has both geostrophic and ageostrophic components.
|
8 |
Padrão espaço temporal dos componentes do balanço de energia em clima subtropical úmidoSchirmbeck, Juliano January 2017 (has links)
Resumo: Considerando a importância da compreensão da dinâmica espaço temporal dos componentes do balanço de energia (BE) em escala regional para o gerenciamento de recursos hídrico e o manejo agrícola, o objetivo principal desta tese foi construir e analisar uma série temporal dos componentes do BE adequada às condições de clima subtropical úmido do Estado do Rio Grande do Sul. Para tanto, inicialmente foi avaliada a adequação de modelos de estimativa de BE para o Estado. Nesta etapa foram utilizados produtos MODIS e dados de referência medidos em uma torre micrometeorológica instalada em Cruz Alta – RS, usando valores instantâneos para um período de estudo de 2009 a 2011. Na sequência foi avaliada a adequação dos modelos em representar a variabilidade espacial dos componentes do BE. Nesta etapa foram usados produtos MODIS, dados de reanálise ERA Interim, dados de referência da torre micrometeorológica e dados de estações meteorológicas do INMET, para o mesmo período de estudo. Na última etapa do trabalho foi construída a série temporal dos componentes do BE usando o modelo METRIC, a qual abrangeu um período de 14 anos, de 2002 a 2016. Os resultados demonstraram que os três modelos analisados apresentam coerência com as medidas de referência, sendo as maiores limitações apresentadas pelo modelo SEBAL, as quais se atribui principalmente às condições ecoclimáticas do Estado e a baixa resolução espacial das imagens. Na análise da variabilidade espacial, o modelo METRIC apresentou maior consistência nos resultados e proporcionou maior número de dias com resultados válidos, sendo assim apontado como o mais apto para realização do restante do estudo. A série temporal construída possibilitou a compreensão dos padrões de distribuição espaço temporal dos componentes do BE no estado do Rio Grande do Sul. Há uma marcada sazonalidade nos componentes do BE, com maiores valores no verão e menores no inverno. G (fluxo de calor no solo) é o componente de menor magnitude e sua distribuição espacial e temporal é determinada pela distribuição de Rn (saldo de radiação). Já os componentes LE (fluxo de calor latente) e H (fluxo de calor sensível), são os que mostram magnitude maior e apresentam padrões de distribuição espacial e temporal coerentes com as condições climáticas e com os tipos de uso e cobertura na área de estudo. Observase um padrão inverso, com um gradiente de LE no sentido noroeste para sudeste e para o componente H, no sentido sudeste para noroeste. Sendo estas informações de grande importância para gerenciamento de recursos hídricos em escala regional, para estudos de zoneamento agrícola. / Abstract: Given the importance of understanding the temporal and spatial dynamics of of the energy balance (EB) components in a regional scale for the management of water resources and agricultural, the main objective of this thesis was to construct and analyze a time series of the components of BE appropriate to the subtropical humid climate conditions of the State of Rio Grande do Sul. In order to reach the objective initially, the adequacy of the models for the humid climate conditions was evaluated, in this step we used MODIS data and reference data measured in a micrometeorological tower installed in Cruz Alta - RS. The analyzes performed with instantaneous values and the study period was from 2009 to 2011. The next step evaluate the spatial variability of the BE components, the data used were the MODIS products, ERA Interim reanalysis data, reference data of the micrometeorological tower and INMET meteorological stations, for the same study period. In the last stage the time series of the BE components was constructed from the METRIC model. The period series was 14 years from 2002 to 2016.The results showed that the three models analyzed were consistent with the reference measurements, with the greatest limitations presented by the SEBAL model, which are mainly attributed to the state's eco-climatic conditions and the low spatial resolution of the images In the analysis of the spatial variability, the METRIC model presented greater consistency in the results and provided greater number of days with valid results, this model thus indicated as the most suitable for the rest of the study. The time series constructed allowed us to understand the temporal distribution patterns of BE components in the state of Rio Grande do Sul. There is a marked seasonality in the BE components, with higher values in summer and lower in winter. G is the smallest magnitude component and its spatial and temporal distribution is determined by the Rn distribution. On the other hand, the LE and H components are those that show higher magnitude and present spatial and temporal distribution patterns consistent with the climatic conditions and the types of use and coverage in the study area. An inverse pattern is observed, with a LE gradient from north-west to south-east and for H-component, from southeast to northwest.
|
9 |
Definitions of Clear-sky Fluxes and ImplicationsVerma, Abhishek 2011 December 1900 (has links)
Clear-sky top-of-atmosphere (TOA) fluxes are important in estimating the impact of clouds on our climate. In this study, we quantitatively compare the clear-sky fluxes measurements of the Clouds and the Earths Radiant Energy System (CERES) instrument to clear-sky fluxes from two reanalysis, NASA's Modern Era Retrospective-analysis for Research and Application (MERRA), and the European Center for Medium Range Weather Forecast Interim reanalysis (ERA-Interim). In the first comparison, we compare observed fluxes from individual cloud-free field-of-
views to the reanalyses. In the second comparison, we compare monthly averaged observed clear-sky fluxes to those from the reanalyses. Monthly clear-sky fluxes are
calculated by averaging fluxes from cloud-free regions.
In both comparisons, the fluxes generally agree within +/- 10 W/m^2. Finally, we show that, while the differences between the fluxes of observations and the reanalyses are several W/m2, the inter-annual anomalies agree much better, with zonal and global average inter-annual anomalies typically agreeing within 1 W/m^2. The longwave clear-sky anomalies show excellent agreement even when comparing individual grid points, whereas the shortwave clear-sky anomalies are generally smaller at individual grid points.
|
10 |
Wind Speed Prediction using Global and Regional Based Virtual Towers in CFD SimulationsMoubarak, Roger January 2011 (has links)
Wind farm assessment is a costly and time consuming process when it is planned by traditional methods such as a met mast. Therefore, new models have been established and used for the wind farm assessment to ease the process of wind farm planning. These models are Global-regional models which add to cost efficiency and time saving. There are several types of these models in the market that have different accuracy. This thesis discusses and uses in simulations Global – regional model data outputs from European Centre for Medium-Range Weather Forecasts (ECMWF), Weather Research Forecast WRF and ECMWF, which is currently producing ERA-Interim, global reanalysis of the data-rich period since 1989 .The goal of the master's thesis is to see whether it is useful and efficient to use Global – regional weather model data such as the Era Interim Global Reanalysis Model data for wind assessment by comparing it with the real data series (met mast) located in Maglarp, in the south of Sweden.The comparison shows that in that specific area (hindcast) at Maglarp, in the south of Sweden, very promising results for planning a wind farm for a 100m, 120m and 38m heights.
|
Page generated in 0.0841 seconds