• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 811
  • 88
  • 88
  • 88
  • 88
  • 88
  • 88
  • 36
  • 31
  • Tagged with
  • 1801
  • 1801
  • 918
  • 828
  • 527
  • 496
  • 366
  • 132
  • 132
  • 127
  • 120
  • 120
  • 112
  • 104
  • 75
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
321

Extratropical cyclone climatology for eastern Canadian cities

Plante, Mathieu January 2014 (has links)
In this study, a Lagrangian tracking algorithm is applied to the 850-hPa relative vorticity field to characterize extratropical cyclone tracks across eastern Canada. Seasonal cycles are examined in terms of overall cyclone frequency, intensity, regions of development and decay. We found that cyclones tend to develop over the Rockies, the Great Lakes or the Western Atlantic. They are most intense over Newfoundland and North Atlantic, and decay over Greenland. Cyclones tracking across Toronto, Montreal, Halifax and St-John's are further analyzed, with typical cyclone tracks, origin, frequency, mean local growth rate, and mean intensity. Among others, we found that cyclone activities at east coast cities (Halifax, St-John's) are dominated by Atlantic cyclones, more frequent in winter, while Montreal's and Toronto's cyclones travel primarily from the Great Lakes, frequent and intense in spring and autumn. Cyclones from the Gulf of Mexico are not frequent, but extreme. The relationship between winter cyclone tracks and modes of atmospheric variability are also examined with an emphasis on the El Niño - Southern Oscillation (ENSO), North Atlantic Oscillation (NAO) and Pacific North American pattern (PNA). An ENSO and PNA-related oscillation between continental and coastal cyclones is confirmed. The inter-annual variability of winter cyclones cross eastern Canadian cities are quantified. Cyclone activities in Toronto and Montreal shown to be modulated by ENSO and PNA, while NAO dominates the cyclone variability in Halifax and St-John's. The local cyclone variability is found to be small in terms of overall cyclone statistics, but important in terms of changes in the origins of the local cyclones. / Un algorithme est appliqué sur le tourbillon relatif à 850-hPa afin de calculer la trajectoire des cyclones affectant l'Est du Canada. Les variations saisonnières de ces trajectoires sont approfondies par l'étude de plusieurs paramètres, tels que la fréquence, l'intensité, l'origine, le taux de développement et le taux de dissipation des cyclones. L'étude démontre que les cyclones se développent principalement au dessus des Rocheuse, des Grands Lacs et de la côte Est des États-Unis, et se dissipent près des côtes Est et Ouest du Groendland. Les plus intenses se trouvent à Terre Neuve et au Nord de l'Atlantique. Ces statistiques de cyclones sont ensuite évaluées plus spécifiquement pour les cyclones atteignant Toronto, Montréal, Halifax et St-John's. Entre autre, il est démontré que les villes côtières sont pricipalement affectées par les cyclones en provenance de la côte Est Américaine, fréquents en hiver, tandis que Toronto et Montréal sont principalement affectés par les cyclones en provenance des Grands Lacs, plutôt fréquents au printemps et à l'automne. Les cyclones en provenance du Golf du Mexique sont moins fréquents, mais constituent une grande partie des extrêmes. La variation inter-annuelle de l'activité cyclonique est ensuite évaluée selon différents régimes de variabilité climatiques, tels qu'ENSO (El Nino-Southern Oscillation), le NAO (North Atlantic Oscillation) et le PNA (Pacific-North America). Les résultats consolident la présence d'une oscillation entre cyclones continentaux et cyclones côtiers pendant ENSO. L'étude démontre que la variabilité cyclonique inter-annuelle à Toronto et Montréal est dominée par ENSO et le PNA, tandis que le NAO a un plus grand impact à Halifax et à St-John's.
322

An investigation of carbon cycle dynamics since the last glacial maximum using a climate model of intermediate complexity

Simmons, Christopher January 2014 (has links)
The University of Victoria Earth System Climate Model (UVic ESCM) v. 2.9 is used in this thesis to investigate two important topics in paleoclimate research: the glacial-to-interglacial rise in CO2 and the Holocene carbon cycle. The UVic ESCM belongs to a class of models known as Earth system Models of Intermediate Complexity (EMICs) (Claussen et al. 2002) and provides a simplified yet comprehensive representation of the climate system and carbon cycle dynamics, including a three-dimensional ocean model, a dynamic-thermodynamic sea ice model, a dynamic global vegetation model, ocean sediments, and a fully-interactive inorganic and organic carbon cycle in the ocean. First, a suite of transient simulations were conducted to cover the period from the Last Glacial Maximum (LGM) to the present (2000 A.D). Simulations including only prescribed orbital forcing and continental ice sheet changes failed to produce an increase in atmospheric CO2 for the simulation period, although they demonstrated significant long-term sensitivity (10-15 ppm) to small (1.9 Tmol yr-1) variations in the weathering rate. Modelling experiments incorporating the full CO2 radiative forcing effect since the Last Glacial Maximum, however, resulted in much higher CO2 concentrations (a 20 ppm increase over those without CO2 radiative forcing) due to a greater ventilation of deep-ocean DIC and decreased oceanic CO2 uptake, related in part to a larger decrease in southern hemisphere sea ice extent. The more thorough ventilation of the deep ocean in simulations with CO2 radiative forcing also caused a larger net alkalinity decrease during the late deglacial and interglacial, allowing atmospheric CO2 to increase by an additional 10 ppm in the simulations presented here. The inclusion of a high latitude terrestrial carbon reservoir provided a net release of carbon to the atmosphere, mostly during the early deglacial, increasing atmospheric CO2 levels to 240-250 ppm. This terrestrial release also provided better agreement with observed changes in carbonate concentrations in the deep ocean since the LGM (Yu et al. 2010). The addition of freshwater fluxes from ice sheet melting in North America added emphasis on the importance of a lower weathering rate during the LGM and early deglacial and indicated that deep water in the North Pacific may become more positively buoyant during freshwater fluxes in the Atlantic due to greater diffusion of heat to the deep ocean by enhanced Pacific intermediate water formation.Second, our results for the Holocene carbon cycle indicate that atmospheric CO2 should decrease between 6000 B.C. and 2000 A.D. without some kind of external forcing not represented in the model. However, the amount of the decrease (8-15 ppm) varied for different ocean circulation states. Furthermore, our simulations demonstrated significant sensitivity to Antarctic marine ice shelves, and these results indicate that more extensive marine ice shelves during the Holocene (relative to previous interglacials) may increase atmospheric CO2 levels by ~5 ppm (from purely physical mechanisms) and as much as 10 ppm when different ocean circulation states or alkalinity changes are included. Adding various anthropogenic land use scenarios to the Holocene carbon cycle were unable to explain the CO2 trend, accounting for only a third of the ice core CO2 increase by 1 A.D. in our most extreme scenario. However, the results imply that external mechanisms leading to a decrease in alkalinity during the Holocene (such as declining weathering rates, more extensive marine ice shelves, terrestrial uptake, more calcifiers, coral reef expansion, etc.) may prevent the ocean from absorbing more of the anthropogenic terrestrial release, allowing the deforestation flux to balance a greater fraction of the Holocene peatland uptake (not modelled) and permitting CO2 to increase from oceanic processes that are normally overwhelmed by northern peatlands. / Cette thèse détaille l'application du modèle du système climatique terrestre de l'Université de Victoria (version 2.9) dans le cadre de deux importants champs de recherche en modélisation paléoclimatique : l'augmentation du niveau de dioxyde de carbone (CO2) dans l'atmosphère durant la plus récente transition glaciaire-interglaciaire, ainsi que l'évolution du cycle du carbone durant l'Holocène. Le modèle utilisé dans cette étude est répertorié comme modèle de complexité intermédiaire (Claussen et al. 2002), offrant un traitement à la fois simplifié et exhaustif de la dynamique du système climatique terrestre et du cycle du carbone. Celui-ci comprend un modèle océanique tridimensionnel, un modèle de glace marine dynamique/thermodynamique, un modèle dynamique et global de la végétation, les sédiments océaniques ainsi qu'un traitement interactif du cycle du carbone organique et inorganique.Premièrement, une série de simulations transitoires sont effectuées afin de couvrir la période s'étendant du plus récent maximum glaciaire (LGM) jusqu'à aujourd'hui (2000 apr. J.-C.). Les simulations fondées uniquement sur une prescription des paramètres orbitaux et des calottes glaciaires ne reproduisent pas l'augmentation du CO2 dans l'atmosphère durant la période transitoire tel que mentionné ci-haut, mais exposent toutefois une certaine sensibilité (10-15 ppm) à de faibles (1.9 Tmol/an) variations dans le taux d'érosion. Dans le cas de simulations prenant en compte la gamme complète des effets radiatifs associés au CO2, par contre, la concentration du CO2 dans l'atmosphère s'avère beaucoup plus élevée (une augmentation de 20 ppm par rapport à celles sans effets radiatifs). Cette différence est causée par une plus importante ventilation de carbone inorganique dissous en eaux profondes ainsi qu'une diminution du taux d'absorption de CO2 par l'océan, qui s'explique en partie par une fonte accélérée de la glace marine dans l'hémisphère Sud. Le changement du régime de ventilation en profondeur a également pour effet de diminuer l'alcalinité marine à partir de la fin de la période de déglaciation, augmentant de 10ppm la concentration de CO2 dans l'atmosphère. La présence d'un réservoir de carbone terrestre an hautes latitudes fournit une source additionnelle de carbone, principalement durant les stages initiaux de la période de déglaciation, permettant ainsi aux niveaux de CO2 dans l'atmosphère d'atteindre les 240-250 ppm. En outre, ceci facilite la validation de nos résultats par rapport aux changements dans la concentration de carbonate observées depuis le dernier maximum glaciaire dans les profondeurs marines (Yu et al. 2010). Le faible taux d'érosion terrestre durant le maximum glaciaire et la période de déglaciation qui a suivi est d'autant plus significatif en raison d'un apport accru d'eau douce de fonte en provenance des calottes glaciaires Nord-Américaines. Deuxièmement, nos résultats quant au cycle du carbone durant l'Holocène pointent vers une certaine diminution du niveau de CO2 dans l'atmosphère se manifestant vers 6000 av. J.-C. et qui, en l'absence de forçage externe au modèle, devrait se maintenir jusqu'à aujourd'hui ; celle-ci semble toutefois varier (8-15 ppm) en fonction du mode de circulation océanique. De plus, la concentration atmosphérique de CO2 dans nos simulations démontre une importante sensibilité à l'étendue des barrières de glace en Antarctique, d'où notre conclusion qu'une présence accrue de glace marine durant l'Holocène (par rapport aux autres périodes interglaciaires) pourrait augmenter le niveau de CO2 atmosphérique de près de 5 ppm (effets physiques directs), et de pas moins de 10 ppm en considérant la gamme de modes de circulation océanique ainsi que les changements dans l'alcalinité marine.
323

Operational mitigation of ground clutter using information from past and near-future radar scans

Anderson-Frey, Alexandra January 2014 (has links)
When a radar pulse encounters obstacles in its path, the accuracy of radar reflectivity data is adversely affected, which in turn decreases the quality of forecasting and nowcasting tools such as rainfall totals and cell-tracking algorithms. In this study, we seek an optimal solution for real-time, operational gap-filling in radar data contaminated by known areas of ground clutter, and explore a variety of algorithms of increasing complexity to that end, making use of a geostatistical method known as ordinary kriging. The final result is the development of a "smart" ordinary kriging algorithm. This method replaces clutter-contaminated pixels in radar data using the weighted average of a nearby collection of uncontaminated pixels, which have been specially selected to sample independent spatial and temporal information while avoiding bogging down calculations with redundant information. These data are obtained not only from the same reflectivity scan as the ground clutter to be corrected, but also from different heights and from both earlier and near-future times. The incorporation of the time dimension in particular adds a great deal of value to simplistic algorithms, even when only data from past times are considered. Radar scans from earlier times are thus shown to be a major untapped source of information that can be used to generate (and regenerate, using near-future data) more accurate radar products. / Lorsqu'un signal radar rencontre des obstacles, la précision des données de réflectivité est endommagée, ce qui réduit la qualité des outils de prévisions météorologiques tels les totaux de précipitation et les algorithmes qui surveillent l'évolution des orages. Dans cette étude, on recherche une solution optimale pour remplir en temps réel et dans un contexte opérationnel les trous d'information causés par les échos de sol, en explorant une variété d'algorithmes de plus en plus complexe basée sur une méthode géostatistique: le kriging ordinaire. Le résultat final est le développement d'un algorithme de kriging ordinaire "intélligent". Cette méthode remplace les pixels contaminés en utilisant la moyenne pondérée de pixels non-contaminés à proximité, où ces pixels sont sélectionnés specialement pour incorporer des données indépendentes et pour ne pas surcompliquer les calculs avec trop d'informations redondantes. Ces informations proviennent non seulement du même temps et du même niveau que la région qui doit être corrigée, mais aussi d'aux autres niveaux ainsi que du passé et du proche-futur. L'inclusion de la dimension temporelle en particulier offre grand valeur même aux algorithmes les plus simples, et aussi lorsqu'on considère seulement les informations du passé et non du futur. Les données du radar des temps antérieurs constituent alors une source inexploité d'informations qui pourraient permettre de générer (et de régénérer, en utilisant les données du proche-futur) des produits radar plus précis.
324

An assessment of freezing rain processes in the Saint- Lawrence River Valley: synoptic-dynamic analysis and operational model verification

Splawinski, Sophie January 2014 (has links)
Freezing rain (FZRA), a hazardous meteorological phenomenon, poses a significant threat to the general public and can severely damage societal infrastructure. The phenomenon is well known throughout the St-Lawrence River Valley (SLRV), which is known to have one of the highest frequencies of FZRA in the world owing to its orography and spatial orientation. Our focus is to provide meteorologists with the means to better predict both the onset and duration of FZRA at Montreal (CYUL), Quebec City (CYQB), and Massena (KMSS) in a two-stage process: by introducing a new 2-dimensional elliptic regression statistical forecast model and to assess synoptic and mesoscale conditions associated with past events. Analysis of a 27-year period, from 1979 through 2005, was conducted with a total of 99, 102, and 70 FZRA events at CYQB, CYUL, and KMSS, respectively. Our statistical analysis provides meteorologists with the POZR (probability of freezing rain): the ability to input model forecasted temperatures at two pressure levels and determine the probability of the onset of FZRA based on a 30-year climatology of northeasterly related precipitation. Synoptic-dynamic analysis of past events acknowledges the need for a high-resolution forecast model to adequately resolve mesoscale processes crucial to FZRA maintenance. Tests performed using a verification dataset (2006-2011 ZR event data) show the accuracy and feasibility of the model, which could be implemented by forecasting offices. Furthermore, synoptic-dynamic assessment of events within the verification dataset and forecast model comparison provide insight into missed forecasts, great forecasts and false alarms. Utilizing these methods could provide meteorologists with the opportunity to produce more highly accurate forecasts of both the onset and duration of freezing rain events. / La pluie verglaçante (PV) est une forme de précipitation qui pose un danger non seulement pour le secteur publique mais également pour le secteur aérien. Ce phénomène est particulièrement connu dans la vallée du fleuve St. Laurent (VFSL). L'orientation de la vallée ainsi que l'orographie explique ce nombre accru d'évènement. Le but vise est donc de fournir aux météorologues les outils nécessaires pour améliorer les prévisions de durée et de location de PV pour les villes de Montréal (CYUL), Québec (CYQB), et Massena (KMSS). Afin d'y parvenir, deux étapes sont requises : la première, introduire un nouveau modèle statistique de prévision et la deuxième, évaluer les conditions synoptiques et meso-échelle des évènements au cours des 27 dernières années. Pendant cette période, 99, 102, et 70 évènements de PV se sont produit à CYQB, CYUL, et KMSS, respectivement. Notre analyse statistique fourni aux météorologues une probabilité d'occurrence de PV (POZR). L'analyse est un moyen d'introduire les prévisions de température à deux différents niveaux et déterminer la probabilité de PV en utilisant un modèle baser sur les données provenant d'une climatologie de précipitations et de vents dans la VFSL. Les analyses synoptiques et dynamiques des évènements passés nous ont montré la nécessité d'incorporer un modèle de prévisions à haute résolution dans la vallée; nécessaire pour résoudre adéquatement l'orographie. Ceci est impératif pour la réussite de prévisions de PV. Ensuite, en utilisant un ensemble de données de vérification, on effectues des analyses de faisabilité et de précision du modèle, ce qui peut être utilise dans un bureau de prévisions. Finalement, une comparaison d'évènements de POZR variées démontre les forces et faiblesses dans les modèles de prévisions actuels. Ces derniers, couplés avec un nouveau modèle de prévision de PV, fourni au météorologues une opportunité de produire des prévisions de PV plus précises dans la VFSL.
325

Hail Formation in Florida

Stanley, Matthew 18 June 2014 (has links)
<p>ABSTRACT Hail poses a substantial threat to life and property in the state of Florida. These losses could be minimized through better understanding of the relationships between atmospheric variables that impact hail formation in Florida. Improving hail forecasting in Florida requires analyzing a number of meteorological parameters and synoptic data related to hail formation. NOAA archive data was retrieved to create a database that was used to categorize text files of hail days. The text files were entered into the National Oceanic and Atmospheric Administration Earth System Research Laboratory website to create National Centers for Environmental Prediction/National Center for Atmospheric Research Reanalysis maps of atmospheric variables for Florida hail days as well as days leading to the hail event. These data were then analyzed to determine the relationship between variables that affect hail formation, in general, across different regions and seasons in Florida using Statistical Product and Service Solutions. The reasoning for the differing factors affecting hail formation between regions, seasons and hail sizes were discussed, as well as forecasting suggestions relating to region and month in Florida. The study found that the majority of all hail that occurs in Florida is during the wet season. A low Lifted Index, high Precipitable Water and lower than average Sea Level Pressure, in most cases, is present during hail days in Florida. Furthermore, results show that Vector Wind magnitude increases as hail size increases. Additionally, several atmospheric variables useful to studying hail events, such as Lifted Index, Precipitable Water, Sea Level Pressure, Vector Wind and Temperature have significant correlations with each other depending on the region and season being observed. Strong correlations between low Lifted Index, high Precipitable Water values and the occurrence of hail events are discussed, as well as the relationship between temperature anomalies at various pressure levels and the occurrence of hail events.
326

North American monsoon variability from paleoclimate era to climate change projection| A multiple dataset perspective

Carrillo Cruz, Carlos Mauricio 17 January 2015 (has links)
<p> In southwestern United States, the North American monsoon (NAM) is the main driver of severe weather in the Southwest. How the monsoon has behaved in the past and how it will change in the future is a question of importance for natural resource management and infrastructural planning. In this dissertation, I present the results of three studies that have investigated NAM variability and change from the perspective of paleoclimate records, future climate change projections, and simulation of the low-frequency variability with the longest retrospective atmospheric reanalysis. </p><p> In the first study, a monsoon-sensitive network of tree-ring chronologies is evaluated within its ability to reproduce NAM variability during the past four centuries. The tree-ring chronologies can reasonable characterizes the dominant modes of NAM climate variability and reveal low-frequency climate variability at decadal and longer timescales that is beyond the ability of the instrumental record to temporally well resolve. This low-frequency climate variability seems to coincide with the occurrence of multiyear persistent droughts. </p><p> In the second study, we consider the modes of climate variability to assess the degree of physical uncertainty in climate change projections models used in the North American Regional Climate Change Assessment Program (NARCCAP). NARCCAP models are evaluated mainly on their ability to represent warm season driven by quasi-stationary Rossby wave trains and El Ni&ntilde;o Southern Oscillation &ndash; Pacific Decadal Variability (ENSO-PDV). Only one out of eight NARCCAP models has a reasonable representation of the seasonal cycle of monsoon precipitation and ENSO-driven variability in both the 20<sup> th</sup> and 21<sup>st</sup> centuries. No decadal variability was observed in any of the NARCCAP models. </p><p> In the third study, the low-frequency drought signal found with tree-ring chronologies is further explored within the framework of a regional climate modeling. The Twentieth-Century Reanalysis is dynamically downscaled (DD-20CR) and its statistic analysis suggests that low-frequency drought signal in the Southwest is driven by atmospheric circulation changes on global to continental scales that affect precipitation in Central American as well. Low-frequency climate variability is therefore likely responsible for the multiyear persistent droughts in the last four centuries, as independently evaluated from the tree-ring monsoon-sensitive network.</p>
327

Broadband interferometry of lightning

Stock, Michael 04 March 2015 (has links)
<p> A lightning interferometer is an instrument which determines the direction to a lightning-produced radio point source by correlating the signal received at two or more antennas. Such instruments have been used with great success for several decades in the study of the physical processes present in a lightning flash. However, previous instruments have either been sensitive to only a narrow radio bandwidth so that the correlation can be done using analog hardware, or have been sensitive to a wide bandwidth but only recorded a short duration of the radiation produced by a lightning flash. </p><p> In this dissertation, a broad bandwidth interferometer is developed which is capable of recording the VHF radio emission over the entire duration of a lightning flash. In order to best utilize the additional data, the standard processing techniques have been redeveloped from scratch using a digital cross correlation algorithm. This algorithm can and does locate sources as faint as the noise level of the antennas, typically producing 100,000 or more point source locations over the course of a lightning flash. </p><p> At very low received power levels, the likelihood that a signal received at the antenna will be affected by the environmental noise is substantially higher. For this reason, the processing allows for the integration windows of the cross correlation to be heavily overlapped. In this way, the location of each event can be based on a distribution of windows. Further, noise identification techniques which leverage the heavily overlapped windows have been developed based on: the closure delay, the standard deviation, the correlation amplitude, and the number of contributing windows. The filtration techniques have proven to be very successful at identifying and removing mis-located sources, while removing the minimum number of low amplitude sources which are well located. </p><p> In the past, lightning interferometers have been limited to using only two perpendicular baselines to determine the direction to each point source. Additional techniques are developed in this dissertation for efficiently computing the image of a point source in the sky using an arbitrary number of antennas in an arbitrary configuration. The multiple baseline techniques further improves the sensitivity and accuracy of the locations provided by broadband interferometers. </p><p> To demonstrate the usefulness of broadband interferometers, the activity of 6 flashes spanning a diverse selection of lightning flash types are examined in this dissertation. This includes detailed analysis of negative stepped leaders, positive un-stepped leaders, K-changes, and fast positive breakdown. Initial breakdown pulses which are seen at the beginning of the flash are found to be no different than horizontal negative leader steps seen later in the flash. Evidence is found that positive leaders produce VHF radiation, as opposed to all of the radiation in the positive breakdown region being produced by retrograde negative breakdown. The time resolved three-dimensional velocity of 47 K-changes occurring in two flashes is measured. And finally, fast positive breakdown is characterized and found to be produced by a positive streamer process instead of a leader process. </p><p> Observations made with the instrument showcase the capabilities of a continuous sampling broadband interferometer. The instrument makes possible measurements which were difficult or impossible to obtain in the past, and the preliminary observations allude to many exciting scientific findings to come.</p>
328

A study of some aspects of extensive air showers at sea level

Enderby, Mark John January 1982 (has links)
An experiment to search for tachyons, associated with extensive air showers, using an unshielded plastic scintillator is described. No evidence for their existence has been found. A review of tachyon theory and previous searches is given. The design and implementation of an automatic data collection system for the Durham Air Shower Array, using a PET microcomputer, is described. In addition various improvements to the array are described. The performance of the array and the methods of data analysis are discussed. The array has been operated with five detectors and the results compared to those expected by simulation. The power of the zenith angle distribution has been determined and found to correspond to previous determinations. The size spectrum has also been determined in the range 2.10(^4) to 10(^6) electrons. Although the slopes are in agreement with other measurements the absolute rate is low. It is concluded that this is mainly due to the lack of redundancy in the array and that more detectors are required.
329

Validation of a model for thermal emission

Hughes, P. A. January 1986 (has links)
An introduction to the more general aspects of thermal models is followed by a brief outline of the construction and range of possible applications of the computer models-developed at the University of Durham. The physical basis behind one of these models, which predicts temperatures both at the surface and within a one-dimensional non-vegetated object, is considered in some detail, and the construction' of the model is outlined. The sensitivity of the temperature predictions to changes in the values of .the input parameters required by the model are also discussed. Equipment was designed specifically to collect sufficient ground truth data to enable the validation of the model, and there is a complete description of the construction and operation of the apparatus. The subsequent interpretation of the data using the computer is also described. The validation of the model was carried out for two roads with concrete and asphalt surfaces, and consisted of a comparison between surface temperatures predicted by the model and those measured by a radiometer. The results of the comparison are discussed in some detail. The suitability of the model to predict temperature contrasts between two different surfaces was also investigated using the’ validation data. The model was next applied to the more complex problem of simulating the thermal behaviour of a south facing vertical sandstone wall. A comparison between the predictions of the model and data measured by a radiometer is given, and the problems that this type of simulation entails are discussed. A summary is given of the work carried out with the model, and suggestions are made for improvements to the model. Finally, the development of future types of model is considered.
330

Model investigations into the radiative forcing of climate by anthropogenic emissions of sulphate and soot aerosol

Haywood, James Matthew January 1995 (has links)
No description available.

Page generated in 0.0738 seconds