Spelling suggestions: "subject:"btemperature data"" "subject:"btemperature mata""
1 |
Statistical homogenization of undocumented monthly temperature data in British Columbia for trend analysisWang, Yaqiong 30 April 2018 (has links)
Homogenization of monthly temperature data in BC is performed for 310 monthly maximum temperature series and 307 minimum temperature series from three networks: BC Hydro, BC Ministry of Forests Land Natural Resource Operations and Rural Development (Wildfire Management Branch) and the BC Ministry of Transportation and Infrastructure. The homogenization procedure is based on a penalized maximum t-test with mean-adjustment to detect inhomogeneities and make adjustments to the data. Before homogenization, quality control is performed on 797 stations at the daily time step. Trends at each location, in three sub-regions and across the province are analyzed based on resulting homogenized PCIC monthly temperature products. In order to measure the influence homogenization has on trends and validate the trends results calculated from the PCIC homogenized datasets, climate trends derived from the PCIC homogenized dataset are compared to those calculated from PCIC datasets without the homogenization and those from the homogenized temperature products existing in BC from ECCC respectively. The brief trend analysis components are introduced as follows. Trends before and after homogenization are compared for the averaged time series within three sub-regions based on PCIC station data. Trends based on homogenized PCIC stations and AHCCD stations are also compared. In addition, spatial patterns of trends over BC are analyzed based on PCIC gridded datasets, and compared with those of CANGRD. Homogenization results show that 92 out of 310 stations (29.6%) for maximum temperature and 75 out of 307 stations (24.4%) for minimum temperature have no detected changepoint, which means they appear to be homogenous. BCH has the highest portion of stations with changepoints, with 73.8% and 60.7% for maximum and minimum temperature, whereas FLNRO_WMB has the lowest portion, with 10.5% for Tmax and 27.3 % for Tmin. 80 and 81 stations have sufficient data for Tmax and Tmin variable have been analyzed for single station trend over 1990-2014. Comparing with the trends before homogenization, trends derived from homogenized PCIC stations have similar sign but smaller magnitude in general. The single station trend results are in good agreement with results of AHCCD. Spatial patterns of trends that are based on the interpolated PCIC stations also agree well with those based on CANGRD products. Warming trends predominate. Most of the seasons have distinctive positive trends across the province with exception of spring and some seasons over Vancouver Island. / Graduate
|
2 |
Benchmarking the performance of homogenisation algorithms on daily temperature dataKillick, Rachel Elizabeth January 2016 (has links)
Reliable temperature time series are necessary to quantify how our world is changing. Unfortunately many non-climatic artefacts, known as inhomogeneities, affect these time series. When looking at real world data it is often not possible to distinguish between these non-climatic artefacts and true climatic variations that are naturally found in our world. Therefore, trying to remove the non-climatic artefacts with complete confidence is problematic, but leaving them in could lead to misinterpretation of climate variations. In creating realistic, homogeneous, synthetic, daily temperature series the truth can be known about the data completely. Known, created inhomogeneity structures can be added to these series, allowing the distinguishing between true and artificial artefacts. The application of homogenisation algorithms to these created inhomogeneous data allows the assessment of algorithm performance, as their returned contributions are being compared to a known standard or benchmark, the clean data. In this work a Generalised Additive Model (GAM) was used to create synthetic, clean, daily temperature series. Daily data pose new challenges compared to monthly or annual data owing to their increased variability and quantity. This is the first intercomparison study to assess homogenisation algorithm performance on temperature data at the daily level. The inhomogeneity structures added to the clean data were created by perturbing the inputs to the GAM, which created seasonally varying inhomogeneities, and by adding constant offsets, which created constant inhomogeneities. Four different regions in the United States were modelled, these four regions are climatically diverse which allowed for the exploration of the impact of this on homogenisation algorithm performance. Four different data scenarios, incorporating three different inhomogeneity structures, were added and evaluations also investigated how these impacted algorithm performance. Eight homogenisation algorithms were contributed to this study and their performance was assessed according to both their ability to detect change points and their ability to return series that were closer to the clean data than they were on release. These evaluations sought to aid the improvement of these algorithms and enable a quantification of the uncertainty remaining in daily temperature data even after homogenisation has taken place. Evaluations were also made of the benchmarks as it was important that benchmark weaknesses were taken into account. It was found that more climatologically diverse regions were harder to model and less climatologically diverse regions were easier to homogenise. Station density in a network and the presence of artificial trend inhomogeneities did not impact algorithm performance as much as changes in autocorrelations did, and the latter area was an area that most algorithms could improve on. This work feeds into the larger project of the International Surface Temperature Initiative which is working on a wider scale and with monthly instead of daily data.
|
3 |
Relationships between Oceanographic Satellite Data and Alexandrium Distributions in the Gulf of MaineLuerssen, Remy Martin January 2001 (has links) (PDF)
No description available.
|
4 |
Anaysis of the Trend of Historical Temperature and Historic CO2 Levels Over the Past 800,000 Years by Short Time Cross Correlation TechniquePatel, Tejashkumar January 2021 (has links)
Carbon Dioxide concentration in Earth’s atmosphere is currently at 417 Parts permillion (ppm) and keep rising. Historic CO2 levels and historic temperature levels hasbeen cycling over the past 800,000 years. To study the trend of CO2 and temperatureover past 800,00 years, one needs to find out the relation between historic CO2 andhistoric temperature levels. In this project, we will perform different tasks to identify thetrend influencer between CO2 and temperature. Cross correlation technique is used tofind out the relation between two random signals. Temperature and CO2 data areconsidered as two random signals. Re-sampling by Interpolation techniques are imposedon both CO2 and temperature data for the change of sampling rate. Short time crosscorrelation technique is employed on the CO2 and temperature data over the differenttime windows to find out the time lag. Time lag refers to how far the signals are offset.
|
5 |
Specific Cation Effects in Biological Systems: Thermodynamic and Spectroscopic InsightsKherb, Jaibir 2011 December 1900 (has links)
Very specific protein-salt interactions are involved in a multitude of biological phenomena such as protein folding/stability, enzymatic activity, and signal transduction events. In this work, we used two very simple, protein-mimic model biopolymers to obtain a better understanding of specific cation effects operating in aqueous protein environments. The two biopolymers used were Elastin-like Polypeptides (ELPs) and poly(N-isopropylacrylamide) (PNIPAM). ELPs are an especially an ideal model system as these polypeptides can be easily genetically engineered to observe the effect of specific amino acid residues and peptide chain length on these salt interactions. Both of these biopolymers are also highly thermoresponsive as their aqueous solutions undergo a hydrophobic collapse/aggregation induced phase transition process above a lower critical solution temperature (LCST). Thermodynamic measurements of these biopolymers were carried out under various salt solution conditions. Additionally, both of these biopolymers are suitable for making surface specific spectroscopic measurements. Vibrational sum frequeny spectroscopy (VSFS), a non-linear interface sensitive spectroscopic technique, was employed here to investigate biologically relevant cation interactions which occur at peptide/protein surfaces.
First, the LCST response of a non-polar ELP and a neutral biopolymer, PNIPAM, was investigated in the presence of 12 different alkali, alkaline-earth metal and transition metal chloride salts. Even though the salt interactions for uncharged proteins are dominated by anions, subtle specific cation effects were also observed. The results followed a direct Hofmeister series for cations. Most alkali cations are excluded from the polar amide regions of proteins. More polarizable cations, however, can solvate the hydrophobic moieties and somewhat counter the salting-out effect of the chloride anion. More charged and hydrated ions like lithium and divalent cations showed a weak interaction to the amide moiety through their hydration shell.
The role of acidic amino acid residues in inducing cation specificities was investigated using an aspartate-rich ELP system. Both thermodynamic and spectroscopic data conclusively proved that the negative charge on protein surfaces is the main driving force for cation partitioning and specificity under physiological relevant concentration regimes. Apparent binding constants of carboxylate moieties with cations were determined. This is the first quantitative and thoroughly systematic study of such biologically relevant cation-carboxylate interactions prevalent in enzyme active sites and protein surfaces.
|
6 |
3D Wavelet-Based Algorithms For The Compression Of Geoscience DataRucker, Justin Thomas 10 December 2005 (has links)
Geoscience applications generate large datasets; thus, compression is necessary to facilitate the storage and transmission of geoscience data. One focus is on the coding of hyperspectral imagery and the prominent JPEG2000 standard. Certain aspects of the encoder, such as rate-allocation between bands and spectral decorrelation, are not covered by the JPEG2000 standard. This thesis investigates the performance of several JPEG2000 encoding strategies. Additionally, a relatively low-complexity 3D embedded wavelet-based coder, 3D-tarp, is proposed for the compression of geoscience data. 3D-tarp employs an explicit estimate of the probability of coefficient significance to drive a nonadaptive arithmetic coder, resulting in a simple implementation suited to vectorized hardware acceleration. Finally, an embedded wavelet-based coder is proposed for the shapeaptive coding of ocean-temperature data. 3D binary set-splitting with $k$-d trees, 3D-BISK, replaces the octree splitting structure of other shapeaptive coders with $k$-d trees, a simpler set partitioning structure that is well-suited to shapeaptive coding.
|
7 |
Constraining 3D Petroleum Reservoir Models to Petrophysical Data, Local Temperature Observations, and Gridded Seismic Attributes with the Ensemble Kalman Filter (EnKF)Zagayevskiy, Yevgeniy Unknown Date
No description available.
|
8 |
A theoretical one-dimensional analysis of the transient temperature and stress distributions in a long cylinder subjected to conductive cooling and heatingHencke, Hartmut January 1983 (has links)
In a new approach used to model quench tests conducted with long cylindrical specimens in fluids, heat transfer within the fluid is described by the heat conduction equations while heat convection is neglected. Analytical solutions for the temperatures and stresses as functions of the time and of the radial coordinate in the cylindrical specimen are presented and the maximum tensile stresses in the specimen are shown to depend on the ratio of thermal conductivities and on the ratio of volumetric heat capacities in the specimens and the quench bath. Plots of maximum tensile stresses are given for a wide range of these ratios and plots of temperatures, stresses and heat fluxes are included for five conductivity ratios.
It was found that the maximum stresses predicted by this analysis are in satisfactory agreement with experimental results in the literature for high conductivity fluids, while a larger discrepancy was noted for low conductivity fluids. / M.S.
|
9 |
Analysis of a novel thermoelectric generator in the built environmentLozano, Adolfo 05 October 2011 (has links)
This study centered on a novel thermoelectric generator (TEG) integrated into the built environment. Designed by Watts Thermoelectric LLC, the TEG is essentially a novel assembly of thermoelectric modules whose required temperature differential is supplied by hot and cold streams of water flowing through the TEG. Per its recommended operating conditions, the TEG nominally generates 83 Watts of electrical power. In its default configuration in the built environment, solar-thermal energy serves as the TEG’s hot stream source and geothermal energy serves as its cold stream source. Two systems-level, thermodynamic analyses were performed, which were based on the TEG’s upcoming characterization testing, scheduled to occur later in 2011 in Detroit, Michigan.
The first analysis considered the TEG coupled with a solar collector system. A numerical model of the coupled system was constructed in order to estimate the system’s annual energetic performance. It was determined numerically that over the course of a sample year, the solar collector system could deliver 39.73 megawatt-hours (MWh) of thermal energy to the TEG. The TEG converted that thermal energy into a net of 266.5 kilowatt-hours of electricity in that year. The second analysis focused on the TEG itself during operation with the purpose of providing a preliminary thermodynamic characterization of the TEG. Using experimental data, this analysis found the TEG’s operating efficiency to be 1.72%.
Next, the annual emissions that would be avoided by implementing the zero-emission TEG were considered. The emission factor of Michigan’s electric grid, RFCM, was calculated to be 0.830 tons of carbon dioxide-equivalent (CO2e) per MWh, and with the TEG’s annual energy output, it was concluded that 0.221 tons CO2e would be avoided each year with the TEG. It is important to note that the TEG can be linearly scaled up by including additional modules. Thus, these benefits can be multiplied through the incorporation of more TEG units.
Finally, the levelized cost of electricity (LCOE) of the TEG integrated into the built environment with the solar-thermal hot source and passive ground-based cold source was considered. The LCOE of the system was estimated to be approximately $8,404/MWh, which is substantially greater than current generation technologies. Note that this calculation was based on one particular configuration with a particular and narrow set of assumptions, and is not intended to be a general conclusion about TEG systems overall. It was concluded that while solar-thermal energy systems can sustain the TEG, they are capital-intensive and therefore not economically suitable for the TEG given the assumptions of this analysis. In the end, because of the large costs associated with the solar-thermal system, waste heat recovery is proposed as a potentially more cost-effective provider of the TEG’s hot stream source. / text
|
Page generated in 0.0617 seconds