• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 1284
  • 376
  • 212
  • 163
  • 71
  • 63
  • 36
  • 33
  • 28
  • 28
  • 26
  • 12
  • 12
  • 10
  • 10
  • Tagged with
  • 2847
  • 398
  • 284
  • 280
  • 207
  • 195
  • 190
  • 162
  • 156
  • 156
  • 156
  • 152
  • 147
  • 142
  • 128
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
391

Methods for the spatial modeling and evalution of tree canopy cover

Datsko, Jill Marie 24 May 2022 (has links)
Tree canopy cover is an essential measure of forest health and productivity, which is widely studied due to its relevance to many disciplines. For example, declining tree canopy cover can be an indicator of forest health, insect infestation, or disease. This dissertation consists of three studies, focused on the spatial modeling and evaluation of tree canopy cover, drawing on recent developments and best practices in the fields of remote sensing, data collection, and statistical analysis.newlinenewline The first study evaluates how well harmonic regression variables derived at the pixel-level using a time-series of all available Landsat images predict values of tree canopy cover. Harmonic regression works to approximate the reflectance curve of a given band across time. Therefore the coefficients that result from the harmonic regression model estimate relate to the phenology of the area of each pixel. We use a time-series of all available cloud-free observations in each Landsat pixel for NDVI, SWIR1 and SWIR2 bands to obtain harmonic regression coefficients for each variable and then use those coefficients to estimate tree canopy cover at two discrete points in time. This study compares models estimated using these harmonic regression coefficients to those estimated using Landsat median composite imagery, and combined models. We show that (1) harmonic regression coefficients that use a single harmonic coefficient provided the best quality models, (2) harmonic regression coefficients from Landsat-derived NDVI, SWIR1, and SWIR2 bands improve the quality of tree canopy cover models when added to the full suite of median composite variables, (3) the harmonic regression constant for the NDVI time-series is an important variable across models, and (4) there is little to no additional information in the full suite of predictors compared to the harmonic regression coefficients alone based on the information criterion provided by principal components analysis. The second study presented evaluates the use of crowdsourcing with Amazon's Mechanical Turk platform to obtain photointerpretated tree canopy cover data. We collected multiple interpretations at each plot from both crowd and expert interpreters, and sampled these data using a Monte Carlo framework to estimate a classification model predicting the "reliability" of each crowd interpretation using expert interpretations as a benchmark, and identified the most important variables in estimating this reliability. The results show low agreement between crowd and expert groups, as well as between individual experts. We found that variables related to fatigue had the most bearing on the "reliability" of crowd interpretations followed by whether the interpreter used false color or natural color composite imagery during interpretation. Recommendations for further study and future implementations of crowdsourced photointerpretation are also provided. In the final study, we explored sampling methods for the purpose of model validation. We evaluated a method of stratified random sampling with optimal allocation using measures of prediction uncertainty derived from random forest regression models by comparing the accuracy and precision of estimates from samples drawn using this method to estimates from samples drawn using other common sampling protocols using three large, simulated datasets as case studies. We further tested the effect of reduced sample sizes on one of these datasets and demonstrated a method to report the accuracy of continuous models for domains that are either regionally constrained or numerically defined based on other variables or the modeled quantity itself. We show that stratified random sampling with optimal allocation provides the most precise estimates of the mean of the reference Y and the RMSE of the population. We also demonstrate that all sampling methods provide reasonably accurate estimates on average. Additionally we show that, as sample sizes are increased with each sampling method, the precision generally increases, eventually reaching a level of convergence where gains in estimate precision from adding additional samples would be marginal. / Doctor of Philosophy / Tree canopy cover is an essential measure of forest health, which is widely studied due to its relevance to many disciplines. For example, declining tree canopy cover can be an indicator of forest health, insect infestation, or disease. This dissertation consists of three studies, focused on the spatial modeling and evaluation of tree canopy cover, drawing on recent developments and best practices in the fields of remote sensing, data collection, and statistical analysis. The first study is an evaluation of the utility of harmonic regression coefficients from time-series satellite imagery, which describe the timing and magnitude of green-up and leaf loss at each location, to estimate tree canopy cover. This study compares models estimated using these harmonic regression coefficients to those estimated using median composite imagery, which obtain the median value of reflectance values across time data at each location, and models which used both types of variables. We show that (1) harmonic regression coefficients that use a simplified formula provided higher quality models compared to more complex alternatives, (2) harmonic regression coefficients improved the quality of tree canopy cover models when added to the full suite of median composite variables, (3) the harmonic regression constant, which is the coefficient that determines the average reflectance over time, based on time-series vegetation index data, is an important variable across models, and (4) there is little to no additional information in the full suite of predictors compared to the harmonic regression coefficients alone.newlinenewline The second study presented, evaluates the use of crowdsourcing, which engages non-experts in paid online tasks, with Amazon's Mechanical Turk platform to obtain tree canopy cover data, as interpreted from aerial images. We collected multiple interpretations at each location from both crowd and expert interpreters, and sampled these data using a repeated sampling framework to estimate a classification model predicting the "reliability" of each crowd interpretation using expert interpretations as a benchmark, and identified the most important variables in estimating this "reliability". The results show low agreement between crowd and expert groups, as well as between individual experts. We found that variables related to fatigue had the most bearing on the reliability of crowd interpretations followed by variables related to the display settings used to view imagery during interpretation. Recommendations for further study and future implementations of crowdsourced photointerpretation are also provided. In the final study, we explored sampling methods for the purpose of model validation. We evaluated a method of stratified random sampling with optimal allocation, a sampling method that is specifically designed to improve the precision of sample estimates, using measures of prediction uncertainty, describing the variability in predictions from different models in an ensemble of regression models. We compared the accuracy and precision of estimates from samples drawn using this method to estimates from samples drawn using other common sampling protocols using three large, mathematically simulated data products as case studies. We further tested the effect of smaller sample sizes on one of these data products and demonstrated a method to report the accuracy of continuous models for different land cover classes and for classes defined using 10% tree canopy cover intervals. We show that stratified random sampling with optimal allocation provides the most precise sample estimates. We also demonstrate that all sampling methods provide reasonably accurate estimates on average and we show that, as sample sizes are increased with each sampling method, the precision generally increases, eventually leveling off where gains in estimate precision from adding additional samples would be marginal.
392

Lead and Copper Contamination in Potable Water: Impacts of Redox Gradients, Water Age, Water Main Pipe Materials and Temperature

Masters, Sheldon 06 May 2015 (has links)
Potable water can become contaminated with lead and copper due to the corrosion of pipes, faucets, and fixtures. The US Environmental Protection Agency Lead and Copper Rule (LCR) is intended to target sampling at high-risk sites to help protect public health by minimizing lead and copper levels in drinking water. The LCR is currently under revision with a goal of better crafting sampling protocols to protect public health. This study examined an array of factors that determine the location and timing of "high-risk" in the context of sampling site selection and consumer health risks. This was done using field studies and well-controlled laboratory experiments. A pilot-scale simulated distribution system (SDS) was used to examine the complex relationship between disinfectant type (free chlorine and chloramine), water age (0-10.2 days), and pipe main material (PVC, cement, and iron). Redox gradients developed in the distribution system as controlled by water age and pipe material, which affected the microbiology and chemistry of the water delivered to consumer homes. Free chlorine disinfectant was the most stable in the presence of PVC while chloramine was most stable in the presence of cement. At shorter water ages where disinfectant residuals were present, chlorine tended to cause as much as 4 times more iron corrosion when compared to chloramine. However, the worst localized attack on iron materials occurred at high water age in the system with chloramine. It was hypothesized that this was due to denitrification-a phenomenon relatively unexplored in drinking water distribution systems and documented in this study. Cumulative chemical and biological changes, such as those documented in the study described above, can create "high-risk" hotspots for elevated lead and copper, with associated concerns for consumer exposure and regulatory monitoring. In both laboratory and field studies, trends in lead and copper release were site-specific and ultimately determined by the plumbing material, microbiology and chemistry. In many cases, elevated levels of lead and copper did not co-occur suggesting that, in a revised LCR, these contaminants will have to be sampled separately in order to identify worst case conditions. Temperature was also examined as a potentially important factor in lead and copper corrosion. Several studies have attributed higher incidence of childhood lead poisoning during the summer to increased soil and dust exposure; however, drinking water may also be a significant contributing factor. In large-scale pipe rigs, total and dissolved lead release was 3-5 times higher during the summer compared to the winter. However, in bench scale studies, higher temperature could increase, decrease, or have no effect on lead release dependent on material and water chemistry. Similarly, in a distribution system served by a centralized treatment plant, lead release from pure lead service lines increased with temperature in some homes but had no correlation in other homes. It is possible that changes throughout the distribution system such as disinfectant residual, iron, or other factors can create scales on pipes at individual homes, which determines the temperature dependency of lead release. Consumer exposure to lead can also be adversely influenced by the presence of particulate iron. In the case of Providence, RI, a well-intentioned decrease in the finished water pH from 10.3 to 9.7, resulted in an epidemic of red water complaints due to the corrosion of iron mains and a concomitant increase in water lead levels. Complementary bench scale and field studies demonstrated that higher iron in water is sometimes linked to higher lead in water, due to sorption of lead onto the iron particulates. Finally, one of the most significant emerging challenges associated with evaluating corrosion control and consumer exposure, is the variability in lead and copper during sampling due to semi-random detachment of lead particles to water, which can pose an acute health concern. Well-controlled test rigs were used to characterize the variability in lead and copper release and compared to consumer sampling during the LCR. The variability due to semi-random particulate detachment, is equal to the typical variability observed in LCR sampling, suggesting that this inherent variability is much more important than other common sources including customer error, customer failure to follow sampling instructions or long stagnation times. While instructing consumers to collect samples are low flow rates reduces variability, it will fail to detect elevated lead from many hazardous taps. Moreover, collecting a single sample to characterize health risks from a given tap, are not adequately protective to consumers in homes with lead plumbing, in an era when corrosion control has reduced the presence of soluble lead in water. Future EPA monitoring and public education should be changed to address this concern. / Ph. D.
393

Efficient Community Detection for Large Scale Networks via Sub-sampling

Bellam, Venkata Pavan Kumar 18 January 2018 (has links)
Many real-world systems can be represented as network-graphs. Some of the networks have an inherent community structure based on interactions. The problem of identifying this grouping structure given a graph is termed as community detection problem which has certain existing algorithms. This thesis contributes by providing specific improvements to various community detection algorithms such as spectral clustering and extreme point algorithm. One of the main contributions is proposing a new sub-sampling method to make existing spectral clustering method scalable by reducing the computational complexity. Also, we have implemented extreme points algorithm for a general multiple communities detection case along with a sub-sampling based version to reduce the computational complexity. We have also developed spectral clustering algorithm for popularity-adjusted block model (PABM) model based graphs to make the algorithm exact thus improving its accuracy. / Master of Science
394

Adhesive areal sampling of gravel bed streams

Fripp, Jon Brooks 05 September 2009 (has links)
The characteristics of a given stream or river are linked to the material that makes up its channel bed. Usually, a vertical stratification by particle size can be recognized. The presence a coarser surface layer is considered to be one of the most important features of a gravel bed stream. Since this surface layer consists of a distinct population of material, it is necessary to be able to separate it from the underlying material and quantify it distinctly. This is done through surface sampling. Two of the most common adhesive areal sampling techniques, and the subject of the present work, are known as clay and wax sampling. If the material obtained in an areal sample is analyzed as a frequency distribution by weight, it has been shown that the size distribution is biased in favor of the larger particles when compared to the results of a bulk sample. The present research shows that this bias is dependent not only upon the sampling method used to remove the material but also upon the size distribution of the sample itself. Not only are the raw results of areal samples not comparable with volumetric samples, but they are not comparable with other areal samples. Before any comparisons are made among areal samples, it is recommended that the size distribution of each areal sample be first converted into the size distribution that would have resulted from an equivalent volumetric sample. The features and limitations of the gravel simulation model that is used to obtain the necessary conversion formula is also the subject of the present work. In addition, the conversion of both matrix and framework supported gravel mixtures that has been areally sampled with either clay or wax, is addressed. Finally, criteria for approximating the minimum depth required for a volumetric sample is presented. / Master of Science
395

On Grouped Observation Level Interaction and a Big Data Monte Carlo Sampling Algorithm

Hu, Xinran 26 January 2015 (has links)
Big Data is transforming the way we live. From medical care to social networks, data is playing a central role in various applications. As the volume and dimensionality of datasets keeps growing, designing effective data analytics algorithms emerges as an important research topic in statistics. In this dissertation, I will summarize our research on two data analytics algorithms: a visual analytics algorithm named Grouped Observation Level Interaction with Multidimensional Scaling and a big data Monte Carlo sampling algorithm named Batched Permutation Sampler. These two algorithms are designed to enhance the capability of generating meaningful insights and utilizing massive datasets, respectively. / Ph. D.
396

A variable sampling interval chart for a combined statistic

Rao, Naresh Krishna January 1988 (has links)
This thesis is an extension of the work on variable sampling charts (<i>VSI</i>) for monitoring a single parameter. An attempt is made to develop a chart which can simultaneously monitor both the process mean and process variance. The chart is based on a statistic which combines both mean and variance. After developing such a chart variable sampling intervals are introduced and it is evaluated against alternative methods of monitoring mean and variance with variable sampling intervals. The statistic chosen is an approximate statistic and simulation studies are performed for the evaluation. The results are at times counter-intuitive thus an analysis of the properties of the chart is made and explanations are provided. / Master of Science
397

Potential for City Parks to Reduce Exposure to Hazardous Air Pollutants

Milazzo, Michael J. 21 May 2018 (has links)
Benzene, toluene, ethylbenzene, and xylenes (BTEX) are hazardous air pollutants commonly found in outdoor air. Several studies have explored the potential of vegetation to mitigate BTEX in outdoor air, but they are limited to a northern temperate climate and present conflicting results. To investigate this issue in a subtropical climate, we deployed passive air samplers for two weeks in parks and nearby residences at four locations: three in an urban area and one in a rural area in Alabama, USA. All BTEX concentrations were below health-based guidelines and were comparable to those found in several other studies in populated settings. Concentrations of TEX, but not benzene, were 3-39% lower in parks than at nearby residences, and the differences were significant. In and around two of the parks, toluene:benzene ratios fell outside the range expected for vehicular emissions (p<0.01), suggesting that there are additional, industrial sources of benzene near these two locations. The ratio of m-,p-xylene:ethylbenzene was high at all locations except one residential area, indicating that BTEX were freshly emitted. Concentrations of individual BTEX compounds were highly correlated with each other in most cases, except for locations that may be impacted by nearby industrial sources of benzene. Results of this study suggest that parks can help reduce BTEX exposure by a modest amount, but future research is needed to ascertain this potential through more measurements at higher spatial and temporal resolution and analysis of vegetation for evidence of uptake of BTEX. / Master of Science
398

Development of an Autonomous Unmanned Aerial Vehicle for Aerobiological Sampling

Dingus, Benjamin Ross 25 May 2007 (has links)
The ability to detect, monitor, and forecast the movement of airborne plant pathogens in agricultural ecosystems is essential for developing rational approaches to managing these habitats. We developed an autonomous (self-controlling) unmanned aerial vehicle (UAV) platform for aerobiological sampling tens to hundreds of meters above agricultural fields. Autonomous UAVs have the potential to extend the range of aerobiological sampling, improve positional accuracy of sampling paths, and enable coordinated flight with multiple aircraft at different altitudes. We equipped a Senior Telemaster model airplane with two spore-sampling devices and a MicroPilot autonomous system, and we conducted over 60 autonomous microbe-sampling flights at Virginia Tech's Kentland Farm. To determine the most appropriate sampling path for aerobiological sampling, we explored a variety of different sampling patterns for our autonomous UAVs including multiple GPS waypoints plotted over a variety of spatial scales. We conducted a total of 25 autonomous aerobiological sampling flights for five different aerobiological sampling patterns. The pattern of a single waypoint exhibited the best flight characteristics with good positional accuracy and standard deviations in altitude from 1.6 to 2.8 meters. The four point pattern configured as a rectangle also demonstrated good flight characteristics and altitude standard deviations from 1.6 to 4.7 meters. / Master of Science
399

Association of Campylobacter spp. Levels between Chicken Grow-Out Environmental Samples and Processed Carcasses

Schroeder, Matthew William 31 May 2012 (has links)
Campylobacter spp. have been isolated from live poultry, production environment, processing facility, and raw poultry products. The detection of Campylobacter using both quantitative and qualitative techniques would provide a more accurate assessment of pre- or post harvest contamination. Environmental sampling in a poultry grow-out house, combined with carcass rinse sampling from the same flock may provide a relative assessment of Campylobacter contamination and transmission. Air samples, fecal/litter samples, and feed pan/drink line samples were collected from four commercial chicken grow-out houses. Birds from the sampled house were the first flock slaughtered the following day, and were sampled by post-chill carcass rinses. Quantitative (direct plating) and qualitative (direct plating after enrichment step) detection methods were used to determine Campylobacter contamination in each environmental sample and carcass rinse. Campylobacter, from post-enrichment samples, was detected from 27% (32/120) of house environmental samples and 37.5% (45/120) of carcass rinse samples. All sample types from each house included at least one positive sample except the house 2 air samples. Samples from house 1 and associated carcass rinses accounted for the highest total of Campylobacter positives (29/60). The fewest number of Campylobacter positives, based on both house environmental (4/30) and carcass rinse samples (8/30) were detected from flock B. Environmental sampling techniques provide a non-invasive and efficient way to test for foodborne pathogens. Correlating qualitative or quantitative Campylobacter levels from house and plant samples may enable the scheduled processing of flocks with lower pathogen incidence or concentrations, as a way to reduce post-slaughter pathogen transmission. / Master of Science in Life Sciences
400

Determination of a novel mine tracer gas and development of a methodology for sampling and analysis of multiple mine tracer gases for characterization of ventilation systems

Patterson, Rosemary Rita 29 April 2011 (has links)
Ventilation in underground mines is vital to creating a safe working environment. Though there have been numerous improvements in mine ventilation, it is still difficult to ascertain data on the state of the ventilation system following a disaster in which ventilation controls have been potentially damaged. This information is important when making the decision to send rescue personnel into the mine. By utilizing tracer gas techniques, which are powerful techniques for monitoring ventilation systems, especially in remote or inaccessible areas, analysis of the ventilation system immediately following a mine emergency can be more rapidly ascertained. However, the success of this technique is largely dependent on the accuracy of release and sampling methods. Therefore, an analysis of sampling methods is crucial for rapid response and dependable results during emergencies. This research project involves evaluating and comparing four well-accepted sampling techniques currently utilized in the mining industry using sulfur hexafluoride, an industry standard, as the tracer gas. Additionally, Solid Phase Microextraction (SPME) fibers are introduced and evaluated as an alternative sampling means. Current sampling methods include plastic syringes, glass syringes, Tedlar bags, and vacutainers. SPME fibers have been successfully used in a variety of industries from forensics to environmental sampling and are a solvent-less method of sampling analytes. To analyze these sampling methods, samples were taken from a 0.01% standard mixture of SF6 in nitrogen and analyzed using electron capture gas chromatography (GC). The technical and practical issues surrounding each sampling method were also observed and discussed. Furthermore, the use of multiple tracer gases could allow for rapid assessment of the functionality of ventilation controls. This paper describes experimentation related to the determination of a novel mine tracer gas. Multiple tracer gases greatly increase the level of flexibility when conducting ventilation surveys to establish and monitor controls. A second tracer would substantially reduce the time it takes to administer multiple surveys since it is not necessary to wait for the first tracer to flush out of the mine which can take up to a few days. Additionally, it is possible to release different tracers at different points and follow their respective airflow paths, analyzing multiple or complex circuits. This would be impossible to do simultaneously with only one tracer. Three different tracer gases, carbon tetrafluoride, octofluoropropane, and perfluoromethlycyclohexane, were selected and evaluated on various GC columns through utilizing different gas chromatographic protocols. Perfluoromethylcyclohexane was selected as the novel tracer, and a final protocol was established that ensured adequate separation of a mixture of SF6 and perfluoromethylcyclohexane. Since there is limited literature comparing sampling techniques in the mining industry, the findings and conclusions gained from the sampling comparison study provide a benchmark for establishing optimal sampling practices for tracer gas techniques. Additionally, the determination of a novel tracer gas that can be used with and separated from SF6 using the same analytical method increases the practicality and robustness of multiple mine tracer gas techniques. This initial work will contribute to the larger project scope of determining a methodology for the remote characterization of mine ventilation systems through utilizing multiple mine tracer gases and computational fluid dynamics (CFD). This will be completed through several phases including initial laboratory testing of novel tracer gases in a model mine apparatus to develop a methodology for releasing, sampling, and modeling a mine ventilation plan and tracer gas dispersion in CFD and eventually completing field trials to validate and enhance the multiple tracer gas methodology. / Master of Science

Page generated in 0.0587 seconds