• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 11
  • Tagged with
  • 16
  • 16
  • 4
  • 4
  • 4
  • 4
  • 4
  • 4
  • 4
  • 4
  • 4
  • 3
  • 3
  • 2
  • 2
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Paleoecological analysis of faunal disparity within a constrained horizon of the Monte León Formation, early Miocene, southern Argentina

Crawford, Robert S. January 2007 (has links)
Thesis (M.S.)--Kent State University, 2007. / Title from PDF t.p. (viewed Mar. 19, 2009). Advisor: Rodney Feldmann. Keywords: Monte León Formation, Argentina, mass mortality, Chaceon Peruvianus, Miocene, marine ashfall. Includes bibliographical references (p. 94-110).
2

Scale insect infestation of phragmites australis in the Mississippi River delta, USA: Do fungal microbiomes play a role?

January 2020 (has links)
archives@tulane.edu / 1 / Caitlin Rose Bumby
3

Prototype campaign assessment of disturbance-induced tree loss effects on surface properties for atmospheric modeling

Villegas, Juan Camilo, Law, Darin J., Stark, Scott C., Minor, David M., Breshears, David D., Saleska, Scott R., Swann, Abigail L. S., Garcia, Elizabeth S., Bella, Elizabeth M., Morton, John M., Cobb, Neil S., Barron-Gafford, Greg A., Litvak, Marcy E., Kolb, Thomas E. 03 1900 (has links)
Changes in large-scale vegetation structure triggered by processes such as deforestation, wildfires, and tree die-off alter surface structure, energy balance, and associated albedo-all critical for land surface models. Characterizing these properties usually requires long-term data, precluding characterization of rapid vegetation changes such as those increasingly occurring in the Anthropocene. Consequently, the characterization of rapid events is limited and only possible in a few specific areas. We use a campaign approach to characterize surface properties associated with vegetation structure. In our approach, a profiling LiDAR and hemispherical image analyses quantify vegetation structure and a portable mast instrumented with a net radiometer, wind-humidity-temperature stations in a vertical profile, and soil temperature-heat flux characterize surface properties. We illustrate the application of our approach in two forest types (boreal and semiarid) with disturbance-induced tree loss. Our prototype characterizes major structural changes associated with tree loss, changes in vertical wind profiles, surface roughness energy balance partitioning, a proxy for NDVI (Normalized Differential Vegetation Index), and albedo. Multi-day albedo estimates, which differed between control and disturbed areas, were similar to tower-based multiyear characterizations, highlighting the utility and potential of the campaign approach. Our prototype provides general characterization of surface and boundary-layer properties relevant for land surface models, strategically enabling preliminary characterization of rapid vegetation disturbance events.
4

Fate and Transport of Pathogen Indicators from Pasturelands

Soupir, Michelle Lynn 15 April 2008 (has links)
The U.S. EPA has identified pathogen indicators as a leading cause of impairments in rivers and streams in the U.S. Elevated levels of bacteria in streams draining the agricultural watersheds cause concern because they indicate the potential presence of pathogenic organisms. Limited understanding of how bacteria survive in the environment and are released from fecal matter and transported along overland flow pathways results in high uncertainty in the design and selection of appropriate best management practices (BMPs) and in the bacterial fate and transport models used to identify sources of pathogens. The overall goal of this study was to improve understanding of the fate and transport mechanisms of two pathogen indicators, E. coli and enterococci, from grazed pasturelands. This goal was addressed by monitoring pathogen indicator concentrations in fresh fecal deposits for an extended period of time. Transport mechanisms of pathogen indicators were examined by developing a method to partition between the attached and unattached phases and then applying this method to analyze runoff samples collected from small box plots and large transport plots. The box plot experiments examined the partitioning of pathogen indicators in runoff from three different soil types while the transport plot experiments examined partitioning at the edge-of-the-field from well-managed and poorly-managed pasturelands. A variety of techniques have been previously used to assess bacterial attachment to particulates including filtration, fractional filtration and centrifugation. In addition, a variety of chemical and physical dispersion techniques are employed to release attached and bioflocculated cells from particulates. This research developed and validated an easy-to-replicate laboratory procedure for separation of unattached from attached E. coli with the ability to identify particle sizes to which indicators preferentially attach. Testing of physical and chemical dispersion techniques identified a hand shaker treatment for 10 minutes followed by dilutions in 1,000 mg L-1 of Tween-85 as increasing total E. coli concentrations by 31% (P value = 0.0028) and enterococci concentrations by 17% (P value = 0.3425) when compared to a control. Separation of the unattached and attached fractions was achieved by fractional filtration followed by centrifugation. Samples receiving the filtration and centrifugation treatments did not produce statistically different E. coli (P value = 0.97) or enterococci (P value = 0.83) concentrations when compared to a control, indicating that damage was not inflicted upon the cells during the separation procedure. In-field monitoring of E. coli and enterococci re-growth and decay patterns in cowpats applied to pasturelands was conducted during the spring, summer, fall and winter seasons. First order approximations were used to determine die-off rate coefficients and decimal reduction times (D-values). Higher order approximations and weather parameters were evaluated by multiple regression analysis to identify environmental parameters impacting in-field E. coli and enterococci decay. First order kinetics approximated E. coli and enterococci decay rates with regression coefficients ranging from 0.70 to 0.90. Die-off rate constants were greatest in cowpats applied to pasture during late winter and monitored into summer months for E. coli (k = 0.0995 d-1) and applied to the field during the summer and monitored until December for enterococci (k = 0.0978 d-1). Decay rates were lowest in cowpats applied to the pasture during the fall and monitored over the winter (k = 0.0581 d-1 for E. coli and k = 0.0557 d-1 for enterococci). Higher order approximations and the addition of weather variables improved regression coefficients (R2) to values ranging from 0.81 to 0.97. Statistically significant variables used in the models for predicting bacterial decay included temperature, solar radiation, rainfall and relative humidity. Attachment of E. coli and enterococci to particulates present in runoff from highly erodible soils was evaluated through the application of rainfall to small box plots containing different soil types. Partitioning varied by indicator and by soil type. In general, enterococci had a higher percent attached to the silty loam (49%) and silty clay loam (43%) soils while E. coli had a higher percent attached to the loamy fine sand soils (43%). At least 50% of all attached E. coli and enterococci were associated with sediment and organic particles ranging from 8 – 62 μm in diameter. Much lower attachment rates were observed from runoff samples collected at the edge-of-the-field, regardless of pastureland management strategy. On average, 4.8% of E. coli and 13% of enterococci were attached to particulates in runoff from well-managed pasturelands. A second transport plot study found that on average only 0.06% of E. coli PC and 0.98% of enterococci were attached to particulates in runoff from well-managed pasturelands, but percent attachment increased slightly in runoff from poorly-managed pasture with 2.8% of E. coli and 1.23% of enterococci attached to particulates. Equations to predict E. coli and enterococci loading rates in the attached and unattached forms as a function of total suspended solids (TSS), phosphorous and organic carbon loading rates appeared to be a promising tool for improving prediction of bacterial loading rates from grazed pasturelands (R2 values ranged from 0.61 to 0.99). This study provides field-based seasonal die-off rate coefficients and higher order approximations to improve predictions of indicator re-growth and decay patterns. The transport studies provide partitioning coefficients that can be implemented into NPS models to improve predictions of bacterial concentrations in surface waters and regression equations to predict bacterial partitioning and loading based on TSS and nutrient data. Best management practices to reduce bacterial loadings to the edge-of-the-field from pasturelands (regardless of management strategy) should focus on retention of pathogen indicators moving through overland flow pathways in the unattached state. Settling of particulates prior to release of runoff to surface waters might be an appropriate method of reducing bacterial loadings by as much as 50% from highly erodible soils. / Ph. D.
5

Effect of Anaerobic Soil Disinfestation on Salmonella Concentration Using Different Soil Amendments

Marik, Claire Margaret 21 May 2020 (has links)
Salmonella has been shown to survive in soils for extended periods. Anaerobic soil disinfestation (ASD) represents a promising alternative to fumigation used to manage soilborne diseases and pests; however, little is known about ASD's impact on Salmonella. The study aimed to compare Salmonella die-off following inoculation in ASD and non-ASD processed soil and compare Salmonella die-off in amended and non-amended soils following ASD. Two independent experiments were arranged in randomized complete block designs (four replications per treatment). Sandy-loam soil was inoculated with a Salmonella cocktail (5.5±0.2 log CFU/g) and amended with field-applicable rates of rye (R), rapeseed (RS), hairy vetch (HV), or pelletized poultry litter (PPL). Non-amended, anaerobic (ANC) and non-amended, aerobic controls (AC) were performed in parallel. Soils were irrigated to saturation and covered with plastic mulch. ASD was terminated by removal of plastic (3-weeks). Triplicate soil samples were collected pre-ASD and 0, 1, 2, 3, 7, 10, 14, 21, 28, 31, 35, 38 and 42d post-ASD. Post-ASD soil was irrigated weekly. Salmonella was quantified using standard methods and a modified MPN enrichment protocol. Concentrations between treatments and time-points were analyzed for significance (P≤0.05). Separate log-linear models were used to examine effect of amendment and irrigation on Salmonella die-off during ASD and post-ASD. Salmonella concentrations decreased in all treatments during ASD with the greatest decrease being observed in ASD and non-ASD controls. Among ASD-processed, amended soil, the rye and rapeseed amendments had the greatest decrease in Salmonella concentrations. Salmonella concentrations decreased by ~1 log between pre-ASD and post-soil saturation (95% Confidence Interval (CI) =-1.31, -0.99), and by approximately 2 logs between pre-ASD and termination of ASD (CI=-2.14, -1.83). Salmonella concentrations were ~1 log higher in ASD-processed, pelletized poultry litter-amended soil, compared to the ASD control (CI=0.81, 1.26). The average daily die-off rate of Salmonella post-ASD was -0.05 log per g (CI=-0.05, -0.04). Following irrigation, Salmonella concentrations were 0.14 log greater, compared to no irrigation within 7 d (CI=0.05, 0.23). Salmonella serovar distribution differed by treatment, with >70% survival of Newport in pelletized poultry litter. ASD does not eliminate Salmonella concentrations in soil; instead some amendments may enhance Salmonella survival. / Master of Science in Life Sciences / Anaerobic soil disinfestation (ASD) is the process of removing soilborne weeds, soilborne diseases and insect by creating an anaerobic environment in the soil by incorporating easily decomposable soil amendments, covering with plastic mulch, and irrigating to saturation. The anaerobic soil environment persists for two- to six-weeks. ASD represents a promising alternative to fumigation used to manage soil-borne diseases and pests. However, little is known about ASD's impact on Salmonella, which has been shown to survive in soils for extended periods of time and can contaminate the edible part of produce. This study aimed to determine if ASD is an effective strategy to reduce or eliminate Salmonella from soils that contain typical amendments added to promote soil health, determine the distribution of Salmonella serovars in ASD treated soils, and examine the impact of irrigation. Two independent experiments were conducted in growth chambers. A common soil type in produce growing regions of the Eastern Shore of VA, sandy-loam soil was inoculated with a five-serovar Salmonella cocktail (~5.5±0.2 log CFU/g) and mixed with field-applicable rates of rye, rapeseed, hairy vetch, or pelletized poultry litter. ASD processed soils were irrigated to field saturation, covered with plastic mulch, and left for 3 weeks. Post-ASD, soils were irrigated weekly. Salmonella was quantified at least once per week for up to 6 weeks in soil samples collected in triplicate pre-ASD, post-field saturation, and post-ASD. The three-week anaerobic soil environment, created during ASD, allowed for greater Salmonella survival, compared to the aerobic soil environment. Salmonella survival in ASD processed soil was dependent on amendment. Salmonella concentrations decreased in all treatments during ASD with the greatest decrease being observed in ASD and non-ASD controls. Among ASD-processed, amended soil, the rye and rapeseed amendments had the greatest decrease in Salmonella concentrations. Pelletized poultry litter amended soil, in combination with ASD, had the highest Salmonella concentration, while rye had the lowest at each time-point post-ASD. Salmonella serovar distribution differed by treatment with greater survival of S. Poona in rye, S. Braenderup in hairy vetch and S. Newport in pelletized poultry litter. , Salmonella concentrations were 0.14 log greater, compared to no irrigation within 7 d. Overall, ASD did not eliminate Salmonella in soils. Compared to non-ASD processed soil, ASD processed soil had greater survival of Salmonella and the soil amendment used influenced the survival ability. While more research is needed on ASD and different soil amendments, the findings of this research would suggest pelletized poultry litter not be used as an amendment (i.e., carbon source in ASD) for fields used to grow produce eaten raw, especially when soils are potentially contaminated with Salmonella.
6

The Effects of <em>Labyrinthula sp.</em> Infection, Salinity, and Light on the Production of Phenolic Compounds in <em>Thalassia testudinum</em>

Sneed, Jennifer M 18 July 2005 (has links)
In the fall of 1987, several areas of Florida Bay were severely affected by the sudden die-off of the seagrass Thalassia testudinum Banks ex Konig (turtle grass). Although the cause is still unknown, several factors were suggested as influencing the on-set of the die-off event including increased salinity, light stress due to self-shading, and disease. Blades of seagrass plants found in the area of die-off were infected by Labyrinthula sp, a pathogenic protist. A similar die-off occurred in another species of seagrass, Zostera marina, in the 1930s that was attributed to the pathogenic protist, Labyrinthula zosterae. Zostera marina produces inhibitory phenolic acids in response to infection by L. zosterae, a response that is diminished in plants exposed to low light and high temperature. This study examined the differences in phenolic content of healthy and infected T. testudinum leaf blades in laboratory cultures to determine if T. testudinum produces a chemical defense against pathogens similar to that of Z. marina. The possible increased susceptibility of turtle grass to Labyrinthula sp. infection under high salinity and low light was also examined. In culture, infection by Labyrinthula sp. induced a rapid, short-term production of total phenolics in Thalassia testudinum under normal, non-stressed conditions. The initial induction was followed by a sharp decline. The production of individual phenolic acids was not induced by infection. In contrast, the production of caffeic acid was inhibited by infection. Environmental stress (low salinity and low light) caused a decrease in both total phenolics and several phenolic acids. Levels of PHBA, vanillic acid, and caffeic acid decreased in low salinity (25ppt) treatments, and caffeic acid decreased in response to low light stress. There was an interaction between stress and infection that resulted in higher levels of phenolics in plants exposed to infection and stress compared to those exposed to stress alone. In culture, plants did not survive exposure to high salinity (45ppt) similar to that found in Florida Bay during the die-off event
7

Microbial Contamination Assessment with SWAT in a Tile-Drained Rural Watershed

Fall, Claudia 10 June 2011 (has links)
Microbial contamination of drinking water poses an important health risk which causes severe illnesses and epidemics. In order to improve surface and drinking water quality, the understanding of fecal pathogen contamination processes including their prevention and control needs to be enhanced. The watershed model soil water assessment tool (SWAT) is commonly used to simulate the complex hydrological, meteorological, erosion, land management and pollution processes within river basins. In recent years, it has been increasingly applied to simulate microbial contamination transport at the watershed scale. SWAT is used in this study to simulate Escherichia coli (E.coli) and fecal coliform densities for the agriculturally dominated Payne River Basin in Ontario, Canada. Unprecedented extensive monitoring data that consist of 30 years of daily hydrological data and 5 years of bi-weekly nutrient data have been used to calibrate and validate the presented model here. The calibration and validation of the streamflow and nutrients indicate that the model represent these processes well. The model performs well for periods of lower E. coli and fecal coliform loadings. On the other hand, frequency and magnitude of higher microbial loads are not always accurately represented by the model.
8

Microbial Contamination Assessment with SWAT in a Tile-Drained Rural Watershed

Fall, Claudia 10 June 2011 (has links)
Microbial contamination of drinking water poses an important health risk which causes severe illnesses and epidemics. In order to improve surface and drinking water quality, the understanding of fecal pathogen contamination processes including their prevention and control needs to be enhanced. The watershed model soil water assessment tool (SWAT) is commonly used to simulate the complex hydrological, meteorological, erosion, land management and pollution processes within river basins. In recent years, it has been increasingly applied to simulate microbial contamination transport at the watershed scale. SWAT is used in this study to simulate Escherichia coli (E.coli) and fecal coliform densities for the agriculturally dominated Payne River Basin in Ontario, Canada. Unprecedented extensive monitoring data that consist of 30 years of daily hydrological data and 5 years of bi-weekly nutrient data have been used to calibrate and validate the presented model here. The calibration and validation of the streamflow and nutrients indicate that the model represent these processes well. The model performs well for periods of lower E. coli and fecal coliform loadings. On the other hand, frequency and magnitude of higher microbial loads are not always accurately represented by the model.
9

Microbial Contamination Assessment with SWAT in a Tile-Drained Rural Watershed

Fall, Claudia 10 June 2011 (has links)
Microbial contamination of drinking water poses an important health risk which causes severe illnesses and epidemics. In order to improve surface and drinking water quality, the understanding of fecal pathogen contamination processes including their prevention and control needs to be enhanced. The watershed model soil water assessment tool (SWAT) is commonly used to simulate the complex hydrological, meteorological, erosion, land management and pollution processes within river basins. In recent years, it has been increasingly applied to simulate microbial contamination transport at the watershed scale. SWAT is used in this study to simulate Escherichia coli (E.coli) and fecal coliform densities for the agriculturally dominated Payne River Basin in Ontario, Canada. Unprecedented extensive monitoring data that consist of 30 years of daily hydrological data and 5 years of bi-weekly nutrient data have been used to calibrate and validate the presented model here. The calibration and validation of the streamflow and nutrients indicate that the model represent these processes well. The model performs well for periods of lower E. coli and fecal coliform loadings. On the other hand, frequency and magnitude of higher microbial loads are not always accurately represented by the model.
10

Screwbean Mesquite (Prosopis pubescens) Die-off: Population Status at Restored and Unrestored Sites in the Lower Colorado River Watershed

January 2016 (has links)
abstract: Die-off of screwbean mesquite (Prosopis pubescens), a species native to the American Southwest, has been documented regionally within the last decade. Historical causes for episodic mortality of the more widely distributed velvet mesquite (Prosopis velutina) and honey mesquite (Prosopis glandulosa) include water table declines and flood scour. Causes of the recent die-offs of P. pubescens have received little study. Numerous riparian restoration projects have been implemented regionally that include screwbean mesquite. Restoration propagules from foreign sources can introduce diseases, and low genetic diversity plantings may allow for disease irruptions. I asked: 1) Are die-offs associated with a particular age class, 2) Is die-off suggestive of a pathogen or related to specific environmental stressors, 3) Are mortality influences and outcomes the same between restoration and local populations, 4) Are particular land uses and management associated with die-off, and 5) Are populations rebounding or keeping pace with mortality? I documented the screwbean mesquite population status at rivers and wetlands in Arizona with varying levels of restoration. I used logistic regression and Pearson correlation analysis to explore mortality response to site factors and disease related variables. I compared mortality response and disease severity between local and restoration populations. Biotic damage surfaced as the most important factor in statistical analyses, suggesting that mortality was caused by a pathogen. Mortality was greatest for young size classes (3 to 14 cm), and biotic damage was higher for individuals at infrequently flooded areas. Strong differences were not found between local and restoration populations – however restoration populations were less stressed and had lower biotic damage. Novel urban and restored sites may provide refuge as site conditions at other locations deteriorate. A culmination of past water diversion, development and land use may be surfacing, rendering riparian species vulnerable to diseases and triggering such events as region-wide die-off. / Dissertation/Thesis / Masters Thesis Plant Biology 2016

Page generated in 0.0625 seconds