• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 513
  • 90
  • 62
  • 51
  • 41
  • 34
  • 13
  • 9
  • 6
  • 6
  • 6
  • 5
  • 5
  • 3
  • 2
  • Tagged with
  • 1045
  • 1045
  • 184
  • 178
  • 178
  • 163
  • 98
  • 84
  • 82
  • 80
  • 73
  • 68
  • 67
  • 64
  • 62
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
621

Determining the Sustainability of Coal Mine Cavity Discharge as a Drinking Water Source

Anderson, Eric T. 14 April 1999 (has links)
In southwestern Virginia, adequate sources of public water for small isolated communities are difficult to find. While many alternatives exist, one of the largest sources of water in this region is flooded abandoned coal mines. One such coal mine aquifer was chosen for a sustainability study in Dickenson County, Virginia. A flowrate monitoring system was installed at the point of discharge from the mine, and the flow records from three months of data collection were analyzed. The recording period included one of the driest periods in recent years, and the flowrate data recorded provided useful information regarding the sustainability of the system. After a study of the geology and groundwater flow patterns in the region, it was determined that a coal mine aquifer is very similar to the extremely heterogeneous system seen in karst landscapes. Thus, techniques common to karst phenomenon were used to analyze the spring hydrograph. A spring recession analysis was performed upon five storm recessions, and the coefficients for each recession compared and discussed in light of known geologic information. It was discovered that the recession coefficients described the flow from the mine very adequately and that the mine response to a rainfall pulse was very similar to the response of certain types of karst aquifers. This information was used to predict a sustainable flow from the mine. A cross-correlation analysis was performed in an attempt to fit a "black box" model to the flow data, as well as to verify the results of the spring recession analysis. The correlation analysis proved that one rainfall event produced many separate reactions in the flowrate at the mine discharge point. This strengthened results concluded by the recession analysis. It was found that the flow record was not long enough to adequately create a statistical model, but a procedure was described that could be used to model flows once a larger flow record was available. / Master of Science
622

Assessing biofilm development in drinking water distribution systems by Machine Learning methods

Ramos Martínez, Eva 02 May 2016 (has links)
[EN] One of the main challenges of drinking water utilities is to ensure high quality supply, in particular, in chemical and microbiological terms. However, biofilms invariably develop in all drinking water distribution systems (DWDSs), despite the presence of residual disinfectant. As a result, water utilities are not able to ensure total bacteriological control. Currently biofilms represent a real paradigm in water quality management for all DWDSs. Biofilms are complex communities of microorganisms bound by an extracellular polymer that provides them with structure, protection from toxics and helps retain food. Besides the health risk that biofilms involve, due to their role as a pathogen shelter, a number of additional problems associated with biofilm development in DWDSs can be identified. Among others, aesthetic deterioration of water, biocorrosion and disinfectant decay are universally recognized. A large amount of research has been conducted on this field since the earliest 80's. However, due to the complex environment and the community studied most of the studies have been developed under certain simplifications. We resort to this already done work and acquired knowledge on biofilm growth in DWDSs to change the common approaches of these studies. Our proposal is based on arduous preprocessing and posterior analysis by Machine Learning approaches. A multi-disciplinary procedure is undertaken, helping as a practical approach to develop a decision-making tool to help DWDS management to maintain, as much as possible, biofilm at the lowest level, and mitigating its negative effects on the service. A methodology to detect the more susceptible areas to biofilm development in DWDSs is proposed. Knowing the location of these hot-spots of the network, mitigation actions could be focused more specifically, thus saving resources and money. Also, prevention programs could be developed, acting before the consequences of biofilm are noticed by the consumers. In this way, the economic cost would be reduced and the service quality would improve, eventually increasing consumers' satisfaction. / [ES] Uno de los principales objetivos de las empresas encargadas de la gestión de los sistemas de distribución de agua potable (DWDSs, del inglés Drinking Water Distribution Systems) es asegurar una alta calidad del agua en su abastecimiento, tanto química como microbiológica. Sin embargo, la existencia de biofilms en todos ellos, a pesar de la presencia de desinfectante residual, hace que no se pueda asegurar un control bacteriológico total, por lo que, hoy en día, los biofilms representan un paradigma en la gestión de la calidad del agua en los DWDSs. Los biofilms son comunidades complejas de microorganismos recubiertas de un polímero extracelular que les da estructura y les ayuda a retener el alimento y a protegerse de agentes tóxicos. Además del riesgo sanitario que suponen por su papel como refugio de patógenos, existen muchos otros problemas asociados al desarrollo de biofilms en los DWDSs, como deterioro estético del agua, biocorrosión y consumo de desinfectante, entre otros. Una gran cantidad de investigaciones se han realizado en este campo desde los primeros años 80. Sin embargo, debido a la complejidad del entorno y la comunidad estudiada la mayoría de estos estudios se han llevado a cabo bajo ciertas simplificaciones. En nuestro caso, recurrimos a estos trabajos ya realizados y al conocimiento adquirido sobre el desarrollo del biofilm en los DWDSs para cambiar el enfoque en el que normalmente se enmarcan estos estudios. Nuestra propuesta se basa en un intenso pre-proceso y posterior análisis con técnicas de aprendizaje automático. Se implementa un proceso multidisciplinar que ayuda a la realización de un enfoque práctico para el desarrollo de una herramienta de ayuda a la toma de decisiones que ayude a la gestión de los DWDSs, manteniendo, en lo posible, el biofilm en los niveles más bajos, y mitigando sus efectos negativos sobre el servicio de agua. Se propone una metodología para detectar las áreas más susceptibles al desarrollo del biofilm en los DWDSs. Conocer la ubicación de estos puntos calientes de biofilm en la red permitiría llevar a cabo acciones de mitigación de manera localizada, ahorrando recursos y dinero, y asimismo, podrían desarrollarse programas de prevención, actuando antes de que las consecuencias derivadas del desarrollo de biofilm sean percibidas por los consumidores. De esta manera, el coste económico se vería reducido y la calidad del servicio mejoraría, aumentando, finalmente, la satisfacción de los usuarios. / [CA] Un dels principals reptes dels serveis d'aigua potable és garantir el subministrament d'alta qualitat, en particular, en termes químics i microbiològics. No obstant això, els biofilms desenvolupen invariablement en tots els sistemes de distribució d'aigua potable (DWDSs, de l'anglès, Drinking Water Distribution Systems), tot i la presència de desinfectant residual. Com a resultat, les empreses d'aigua no són capaces de garantir un control bacteriològic total. Actualment el biofilms representen un veritable paradigma en la gestió de la qualitat de l'aigua per a tots les DWDSs. Els biofilms són comunitats complexes de microorganismes vinculats per un polímer extracel·lular que els proporciona estructura, protecció contra els tòxics i ajuda a retenir els aliments. A més del risc de salut que impliquen els biofilms, com a causa del seu paper com a refugi de patògens, una sèrie de problemes addicionals associats amb el desenvolupament del biofilm en els DWDSs pot ser identificat. Entre altres, deteriorament estètic d'aigua, biocorrosión i decadència de desinfectant són universalment reconeguts. Una gran quantitat d'investigació s'ha realitzat en aquest camp des dels primers anys de la dècada del 80. No obstant això, a causa de la complexitat de l'entorn i la comunitat estudiada, la major part dels estudis s'han desenvolupat sota certes simplificacions. Recorrem a aquest treball ja realitzat i a aquest coneixement adquirit en el creixement de biofilms en els DWDSs per canviar el punt de vista clàssic del biofilm en estudis en els DWDSs. La nostra proposta es basa en l'ardu processament previ i posterior anàlisi mitjançant enfocaments d'aprenentatge automàtic. Es va dur a terme un procediment multidisciplinari, ajudant com un enfocament pràctic per desenvolupar una eina de presa de decisions per ajudar a la gestió dels DWDS a mantenir, en la mesura possible, els biofilm en els nivells més baixos, i la mitigació dels seus efectes negatius sobre el servei. Es proposa una metodologia per detectar les àrees més susceptibles al desenvolupament de biofilms en els DWDSs. En conèixer la ubicació d'aquests punts calents de la xarxa, les accions de mitigació podrien centrar-se més específicament, estalviant recursos i diners. A més, els programes de prevenció es podrien desenvolupar, actuant abans que les conseqüències del biofilm es noten pels consumidors. D'aquesta manera, el cost econòmic seria reduït i la qualitat del servei podria millorar, finalment augmentant la satisfacció dels consumidors. / Ramos Martínez, E. (2016). Assessing biofilm development in drinking water distribution systems by Machine Learning methods [Tesis doctoral]. Universitat Politècnica de València. https://doi.org/10.4995/Thesis/10251/63257
623

Opportunistic Pathogens and the Brain-eating Amoeba, Naegleria fowleri in Reclaimed Water, Municipal Drinking Water, and Private Well Water

Strom, Laurel Elisabeth 13 October 2017 (has links)
Opportunistic pathogens (OPs) are of special concern for immunocompromised populations and are known to grow in both drinking water and reclaimed water (i.e., non-potable recycled water) distribution systems, with aerosol inhalation and other non-ingestion exposures that are not addressed by existing regulatory frameworks. Factors enabling the growth of OPs in water distribution and premise (i.e., building) plumbing systems distributing reclaimed and other water sources systems are poorly understood especially for the emerging OP, Naegleria fowleri (i.e. brain-eating amoeba). Three phases of investigation were carried out to identify factors that facilitate the growth of OPs in main distribution and premise plumbing systems, with particular attention on reclaimed water systems, aging water mains, and private well systems. Phase one examined the role of biological treatment to remove organic carbon and disinfectant type on the occurrence of OPs during distribution of reclaimed water. Laboratory-scale simulated reclaimed water distribution systems were employed to systematically examine the effects of prior granular activated carbon (GAC) biofiltration of the water; chlorine, chloramines, or no disinfectant, and water ages ranging up to 5 days. The second and third phases of research explored the role of nitrification, iron corrosion, and disinfectant on the growth of N. fowleri both in municipal drinking water from a city grappling with aging water infrastructure and untreated private well water. Results from the simulated reclaimed water distribution systems suggested that biologically-active GAC filtration may unintentionally select for specific OPs, contrary to expectations and experiences with oligotrophic conditions in potable water systems. While GAC biofiltration was associated with lower total bacteria and Legionella spp. gene markers, there were no apparent benefits in terms of other OPs analyzed. Similarly, disinfectant treatments successful for controlling OPs in potable water were either ineffective or associated with increased levels of OPs, such as Mycobacterium spp. and Acanthamoeba spp., in the reclaimed water examined. In the potable water study, it was possible to recreate conditions associated with growth of N. fowleri in the aged main distribution system from where the water for the experiment was collected; including corroding iron mains, nitrification, and disinfectant decay. While the effects of nitrification could not be confirmed, there was a clear association of iron corrosion with N. fowleri proliferation. The role of iron was explored further in what, to the author's knowledge, was the first study of N. fowleri in private wells. Analysis of 40 wells found correlations between N. fowleri and stagnant iron levels, further supporting the hypothesis that iron corrosion or iron encourages the growth of N. fowleri, and, because wells are not routinely disinfected, not necessarily as a result of promoting disinfectant decay. As this study took place following a major flooding event, it provided insight not only into how surface water contamination may influence private well water microbial communities, but also added to the understanding that current recommendations for disinfecting private wells are inadequate and standards should be implemented to aid homeowners in the event of flooding. This exploratory research illuminated several factors influencing the OP growth in a range of water systems. Identifying key variables that control growth is crucial to improving the safety of these systems. / MS / Water borne bacteria that effect the immune systems of the sick, known as opportunistic pathogens (OPs), have become a major heath concern. These organisms are known to grow in drinking water and reclaimed water (i.e., non-potable recycled water) distribution systems yet there are no regulations aimed at prevention. There is also limited knowledge on how premise plumbing and water sources effect the growth, population, and risk of infection by OPs, especially for Naegleria fowleri (i.e. brain-eating amoeba). An investigation was carried out in three parts to determine what influences the growth of OPs in water distribution and household plumbing systems, with particular attention to reclaimed water, municipal drinking water, and private well systems. Phase one examined the role of biological treatment to remove organic carbon and disinfectant type on the occurrence of OPs during distribution of reclaimed water. Laboratory-scale simulated reclaimed water distribution systems were used to examine the effects of granular activated carbon (GAC) biofiltration of the water, disinfectants (chlorine, chloramines, or no disinfectant), and water ages ranging zero to five days. The second and third phases of research explored the role of nitrification, iron corrosion, and disinfectant on the growth of N. fowleri both in municipal drinking water from a city with aging water infrastructure and untreated private well water. Results from the simulated reclaimed water distribution systems suggested that biologically-active GAC filtration may allow for the growth of specific OPs. While GAC biofiltration was associated with lower total bacteria and Legionella spp., there were no apparent benefits in reducing the presence of other OPs. Similarly, common disinfectant treatments for preventing OPs in drinking water were either ineffective or increased viii levels of OPs, such as Mycobacterium spp. and Acanthamoeba spp., in the reclaimed water. In the drinking water study, conditions were introduced to grow N. fowleri in aged drinking water distribution systems with the additions of corroding iron, nitrification (using nitrifying bacteria), and disinfectant. While the effects of nitrification could not be confirmed, there was a clear relationship between iron corrosion and N. fowleri growth. The role of iron was explored further in what, to the author’s knowledge, was the first study of N. fowleri in private wells. Forty wells were examined and the relationships between N. fowleri and stagnant iron levels supported the hypothesis that iron corrosion or iron increases the growth of N. fowleri. As this study took place following a major flooding event, it provided data not only into how surface water contamination may influence private well water microbial communities, but also added to the understanding that current recommendations for disinfecting private wells are inadequate and standards should be implemented to aid homeowners in the event of flooding. This exploratory research highlighted several variables that may allow for the growth of OPs in a range of water systems. Identifying key variables that control growth is crucial to improving the safety of these systems.
624

Advancing Rural Public Health: From Drinking Water Quality and Health Outcome Meta-analyses to Wastewater-based Pathogen Monitoring

Darling, Amanda Victoria 07 October 2024 (has links)
A rural-urban divide in health status and healthcare infrastructure has been well-documented in the U.S., where populations residing in census regions classified as rural often exhibit more negative health outcomes, adverse health behaviors, and have reduced access to affordable and proximal health services, compared to their urban and peri-urban counterparts. However, it is important to note that such disparities vary based on specific rural regions and individual circumstances. Rural areas may face elevated risk factors for infectious diseases such as increased proximity to wildlife and livestock and disproportionately high reliance on private, non-federally regulated, primary drinking water sources. Chronic conditions prevalent in rural communities such as diabetes and hypertension are frequently linked with longer duration and higher severity of symptoms than in urban areas; this association suggests that the risk of exposure to infectious diseases and the likelihood of progression to serious illness and hospitalization may be elevated, although this is not universally the case across all rural settings. Alongside documented urban-rural health disparities, there also exist disparities in the nature and quality of data on health-related behaviors, outcomes, and service provision in rural areas compared to urban and peri-urban regions. In this dissertation, two key environmental matrices –drinking water and wastewater– were highlighted as vectors of information to better estimate levels of contaminant exposures and health outcomes in rural communities. First, baseline data on drinking water contaminant levels and associated health outcome data were highlighted as crucial for refining holistic exposure estimates as well as understanding drinking water related health burdens in rural communities where a larger proportion of households use private drinking water sources, such as well water, that are not federally regulated. Second, systematic sampling and testing of pathogen biomarkers in wastewater to non-invasively measure population-level health status, also known as wastewater based surveillance (WBS) and, depending on the context, wastewater based epidemiology (WBE) is not constrained by disadvantages of clinical testing, e.g., limited health-care access, long travel times to testing facilities, delay between symptom-onset and testing. Thus, expanded implementation of WBS in rural communities is proposed here as a strategy to address data disparities in clinical testing for infectious diseases. Collectively, this dissertation advances knowledge on estimated drinking water contaminant levels, exposures, and associated public health outcomes and corresponding research gaps in rural Appalachian U.S., and elucidates pathways toward best practices and considerations for public-health focused wastewater testing adoption in rural communities. For the latter, the question of whether WBS challenges unique to rural wastewater systems hinder application of WBS in small, rural communities was explored, as well as methods to advance best-practices for rural WBS. To summarize existing publicly available peer-reviewed literature on drinking water contaminants in rural Appalachian U.S., in Chapter 2, a systematic review and meta-analysis of microbial and chemical drinking water contaminants was performed. Key contaminants were identified as being elevated beyond regulatory, health-based, maximum contaminant levels in our meta-analyses from rural drinking water sources in Appalachia, including E coli, lead, arsenic, uranium. Overall, we found data on drinking water source quality under baseline conditions (i.e., rather than post anomalous contamination events such as chemical spills) in rural Appalachian U.S. was sparse relative to widespread media coverage on the issue. Epidemiologic-based research studies that collected both drinking water exposure data and paired health outcome data were also limited. As a result, although some instances of anomalously high levels of drinking water contaminants were identified in rural Appalachia from the published literature, we could not obtain a clear picture of baseline exposures to drinking water contaminants in most rural Appalachian communities, highlight need to address these knowledge gaps. In Chapter 3, to evaluate whether wastewater could serve as a reliable metric for estimating community circulation of viruses and antimicrobial resistance (AMR) markers, even when sourced from aging and low-resource sewer collection networks, a 12-month wastewater monitoring study was conducted in a small, rural sewer conveyance system with pronounced infrastructural challenges. Specifically, the field site under study was compromised with heavy inflow and infiltration (IandI). Detection rates and concentrations of viral, AMR, and human fecal markers were grouped by levels of IandI impact across the sewershed, and location-, date-, and sample- specific variables were assessed for their relative influence on viral, AMR, and human fecal marker signal using generalized linear models (GLMs). We found that while IandI likely adversely impacted the magnitude of wastewater biomarker signal to some extent throughout the sewershed, especially up-sewer at sites with more pronounced IandI, substantial diminishment of wastewater signal at WWTP influent was not observed in response to precipitation events. Thus, our data indicated that WWTP influent sampling alone can still be used to assess and track community circulation of pathogens in heavily IandI impacted systems, particularly for ubiquitously circulating viruses less prone to dilution induced decay. Delineations were also made for what circumstances up-sewer sampling may be necessary to better inform population shedding of pathogens, especially where IandI is prevalent. Various normalization strategies have been proposed to account for sources of variability for deriving population-level pathogen shedding from wastewater, including those introduced by IandI-driven dilution. Thus, in Chapter 4, we evaluated the temporal and spatial variability of viral and AMR marker signal in wastewater at different levels of IandI, both unnormalized and with the adoption of several normalization strategies. We found that normalization using physicochemical-based wastewater strength metrics (chemical oxygen demand, total suspended solids, phosphate, and ammonia) resulted in higher temporal and site-specific variability of SARS-CoV-2 and human fecal biomarker signal compared to unnormalized data, especially for viral and AMR marker signal measured in wastewater from sites with pronounced IandI. Viral wastewater signal normalized to physicochemical wastewater strength metrics and flow data also closely mirrored precipitation trends, suggesting such normalization approaches may more closely scale wastewater trends towards precipitation patterns rather than per capita signal in an IandI compromised system. We also found that in most cases, normalization did not significantly alter the relationship between wastewater trends and clinical infection trends. These findings suggest a degree of caution is warranted for some normalization approaches, especially where precipitation driven IandI is heightened. However, data and findings largely supported the utility of using human fecal markers such as crAssphage for normalizing wastewater signal to address site-specific differences in dilution levels, since viral signal scaled to this metric did not result in strong correlations between precipitation and wastewater trends, higher spatial and temporal variation was not observed, and strong correlations were observed between viral signal and viral infection trends. Finally, in chapter 5, we assessed the relationship between monthly Norovirus GII, Rotavirus, and SARS-CoV-2 wastewater trends with seasonal infection trends for each of the viruses to ascertain whether WBE could be used in a rural sewershed of this size with substantial IandI impacts to track and potentially predict population level infection trends. Though up-sewer, or near-source sampling, at sites with permanent IandI impacts did not exhibit a clear relationship with seasonal infection trends for Rotavirus, SARS-CoV-2, and Norovirus GII, WWTP influent signal and consensus signals aggregated from multiple up-sewer sites largely mirrored expected seasonal trends. Findings also suggested that for more ubiquitous viral targets, such as SARS-CoV-2, viral trends measured at WWTP influent in a small IandI impacted system may still provide a sufficiently useful measure of infection trends to inform the use of WBE (assuming appropriate normalization to sewershed population). These findings elucidate the potential utility and relative robustness of wastewater testing to ascertain community-level circulation of pathogens in small, rural sewersheds even those compromised by extensive IandI inputs. Overall, this dissertation examined drinking water and wastewater as critical metrics for assessing contaminant exposures and infectious disease trends in rural communities, particularly in the context of small, rural communities which tend to have more limited health infrastructure and lower-resource wastewater systems. Overall, findings underscore the need for baseline data on drinking water quality by identifying gaps in current knowledge and calling for further research to better understand drinking water contaminant exposure levels in rural areas. Wastewater as a non-invasive, population-level health metric was evaluated in the context of a small, rural sewer system overall, and by varying observed levels of IandI, as well as associated tradeoffs for normalization adoption. By evaluating these environmental surveillance metrics using both desk-based and field-based research study designs, findings from this dissertation offer valuable insights and practical recommendations for improving baseline drinking water quality monitoring and wastewater pathogen testing, all with the overarching goal of supporting more targeted public health interventions in rural settings. / Doctor of Philosophy / In the United States, there is a significant health and healthcare gap between rural and urban areas. Rural communities often face worse health outcomes, poorer health behaviors, and have less access to affordable and nearby healthcare services compared to their urban and peri-urban counterparts. Additionally, rural areas are exposed to higher risks for infectious diseases due to closer proximity to wildlife and livestock and proportionately lower access to regulated drinking water sources. Chronic conditions like diabetes and hypertension, which are more common in rural populations, can exacerbate the severity and duration of symptoms for infectious diseases, potentially leading to more serious illness and hospitalizations. Despite these heightened risks, data on health behaviors, outcomes, and healthcare services in rural areas is often lacking and less comprehensive compared to urban regions. This dissertation investigates two promising avenues of improving monitoring to provide information needed to better understand and address contaminant exposures and health trends in rural communities: drinking water and wastewater. Firstly, this dissertation underscores the importance of establishing baseline data on drinking water quality. This is essential for accurately estimating exposure levels and understanding the health impacts associated with elevated levels of drinking water contaminants, particularly in rural areas where a higher share of primary drinking water sources is unregulated by the federal government compared to urban areas. This study reveals significant gaps in current knowledge and highlights the need for more research to provide a clearer picture of drinking water quality in these communities. Secondly, this dissertation explores the use of wastewater as a non-invasive tool for assessing community health. This method, known as wastewater-based surveillance (WBS) or wastewater-based epidemiology (WBE), offers a way to measure population-level health trends without relying on clinical testing, which can be limited by factors such as access to healthcare and delays in testing. The dissertation evaluates how effective wastewater monitoring can be in small, rural sewer systems, even when these systems face challenges like aging infrastructure and significant inflow and infiltration (IandI) from groundwater and surface water. It examines how different normalization strategies for wastewater data can influence the reliability of this method and how wastewater testing can be adapted to account for varying levels of IandI. Overall, the dissertation provides valuable insights into the effectiveness of using drinking water and wastewater as environmental metrics for informing public health intervention strategies in rural settings. It offers justifications for improving drinking water quality monitoring and wastewater testing practices, aiming to support more targeted and effective public health interventions in rural communities. By addressing the challenges and limitations associated with these environmental monitoring strategies this research contributes to a better understanding of how to reduce health data disparities in rural areas.
625

Characterization of Metallic Flavor in Drinking Water: An Interdisciplinary Exploration through Sensory Science, Medicine, Health, and the Environment

Mirlohi, Susan 02 April 2012 (has links)
Scientific explorations can lead to life changing discoveries or light the path for new discoveries as scientists continue to carry or pass on the torch of knowledge to current and future generations. This torch of knowledge radiates in many directions, as the path of discovery often demands a multidimensional perspective. This research explored the many aspects of metallic flavor in drinking water through applications of sensory science, medicine, health, and the environment. Humans interact with their environment through the five senses and are often exposed to contaminants through multiple routes; oral intake of trace metal contaminants through drinking water is a likely source. The biochemical mechanism by which humans are able to detect the flavor of strongly metallic agents such as iron has been previously elucidated, but little is known about population variability in the ability to sense metallic flavors. This research evaluated sensory thresholds and biochemical indicators of metallic flavor perception in healthy adults for ferrous iron in drinking water; 61 subjects aged 19 – 84 years, participated. The findings demonstrated an age-dependent sensitivity to iron indicating as people age they are less sensitive to metallic perception; impairment of olfactory functions is a contributing factor. Unlike in healthy adults, where human senses are often protective of overexposure to contaminants, and supportive of sensations of everyday life's pleasures, cancer patients often suffer from chemosensory dysfunctions. Metallic phantom taste is a commonly experienced sensation, yet very little studied aspect of this debilitating disorder. Impact of cancer therapy on chemosensory functions of patients with malignant brain tumors undergoing combined modality treatment (CMT) was explored. The results indicated that chemosensory dysfunctions of the patients can range from minimal to moderate impairment with maximum impairment developing during the 6-week CMT. Study of salivary constituents may provide clues on to the causes of chemosensory dysfunctions. On health aspects, implication of individual sensitivity to metallic flavor on beverage choices and overall water consumption was assessed in 33 healthy adults through self-reported beverage questionnaire. The results indicated that among the elderly reduced intake of drinking water coincided with reduced sensitivity to metallic flavor. The findings have important health implications in terms of hydration status and beverage choices. Finally, with environmental exposure relevance, preliminary findings on sensory properties of zerovalent iron nanoparticles (nZVI) indicated that oral exposure to nZVI may induce sensory properties different from that of ferrous iron, likely predictive of a diminished detection of metallic flavor by humans. Further research is warranted in this area. / Ph. D.
626

Effect of Installation Practices on Galvanic Corrosion in Service Lines, Low Flow Rate Sampling for Detecting Water-Lead Hazards, and Trace Metals on Drinking Water Pipeline Corrosion: Lessons in Unintended Consequences

Clark, Brandi Nicole 17 April 2015 (has links)
Corrosion of drinking water distribution systems can cost water utilities and homeowners tens of billions of dollars each year in infrastructure damage, adversely impacting public health and causing water loss through leaks. Often, seemingly innocuous choices made by utilities, plumbers, and consumers can have a dramatic impacts on corrosion and pipeline longevity. This work demonstrated that brass pipe connectors used in partial lead service line replacements (PLSLR) can significantly influence galvanic corrosion between lead and copper pipes. Galvanic crevice corrosion was implicated in a fourfold increase in lead compared to a traditional direct connection, which was previously assumed to be a worst-case connection method. In field sampling conducted in two cities, a new sampling method designed to detect particulate lead risks demonstrated that the choice of flow rate has a substantial impact on lead-in-water hazards. On average, lead concentrations detected in water at high flow without stagnation were at least 3X-4X higher than in traditional regulatory samples with stagnation, demonstrating a new 'worst case' lead release scenario due to detachment of lead particulates. Although galvanized steel was previously considered a minor lead source, it can contain up to 2% lead on the surface, and elevated lead-in-water samples from several cities were traced to galvanized pipe, including the home of a child with elevated blood lead. Furthermore, if both galvanized and copper pipe are present, as occurs in large buildings, deposition corrosion is possible, leading to both increased lead exposure and pipe failures in as little as two years. Systematic laboratory studies of deposition corrosion identified key factors that increase or decrease its likelihood; soluble copper concentration and flow pattern were identified as controlling factors. Because of the high copper concentrations and continuous flow associated with mixed-metal hot water recirculating systems, these systems were identified as a worst-case scenario for galvanic corrosion. Deposition corrosion was also confirmed as a contributing mechanism to increased lead release, if copper pipe is placed before a lead pipe as occurs in partial service line replacements. Dump-and-fill tests confirmed copper solubility as a key factor in deposition corrosion impacts, and a detailed analysis of lead pipes from both laboratory studies and field tests was consistent with pure metallic copper deposits on the pipe surface, especially near the galvanic junction with copper. Finally, preliminary experiments were conducted to determine whether nanoparticles from novel water treatment techniques could have a negative impact on downstream drinking water pipeline infrastructure. Although increases in the corrosion of iron, copper, and stainless steel pipes in the presence of silver and carbon nanomaterials were generally small or non-existent, in one case the presence of silver nanoparticles increased iron release from stainless steel by more than 30X via a localized corrosion mechanism, with pitting rates as high as 1.2 mm/y, implying serious corrosion consequences are possible for stainless steel pipes if nanoparticles are present. / Ph. D.
627

Characterizing Waterborne Lead in Private Water Systems

Pieper, Kelsey J. 21 July 2015 (has links)
Lead is a common additive in plumbing components despite its known adverse health effects. Recent research has attributed cases of elevated blood lead levels in children and even fetal death with the consumption of drinking water containing high levels of lead. Although the federal Environmental Protection Agency (USEPA) strives to minimize lead exposure from water utilities through the Lead and Copper Rule (LCR), an estimated 47 million U.S. residents reliant on private unregulated water systems (generally individual and rural) are not protected. Detection, evaluation, and mitigation of lead in private systems is challenging due to lack of monitoring data, appropriate sampling protocols, and entities to fund research. Through a statewide sampling survey, over 2,000 homeowners submitted water samples for analysis. This survey documented that 19% of households had lead concentrations in the first draw sample (i.e., 250 mL sample collected after 6+ hours of stagnation) above the EPA action level of 15, with concentrations as high as 24,740. Due to the high incidence observed, this research focused on identifying system and household characteristics that increased a homeowner's susceptibility of lead in water. However, 1% of households had elevated lead concentrations after flushing for five minutes, which highlighted potential sources of lead release beyond the faucet. Therefore, a follow-up study was conducted to investigate sources and locations of lead release throughout the entire plumbing network. Using profiling techniques (i.e., sequential and time series sampling), three patterns of waterborne lead release were identified: no elevated lead or lead elevated in the first draw of water only (Type I), erratic spikes of particulate lead mobilized from plumbing during periods of water use (Type II), and sustained detectable lead concentrations (>1 ) even with extensive flushing (Type III). Lastly, emphasis was given to understand potential lead leaching from NSF Standard 61 Section 9 certified lead-free plumbing components as the synthetic test water is not representative of water quality observed in private water systems. Overall, this dissertation research provides insight into a population that is outside the jurisdiction of many federal agencies. / Ph. D.
628

The performance of free chlorine and chlorine dioxide oxidation and/or alum coagulation for the removal of complexed Fe(II) from drinking water

Shorney, Holly L. 12 September 2009 (has links)
Past research regarding complexed iron has focused on the resistance to and kinetics of oxidation by O₂(aq) and the extent of stabilization. The 0.45 um filter was typically used to differentiate between dissolved and particulate iron. This research investigated Fe(II) oxidation by free chlorine and ClO₂ in the presence of DOC by varying the pH, DOC to Fe ratios, DOC sources, oxidant dosages, and contact time. Complexed iron removal by alum coagulation with and without oxidant addition was also examined. Particulate, colloidal, and soluble iron were differentiated by the use of 0.2 um filters and 100K ultrafilters. Ultrafiltration and oxidation studies revealed that, at the DOC-to-iron ratios used for this research, not all of the Fe(II) in solution was actually complexed. Thus, oxidation studies represented the oxidation of uncomplexed Fe(II) to Fe(III), which was then complexed by the higher molecular weight DOC. Results indicated that particulate iron formation (as defined as retention by a 0.2 um filter) was a function of the DOC source and oxidant used for testing. The formation of colloidal iron (as defined by retention on 100K ultrafilter) due to oxidation was dependent upon the initial DOC-to-iron ratio and the DOC source. A correlation between DOC adsorption to iron oxide solids and the solution pH, initial DOC-to-iron ratio, and the oxidant used was also evident. Complexed Fe(II) was removed from solution by alum coagulation. Oxidant addition to alum coagulation was necessary to effectively remove uncomplexed Fe(II) (in the presence of DOC) from solution. / Master of Science
629

Lead silicate solubility and the control of lead contamination in drinking water

Weaver, Cameron L. 11 July 2009 (has links)
The intake of lead into the human body has become an area of major concern because high levels of lead are harmful and can cause physiological damage, especially in children. It has been suggested that adding NaSiO₃might control Pb²⁺ contamination of drinking water supplies (c.g., Thresh 1922). PbSiO₃ (am), the white, gelatinous precipitate formed by mixing Na₂SiO₃ solutions with Pb²⁺ bearing solutions, dissolves at pH<7 by the reaction: PbSiO₃ + H₂O + 2H⁺ = Pb²⁺ + H₄SiO₄ Measurements of the solubility of PbSiO₃ show that the K<sub>eq</sub> for this reaction is 7.41x10⁵ and the ΔG<sub>f</sub>° (PbSiO₃ (am)) is 1061.81 kJ mol⁻¹. This high value of K means that extreme amounts of a Na₂SiO₃ additive are required in a water supply system to reduce the Pb²⁺ concentration to the EPA MCL action level of 15 ppb. Furthermore, the high pH values that result from NaSiO₃ water treatments lead to the formation of lead hydroxycarbonate (hydrocerussite) because this carbonate phase is more stable in the pH range of natural waters (pH 5-8) than lead silicate. / Master of Science
630

Consumers and Their Drinking Water: Communicating Water Quality and Assessing the Reaction of Zerovalent Nanoiron (nZVI) with Saliva

Phetxumphou, Katherine 01 July 2014 (has links)
Human senses for taste, odor, and visual assessment allow consumers to be selective when it comes to choosing their drinking water. In addition to wanting aesthetically pleasing water to drink, consumers want to know if their water is safe and may have misconceptions on what possible health risk contaminants could be lurking in their water supply. This thesis aimed to measure reaction of zerovalent nanoiron (nZVI) in water and human saliva, evaluate consumer's perceptions of taste, odor, and risk in their drinking water, and investigate the effectiveness of community water systems in communicating water quality information to their consumers. Since nZVI, including commercially available Nanofer 25S, is widely being used in water treatment processes and has future potential for use in fortifying foods, the exposure to these engineered nanoparticles will increase for humans and aquatic organisms. Thus, the first part of the thesis was to develop a quantitative analytical technique to measure the iron levels at environmentally relevant concentrations. Researchers developed a colorimetric assay using 1, 10-phenanthroline as an assay to determine the amount of ferrous ions produced from different iron materials, including ferrous(II)sulfate, nZVI, and goethite. Resulting ferrous ion measurements indicate that the maximum production of ferrous ions varied among the iron materials. Nanofer25S did not undergo 100% conversion to ferrous ions, as expected, goethite had no production of ferrous ions, and ferrous(II)sulfate was 100% ferrous ions. The total iron, as measured by atomic absorption for all iron materials were equal. The reactivity of these iron materials were also assessed in different water qualities ranging in salt concentrations. The capacity to produce ferrous ion did not change when added to nanopure water, tap water, and inorganic solution that is equivalent to the high ionic strength of saliva. Toxicology data for nZVI exposure to humans and aquatic organisms are limited. For that reason, authors of this manuscript measured salivary lipid oxidation (SLO) potential for the different iron materials in human saliva. They also developed an artificial saliva recipe to ensure repeatability and comparable results among laboratories due to human saliva's variability day by day. This simulated human saliva contained salts, proteins, and lipids. Using thiobarbituric acid reactive substances (TBARs), both Nanofer25S and ferrous(II)sulfate induced in-vitro SLO with human saliva. Goethite was unreactive. SLO results from this study have implications for flavor effects of nZVI in drinking water. The second chapter of this thesis is assessing the clarity of message communication of Consumer Confidence Reports (CCRs). In 1998, the United States Environmental Protection Agency (USEPA) mandated that community water systems (CWSs) provide annual water quality reports to their consumers. These CCRs summarize information regarding water sources, any detected contaminants, compliance with federal regulations, and educational information. Thirty CCRs across all ten USEPA regions were analyzed for clarity using the Centers for Disease Control and Prevention's (CDC) Clear Communication Index (CCI) tool. The analysis of these CCRs was a national representation of CWSs and revealed that currently distributed CCRs performed poorly on the CDC's CCI—all failing to meet the 90% passing mark. The overall average score for all CCRs was 50.3 ± 13.5%. The clarity scores were based on seven key areas: 1) Main message and call to action; 2) Language; 3) Information design; 4) State of the science; 5) Behavioral recommendations; 6) Numbers; and 7) Risk. Improvements in all seven areas—with the lowest average scores at 3.3 ± 18.1%, 21.7 ± 26.6%, and 37.7 ± 27.1%, respectively, for state of science, language, and main message and call to action—of the CCI will greatly improve the quality and educational capabilities of CCRs. The failing scores highlight the challenges facing CWSs in communicating water quality information. This assessment can serve as a tool for water utilities to effectively prepare and distribute information to their consumers in the future. CWSs must promote a two-way dialogue with their consumers. They should address consumer's concerns and wants in the CCRs, and they should also effectively communicate risks to the consumers so that they are not under the misconception that their water is unsafe to drink. CWSs should use the CCRs as a way to educate the public and promote drinking tap water. The last chapter of this thesis addresses the concerns that consumers may have about their drinking water and methods that could be implemented to quickly and efficiently respond to consumer complaints and contaminants with sensory properties. Just like CWSs, consumers are concerned about their water; they are the sentinels to water quality monitoring because they are uniquely positioned at the tap. Consumers are able to detect the slightest taste, odor, and appearance in their drinking water because it is well—instinctive! Thus, consumer feedback and complaint data provided to a utility should be taken seriously and stored for future comparisons. Any consumer complaint represents a fruitful data stream that should be harnessed routinely to gain knowledge about aesthetic water quality in the distribution system. Four utilities provided consumer complaints on water quality data that were categorized and visualized using radar and run-time plots. As a result, major taste, odor, and appearance patterns emerged that clarified the issue and could provide guidance to the utilities on the nature and extent of the problem. Consumer complaint data is valuable for water quality issue identification, but CWSs should understand that even though humans readily identify visual issues with water, such as color, cloudiness, or rust, describing specific tastes and particularly odors in drinking water is acknowledged to be a much more difficult task for humans to achieve without training. This was demonstrated with two utility groups, laboratory personnel and plant operators, and a group of consumers identifying the odor of orange, 2-MIB, and DMTS. All of the groups were able to identify the familiar orange odor. However, the two utility groups were much more able to identify the musty odor of 2-MIB; this may be due to the fact that the utility groups are more familiar with raw and finished water. DMTS, a garlic-onion odor associated with sulfur compounds in drinking water, was the least familiar to all three groups. The lab personnel group was the better describers of the odor, but the results within this group still varied significantly. These results suggest that utility personnel should be mindful of consumers who complain that their water is different, but cannot describe the problem. To reduce the inability to describe an odor or taste issue, a TandO program at a utility can be beneficial. The safety and aesthetic characteristics of drinking water is most important to consumers. They both complement each other; if consumers think their water tastes funny, they would probably assume that is unsafe to drink. Since nZVI is increasingly being introduced into the drinking water supply, researchers must be able to understand how it reacts in humans and the environment. Additionally, CCRs would be an effective method for CWSs to communicate water quality information and address any concerns consumers may have about their water. CWSs can use implement the radar and run-time plots to identify issues in the drinking water systems. Also, TandO programs will allow CWSs and their consumers to better describe and identify the issues in their drinking water as it arises so that it can be easily addressed and alleviated. Thus, promoting communication between water utilities and their consumers will improve the relationship and instill confidence in consumers about their drinking water. / Master of Science

Page generated in 0.0885 seconds