Return to search

Lead and Copper Contamination in Potable Water: Impacts of Redox Gradients, Water Age, Water Main Pipe Materials and Temperature

Potable water can become contaminated with lead and copper due to the corrosion of pipes, faucets, and fixtures. The US Environmental Protection Agency Lead and Copper Rule (LCR) is intended to target sampling at high-risk sites to help protect public health by minimizing lead and copper levels in drinking water. The LCR is currently under revision with a goal of better crafting sampling protocols to protect public health. This study examined an array of factors that determine the location and timing of "high-risk" in the context of sampling site selection and consumer health risks. This was done using field studies and well-controlled laboratory experiments.

A pilot-scale simulated distribution system (SDS) was used to examine the complex relationship between disinfectant type (free chlorine and chloramine), water age (0-10.2 days), and pipe main material (PVC, cement, and iron). Redox gradients developed in the distribution system as controlled by water age and pipe material, which affected the microbiology and chemistry of the water delivered to consumer homes. Free chlorine disinfectant was the most stable in the presence of PVC while chloramine was most stable in the presence of cement. At shorter water ages where disinfectant residuals were present, chlorine tended to cause as much as 4 times more iron corrosion when compared to chloramine. However, the worst localized attack on iron materials occurred at high water age in the system with chloramine. It was hypothesized that this was due to denitrification-a phenomenon relatively unexplored in drinking water distribution systems and documented in this study.

Cumulative chemical and biological changes, such as those documented in the study described above, can create "high-risk" hotspots for elevated lead and copper, with associated concerns for consumer exposure and regulatory monitoring. In both laboratory and field studies, trends in lead and copper release were site-specific and ultimately determined by the plumbing material, microbiology and chemistry. In many cases, elevated levels of lead and copper did not co-occur suggesting that, in a revised LCR, these contaminants will have to be sampled separately in order to identify worst case conditions.

Temperature was also examined as a potentially important factor in lead and copper corrosion. Several studies have attributed higher incidence of childhood lead poisoning during the summer to increased soil and dust exposure; however, drinking water may also be a significant contributing factor. In large-scale pipe rigs, total and dissolved lead release was 3-5 times higher during the summer compared to the winter. However, in bench scale studies, higher temperature could increase, decrease, or have no effect on lead release dependent on material and water chemistry. Similarly, in a distribution system served by a centralized treatment plant, lead release from pure lead service lines increased with temperature in some homes but had no correlation in other homes. It is possible that changes throughout the distribution system such as disinfectant residual, iron, or other factors can create scales on pipes at individual homes, which determines the temperature dependency of lead release.

Consumer exposure to lead can also be adversely influenced by the presence of particulate iron. In the case of Providence, RI, a well-intentioned decrease in the finished water pH from 10.3 to 9.7, resulted in an epidemic of red water complaints due to the corrosion of iron mains and a concomitant increase in water lead levels. Complementary bench scale and field studies demonstrated that higher iron in water is sometimes linked to higher lead in water, due to sorption of lead onto the iron particulates.

Finally, one of the most significant emerging challenges associated with evaluating corrosion control and consumer exposure, is the variability in lead and copper during sampling due to semi-random detachment of lead particles to water, which can pose an acute health concern. Well-controlled test rigs were used to characterize the variability in lead and copper release and compared to consumer sampling during the LCR. The variability due to semi-random particulate detachment, is equal to the typical variability observed in LCR sampling, suggesting that this inherent variability is much more important than other common sources including customer error, customer failure to follow sampling instructions or long stagnation times. While instructing consumers to collect samples are low flow rates reduces variability, it will fail to detect elevated lead from many hazardous taps. Moreover, collecting a single sample to characterize health risks from a given tap, are not adequately protective to consumers in homes with lead plumbing, in an era when corrosion control has reduced the presence of soluble lead in water. Future EPA monitoring and public education should be changed to address this concern. / Ph. D.

Identiferoai:union.ndltd.org:VTETD/oai:vtechworks.lib.vt.edu:10919/73338
Date06 May 2015
CreatorsMasters, Sheldon
ContributorsCivil and Environmental Engineering, Edwards, Marc A., Dietrich, Andrea M., Pruden, Amy, Lambrinidou, Ioanna
PublisherVirginia Tech
Source SetsVirginia Tech Theses and Dissertation
Detected LanguageEnglish
TypeDissertation
FormatETD, application/pdf, application/pdf, application/pdf
RightsIn Copyright, http://rightsstatements.org/vocab/InC/1.0/

Page generated in 0.0021 seconds