Return to search

Validating Pathogen Reduction in Ozone-Biofiltration Water Reuse Applications

Advanced water treatment (AWT)/reuse has become a necessity for many utilities across the globe as the quantity and quality of water resources has been diminished. In some locations including California, the full-advanced treatment (FAT) train is mandated including membrane filtration, reverse osmosis, and UV advanced oxidation. The application of carbon-based treatment has emerged as a cost-effective alternative to FAT in locations that cannot manage brine disposal. However, considering the relative novelty of this treatment technology for water reuse, the process still requires full-scale validation of treatment goals including pathogen reduction. While there are many constituents of concern in water reuse, exposure to pathogens remains the greatest acute health risk. The studies described herein examine pathogen and microbial surrogate reduction both full-scale and pilot-scale floc/sed-ozone-biofiltration advanced water treatment facility. Both culture and molecular-based methods were used to demonstrate removal in this case and pilot challenge testing was employed to address the shortcomings of full-scale monitoring and to address additional research objectives.
The reduction of Cryptosporidum, Giardia, enteric viruses, pathogenic bacteria and their corresponding surrogate microorganisms (e.g. spore forming bacteria, coliphage) was quantified across the upstream wastewater treatment process and the AWT. In general, the removal of surrogate microorganisms was less than or equal that of the pathogens of interest thereby justifying their use in full-scale monitoring. Several limitations of full-scale monitoring were noted including low starting concentrations which resulted in large sample volume required to demonstrate log-reduction. Additionally, while molecular methods were sufficient to demonstrate reduction by physical treatment steps, they are unable to demonstrate inactivation. Therefore, ozone pilot testing was performed to evaluate the use of capsid integrity PCR for showing inactivation by ozonation. Additional testing was also performed to relate the LRV shown with culture methods to the LRV shown with PCR so as to create a relationship that can be used in future monitoring.
While pathogen inactivation is a major concern in water reuse, these objectives must also be balanced with the formation of disinfection byproducts (DBPs) through ozonation. Given the elevated concentration of dissolved organic matter, relatively higher ozone doses are required in reuse applications when compared with water treatment applications in order to achieve the desired treatment goals (oxidation, disinfection). Pilot scale ozone testing was performed to evaluate ozone disinfection performance in unfiltered secondary effluent while balancing the formation of bromate and oxidation of trace organic contaminants (TrOCs). Two chemical bromate control methods were compared including preformed monochloramine (NH2Cl), and hydrogen peroxide (H2O2). Neither of these bromate control methods had any demonstrable impact on virus or coliform inactivation, however H2O2 eliminated measurable ozone exposure which is necessary for the inactivation of more resistant spore forming bacteria. Additionally, NH2Cl was shown to suppress *OH exposure and thus negatively impacted the oxidation of ozone resistant TrOCs, while H2O2 marginally improved TrOC oxidation.
Finally, the use of H2O2 for bromate control necessitates the validation of an alternative framework for ozone process control. The existing ozone Ct framework has been shown to be prohibitively conservative especially for virus inactivation. In this study, the applied specific ozone dose (O3:TOC) and the change in UV254 absorbance were evaluated as ozone monitoring frameworks across a range of water quality characteristics. Elevated temperature and pH were shown to significantly impact ozone decay kinetics, and only marginally impact virus inactivation. Both frameworks that were evaluated were shown to be valid across all water quality conditions evaluated.
Validating pathogen reduction across carbon-based reuse treatment trains is imperative in order to allow for more widespread application and regulatory confidence in the technology. Coagulation, floc/sed, ozone, and biofiltration were shown to be robust barriers for pathogen and surrogate reduction and recommended concentration and quantification methods are presented herein. The ozone challenge testing results also provide guidance to utilities using ozone for disinfection while controlling DBPs and enhancing organics oxidation in water reuse applications. / Doctor of Philosophy / Water reuse has become a necessity for many utilities across the globe as the quantity and quality of water resources has been diminished. In some locations including California, the full-advanced treatment (FAT) train is required including membrane filtration, reverse osmosis, and UV advanced oxidation. The application of carbon-based treatment has emerged as a cost-effective alternative to FAT in locations that cannot manage brine disposal. However, considering the relative novelty of this treatment technology for water reuse, the process still requires full-scale validation of treatment goals including pathogen reduction. While there are many constituents of concern in water reuse, exposure to pathogens remains the greatest acute health risk. The studies described herein examine pathogen and microbial surrogate reduction both full-scale and pilot-scale floc/sed-ozone-biofiltration advanced water treatment facility. Both culture and molecular-based methods were used to demonstrate removal in this case and pilot challenge testing was employed to address the shortcomings of full-scale monitoring and to address additional research objectives.
The reduction of protozoa, viruses, bacteria and their corresponding surrogate microorganisms was quantified across the upstream wastewater treatment process and the water reuse treatment train. In general, the removal of surrogate microorganisms was less than or equal that of the pathogens of interest thereby justifying their use in full-scale monitoring. Several limitations of full-scale monitoring were noted including low starting concentrations which resulted in large sample volume required to demonstrate log-reduction. Additionally, while molecular methods were sufficient to demonstrate reduction by physical treatment steps, they are unable to demonstrate inactivation. Therefore, ozone pilot testing was performed to evaluate several methods to adapt these methods to reflect inactivation.
While pathogen inactivation is a major concern in water reuse, these objectives must also be balanced with the formation of disinfection byproducts through ozonation. Given the elevated concentration of dissolved organic matter, relatively higher ozone doses are required in reuse applications when compared with water treatment applications in order to achieve the desired treatment goals (oxidation, disinfection). Pilot scale ozone testing was performed to evaluate ozone disinfection performance in wastewater effluent while balancing the formation of byproducts and oxidation of trace organic contaminants. Two chemical byproduct control methods were compared including preformed monochloramine, and hydrogen peroxide. Neither of these bromate control methods had any demonstrable impact on virus or coliform inactivation, however H2O2 eliminated measurable ozone exposure which is necessary for the inactivation of more resistant spore forming bacteria. Additionally, monochloramine was shown to suppress hydroxyl radical exposure and thus negatively impacted the oxidation of ozone resistant organic contaminants, while hydrogen peroxide marginally improved oxidation.
Finally, the use of hydrogen peroxide for bromate control necessitates the validation of an alternative framework for ozone process control. The existing framework that relies on ozone exposure has been shown to be conservative especially for virus inactivation. In this study, the applied specific ozone dose and the change in UV254 absorbance were evaluated as ozone monitoring frameworks across a range of water quality characteristics. Elevated temperature and pH were shown to impact ozone decay kinetics and virus inactivation to varying degrees. Both frameworks that were evaluated were shown to be valid across all water quality conditions evaluated.
Validating pathogen reduction across carbon-based reuse treatment trains is imperative in order to allow for more widespread application and regulatory confidence in the technology. Coagulation, flocculation/sedimentation, ozone, and biofiltration were shown to be robust barriers for pathogen and surrogate reduction and recommended concentration and quantification methods are presented herein. The ozone challenge testing results also provide guidance to utilities using ozone for disinfection while controlling disinfection byproducts and enhancing organics oxidation in water reuse applications.

Identiferoai:union.ndltd.org:VTETD/oai:vtechworks.lib.vt.edu:10919/117299
Date03 January 2024
CreatorsHogard, Samantha Ann
ContributorsCivil and Environmental Engineering, Knocke, William R., Bott, Charles B., Edwards, Marc A., Pruden-Bagchi, Amy Jill
PublisherVirginia Tech
Source SetsVirginia Tech Theses and Dissertation
LanguageEnglish
Detected LanguageEnglish
TypeDissertation
FormatETD, application/pdf, application/pdf
RightsIn Copyright, http://rightsstatements.org/vocab/InC/1.0/

Page generated in 0.0031 seconds