• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 20
  • 4
  • 2
  • 1
  • Tagged with
  • 38
  • 38
  • 38
  • 9
  • 8
  • 8
  • 8
  • 7
  • 7
  • 7
  • 7
  • 6
  • 6
  • 4
  • 4
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
31

A novel model for the prediction of iron release in drinking water distribution pipe networks

Mutoti, Ginasiyo 01 October 2003 (has links)
No description available.
32

Risk of injection using reclaimed water for aquifer recharge using rotavirus as surrogate contaminant

Unknown Date (has links)
Groundwater aquifers are precious resources that has been serving human consumption for many centuries. This resource is pristine in comparison with surface waters such as lakes and canals, however, as population grows exponentially so does the demand for groundwater and the need to study the potential of groundwater replenishment programs. The injection of treated water or wastewater into an aquifer is a method to protect this resource for current and future generations. Health concerns would be expected since migration of water of “impaired quality” can affect the drinking water by contamination. Regulatory barriers resulting from the perceived risks of adverse health effects from pathogens such as viruses have limited the concept of this impaired water resources from being used for groundwater replenishment programs. The objective of this study is to examine the risk assessment using computational modeling with MODFLOW and MT3D groundwater transport simulation. The results from the simulation showed that after two years, the risk of contamination based on concentration contours from the injection well to the production wellfields for the City of Hollywood stabilized below 10- 6. The risk assessment provided important aspect to demonstrate the concept of using injection of treated water as an option for groundwater replenishment. / Includes bibliography. / Thesis (M.S.)--Florida Atlantic University, 2014. / FAU Electronic Theses and Dissertations Collection
33

Groundwater Surface Trends in the North Florence Dunal Aquifer, Oregon Coast, USA

Doliber, Sarah Rebecca 01 January 2012 (has links)
The North Florence Dunal Aquifer is the only feasible source for drinking water for the coastal city of Florence, Oregon and Florence's Urban Growth Boundary. High infiltration rates and a shallow groundwater table leave the aquifer highly susceptible to contamination from septic tank effluent, storm runoff, chemical fertilizers and recreational ATV use throughout the dunes. Public interest in the quality and quantity of the aquifer water has been sparked since the City of Florence received a grant from the Environmental Protection Agency for a watershed protection and restoration project. Delineation of the shallow groundwater surface and its relationship to the surface water bodies within the dunes is crucial in protecting this drinking water source from contamination. This thesis project created a GIS representation of the shallow groundwater elevation and associated prediction error map. Surface water bodies were confirmed as window lakes into the dunal aquifer and no signs of perched aquifer conditions were observed between Holocene and Pleistocene dunes. Ground Penetrating Radar, well data provided by the city of Florence and LiDAR were the primary sources for data collection.
34

Drinking water arsenic and uranium: associations with urinary biomarkers and diabetes across the United States

Spaur, Maya January 2023 (has links)
Inorganic arsenic is a potent carcinogen and toxicant associated with numerous adverse health outcomes, and is number one on the Agency for Toxic Substances and Disease Registry Substance Priority List. Uranium is also a carcinogen and nephrotoxicant, however health effects at levels experienced by general populations is unclear. Chronic exposure to inorganic arsenic (As) and uranium (U) in the United States (US) occurs from unregulated private wells and federally regulated community water systems (CWSs). Geogenic arsenic contamination typically occurs in groundwater as opposed to surface water supplies. Groundwater is a major source for many CWSs in the US. Although the US Environmental Protection Agency sets the maximum contaminant level (MCL enforceable since 2006: 10 µg/L) for arsenic in CWSs, private wells are not federally regulated. The contribution of drinking water from private wells and regulated CWSs to total inorganic arsenic and uranium exposure is not clear.In the United States (US), type 2 diabetes (T2D) affects approximately 37.3 million people (11.3% of the population), with the highest burden in American Indian communities. Toxic metal exposures have been identified as risk factors of T2D. Most studies rely on biomarkers, which could be affected by early disease processes. Studies directly measuring metals in drinking water in US populations have been limited. In Chapter 2, we evaluated county-level associations between modeled values of the probability of private well arsenic exceeding 10 µg/L and CWS arsenic concentrations for 2,231 counties in the conterminous US, using time invariant private well arsenic estimates and CWS arsenic estimates for two time periods. Nationwide, county-level CWS arsenic concentrations increased by 8.4 µg/L per 100% increase in the probability of private well arsenic exceeding 10 µg/L for 2006 – 2008 (the initial compliance monitoring period after MCL implementation), and by 7.3 µg/L for 2009 – 2011 (the second monitoring period following MCL implementation) (1.1 µg/L mean decline over time). Regional differences in this temporal decline suggest that interventions to implement the MCL were more pronounced in regions served primarily by groundwater. The strong association between private well and CWS arsenic in Rural, American Indian, and Semi Urban, Hispanic counties suggests that future research and regulatory support are needed to reduce water arsenic exposures in these vulnerable subpopulations. This comparison of arsenic exposure values from major private and public drinking water sources nationwide is critical to future assessments of drinking water arsenic exposure and health outcomes. In Chapter 3, we aimed to determine the association between drinking water arsenic estimates and urinary arsenic concentrations in the 2003-2014 National Health and Nutrition Examination Survey (NHANES). We evaluated 11,088 participants from the 2003-2014 NHANES cycles. For each participant, we assigned private well and CWS arsenic levels according to county of residence using estimates previously derived by the U.S. Environmental Protection Agency and U.S. Geological Survey. We used recalibrated urinary dimethylarsinate (rDMA) to reflect the internal dose of estimated water arsenic by applying a previously validated, residual-based method that removes the contribution of dietary arsenic sources. We compared the adjusted geometric mean ratios and corresponding percent change of urinary rDMA across tertiles of private well and CWS arsenic levels, with the lowest tertile as the reference. Comparisons were made overall and stratified by census region and race/ethnicity. Overall, the geometric mean of urinary rDMA was 2.52 (2.30, 2.77) µg/L among private well users and 2.64 (2.57, 2.72) µg/L among CWS users. Urinary rDMA was highest among participants in the West and South, and among Mexican American, Other Hispanic, and Non-Hispanic Other participants. Urinary rDMA levels were 25% (95% confidence interval (CI): 17-34%) and 20% (95% CI: 12-29%) higher comparing the highest to the lowest tertile of CWS and private well arsenic, respectively. The strongest associations between water arsenic and urinary rDMA were observed among participants in the South, West, and among Mexican American and Non-Hispanic White and Black participants. Both private wells and regulated CWSs are associated with inorganic arsenic internal dose as reflected in urine in the general U.S. population. In Chapter 4, our objective was to evaluate regional and sociodemographic inequalities in water arsenic exposure reductions associated with the US Environmental Protection Agency’s Final Arsenic Rule, which lowered the arsenic maximum contaminant level to 10 µg/L in public water systems. We analyzed 8,544 participants from the 2003-14 National Health and Nutrition Examination Survey (NHANES) reliant on community water systems (CWSs). We estimated arsenic exposure from water by recalibrating urinary dimethylarsinate (rDMA) to remove smoking and dietary contributions. We evaluated mean differences and corresponding percent reductions of urinary rDMA comparing subsequent survey cycles to 2003-04 (baseline), stratified by region, race/ethnicity, educational attainment, and tertile of CWS arsenic assigned at the county level. The overall difference (percent reduction) in urine rDMA was 0.32 µg/L (9%) among participants with the highest tertile of CWS arsenic, comparing 2013-14 to 2003-04. Declines in urinary rDMA were largest in regions with the highest water arsenic: the South [0.57 µg/L (16%)] and West [0.46 µg/L, (14%)]. Declines in urinary rDMA levels were significant and largest among Mexican American [0.99 µg/L (26%)] and Non-Hispanic White [0.25 µg/L (10%)] participants. Reductions in rDMA following the Final Arsenic Rule were highest among participants with the highest CWS arsenic concentrations, supporting legislation can benefit those who need it the most, although additional efforts are still needed to address remaining inequalities in CWS arsenic exposure. In Chapter 5, we examined the contribution of water As and U to urinary biomarkers in the Strong Heart Family Study (SHFS), a prospective study of American Indian communities, and the Multi-Ethnic Study of Atherosclerosis (MESA), a prospective study of racially/ethnically diverse urban US communities. We assigned residential zip code-level estimates in CWSs (µg/L) and private wells (90th percentile probability of As >10 µg/L) to up to 1,485 and 6,722 participants with dietary information and urinary biomarkers in the SHFS (2001-2003) and MESA (2000-2002; 2010-2011), respectively. Total inorganic As exposure was estimated as the sum of inorganic and methylated species in urine (urine As). We used linear mixed-effects models to account for participant clustering and removed the effect of dietary sources of As and U via regression adjustment. The median (interquartile range) urine As was 5.32 (3.29, 8.53) and 6.32 (3.34, 12.48) µg/L for SHFS and MESA, respectively, and urine U was 0.037 (0.014, 0.071) and 0.007 (0.003, 0.018) µg/L. In a mixed-effects meta-analysis of pooled effects across the SHFS and MESA, urine As was 11% (95% CI: 3, 20%) higher and urine U was 35% (5, 73%) higher per 2-fold higher CWS As and U, respectively. In the SHFS, CWS and private well As explained >40% of variability in urine As and CWS U explained >20% of urine U. In MESA, CWS As and U explained >50% of urine As and U. Water from public water supplies and private wells represents a major contributor to inorganic As and U exposure in diverse US populations. In Chapter 6, we examined the association of arsenic exposures in community water systems (CWS) and private wells with T2D incidence in the Strong Heart Family Study (SHFS), a prospective cohort of American Indian communities, and the Multi-Ethnic Study of Atherosclerosis (MESA), a prospective study of racially/ethnically diverse urban US communities, to evaluate direct associations between drinking water metal exposures and T2D risk. We evaluated adults in the SHFS free of T2D at baseline (2001-2003) and followed through 2010, with available private well and CWS arsenic (N=1,791) estimates assigned by residential zip code. We also evaluated adults in the MESA free of T2D at baseline (2000-2002) and followed through 2019, with available zip code level CWS arsenic (N=5,577) estimates. We used mixed effects Cox models to account for clustering by family and residential zip code, with adjustment for sex, baseline age, body mass index (BMI), smoking status, and education. T2D incidence in the SHFS was 24.4 cases per 1,000 people (mean follow-up 5.6 years) and T2D incidence in MESA was 11.2 per 1,000 people (mean follow-up 6.0 years). In a meta-analysis of pooled effects across the SHFS and MESA, the corresponding hazard ratio (95% confidence interval) per 2-fold increase in water arsenic was 1.09 (1.01, 1.16). Differences were observed by BMI category and sex; positive associations were observed among participants with BMI <25 kg/m2 and among female participants. In categorical analyses, >10% probability of private well arsenic (<10% reference) in the SHFS and >1 µg/L of CWS arsenic (<1 µg/L reference) in MESA were associated with increased diabetes risk. Low to moderate water arsenic levels in unregulated private wells and federally regulated CWSs were associated with T2D incidence in the SHFS and MESA. In supplementary analyses, we also observed that CWS uranium was associated with T2D risk among SHFS and MESA participants with BMI<25 kg/m2.
35

An evaluation of well-water nitrate exposure and related health risks in the Lower Umatilla Basin of Oregon

Mitchell, Thomas J. 04 May 1993 (has links)
Excessive nitrates in drinking water pose a human health threat, especially to infants. Methemoglobinemia, or blue-baby syndrome, is a potentially fatal condition that inhibits the ability of red blood cells to bind and transport oxygen. Nitrates/nitrites have also been linked to such conditions as cancer, birth defects, and behavioral and developmental abnormalities. Nitrates are frequently found in wells in rural farming areas because synthetic fertilizers (containing nitrates) leach from the soil into the groundwater. The Lower Umatilla Basin (LUB) in Morrow and Umatilla counties of Oregon represents an intensively farmed and irrigated area in which relatively high amounts of nitrates are present in the groundwater and domestic well water. This study investigated population demographics for the rural Lower Umatilla Basin, comparing these data to identified well-water nitrate levels for the purpose of estimating nitrate exposures and potential risk of adverse health effects in the survey area. Results of the investigation revealed that 25 percent of the domestic-use wells in the survey area had nitrate levels that were in excess of the 10 ppm nN MCL for drinking water, as established by the U.S. Environmental Protection Agency. From access to these wells, 23 percent of the surveyed population was exposed to nitrate concentrations in excess of the MCL standard. However, resident infants were neither exposed to well-water nitrates in excess of the standard, nor were they exposed to illness that could have increased the risk of methemoglobinemia. The LUB survey population was generally older than the populations from cities in the LUB or the combined populations of rural areas of Morrow and Umatilla counties. The population included few women of childbearing age, and it was not subject to an appreciable increase in the proportion of younger to older families. These factors reduced the likelihood of a significant increase in the infant population, which also minimized the risk of methemoglobinemia to this population. Even though the risk of methemoglobinemia to infants was low in the LUB area, it is recommended that exposures to well-water nitrates be prevented, if possible even for adults, to reduce the potential for chronic, adverse health effects from excess nitrate ingestion. Continued monitoring of private wells by state agencies is recommended, with attention directed at domesticuse wells with nitrate levels in excess of 10 ppm nN. This information should be shared with local health departments for follow-up, investigation, and educational efforts as needed. Future studies by the Oregon DEQ, or other agencies which seek to document the sources of well-water nitrate contamination in the LUB, should include an investigation of the influence of local sources of nitrate contamination. / Graduation date: 1993
36

Quality of drinking water sources in the Bloemfontein area of the Mangaung Metropolitan Municipality

Ratikane, Mosepeli January 2013 (has links)
Thesis (M. Tech. (Environmental Health)) -- Central University of technology, Free State, 2013 / Introduction: Drinking water of poor quality can cause a variety of diseases and may even result in death. The impact of poor drinking water is a course for concern even in South Africa. Therefore, the physical, chemical and microbiological drinking water quality was investigated in the peri-urban area of Bainsvlei and the Woodlands Hills Estate in Bloemfontein, Free State. Materials and Methods: The water quality was assessed in 20 identified sampling sites for three series with ten weeks apart. These sites use treated municipal and untreated borehole water for drinking. The determinants analysed for were pH, electrical conductivity (EC), turbidity, temperature, Ca, Mg, Na, F, Cl, N, SO₄,N, Free chlorine, Al, As, CN, Fe, Mn, Pb, Hg, total coliforms and E. coli. The water samples were collected and analysed on site and in the laboratory. Both the physical and chemical determinants were measured using standard methods whereas the microbiological determinants were measured using the Defined Substrate Technology (DST) method. The measurements were first compared to the SANS 241 (2011) for compliance. The ANOVA tests were used to investigate if any seasonal variations existed in the water quality as well as to compare the levels of the determinants between borehole and municipal water. In the assessment of the overall drinking water quality of different water sampling sites the water quality index (WQI) was used. Results and Discussions: Significant effects were believed to exist if the p-values of the ANOVA and Scheffe tests were at a significance level of 5% (p < 0.05). The study results revealed that of the four physical determinants that were measured turbidity exceeded the standard in many sampling sites in the three series. Of all the chemical determinants, nitrates exceeded the standard. In the same way coliforms exceeded the standard in a number of sampling sites while E. coli was found in a few sampling sites in the first series. ANOVA tests revealed that seasonal variations existed between pH, EC, temperature, cyanide and iron at a significant level of 5% (p < 0.05) while the Post-hoc Scheffe test further revealed the series in which the effect existed. Similarly, the ANOVA tests revealed that the levels of the determinants between municipal versus borehole varied in pH, EC, Ca, Mg, Na, F, Cl, N, and SO₄ at a significant level of 5% (p < 0.05). The WQI showed that in all the series when combining the good and excellent category season 2 had the highest percentage of 80%, followed by season 3 with 79% and season 1 with 70%. Only borehole sampling sites were found in the poor, very poor and unsuitable categories. Similarly all the highest WQI values were found in borehole sampling sites. Conclusion: This study revealed that the water quality is of good quality in the Bainsvlei and Woodlands Hills Estate of the Mangaung metropolitan municipality in Bloemfontein, in the Free State, South Africa. The presence of E. coli, though found in a few sampling sites and the high levels of turbidity, nitrates and coliforms are of concern to public health.
37

Granular Media Supported Microbial Remediation of Nitrate Contaminated Drinking Water

Malini, R January 2014 (has links) (PDF)
Increasing nitrate concentration in ground water from improper disposal of sewage and excessive use of fertilizers is deleterious to human health as ingestion of nitrate contaminated water can cause methaemoglobinemia in infants and possibly cancer in adults. The permissible limit for nitrate in potable water is 45 mg/L. Unacceptable levels of nitrate in groundwater is an important environmental issue as nearly 80 % of Indian rural population depends on groundwater as source of drinking water. Though numerous technologies such as reverse osmosis, ion exchange, electro-dialysis, permeable reactive barriers using zero-valent iron exists, nitrate removal from water using affordable, sustainable technology, continues to be a challenging issue as nitrate ion is not amenable to precipitation or removable by mineral adsorbents. Tapping the denitrification potential of soil denitrifiers which are inherently available in the soil matrix is a possible sustainable approach to remove nitrate from contaminated drinking water. Insitu denitrification is a useful process to remove NO3–N from water and wastewater. In biological denitrification, nitrate ions function as terminal electron acceptor instead of oxygen; the carbon source serve as electron donor and the energy generated in the redox process is utilized for microbial cell growth and maintenance. In this process, microorganisms first reduce nitrate to nitrite and then produce nitric oxide, nitrous oxide, and nitrogen gas. The pathway for nitrate reduction can be written as: NO3-→ NO2-→ NO → N2O → N2. (i) Insitu denitrification process occurring in soil environments that utilizes indigenous soil microbes is the chosen technique for nitrate removal from drinking water in this thesis. As presence of clay in soil promotes bacterial activity, bentonite clay was mixed with natural sand and this mix, referred as bentonite enhanced sand (BES) acted as the habitat for the denitrifying bacteria. Nitrate reduction experiments were carried out in batch studies using laboratory prepared nitrate contaminated water spiked with ethanol; the batch studies examined the mechanisms, kinetics and parameters influencing the heterotrophic denitrification process. Optimum conditions for effective nitrate removal by sand and bentonite enhanced sand (BES) were evaluated. Heterotrophic denitrification reactors were constructed with sand and BES as porous media and the efficiency of these reactors in removing nitrate from contaminated water was studied. Batch experiments were performed at 40°C with sand and bentonite enhanced sand specimens that were wetted with nutrient solution containing 22.6 mg of nitrate-nitrogen and ethanol to give C/N ratio of 3. The moist sand and BES specimens were incubated for periods ranging from 0 to 48 h. During nitrate reduction, nitrite ions were formed as intermediate by-product and were converted to gaseous nitrogen. There was little formation of ammonium ions in the soil–water extract during reduction of nitrate ions. Hence it was inferred that nitrate reduction occurred by denitrification than through dissimilatory nitrate reduction to ammonium (DNRA). The reduction in nitrate concentration with time was fitted into rate equations and was observed to follow first order kinetics with a rate constant of 0.118 h-1 at 40°C. Results of batch studies also showed that the first order rate constant for nitrate reduction decreased to 5.3x10-2 h-1 for sand and 4.3 x10-2 h-1 for bentonite-enhanced sand (BES) at 25°C. Changes in pH, redox potential and dissolved oxygen in the soil-solution extract served as indicators of nitrate reduction process. The nitrate reduction process was associated with increasing pH and decreasing redox potential. The oxygen depletion process followed first order kinetics with a rate constant of 0.26 h-1. From the first order rate equation of oxygen depletion process, the nitrate reduction lag time was computed to be 12.8 h for bentonite enhanced sand specimens. Ethanol added as an electron donor formed acetate ions as an intermediate by-product that converted to bicarbonate ions; one mole of nitrate reduction generated 1.93 moles of bicarbonate ions that increased the pH of the soil-solution extract. The alkaline pH of BES specimen (8.78) rendered it an ideal substrate for soil denitrification process. In addition, the ability of bentonite to stimulate respiration by maintaining adequate levels of pH for sustained bacterial growth and protected bacteria in its microsites against the effect of hypertonic osmotic pressures, promoting the rate of denitrification. Buffering capacity of bentonite was mainly due to high cation exchange capacity of the clay. The presence of small pores in BES specimens increased the water retention capacity that aided in quick onset of anaerobiosis within the soil microsites. The biochemical process of nitrate reduction was affected by physical parameters such as bentonite content, water content, and temperature and chemical parameters such as C/N ratio, initial nitrate concentration and presence of indigenous micro-organisms in contaminated water. The rate of nitrate reduction process progressively increased with bentonite content but the presence of bentonite retarded the conversion of nitrite ions to nitrogen gas, hence there was significant accumulation of nitrite ions with increase in bentonite content. The dependence of nitrate reduction process on water content was controlled by the degree of saturation of the soil specimens. The rate of nitrate reduction process increased with water content until the specimens were saturated. The threshold water content for nitrate reduction process for sand and bentonite enhanced sand specimens was observed to be 50 %. The rate of nitrate reduction linearly increased with C/N ratio till steady state was attained. The optimum C/N ratio was 3 for sand and bentonite enhanced sand specimens. The activation energy (Ea) for this biochemical reaction was 35.72 and 47.12 kJmol-1 for sand and BES specimen respectively. The temperature coefficient (Q10) is a measure of the rate of change of a biological or chemical system as a consequence of increasing the temperature by 10°C. The temperature coefficient of sand and BES specimen was 2.0 and 2.05 respectively in the 15–25°C range; the temperature coefficients of sand and BES specimens were 1.62 and 1.77 respectively in the 25–40°C range. The rate of nitrate reduction linearly decreased with increase in initial nitrate concentration. The biochemical process of nitrate reduction was unaffected by presence of co-ions and nutrients such as phosphorus but was influenced by presence of pathogenic bacteria. Since nitrate leaching from agricultural lands is the main source of nitrate contamination in ground water, batch experiments were performed to examine the role of vadose (unsaturated soil) zone in the nitrate mitigation by employing sand and BES specimens with varying degree of soil saturation and C/N ratio as controlling parameters. Batch studies with sand and BES specimens showed that the incubation period required to reduce nitrate concentrations below 45 mg/L (t45) strongly depends on degree of saturation when there is inadequate carbon source available to support denitrifying bacteria; once optimum C/N ratio is provided, the rate of denitrification becomes independent of degree of soil saturation. The theoretical lag time (lag time refers to the period that is required for denitrification to commence) for nitrate reduction for sand specimens at Sr= 81 and 90%, C/N ratio = 3 and temperature = 40ºC corresponded to 24.4 h and 23.1 h respectively. The lag time for BES specimens at Sr = 84 and 100%, C/N ratio = 3 and temperature = 40ºC corresponded to 13.9 h and 12.8 h respectively. Though the theoretically computed nitrate reduction lag time for BES specimens was nearly half of sand specimens, it was experimentally observed that nitrate reduction proceeds immediately without any lag phase in sand and BES specimens suggesting the simultaneous occurrence of anaerobic microsites in both. Denitrification soil columns (height = 5 cm and diameter = 8.2 cm) were constructed using sand and bentonite-enhanced sand as porous reactor media. The columns were permeated with nitrate spiked solutions (100 mg/L) and the outflow was monitored for various chemical parameters. The sand denitrification column (packing density of 1.3 Mg/m3) showed low nitrate removal efficiency because of low hydraulic residence time (1.32 h) and absence of carbon source. A modified sand denitrification column constructed with higher packing density (1.52 Mg/m3) and ethanol addition to the influent nitrate solution improved the reactor performance such that near complete nitrate removal was achieved after passage of 50 pore volumes. In comparison, the BES denitrification column achieved 87.3% nitrate removal after the passage of 28.9 pore volumes, corresponding to 86 h of operation of the BES reactor. This period represents the maturation period of bentonite enhanced sand bed containing 10 % bentonite content. Though nitrate reduction is favored by sand bed containing 10 % bentonite, the low flow rate (20-25 cm3/h) impedes its use for large scale removal of nitrate from drinking water. Hence new reactor was designed using lower bentonite content of 5 % that required maturation period of 9.6 h. The 5 and 10 % bentonite-enhanced sand reactors bed required shorter maturation period than sand reactor as presence of bentonite contributes to increase in hydraulic retention time of nitrate within the reactor. On continued operation of the BES reactors, reduction in flow rate from blocking of pores by microbial growth on soil particles and accumulation of gas molecules was observed that was resolved by backwashing the reactors.
38

Metals Exposure and Cardiovascular Health: Characterizing Novel Risk Factors of Heart Failure

Martinez Morata, Irene January 2024 (has links)
Heart Failure is a leading cause of death and disability worldwide. The identification of risk factors of heart failure in healthy individuals is key to improve disease prevention and reduce mortality. Metals exposures are recently established cardiovascular disease risk factors, but their association with heart failure remains understudied and prospective studies across diverse populations are needed. Metals are widespread in the environment, some of the sources of exposure include drinking water, air, and soil contamination. Some population groups, particularly American Indian, Hispanic/Latino, and Black communities in the United States are exposed to higher levels of environmental metals as a result of sociodemographic and structural factors including structural racism. These population groups suffer a higher burden of heart failure compared to White populations. Importantly, the burden of heart failure in American Indian communities in the United States, a population group with high rates of diabetes, hypertension, and other cardiovascular disease risk factors, is underreported, and key risk factors of heart failure in these population groups remain understudied. This dissertation characterized relevant risk factors of heart failure in American Indian participants from the Strong Heart Study. Towards the goal of identifying novel preventable cardiovascular disease risk factors, it comprehensively assessed the sources of exposure and biomarkers for multiple non-essential and essential metals with a focus on characterizing drivers of disparities in drinking water metal concentrations. Then, it evaluated the role of exposure to multiple metals (individually and as a mixture) on the risk of heart failure and overall cardiovascular disease and all-cause mortality, leveraging three geographically and racially and ethnically diverse population-based cohorts: the Multi-Ethnic Study of Atherosclerosis (MESA), the Strong Heart Study (SHS), and the Hortega cohort. Last, it identified and evaluated new opportunities for the mitigation of metal toxicity through nutritional interventions. Chapter 1 provides background information about heart failure epidemiology and pathophysiology, the role of environmental metals on cardiovascular disease, and introduces the dissertation framework necessary to contextualize the work included in this dissertation. Chapter 2 estimated the incidence of heart failure in the SHS, a large epidemiological cohort of American Indian adults from Arizona, Oklahoma, North Dakota, and South Dakota, followed from 1989-1991 through 2019. A parsimonious heart failure-risk prediction equation that accounts for relevant cardiovascular risk factors affecting American Indian communities was developed. The incidence rate of heart failure was 9.5 per 1,000 person-years, with higher rates across participants with diabetes, hypertension, and albuminuria. Significant predictors for heart failure risk at 5 and 10 years included age, smoking, albuminuria, and previous myocardial infarction. Diabetes diagnosis and higher levels of HbA1c were significant predictors of risk at 10 and 28 years. Models achieved a high discrimination performance (C-index (95%CI): 0.81 (0.76, 0.84) at 5 years, 0.78 (0.75, 0.81) at 10 years, and 0.77 (0.74, 0.78) up to 28 years), and some associations varied across HF subtypes. Chapter 3 developed a comprehensive overview of the main sources and routes of exposure, biotransformation, and biomarkers of exposure and internal dose for 12 metals/metalloids, including 8 non-essential elements (arsenic, barium, cadmium, lead, mercury, nickel, tin, uranium) and 4 essential elements (manganese, molybdenum, selenium, and zinc), providing a set of recommendations for the use and interpretation of metal biomarkers in epidemiological studies. Chapter 4 conducted the first nationwide geospatial analysis identifying racial/ethnic inequalities in arsenic and uranium concentrations in public drinking water across the conterminous United States using geospatial models. The association between county-level racial/ethnic composition and public water arsenic and uranium concentrations (2000-2011)was assessed. Higher proportions of Hispanic/Latino and American Indian/Alaskan Native residents were associated with 6% (95% CI: 4-8%), and 7% (3-11%) higher levels of arsenic, and 17% (13-22%), and 2% (-4-8%), higher levels of uranium, respectively, in public drinking water, after accounting for relevant social and geological indicators. Higher county-level proportions of non-Hispanic Black residents were associated with higher arsenic and uranium in the Southwest, where concentrations of these contaminants are high. These findings identified the key role of structural racism as driver of drinking water metal concentrations inequalities. Chapter 5 evaluated the prospective association between urinary metal levels, a established biomarker of internal dose, and incident heart failure across three geographically and ethnically/racially diverse cohorts: MESA and SHS in the United States, and the Hortega Study in Spain. These findings consistently identified significant associations across cohorts for cadmium (pooled hazard ratio: 1.15 (95% CI: 1.07, 1.24), tungsten (1.07 (1.02, 1.12)), copper (1.31 (1.18, 1.45)), molybdenum (1.13 (1.05, 1.22)), and zinc (1.22 (1.14, 1.32))). Higher levels of urinary metals analyzed as a mixture were significantly associated with increased incident heart failure risk in MESA and SHS, and non-significantly increased in the Hortega Study, which has a smaller number of events. Chapter 6 assessed the prospective association of urinary metals with incident cardiovascular disease and all-cause mortality in MESA, including a total of 6,599 participants at baseline (2000-2001), followed through 2019. Significant associations between higher levels of urinary cadmium, tungsten, uranium, cobalt, copper, and zinc, and higher risk of CVD and all-cause mortality were identified. A positive linear dose-response was identified for cadmium and copper with both endpoints. The adjusted HRs (95%CI) for an interquartile range (IQR) increase in the mixture of these six urinary metals and the correspondent 10-year survival probability difference (95% CI) were 1.29 (1.11, 1.56), and -1.1% (-2.0, -0.05) for incident CVD and 1.66 (1.47, 1.91), and -2.0% (-2.6, -1.5) for all-cause mortality. Chapter 7 investigated the effects of a nutritional intervention with folic acid (FA) and B12 supplementation on arsenic methylation in children exposed to high levels of drinking water arsenic in Bangladesh. The randomized controlled trial included a total of 240 children 8-11 years old. Compared to placebo, the supplementation group experienced a significant increase in the concentration of blood DMA, a non-toxic arsenic metabolite, by 14.0% (95%CI: 5.0, 25.0) and blood secondary methylation index (DMAs/MMAs) by 0.19 (95%CI: 0.09, 0.35). Similarly, there was a 1.62% (95%CI: 0.43, 2.83) significantly higher urinary %DMAs and -1.10% (CI: -1.73, -0.48) significantly lower urinary %MMAs compared to placebo group after 1 week. These results confirmed that FA+B12 supplementation increases arsenic methylation in children as reflected by decreased MMAs and increased DMAs in blood and urine. Altogether, the findings presented in this dissertation consistently identify the role of urinary metals as robust risk factors of heart failure, overall cardiovascular disease and all-cause mortality across diverse populations. With consistent findings across multiple assessments of the dose response relationship and mixture approaches. Additionally, this dissertation work contributes to address disparities in environmental exposures and heart failure burden, respectively, by characterizing the impact of structural racism drinking water metal exposures disparities and identifying relevant risk factors of heart failure in American Indian populations who are historically underrepresented in epidemiological cohorts. Last, this dissertation identifies the role of folic acid and B12 supplementation to reduce arsenic toxicity in children. These findings have direct clinical and policy implications, as they can inform the development of novel clinical guidelines to incorporate environmental factors in clinical risk prediction, and they can inform drinking water regulation and infrastructure efforts to support at risk communities and inform population-level nutritional recommendations and policies.

Page generated in 0.1095 seconds