Hu J, Dong H, Xu Q, Ling W, Qu J, Qiang Z. Impacts of water quality on the corrosion of cast iron pipes for water distribution and proposed source water switch strategy. Water research. 2017 Oct 31;129:428-435. doi: 10.1016/j.watres.2017.10.065.
Switch of source water may induce “red water” episodes. This study investigated the impacts of water quality on iron release, dissolved oxygen consumption (ΔDO), corrosion scale evolution and bacterial community succession in cast iron pipes used for drinking water distribution at pilot scale, and proposed a source water switch strategy accordingly. Three sets of old cast iron pipe section (named BP, SP and GP) were excavated on site and assembled in a test base, which had historically transported blended water, surface water and groundwater, respectively. Results indicate that an increasing Cl– or SO42- concentration accelerated iron release, but alkalinity and calcium hardness exhibited an opposite tendency. Disinfectant shift from free chlorine to monochloramine slightly inhibited iron release, while the impact of peroxymonosulfate depended on the source water historically transported in the test pipes. The ΔDO was highly consistent with iron release in all three pipe systems. The mass ratio of magnetite to goethite in the corrosion scales of SP was higher than those of BP and GP and kept almost unchanged over the whole operation period. Siderite and calcite formation confirmed that an increasing alkalinity and hardness inhibited iron release. Iron-reducing bacteria decreased in the BP but increased in the SP and GP; meanwhile, sulfur-oxidizing, sulfate-reducing and iron oxidizing bacteria increased in all three pipe systems. To avoid the occurrence of “red water”, a source water switch strategy was proposed based on the difference between local and foreign water qualities.
Newton SR, McMahen RL, Sobus JR, Mansouri K, Williams AJ, McEachran AD, Strynar MJ.
Suspect screening and non-targeted analysis of drinking water using point-of-use filters.Environ Pollut. 2017 Nov 25;234:297-306. doi: 10.1016/j.envpol.2017.11.033.
Monitored contaminants in drinking water represent a small portion of the total compounds present, many of which may be relevant to human health. To understand the totality of human exposure to compounds in drinking water, broader monitoring methods are imperative. In an effort to more fully characterize the drinking water exposome, point-of-use water filtration devices (Brita® filters) were employed to collect time-integrated drinking water samples in a pilot study of nine North Carolina homes. A suspect screening analysis was performed by matching high resolution mass spectra of unknown features to molecular formulas from EPA’s DSSTox database. Candidate compounds with those formulas were retrieved from the EPA’s CompTox Chemistry Dashboard, a recently developed data hub for approximately 720,000 compounds. To prioritize compounds into those most relevant for human health, toxicity data from the US federal collaborative Tox21 program and the EPA ToxCast program, as well as exposure estimates from EPA’s ExpoCast program, were used in conjunction with sample detection frequency and abundance to calculate a “ToxPi” score for each candidate compound. From ∼15,000 molecular features in the raw data, 91 candidate compounds were ultimately grouped into the highest priority class for follow up study. Fifteen of these compounds were confirmed using analytical standards including the highest priority compound, 1,2-Benzisothiazolin-3-one, which appeared in 7 out of 9 samples. The majority of the other high priority compounds are not targets of routine monitoring, highlighting major gaps in our understanding of drinking water exposures. General product-use categories from EPA’s CPCat database revealed that several of the high priority chemicals are used in industrial processes, indicating the drinking water in central North Carolina may be impacted by local industries.
Lane K, Stoddart AK, Gagnon GA. Water safety plans as a tool for drinking water regulatory frameworks in Arctic communities. Environmental science and pollution research international. 2017 Jul 14. doi: 10.1007/s11356-017-9618-9.
Arctic communities often face drinking water supply challenges that are unique to their location. Consequently, conventional drinking water regulatory strategies often do not meet the needs of these communities. A literature review of Arctic jurisdictions was conducted to evaluate the current water management approaches and how these techniques could be applied to the territory of Nunavut in Canada. The countries included are all members of the Arctic Council and other Canadian jurisdictions considered important to the understanding of water management for Northern Canadian communities. The communities in Nunavut face many challenges in delivering safe water to customers due to remoteness, small community size and therefore staffing constraints, lack of guidelines and monitoring procedures specific to Nunavut, and water treatment and distribution systems that are vastly different than those used in southern communities. Water safety plans were explored as an alternative to water quality regulations as recent case studies have demonstrated the utility of this risk management tool, especially in the context of small communities. Iceland and Alberta both currently have regulated water safety plans (WSPs) and were examined to understand shortcomings and benefits if WSPs were to be applied as a possible strategy in Nunavut. Finally, this study discusses specific considerations that are necessary should a WSP approach be applied in Nunavut.
Gunnarsdottir MJ, Gardarsson SM, Jonsson GS, Bartram J.Chemical quality and regulatory compliance of drinking water in Iceland. Int J Hyg Environ Health. 2016 Sep 26. pii: S1438-4639(16)30175-4. doi: 10.1016/j.ijheh.2016.09.011.
Assuring sufficient quality of drinking water is of great importance for public wellbeing and prosperity. Nations have developed regulatory system with the aim of providing drinking water of sufficient quality and to minimize the risk of contamination of the water supply in the first place. In this study the chemical quality of Icelandic drinking water was evaluated by systematically analyzing results from audit monitoring where 53 parameters were assessed for 345 samples from 79 aquifers, serving 74 water supply systems. Compliance to the Icelandic Drinking Water Regulation (IDWR) was evaluated with regard to parametric values, minimum requirement of sampling, and limit of detection. Water quality compliance was divided according to health-related chemicals and indicators, and analyzed according to size. Samples from few individual locations were benchmarked against natural background levels (NBLs) in order to identify potential pollution sources. The results show that drinking compliance was 99.97% in health-related chemicals and 99.44% in indicator parameters indicating that Icelandic groundwater abstracted for drinking water supply is generally of high quality with no expected health risks. In 10 water supply systems, of the 74 tested, there was an indication of anthropogenic chemical pollution, either at the source or in the network, and in another 6 water supplies there was a need to improve the water intake to prevent surface water intrusion. Benchmarking against the NBLs proved to be useful in tracing potential pollution sources, providing a useful tool for identifying pollution at an early stage.
Cui C, Jin L, Jiang L, Han Q, Lin K, Lu S, Zhang D, Cao G.
Removal of trace level amounts of twelve sulfonamides from drinking water by UV-activated peroxymonosulfate. The Science of the total environment. 2016 Aug 5;572:244-251. doi: 10.1016/j.scitotenv.2016.07.183.
Trace levels of residual antibiotics in drinking water may threaten public health and become a serious problem in modern society. In this work, we investigated the degradation of twelve sulfonamides (SAs) at environmentally relevant trace level concentrations by three different methods: ultraviolet (UV) photolysis, peroxymonosulfate (PMS) oxidation, and UV-activated PMS (UV/PMS). Sulfaguanidine, sulfadiazine, sulfamerazine, sulfamethazine, sulfathiazole, sulfamethoxydiazine, and sulfadimethoxine were be effectively removed by direct UV photolysis and PMS oxidation. However, sulfanilamide, sulfamethizole, sulfamethoxazole, sulfisoxazole, and sulfachloropyridazine were not completely degraded, despite prolonging the UV irradiation time to 30min or increasing the PMS concentration to 5.0mg·L-1. UV/PMS provided more thorough elimination of SAs, as demonstrated by the complete removal of 200 ng·L-1 of all SAs within 5min at an initial PMS concentration of 1.0mg·L-1. UV/PMS promoted SA decomposition more efficiently than UV photolysis or PMS oxidation alone. Bicarbonate concentration and pH had a negligible effect on SA degradation by UV/PMS. However, humic acid retarded the process. Removal of 200 ng·L-1 of each SA from a sample of sand-filtered effluent from a drinking water treatment plant (DWTPs) was quickly and completely achieved by UV/PMS. Meanwhile, about 41% of the total organic carbon (TOC) was eliminated. Scavenging experiments showed that sulfate radical (SO4–) was the predominant species involved in the degradation. It is concluded that UV/PMS is a rapid and efficient method for removing trace-level SAs from drinking water.
The use of water residence time as a proxy for contamination by intrusion into a water distribution system is unsupported and speculative at best. It seems these researchers do not fully understand drinking water distribution systems. Lastly, the ORs and CIs mentioned here are very low and are well within the range of no-effect. Why speculate with such a weak finding? Let’s use some common sense to operate and maintain water distribution systems and disinfectant residuals.
Levy K, Klein M, Sarnat SE, Panwhar S, Huttinger A, Tolbert P, Moe C. Refined assessment of associations between drinking water residence time and emergency department visits for gastrointestinal illness in Metro Atlanta, Georgia. Journal of Water and Health. 2016 Aug;14(4):672-681.
Recent outbreak investigations suggest that a substantial proportion of waterborne disease outbreaks are attributable to water distribution system issues. In this analysis, we examine the relationship between modeled water residence time (WRT), a proxy for probability of microorganism intrusion into the distribution system, and emergency department visits for gastrointestinal (GI) illness for two water utilities in Metro Atlanta, USA during 1993-2004. We also examine the association between proximity to the nearest distribution system node, based on patients’ residential address, and GI illness using logistic regression models. Comparing long (≥90th percentile) with intermediate WRTs (11th to 89th percentile), we observed a modestly increased risk for GI illness for Utility 1 (OR = 1.07, 95% CI: 1.02-1.13), which had substantially higher average WRT than Utility 2, for which we found no increased risk (OR = 0.98, 95% CI: 0.94-1.02). Examining finer, 12-hour increments of WRT, we found that exposures >48 h were associated with increased risk of GI illness, and exposures of >96 h had the strongest associations, although none of these associations was statistically significant. Our results suggest that utilities might consider reducing WRTs to <2-3 days or adding booster disinfection in areas with longer WRT, to minimize risk of GI illness from water consumption.
Sun R, An D, Lu W, Shi Y, Wang L, Zhang C, Zhang P, Qi H, Wang Q. Impacts of a flash flood on drinking water quality: case study of areas most affected by the 2012 Beijing flood. Heliyon. 2016 Feb 19;2(2):e00071. doi: 10.1016/j.heliyon.2016.e00071.
In this study, we present a method for identifying sources of water pollution and their relative contributions in pollution disasters. The method uses a combination of principal component analysis and factor analysis. We carried out a case study in three rural villages close to Beijing after torrential rain on July 21, 2012. Nine water samples were analyzed for eight parameters, namely turbidity, total hardness, total dissolved solids, sulfates, chlorides, nitrates, total bacterial count, and total coliform groups. All of the samples showed different degrees of pollution, and most were unsuitable for drinking water as concentrations of various parameters exceeded recommended thresholds. Principal component analysis and factor analysis showed that two factors, the degree of mineralization and agricultural runoff, and flood entrainment, explained 82.50% of the total variance. The case study demonstrates that this method is useful for evaluating and interpreting large, complex water-quality data sets.