Ecologic studies such as this are not very informative because of limited exposure assessments. But they can certainly generate alarm. Even so, small associations such as this are questionable regardless of precise mathematical computations. Note the absence of confidence intervals on the estimates which sends a strong message to ignore the study altogether.
Saint-Jacques N, Brown P, Nauta L, Boxall J, Parker L, Dummer TJB. Estimating the risk of bladder and kidney cancer from exposure to low-levels of arsenic in drinking water, Nova Scotia, Canada. Environment international. 2017 Oct 28. pii: S0160-4120(17)31385-5. doi: 10.1016/j.envint.2017.10.014.
Arsenic in drinking water impacts health. Highest levels of arsenic have been historically observed in Taiwan and Bangladesh but the contaminant has been affecting the health of people globally. Strong associations have been confirmed between exposure to high-levels of arsenic in drinking water and a wide range of diseases, including cancer. However, at lower levels of exposure, especially near the current World Health Organization regulatory limit (10μg/L), this association is inconsistent as the effects are mostly extrapolated from high exposure studies. This ecological study used Bayesian inference to model the relative risk of bladder and kidney cancer at these lower concentrations-0-2μg/L; 2-5μg/L and; ≥5μg/L of arsenic-in 864 bladder and 525 kidney cancers diagnosed in the study area, Nova Scotia, Canada between 1998 and 2010. The model included proxy measures of lifestyle (e.g. smoking) and accounted for spatial dependencies. Overall, bladder cancer risk was 16% (2-5μg/L) and 18% (≥5μg/L) greater than that of the referent group (<2μg/L), with posterior probabilities of 88% and 93% for these risks being above 1. Effect sizes for kidney cancer were 5% (2-5μg/L) and 14% (≥5μg/L) above that of the referent group (<2μg/L), with probabilities of 61% and 84%. High-risk areas were common in southwestern areas, where higher arsenic-levels are associated with the local geology. The study suggests an increased bladder cancer, and potentially kidney cancer, risk from exposure to drinking water arsenic-levels within the current the World Health Organization maximum acceptable concentration.
Li F, Qiu Z, Zhang J, Liu C, Cai Y, Xiao M. Spatial Distribution and Fuzzy Health Risk Assessment of Trace Elements in Surface Water from Honghu Lake. International journal of environmental research and public health. 2017 Sep 4;14(9). pii: E1011. doi: 10.3390/ijerph14091011.
Previous studies revealed that Honghu Lake was polluted by trace elements due to anthropogenic activities. This study investigated the spatial distribution of trace elements in Honghu Lake, and identified the major pollutants and control areas based on the fuzzy health risk assessment at screening level. The mean total content of trace elements in surface water decreased in the order of Zn (18.04 μg/L) > Pb (3.42 μg/L) > Cu (3.09 μg/L) > Cr (1.63 μg/L) > As (0.99 μg/L) > Cd (0.14 μg/L), within limits of Drinking Water Guidelines. The results of fuzzy health risk assessment indicated that there was no obvious non-carcinogenic risk to human health, while carcinogenic risk was observed in descending order of As > Cr > Cd > Pb. As was regarded to have the highest carcinogenic risk among selected trace elements because it generally accounted for 64% of integrated carcinogenic risk. Potential carcinogenic risk of trace elements in each sampling site was approximately at medium risk level (10-5 to 10-4). The areas in the south (S4, S13, and S16) and northeast (S8, S18, and S19) of Honghu Lake were regarded as the risk priority control areas. However, the corresponding maximum memberships of integrated carcinogenic risk in S1, S3, S10-S13, S15, and S18 were of relatively low credibility (50-60%), and may mislead the decision-makers in identifying the risk priority areas. Results of fuzzy assessment presented the subordinate grade and corresponding reliability of risk, and provided more full-scale results for decision-makers, which made up for the deficiency of certainty assessment to a certain extent.
Li Z, Jennings A. Worldwide Regulations of Standard Values of Pesticides for Human Health Risk Control: A Review. International journal of environmental research and public health. 2017 Jul 22;14(7). pii: E826. doi: 10.3390/ijerph14070826.
The impact of pesticide residues on human health is a worldwide problem, as human exposure to pesticides can occur through ingestion, inhalation, and dermal contact. Regulatory jurisdictions have promulgated the standard values for pesticides in residential soil, air, drinking water, and agricultural commodity for years. Until now, more than 19,400 pesticide soil regulatory guidance values (RGVs) and 5400 pesticide drinking water maximum concentration levels (MCLs) have been regulated by 54 and 102 nations, respectively. Over 90 nations have provided pesticide agricultural commodity maximum residue limits (MRLs) for at least one of the 12 most commonly consumed agricultural foods. A total of 22 pesticides have been regulated with more than 100 soil RGVs, and 25 pesticides have more than 100 drinking water MCLs. This research indicates that those RGVs and MCLs for an individual pesticide could vary over seven (DDT drinking water MCLs), eight (Lindane soil RGVs), or even nine (Dieldrin soil RGVs) orders of magnitude. Human health risk uncertainty bounds and the implied total exposure mass burden model were applied to analyze the most commonly regulated and used pesticides for human health risk control. For the top 27 commonly regulated pesticides in soil, there are at least 300 RGVs (8% of the total) that are above all of the computed upper bounds for human health risk uncertainty. For the top 29 most-commonly regulated pesticides in drinking water, at least 172 drinking water MCLs (5% of the total) exceed the computed upper bounds for human health risk uncertainty; while for the 14 most widely used pesticides, there are at least 310 computed implied dose limits (28.0% of the total) that are above the acceptable daily intake values. The results show that some worldwide standard values were not derived conservatively enough to avoid human health risk by the pesticides, and that some values were not computed comprehensively by considering all major human exposure pathways.
Zhang LE, Huang D, Yang J, Wei X, Qin J, Ou S, Zhang Z, Zou Y. Probabilistic risk assessment of Chinese residents’ exposure to fluoride in improved drinking water in endemic fluorosis areas. Environmental pollution 2017 Mar;222:118-125. doi: 10.1016/j.envpol.2016.12.074.
Studies have yet to evaluate the effects of water improvement on fluoride concentrations in drinking water and the corresponding health risks to Chinese residents in endemic fluorosis areas (EFAs) at a national level. This paper summarized available data in the published literature (2008-2016) on water fluoride from the EFAs in China before and after water quality was improved. Based on these obtained data, health risk assessment of Chinese residents’ exposure to fluoride in improved drinking water was performed by means of a probabilistic approach. The uncertainties in the risk estimates were quantified using Monte Carlo simulation and sensitivity analysis. Our results showed that in general, the average fluoride levels (0.10-2.24 mg/L) in the improved drinking water in the EFAs of China were lower than the pre-intervention levels (0.30-15.24 mg/L). The highest fluoride levels were detected in North and Southwest China. The mean non-carcinogenic risks associated with consumption of the improved drinking water for Chinese residents were mostly accepted (hazard quotient < 1), but the non-carcinogenic risk of children in most of the EFAs at the 95th percentile exceeded the safe level of 1, indicating the potential non-cancer-causing health effects on this fluoride-exposed population. Sensitivity analyses indicated that fluoride concentration in drinking water, ingestion rate of water, and the exposure time in the shower were the most relevant variables in the model, therefore, efforts should focus mainly on the definition of their probability distributions for a more accurate risk assessment.
Amoueyan E, Ahmad S, Eisenberg JNS, Pecson B, Gerrity D. Quantifying pathogen risks associated with potable reuse: A risk assessment case study for Cryptosporidium. Water research 2017 Apr 19;119:252-266. doi: 10.1016/j.watres.2017.04.048.
This study evaluated the reliability and equivalency of three different potable reuse paradigms: (1) surface water augmentation via de facto reuse with conventional wastewater treatment; (2) surface water augmentation via planned indirect potable reuse (IPR) with ultrafiltration, pre-ozone, biological activated carbon (BAC), and post-ozone; and (3) direct potable reuse (DPR) with ultrafiltration, ozone, BAC, and UV disinfection. A quantitative microbial risk assessment (QMRA) was performed to (1) quantify the risk of infection from Cryptosporidium oocysts; (2) compare the risks associated with different potable reuse systems under optimal and sub-optimal conditions; and (3) identify critical model/operational parameters based on sensitivity analyses. The annual risks of infection associated with the de facto and planned IPR systems were generally consistent with those of conventional drinking water systems [mean of (9.4 ± 0.3) × 10-5 to (4.5 ± 0.1) × 10-4], while DPR was clearly superior [mean of (6.1 ± 67) × 10-9 during sub-optimal operation]. Because the advanced treatment train in the planned IPR system was highly effective in reducing Cryptosporidium concentrations, the associated risks were generally dominated by the pathogen loading already present in the surface water. As a result, risks generally decreased with higher recycled water contributions (RWCs). Advanced treatment failures were generally inconsequential either due to the robustness of the advanced treatment train (i.e., DPR) or resiliency provided by the environmental buffer (i.e., planned IPR). Storage time in the environmental buffer was important for the de facto reuse system, and the model indicated a critical storage time of approximately 105 days. Storage times shorter than the critical value resulted in significant increases in risk. The conclusions from this study can be used to inform regulatory decision making and aid in the development of design or operational criteria for IPR and DPR systems.
Vinceti M, Filippini T, Cilloni S, Bargellini A, Vergoni AV, Tsatsakis A, Ferrante M. Health risk assessment of environmental selenium: Emerging evidence and challenges (Review). Mol Med Rep. 2017 Mar 24. doi: 10.3892/mmr.2017.6377.
New data have been accumulated in the scientific literature in recent years which allow a more adequate risk assessment of selenium with reference to human health. This new evidence comes from environmental studies, carried out in populations characterized by abnormally high or low selenium intakes, and from high-quality and large randomized controlled trials with selenium recently carried out in the US and in other countries. These trials have consistently shown no beneficial effect on cancer and cardiovascular risk, and have yielded indications of unexpected toxic effects of selenium exposure. Overall, these studies indicate that the minimal amount of environmental selenium which is source of risk to human health is much lower than anticipated on the basis of older studies, since toxic effects were shown at levels of intake as low as around 260 µg/day for organic selenium and around 16 µg/day for inorganic selenium. Conversely, populations with average selenium intake of less than 13-19 µg/day appear to be at risk of a severe cardiomyopathy, Keshan disease. Overall, there is the need to reconsider the selenium standards for dietary intake, drinking water, outdoor and indoor air levels, taking into account the recently discovered adverse health effects of low-dose selenium overexposure, and carefully assessing the significance of selenium-induced proteomic changes.
Lumen A, George NI. Evaluation of the risk of perchlorate exposure in a population of late-gestation pregnant women in the United States: Application of probabilistic biologically-based dose response modeling. Toxicology and applied pharmacology. 2017 Mar 2. pii: S0041-008X(17)30098-4. doi: 10.1016/j.taap.2017.02.021.
The risk of ubiquitous perchlorate exposure and the dose-response on thyroid hormone levels in pregnant women in the United States (U.S.) have yet to be characterized. In the current work, we integrated a previously developed perchlorate submodel into a recently developed population-based pregnancy model to predict reductions in maternal serum free thyroxine (fT4) levels for late-gestation pregnant women in the U.S. Our findings indicated no significant difference in geometric mean estimates of fT4 when perchlorate exposure from food only was compared to no perchlorate exposure. The reduction in maternal fT4 levels reached statistical significance when an added contribution from drinking water (i.e., 15μg/L, 20μg/L, or 24.5μg/L) was assumed in addition to the 90th percentile of food intake for pregnant women (0.198μg/kg/day). We determined that a daily intake of 0.45 to 0.50μg/kg/day of perchlorate was necessary to produce results that were significantly different than those obtained from no perchlorate exposure. Adjusting for this food intake dose, the relative source contribution of perchlorate from drinking water (or other non-dietary sources) was estimated to range from 0.25-0.3μg/kg/day. Assuming a drinking water intake rate of 0.033L/kg/day, the drinking water concentration allowance for perchlorate equates to 7.6-9.2μg/L. In summary, we have demonstrated the utility of a probabilistic biologically-based dose-response model for perchlorate risk assessment in a sensitive life-stage at a population level; however, there is a need for continued monitoring in regions of the U.S. where perchlorate exposure may be higher.