Abstract
Water safety plans (WSPs) are intended to assure safe drinking water (DW). WSPs involve assessing and managing risks associated with microbial, chemical, physical and radiological hazards from the catchment to the consumer. Currently, chemical hazards in WSPs are assessed by targeted chemical analysis, but this approach fails to account for the mixture effects of the many chemicals potentially present in water supplies and omits the possible effects of non-targeted chemicals. Consequently, effect-based monitoring (EBM) using in vitro bioassays and well plate-based in vivo assays are proposed as a complementary tool to targeted chemical analysis to support risk analysis, risk management and water quality verification within the WSP framework. EBM is frequently applied to DW and surface water and can be utilised in all defined monitoring categories within the WSP framework (including ‘system assessment’, ‘validation’, ‘operational’ and ‘verification’). Examples of how EBM can be applied within the different WSP modules are provided, along with guidance on where to apply EBM and how frequently. Since this is a new area, guidance documents, standard operating procedures (SOPs) and decision-making frameworks are required for both bioassay operators and WSP teams to facilitate the integration of EBM into WSPs, with these resources being developed currently.
HIGHLIGHTS
Effect-based monitoring (EBM) captures the mixture effects of the many chemicals present in water.
EBM can be integrated into water safety plans (WSPs) to assess risks associated with chemical hazards.
EBM can be applied in all monitoring categories in the WSP framework.
While EBM has not been integrated into WSPs yet, uptake can be supported through the development of guidance documents, frameworks and standard operating procedures.
Graphical Abstract
INTRODUCTION
WSPs focus on assessing the risks stemming from microbial, chemical, physical and radiological hazards from the catchment via water treatment processes and distribution to the customer. The risk can be characterised based on the likelihood of exposure to the hazard and the severity of consequences if exposure occurs. For some hazards, such as specific metals and radioisotopes, the assessment of risk is relatively straightforward since testing can target specific atoms, ions and molecules. However, chemical hazards in DW can include diverse and complex organic micropollutants, such as pesticides, pharmaceuticals and industrial chemicals, which enter catchments from municipal and industrial wastewater discharges and run-off from agricultural and urban areas (Fawell & Ong 2012). Additionally, DW treatment processes, such as disinfection and advanced oxidation, may result in the formation of disinfection by-products (DBPs) and micropollutant transformation products (TPs) (Postigo & Richardson 2014; Richardson & Postigo 2015).
Hence, the traditional approach of monitoring targeted chemical hazards can only provide information about a small portion of the chemical burden in water, with any chemicals omitted from the target method not being detected (Escher et al. 2020). Non-target screening (NTS) using high-resolution mass spectrometry regularly detects thousands of peaks (i.e., detected signals) in water samples (Hollender et al. 2017), and many thousands of new synthetic chemicals are developed annually. This means that simply adding more and more chemicals to target lists is a futile exercise as there will always be chemicals that are not considered or even detectable with NTS (e.g., due to unfavourable properties such as poor ionisation in the mass spectrometer) (Escher et al. 2020). On top of that, chemical analyses cannot account for the mixture effects of micropollutants, DBPs and TPs potentially present in water. Moreover, guidelines and standards for chemicals in DW are presently limited to approximately 100 − 200 chemicals depending on the jurisdiction.
Instead, effect-based monitoring (EBM) using in vitro bioassays and well plate-based in vivo assays can be used to assess the risk of chemical hazards in DW supply systems (Neale & Escher 2019). In vitro bioassays and well plate-based in vivo assays are indicative of specific endpoints, i.e., a measurable biological event that indicates an effect relevant to human and/or ecological health. In vitro bioassays use cell lines and are typically run in a 96-well or 384-well plate format, making them high-throughput. In vitro bioassays indicative of different stages of cellular toxicity pathways, including the induction of xenobiotic metabolism, receptor-mediated effects, adaptive stress responses and reactive toxicity as well as cytotoxicity, have been applied to both surface water and treated DW (e.g., Muller et al. 2018; Rosenmai et al. 2018; De Baat et al. 2019; Neale et al. 2020a), with the first application of in vitro bioassays to DW dating back several decades (Loper et al. 1978). In vitro bioassays are also applied in sediment quality monitoring and biomonitoring (Simon et al. 2013; Vethaak et al. 2017; Baumer et al. 2021). While an effect at the cellular level may not always result in an adverse effect at higher organisation levels, in vitro bioassays can be viewed as proxies for chronic effects, and cytotoxicity is an adequate proxy for acute toxicity (Escher et al. 2021). Furthermore, many in vitro reporter gene assays are highly sensitive and can detect the effects of chemicals present at concentrations below their analytical limit of detection (e.g., estrogenic chemicals 17β-estradiol and 17α-ethinylestradiol) (Konemann et al. 2018).
Complementary to in vitro bioassays, well plate-based in vivo assays using whole organisms can serve to detect acute effects in certain aquatic organisms and are generally indicative of apical effects, such as growth, immobilisation and mortality, covering effects related to multiple toxicity pathways (Wernersson et al. 2015). Well plate-based in vivo assays are also used for surface water quality assessment (e.g., Tousova et al. 2017; Hamers et al. 2018; De Baat et al. 2020), but most well plate-based in vivo assays will not be sensitive enough for DW, except for bacterial toxicity assays, such as the established Microtox and BLT-Screen assays that are in routine use by some water supply agencies (Neale et al. 2012; van de Merwe & Leusch 2015). In vivo assays target adverse effects on aquatic life and not human health effects, but some toxicity pathways are highly conserved across the animal kingdom (Colbourne et al. 2015). Hence, some well plate-based in vivo assays also suit human health hazard identification.
Applied as a complementary tool to targeted chemical analysis, EBM can provide valuable input for risk analysis and risk management in the WSP framework. EBM cannot identify individual chemicals in a water sample, but it can capture the mixture effects of groups of chemicals that elicit the same toxicological mode of action. Therefore, it cannot replace instrumental analysis but will complement it with a wealth of relevant information on water quality that would otherwise go overlooked. This includes detecting the effects of non-targeted chemicals, including contaminants of emerging concern. EBM also provides information that is scaled to risk, with more potent chemicals having a greater effect than less potent chemicals. To determine whether the chemical water quality is acceptable or unacceptable, the observed effect can be compared with effect-based trigger values (EBTs) (Escher et al. 2021). EBTs have been developed for a range of biological endpoints, including estrogenic activity, glucocorticoid activity and the oxidative stress response for DW (Brand et al. 2013; Been et al. 2021), recycled water for indirect potable reuse (Escher et al. 2015) and surface water (van der Oost et al. 2017; Escher et al. 2018). Furthermore, in vitro bioassays indicative of activation of the estrogen receptor (ER) and activation of the aryl hydrocarbon receptor (AhR) have already been implemented to monitor estrogenic activity and dioxin-like activity, respectively, in recycled water used for groundwater recharge and reservoir water augmentation in California (State Water Resources Control Board 2019).
This article demonstrates how EBM can be integrated into WSPs, with a focus on how EBM can be applied within the different monitoring categories and modules in the WSP framework. Furthermore, this article considers what is required from both a bioassay operator and a WSP team perspective to facilitate the uptake of EBMs in WSPs.
Monitoring within the WSP
System assessment monitoring
System assessment monitoring is used to provide baseline and ongoing background information to characterise water resources to understand their quality and help inform risk assessments and define treatment requirements. EBM can be included as part of the suite of monitoring activities used to assess the quality of source water to inform an assessment of the system, with EBM providing information about the mixture effects of all active chemicals in a water sample. EBM can be applied to describe the current quality of the water supply system from the source to the customer (Module 2).
EBM can be applied to identify chemical hazards (Module 3). For example, highly specific bioassays indicative of receptor-mediated effects such as estrogenic activity or photosystem II inhibition could be used to identify chemical hazards associated with wastewater and agriculture/domestic herbicide use, respectively. As an example, Hebert et al. (2018) used an oxidative stress response assay to differentiate between micropollutants and formed DBPs in their contribution to the effect in treated DW after chlorination, with DBPs contributing up to 58% of the oxidative stress response. EBM could also be conducted after potentially hazardous extreme events (e.g., heavy rainfall, chemical spills and bushfires) to determine how such events altered the chemical hazards in the system (Crawford et al. 2022). Climate change is expected to increase the frequency and severity of such events and WSPs need to be updated accordingly (WHO 2017a). EBM can also be applied to assess changes in chemical hazards associated with climate-related events, such as drought, increased rainfall and higher temperatures (WHO 2017a).
New activities in a catchment or altered DW treatment processes require revision of the WSP (Module 10), and EBM can be applied to characterise any resulting changes in water quality. A prominent example is the implementation of ozonation in a Swiss wastewater treatment plant (WWTP), where baseline effects were registered using in vitro bioassays prior to the implementation of ozonation (Escher et al. 2008). Ozonation following the secondary biological treatment was effective at reducing estrogenic activity and phytotoxicity, supporting the implementation of ozonation as a tertiary treatment step, particularly when discharging effluents into receiving waters under low-flow conditions (Escher et al. 2009). In another example, Cavallin et al. (2021) applied assays indicative of activation of the peroxisome proliferator-activated receptor-gamma (PPARγ), ER and glucocorticoid receptor (GR) to evaluate the surface water quality in the Colorado River before and after replacing an aging WWTP. The new WWTP resulted in a substantial improvement in water quality, as evidenced by a significant decrease in bioassay responses to RW samples.
In a system assessment monitoring context, EBM can theoretically be applied anywhere in the system from the catchment to the customer. To have a practical application, the measured effects should be compared with EBTs for the respective bioassays. The monitoring frequency will depend on the module and the characteristics of the catchment. For example, EBM can be used to describe the water supply system every 3–5 years for large catchments, though smaller catchments or catchments with more variable water quality may require more frequent monitoring. As discussed above, monitoring is also required after a hazardous event in the catchment or after the implementation of new treatment processes.
Validation monitoring
Validation monitoring seeks to provide evidence on control measure effectiveness and includes research to understand and validate processes and identify critical control points. Consequently, validation can be conducted in laboratories to verify whether a process works, and those processes can then be implemented and are considered ‘validated’. Testing is typically carried out on processes that can be replicated and then installed and implemented many times without having to be subsequently retested. However, validation is also conducted for specific processes, particularly in larger water supplies for which the benefit:cost ratio of such tailored studies is favourable. EBM can be used to assess and validate the existing control measures to prove their ability to reduce or remove effects (Module 4). For example, water can be collected before and after preventive measures, such as different treatment processes, to evaluate their efficacy. Information about effect removal during DW treatment is available for some endpoints, such as estrogenic activity (e.g., Xiao et al. 2016; Conley et al. 2017). EBM can also be used to compare different treatment options. For example, Farre et al. (2013) compared DBP formation potential and effects in a battery of assays including bacterial toxicity, genotoxicity and the oxidative stress response in coagulation-treated source waters after chlorination and chloramination. Both the observed effects and DBP concentrations increased after disinfection, with chlorinated water typically having a greater effect and higher DBP formation potential than chloraminated water. This suggests that chloramination led to the formation of lower concentrations of DBPs or less potent DBPs in the studied source waters.
If EBM contributes to identifying that existing controls are not fully effective in reducing chemical hazards, it is necessary to develop, implement and maintain an improvement/upgrade plan (Module 5). The efficacy of a treatment train is conventionally estimated based on the removal of a specific suite of chemicals (e.g., active pharmaceutical ingredients), but this approach fails to account for the effect of TPs and other unknown chemicals. EBM can be used to confirm that the new or improved controls are effective and ensure that new risks, such as from the formation of TPs, are not introduced. For example, using assays indicative of the activation of AhR, the oxidative stress response and cytotoxicity, Lundqvist et al. (2019) found that conventional DW treatment with coagulation, sand filtration and disinfection did not reduce AhR activity or the oxidative stress response. The same source water was treated in a novel pilot plant with suspended ion exchange, ozonation, microfiltration and granular activated carbon, after which the AhR activity and the oxidative stress response were reduced. In another example, Heringa et al. (2011) assessed changes in the genotoxicity of pre-treated surface water from three locations after advanced DW treatment with UV/H2O2 and granular activated carbon. Treatment with UV/H2O2 increased the genotoxicity in all water types, with granular activated carbon reducing the genotoxicity to baseline levels.
For both Modules 4 and 5, the effect in the treated water could also be compared to an EBT. For validation monitoring, monitoring before and after the control measure is required. Ideally, this is done with a bioassay test battery covering different stages of cellular toxicity pathways to gain a more thorough understanding of the chemical burden and its potential risks. A suggested practical test battery for monitoring is discussed later in the article.
Operational monitoring
Under the WSP paradigm, the intent of ‘operational monitoring’ is that it can detect potentially unsafe water and trigger a response to prevent any potentially unsafe water from reaching consumers. Changes in water quality and associated operational corrections need to be implemented rapidly – within a toxicologically relevant and protective timeframe. For controlling microbial risk, operational monitoring is used to monitor the performance of treatment processes on an ongoing basis, often continually, and provide timely evidence that they are functioning properly. It is typically conducted by measuring simple parameters, such as pH, turbidity and disinfectant residual, rather than the hazard itself, with the relationship between the measured parameter and hazard established. Such monitoring would typically be coupled to a target or limit that, in turn, would trigger a correction if exceeded. The affected water supply might be shut off as a correction, with an alternative water source used if the target or limit is exceeded. A response action would follow to understand the occurrence and ensure that the water is safe to use before restoring normal operations.
EBM has the potential to be applied to ensure that preventive measures are operating correctly (Module 6). Operational monitoring of preventive measures needs to be undertaken in a timely manner to provide an early warning (O'Connor et al. 2006). The time required to process and run samples in cell-based in vitro bioassays (e.g., at least 3–4 days) means that current EBM may not be suitable for operational monitoring of rapid changes in conditions and their consequences but may be more suitable for operational monitoring of long-term developments of treatment and surveillance. To meet the requirements for operational monitoring, further research is required to develop online EBM options, with work ongoing to develop online bacterial toxicity monitoring (e.g., Bodini et al. 2018; Hassan et al. 2019). In the meantime, rapid tools, such as the bacterial Microtox and BLT-Screen assays, could be applied for operational monitoring. These assays only provide information about non-specific effects, but they respond to all chemicals that are in the water weighted by their effect potency. If non-specific toxicity exceeds a certain threshold, then a more detailed investigation with EBM would be warranted. An alternate approach could be to establish a relationship between easily measured simple parameters, such as turbidity, the simple bacterial bioassays or absorbable organic halogens (AOX), and relevant endpoints detected using EBM. Once such a relationship is established, then only the simple parameter would be used for operational monitoring. Chemical sum parameters such as AOX or adsorbable organic fluorine might also be useful for targeting specific chemical classes of concern, e.g., per- and polyfluoroalkyl substances (PFAS), which target many different modes of action and do not dominate the mixture effects in a single bioassay, or DBPs.
Nonetheless, a small number of major water utilities have implemented EBM and bioindicators routinely that are linked to automatic alarms. For example, the Hong Kong Water Supplies Department uses a biosensing alert system with zebrafish to monitor water quality (Water Supplies Department 2017). An alert is issued after any zebrafish mortality or abnormal activity, with the water samples further tested using a rapid bacterial bioluminescence assay. Due to the sheer volume of DW supplied by most public water supplies, the reasonably foreseeable concentrations of most chemical contaminants are not acutely toxic (Bartram et al. 2009; Leusch et al. 2020). This potentially allows for a timely protective corrective response to an EBT exceedance within days to weeks rather than the minutes to hours required when protecting from microbial contamination.
Verification monitoring
Verification monitoring, which is also called surveillance or compliance monitoring, is used to verify routine operations to confirm that the system was producing good quality water. Verification monitoring of different parameters is typically conducted on a routine basis, such as weekly, monthly or annually, depending on the measured parameter and the level of confidence required by the water utility. For individual chemicals, verification monitoring is recommended on a quarterly or biannual basis as chemicals are not normally expected to be present at acutely hazardous concentrations (Bartram et al. 2009), so a similar monitoring frequency would also be suitable for EBM. An advantage of EBM over chemical analysis is that it can detect the effect of both known and unknown active chemicals. EBM can be used for verification monitoring of control measures and to confirm the quality of the treated water (Module 7), with the observed effect in the product water compared to an EBT. As the first application of EBM, treatment plant operators may decide to use EBM for verification monitoring of treated water. Using the California State Water Resources Control Board example, EBM of treated water is recommended prior to release on a quarterly basis during the initial assessment and baseline monitoring phases, with semi-annual or annual monitoring recommended during standard operation for systems that are in a relatively steady state (State Water Resources Control Board 2019).
Similar to validation monitoring, samples collected before and after the preventive measures can be tested in a bioassay test battery to verify treatment performance. This provides ongoing verification of the ability of the preventive measures to mitigate any adverse effects as well as building a body of evidence that can feedback into bolstering validation. As an example, a test battery of in vitro assays indicative of receptor-mediated effects, adaptive stress responses and mutagenicity was applied to assess micropollutant removal and DBP formation in three DW treatment plants (DWTPs) over four seasons (Neale et al. 2020a). While estrogenic activity was detected in the source water, all three DWTPs could consistently remove estrogenic activity to levels below the DW EBT.
What is required to support the uptake of EBM into WSP?
Standard operating procedures and analysis workflows
The availability of tools and resources was identified as a key factor enabling the uptake of WSPs (Baum & Bartram 2018), and this is also required for the implementation of EBM in WSPs. To facilitate EBM application, standard operating procedures (SOPs) and guidance documents for sample collection and processing, bioanalysis and data analysis are required to provide consistency and ensure that appropriate quality assurance/quality control steps are taken. This fits within Module 8 (prepare management procedures) of the WSP framework. Some SOPs are available from assay suppliers or in the peer-reviewed literature. For example, SOPs are available for 20 in vitro bioassays and well plate-based in vivo assays in Neale et al. (2017). In addition, Organization for Economic Co-operation and Development (OECD) and International Organization for Standardization (ISO) guidelines are available for some in vitro bioassays and well plate-based in vivo assays. Some of the OECD guidelines are developed for toxicity testing of chemicals but can be adapted to water samples.
A practical test battery
As the previous examples highlight, numerous bioassays have been applied for water quality monitoring in different water types, raising the question of which bioassays should be used for DW. Ideally, a practical test battery consisting of at least three or four bioassays representative of effects detected in source water and DW samples should be deployed. Assays indicative of activation of AhR, activation of ER and the oxidative stress response are recommended as they are commonly detected endpoints in water (e.g., Escher et al. 2014; Rosenmai et al. 2018) and represent different stages of cellular toxicity pathways (i.e., xenobiotic metabolism, receptor-mediated effects and adaptive stress responses). EBTs are available for many assays indicative of these endpoints, with currently available EBTs as summarised in Neale et al. (2020c). Furthermore, an assay indicative of genotoxicity or mutagenicity is recommended to detect toxic DBPs formed after disinfection. Additional information about test battery selection is available in Neale et al. (2020b).
Guidance to operators and managers in how to deploy and interpret bioassay results
While bioassay operators require technical guidance, the WSP team, which includes managers and treatment plant operators, requires resources that focus on how EBM can be implemented in WSPs. This includes guidance on where in the catchment-to-customer process to apply EBM and at what test frequencies, as addressed in Figure 2. Managers also require guidance on the steps to follow should the effect of a sample exceed its EBT. Such frameworks have previously been proposed in the literature (Leusch & Snyder 2015), with work now underway to refine currently available frameworks. Support programmes including training to develop people's skills and knowledge about EBM fit within Module 9 (develop supporting programmes), as does research and development to improve the understanding of water quality. Finally, case studies to demonstrate how EBM has been applied for monitoring (system assessment, validation and verification) should be developed. These can act as useful examples for water utilities and can be uploaded on the Water Safety Portal (https://wsportal.org/).
Community expectations and public confidence
The general public is becoming increasingly concerned about chemical (and other forms of) contamination of their public DW supplies and some are turning to bottled water or undertaking point-of-use treatment. For example, surveys have found that approximately 30–40% of populations in the USA (Rosinger et al. 2022) and major European centres (Tosun et al. 2020) prefer bottled water despite the tap water being safe. Undertaking EBM has the potential to help reassure consumers of the safety of their tap water. This, in turn, has environmental, economic and health benefits. As noted by the WHO (2022), consumers losing confidence in DW may result in their consumption of less water or alternatives that may not be safe. Drinking bottled water involves the wasteful use of packaging and, along with point-of-use treatment, involves unnecessary expenditure. Alternatives to tap water may include beverages with excessive calories, including sugar-sweetened beverages, which may be detrimental to dental health.
CONCLUSIONS
The ability of EBM to account for the effects of complex mixtures of chemicals in water, and to detect a wide range of chemicals, even those present below analytical detection limits, offers fundamental advantages for risk assessment and risk management, particularly when applied as a complement to targeted chemical analysis. However, EBM has not been integrated into WSPs to date.
EBM can be applied for system assessment, operational, verification and validation monitoring, with examples from the literature provided for relevant WSP modules. Of these, the greatest challenge relates to applying in vitro bioassays for operational monitoring due to the need for a preventive monitoring and response timeframe.
Guidance documents, frameworks and SOPs are required for both bioassay operators and the WSP team to support the uptake of EBM into WSPs, with work currently underway to develop these resources. Notwithstanding ongoing developments, EBM has strong applicability in multiple monitoring levels of the WSP framework and can aid in the future safeguarding of water resources.
ACKNOWLEDGEMENTS
This study was supported by the Global Water Research Coalition project 2057/19, which was funded by Public Utilities Board (PUB), the Foundation for Applied Water Research (STOWA), Water Research Australia, the Water Research Commission and the Water Services Association of Australia. In-kind support was kindly provided by Veolia Environnement – Research and Innovation (VERI), SUEZ, and KWR.
DATA AVAILABILITY STATEMENT
All relevant data are included in the paper.
CONFLICT OF INTEREST
The authors declare there is no conflict.