The water quality risk assessment is the foundation for every drinking Water Safety Plan. The need to conform with a ‘Corporate’ risk assessment model commonly dominated by asset management frequency data can cause misjudgement of microbial risk. Well-performed risk assessments should identify potential risks. Risk assessments which place too much emphasis on historical evidence to demonstrate ‘likelihood’ of microbial contamination are unsuitable for drinking water quality because they fail to recognise ‘latent’ risks associated with absent or underperforming barriers to contamination. Most outbreaks occur when these ‘latent’ factors align to create a failure. Good risk assessments need to provide foresight. This is achieved if drinking water quality risk assessments are based on a ‘barrier’ approach. Where adequate and reliable multiple barriers to contamination are present the likelihood of a hazardous event should then be categorised as rare. Where barriers are absent, inadequate, or unreliable, then a higher likelihood is appropriate, depending on the nature and extent of the barrier shortfall. Practical examples show how the ‘barrier’ approach can be applied. The barrier risk assessment directly informs the operational monitoring programme, enabling regular confirmation that the challenge and barrier performance are consistent with the predictions of the risk assessment.

  • Explains why commonly used risk assessment methods can lead to water quality risks being misjudged.

  • Provides a ‘barrier’ risk assessment method to account for ‘latent’ risks.

  • Demonstrates that the greater the gap between barriers required and those existing, the higher the likelihood of contamination.

  • Explains how operational monitoring should be used to continuously validate the risk assessment

In Australia, the Australian Drinking Water Guidelines (ADWG) is the authoritative reference for drinking water quality (NHMRC 2022). In 2004, the ADWG incorporated the Framework for Management of Drinking Water Quality (the Framework). The Framework advocates a proactive, risk-based approach to drinking water quality management, and is a total water quality management blueprint comprising of 12 elements.

Water safety plans (WSPs) are sometimes used in Australia as a vehicle to meet and implement the risk management aspects of good practice drinking water quality management as set out in the Framework, in particular Element 2, Assessment of the drinking water supply system, Element 3, Preventive measures for drinking water quality management and Element 4, Operational procedures and process control. The centrepiece of any WSP is the ‘risk assessment’, that is the identification of ‘hazards and hazardous events’, followed by assessment of the consequences if such an event occurred and the likelihood of such an event happening. The challenge for WSPs to be effective is to not only accurately identify those hazards or hazardous events that are most important to ensure safe drinking water, but also to correctly assign their consequence and likelihood.

Guidance provided by the ADWG combined with decades of experience1 in providing drinking water to a large number of diverse systems ranging from the city of Perth with a population ∼2 million and 220 regional and remote schemes including ∼100 with populations less than 1,000. These have been evaluated for achieving the objective of consistently and reliably ensuring the safety of drinking water. This evaluation has also considered relevant international published accounts of failure to ensure safe drinking water. Together, this approach has been used to develop and explain a rational and practical approach for reliably performing the essential risk assessment stage of a WSP and the companion operational monitoring needed to ensure that the WSP is achieving the objective of ensuring safe drinking water.

Understanding the problem

A major contribution to the problem of misjudging water quality risk is assigning the incorrect likelihood of microbial contamination.

Chapter 3.2 (Element 2) of the ADWG outlines a qualitative risk assessment methodology for assessing the risk from potential hazardous events. The risk assessment requires assigning a consequence ranging from ‘insignificant’ through ‘minor’, ‘moderate’, ‘major’ and up to ‘catastrophic’ and likelihood is assigned a rating from ‘rare’ through ‘unlikely’, ‘possible’, ‘likely’, and ‘almost certain’.

Consequence and likelihood can be combined to determine the residual risk from the hazardous event as illustrated in Table 1.

Table 1

Qualitative risk analysis matrix: level of risk (reproduced from the ADWG (NHMRC 2022)

LikelihoodConsequences
InsignificantMinorModerateMajorCatastrophic
Almost certain Moderate High Very high Very high Very high 
Likely Moderate High High Very high Very high 
Possible Low Moderate High very high Very high 
Unlikely Low Low Moderate High Very high 
Rare Low Low Moderate High High 
LikelihoodConsequences
InsignificantMinorModerateMajorCatastrophic
Almost certain Moderate High Very high Very high Very high 
Likely Moderate High High Very high Very high 
Possible Low Moderate High very high Very high 
Unlikely Low Low Moderate High Very high 
Rare Low Low Moderate High High 

This is a ‘standard’ approach and is not the issue. Difficulties can arise when determining how to apply the advice in the ADWG in a water utility environment. The issues centre on consideration of:

  • - ‘Latent’ risks and how they should factor into the assignment of likelihood.

  • - Ensuring the risk assessment effectively operationalises the ADWG requirement for multiple barriers to contamination.

Risk assessment in a water utility

Prior to retirement in 2016, I worked in a water utility for over 40 years, the last 20 years as manager of drinking water quality for the Water Corporation of West Australia which supplies water to the city of Perth with a population of over 2 million and 220 regional and remote towns spread across one-third of the Australian continent.

The Water Corporation is very committed to implementing the ADWG. The ADWG provides sound advice on water quality risk assessments. However, water quality risk assessments conducted in a water utility environment cannot usually be done in isolation. They have to be conducted in accordance with the ‘Corporate’ model so all risks being managed across the utility can be consistently assessed and reported to the Executive and Board. This is a precursor to securing approved funding for improvements. This is where consideration of ‘latent’ risk can become an issue in a utility.

Water supply is a capital-intensive industry. Large and expensive assets are required to treat, store and transport water. Unsurprisingly, there are well-developed asset management systems in most utilities to aid decisions about asset adequacy, replacement, and augmentation. In this environment the likelihood of an event usually aligns well with historical performance. For example, the history of burst mains can justify replacing an old water main. The history of low-pressure incidents or tanks running critically low can justify pipe and pump upgrades. Alternatively, such problems can be easily anticipated by looking at historical asset performance trends. In this ‘asset rich’ environment, for an event to be ‘almost certain’ of occurring it is expected to happen at least several times per year. This common practice definition (in a water utility) of ‘almost certain’ is then applied to other areas of the business, including water quality.

In the absence of an ‘outbreak’, Water Quality Managers can sometimes struggle to make the case for water quality improvements. The argument against water quality investment usually takes the line that ‘it has been like that for years and nothing has gone wrong’ or ‘if you say the likelihood is ‘almost certain’ then show me the bodies or modify the risk assessment’.

‘Latent’ risk

The problem with a risk assessment which places too much emphasis on historical evidence is that it fails to account for the ‘latent’ risks which are the cause of most water quality incidents. A well-known example of this is the Walkerton tragedy in Ontario, Canada in May 2000 (O'Connor 2002; Hrudey & Walker 2005). In a town of 5,000; 2,300 residents and visitors became ill and seven died after heavy rainfall washed microbial pathogens from manure into a shallow bore, one of three groundwater sources which supplied the town. The inadequate chlorination was overwhelmed by the organic load and the absence of online monitoring and alarms meant the problem went unrecognised for several days.

With the benefit of hindsight, it is easy to conclude that the bore was under the influence of surface water and there were ample pathogens in the proximity of the bore which could contaminate the raw water. There were inadequate barriers to inactivate or remove pathogens and the single barrier (chlorination) was unreliable and its control was inadequate. Experts testified that the Walkerton system was so deficient that it was essentially inevitable that the water would become contaminated by pathogens.

The Walkerton system operated for over 20 years without reported waterborne disease before disaster struck. Apart from advice in 1978 that the shallow bore was under the influence of surface water, there was no water quality risk assessment carried out for the Walkerton scheme before the incident in May 2000. The hypothetical question arises that if a risk assessment was carried out as late as April 2000 (one month before the incident), what would have been the conclusion? Any valid risk assessment should have identified the system weaknesses and concluded the system was vulnerable to contamination and urgent intervention was required. Arguably, a precautionary boil water advisory should have been implemented years ago. But the statement ‘years ago’ can be a problem if too much reliance is placed on historical performance to determine likelihood. An inappropriate risk assessment may well have not looked critically for weaknesses or even rationalised them because the water supply system has been operating for over 20 years without a documented contamination incident. Based on history, the likelihood of pathogen contamination could be deemed to be ‘unlikely’ or ‘rare’ according to a narrow application of risk assessment guidance. Clearly, the subsequent events demonstrated that this is the wrong answer, so what must change? How do we incorporate the latent factors which are obvious with hindsight into foresight when carrying out a water quality risk assessment?

The case for adoption of the barrier approach

Waterborne disease outbreaks are rare in Australia but that does not mean that all systems have adequate barriers to contamination. Reason (1990, 2000) proposed the ‘Swiss Cheese Model’ to explain the role of latent flaws and human error in causing major failures. Wu et al. (2009) adapted this model for analysing drinking water failures. The concept is that latent flaws may be seen as layers of Swiss Cheese that only lead to failure when circumstances allow the holes in the layers of Swiss Cheese to align. The Walkerton tragedy demonstrated that the preconditions for an outbreak can exist for many years before disaster strikes. At Walkerton, it took 22 years before the holes in the ‘Swiss Cheese’ lined up, when:

  • The neighbouring farmer applied fresh manure to the field.

  • Heavy and unseasonal rain washed the manure into the shallow aquifer.

  • The duty bore was under the influence of surface water.

  • The chlorinator did not have residual control and was overwhelmed by the increased organic load.

  • The problem was undetected because residual monitoring and alarms were not installed on the chlorinator and the operators failed to manually measure the chlorine residual daily with an accurate method as they were required to do.

Any valid water quality risk assessment method must identify these ‘latent’ risks and give them appropriate weight. The point is that when disaster strikes no one goes back to the risk assessment to say that everything is forgiven because the model predicted that there should be no more than one incident every 20–50 years. No one praises the financial prowess of the utility for deferring obviously needed expenditure on water quality improvements for 20 years. Instead, an Inquiry will find, as occurred at Walkerton, that the surprise is not that the incident happened but that it took so long before it happened. What lessons can we learn from incidents like Walkerton to ensure that water quality risk assessments protect customers from foreseeable harm?

The most obvious learning is that when barriers are absent or seriously compromised, then it is inevitable that pathogens, which are prevalent wherever humans or animals are present, will enter the water supply system with waterborne community disease a likely outcome. In such deficient systems, this could happen today, tomorrow, next week, next year or maybe not for decades like Walkerton. But it will happen – it is a question of ‘when’ not ‘if’. It is just a question of when the prerequisite conditions coincide, and these factors are not within the control of the water provider. The fact that an outbreak may not occur for some time is only a matter of good luck, not good management. It does not validate an incorrect risk assessment.

The barrier approach to likelihood assessment

The Water Corporation followed the events at Walkerton closely. It was of particular interest because Walkerton was a groundwater supply and about half the supplies managed by the Corporation are also supplied from groundwater and such sources are often thought of as ‘intrinsically safe’. It brought into focus the validity of existing water quality risk assessments and the practice of using ‘historical performance’ to assess likelihood.

It was noted that the common denominator of nearly all microbial water quality incidents was that the barriers to contamination were absent or performance was inadequate. It follows that any water quality risk assessment should not be based primarily on historical performance such as whether there has been a previous waterborne disease outbreak; rather the likelihood of the hazardous event should be determined by the presence and performance of barriers to contamination – the ‘barrier’ approach.

The Corporation is committed to implement the ADWG. It was believed that the ‘barrier’ approach was very much in line with the principles outlined in the ADWG. The 2004 edition of the ADWG was current at the time but the importance of multiple barriers to contamination has remained a consistent theme of the ADWG since 1996. With respect to barriers to contamination, the ADWG (NHMRC 2022) contains relevant guidance as summarised in the following:

  • The greatest risk to consumers of drinking water are pathogenic microorganisms (Chapter 1.1).

  • Water supplies should contain no harmful concentrations of chemicals or pathogenic microorganisms (Chapter 1.3.1).

  • Drinking water systems must have, and continuously maintain, robust multiple barriers appropriate to the level of potential contamination facing the raw water supply (Chapter1.1).

  • The advantage of multiple barriers is that short-term reductions in performance of one barrier may be compensated by performance of other barriers (Chapter 1.1).

  • A critical control point (CCP) is defined as an activity, procedure or process at which control can be applied and which is essential to prevent a hazard or reduce it to an acceptable level (Chapter 3.2.2).

  • Deviation from critical limits indicates loss of control of the process or activity and should be regarded as representing an unacceptable health risk (Chapter 3.2.2).

The applicable rationale is captured in Figure 1 (Hrudey 2001).
Figure 1

Relationship of risk to barriers and the level of challenge (adapted from Hrudey 2001).

Figure 1

Relationship of risk to barriers and the level of challenge (adapted from Hrudey 2001).

Close modal

If adequate barriers exist, then the risk is lowered, and so too is the likelihood of the hazardous event (e.g., harmful concentrations of pathogens entering the water supply system).

The ‘barrier’ approach employed for a subjective water quality risk assessment is based on this principle – if multiple and adequate barriers are operating reliably and correctly, then the likelihood of the hazardous event is reduced to ‘rare’.

If there are inadequate barriers, if their operation is inadequate or if it can't be reliably demonstrated that their operation is adequate, then the likelihood is not rare; it is ‘unlikely, possible, likely or almost certain’ depending on the extent of the shortfall in barriers and the extent of hazards present. The greater the gap between what is required and what exists, the higher the likelihood of the hazardous event occurring.

To put the barrier approach into practice, for each hazardous event a water provider needs to:

  • (a)

    Establish a suite of challenge categories for the hazardous event under consideration.

  • (b)

    Determine the multiple barriers required to meet each category of challenge.

  • (c)

    Prepare a matrix which estimates the likelihood of the hazardous event occurring for the range of challenges and barriers determined in (a) and (b).

  • (d)

    Determine the challenge category applicable to the scheme being assessed together with the presence, effectiveness and reliability of existing barriers

  • (e)

    Plot the scheme being assessed on the matrix to determine the likelihood of the hazardous event occurring.

Knowledge of water quality principles and experience in water quality management are essential requirements to identify the challenge categories and barrier requirements described previously. Since the risk assessment is qualitative, considerable judgement is required to populate the likelihood matrix.

The starting point for preparing the matrix is:

  • - where the required barriers exist and are performing effectively and reliably then the likelihood is ‘rare’.

  • - Where barriers are completely absent or not effective and reliable then the likelihood is ‘almost certain’.

  • - Other combinations of challenge and barriers will lie between these ‘bookends’.

Example 1. An existing and well-established barrier assessment model

Section 3.3 of the ADWG requires ‘evaluating whether the preventive measures, when considered together, are effective in reducing risk to acceptable levels’. Until recently, the acceptable level of residual risk for the microbial quality of drinking water was not explicitly defined. In 2022 the ADWG was updated to include a microbial health-based target (HBT). HBTs provide a quantitative measure of the microbial quality of drinking water, including an assessment of enteric pathogen risks in the source water which informs appropriate risk management measures (barriers to contamination). The HBT used in the ADWG is 1 × 10−6 Disability Adjusted Life Years per person per year (pp pa) or one micro-DALY (Disability Adjusted Life Year) pp pa. The concept of applying DALYs to the management of drinking water quality was first introduced in the third edition of the of the WHO Guidelines for Drinking Water Quality (WHO 2008). One DALY can be thought of as one year of healthy life lost. The sum of these DALYs across the population is the burden of disease. The derivation of microbial treatment targets for enteric pathogens is explained in Appendix 3 of the ADWG (NHMRC 2022).

In Australia the microbial HBT is used as an operational benchmark rather than a pass/fail guideline value (Walker 2016). The benchmark serves two important purposes:

  • (a)

    setting a definitive target for defining microbially safe drinking water.

  • (b)

    Informing improvement programmes to enhance safety of drinking water as per Element 12 of the Framework for Management of Drinking Water Quality.

In preparation for the introduction of a microbial HBT in Australia, the Water Services Association of Australia published in 2015 the Manual for the application of health-based targets for drinking water (WSAA 2015) usually referred to as the ‘HBT Manual’. Both the ADWG and HBT Manual provide similar advice on how to conduct a source risk and water treatment performance assessment in order to determine if the HBT has been met for the water supply system under consideration. This assessment process is now well established in the water industry in Australia. This methodology is actually a working model of the barrier approach which is the subject of this paper. The process is summarised as follows.

Source challenge assessment

Surface water sources are allocated to one of four categories based on the degree of source protection as outlined in Table 2.

Table 2

Source categories

CategorySource vulnerability
Category 1 sources have negligible sources of contamination from humans and stock animals. This means treatment is not required to deal with human infectious viruses or protozoa. Contamination by bacteria from native animals and birds is unavoidable. However, natural inactivation, dilution and settling in the large storages associated with a Category 1 source means E. coli levels in the raw water are less than 20 per 100 mL 
Category 2 sources have minimal sources of contamination. There may be some low-density housing or low intensity stock grazing in the outer catchment, but the inner catchment is well protected from contaminating activities. There is therefore a risk of low levels of human infectious viruses, bacteria and protozoa in the raw water. Catchment activities and/or the absence of a large reservoir mean Category 2 sources typically experience E. coli in the raw water in the range 20–2,000 per 100 mL 
Category 3 sources typically have moderate sources of faecal contamination. For example, there may be rural or urban subdivisions, extensive stock grazing on cleared pastures and catchment recreation. Accordingly, there is much higher risk of contamination by virus and protozoa than Category 2. However, all these activities are confined to the outer catchment and there are effective measures in place to protect the inner catchment and water body from contaminating activities. E. coli levels in the raw water typically fall in the range 20–2,000 per 100 mL 
Category 4 sources are typically unprotected with high contamination risk from humans (urban developments), stock (intensive grazing) and industry (piggeries, dairies). The inner catchment is not protected, and recreation may occur throughout the catchment and on the water body. Accordingly, contamination by virus and protozoa is very likely. E. coli levels in the raw water typically fall in the range 2,000–20,000 per 100 mL 
CategorySource vulnerability
Category 1 sources have negligible sources of contamination from humans and stock animals. This means treatment is not required to deal with human infectious viruses or protozoa. Contamination by bacteria from native animals and birds is unavoidable. However, natural inactivation, dilution and settling in the large storages associated with a Category 1 source means E. coli levels in the raw water are less than 20 per 100 mL 
Category 2 sources have minimal sources of contamination. There may be some low-density housing or low intensity stock grazing in the outer catchment, but the inner catchment is well protected from contaminating activities. There is therefore a risk of low levels of human infectious viruses, bacteria and protozoa in the raw water. Catchment activities and/or the absence of a large reservoir mean Category 2 sources typically experience E. coli in the raw water in the range 20–2,000 per 100 mL 
Category 3 sources typically have moderate sources of faecal contamination. For example, there may be rural or urban subdivisions, extensive stock grazing on cleared pastures and catchment recreation. Accordingly, there is much higher risk of contamination by virus and protozoa than Category 2. However, all these activities are confined to the outer catchment and there are effective measures in place to protect the inner catchment and water body from contaminating activities. E. coli levels in the raw water typically fall in the range 20–2,000 per 100 mL 
Category 4 sources are typically unprotected with high contamination risk from humans (urban developments), stock (intensive grazing) and industry (piggeries, dairies). The inner catchment is not protected, and recreation may occur throughout the catchment and on the water body. Accordingly, contamination by virus and protozoa is very likely. E. coli levels in the raw water typically fall in the range 2,000–20,000 per 100 mL 

The recommended minimum pathogen reduction requirements to achieve one micro-DALY pp pa are specified in the ADWG for each source category as listed in Table 3. Note there are some differences between the requirements of the ADWG and HBT Manual.

Water treatment barrier assessment

The recommended water treatment requirements for each source category are listed in Table 4.

Table 3

Log removal requirements to achieve one micro-DALY pp pa vs. source category (as per ADWG (NHMRC 2022))

Source categoryProtozoa log removalVirus log removalBacteria log removal
Category 1 
Category 2 3.0 4.0 4.0 
Category 3 4.0 5.0 5.0 
Category 4 5.0 6.0 6.0 
Source categoryProtozoa log removalVirus log removalBacteria log removal
Category 1 
Category 2 3.0 4.0 4.0 
Category 3 4.0 5.0 5.0 
Category 4 5.0 6.0 6.0 
Table 4

Water treatment vs. source category

CategoryRecommended treatment
Category 1 source waters have a very low level of bacterial contamination, easily dealt with by chlorination alone. However, chlorination at the doses (concentration × time, or Ct) realistic for bulk water treatment will not inactivate Cryptosporidium so it is critical that the sanitary survey confirms negligible sources of protozoan pathogens 
Filtration is required to remove protozoa, viral and bacterial pathogens, followed by chlorination. However, the performance requirements for Category 2 filtration provide only modest removal of protozoa, so it is important that the sanitary survey confirms minimal sources of protozoa and that these are in the outer catchment and buffered from feeder streams. Note that where a large reservoir barrier is absent, filtration is usually required to reduce turbidity to ensure effective disinfection by chlorine and/or UV disinfection 
The pathogen risk from a Category 3 source requires filtration with a higher performance specification than that required for Category 2 sources to ensure sufficient reduction of virus and protozoa prior to chlorination 
The absence of a source protection barrier and often a reservoir barrier, means a double treatment barrier is required to not only achieve the required pathogen removal but also the multiple barriers to contamination required by the ADWG. Typical treatment is filtration followed by ultraviolet disinfection followed by chlorination 
CategoryRecommended treatment
Category 1 source waters have a very low level of bacterial contamination, easily dealt with by chlorination alone. However, chlorination at the doses (concentration × time, or Ct) realistic for bulk water treatment will not inactivate Cryptosporidium so it is critical that the sanitary survey confirms negligible sources of protozoan pathogens 
Filtration is required to remove protozoa, viral and bacterial pathogens, followed by chlorination. However, the performance requirements for Category 2 filtration provide only modest removal of protozoa, so it is important that the sanitary survey confirms minimal sources of protozoa and that these are in the outer catchment and buffered from feeder streams. Note that where a large reservoir barrier is absent, filtration is usually required to reduce turbidity to ensure effective disinfection by chlorine and/or UV disinfection 
The pathogen risk from a Category 3 source requires filtration with a higher performance specification than that required for Category 2 sources to ensure sufficient reduction of virus and protozoa prior to chlorination 
The absence of a source protection barrier and often a reservoir barrier, means a double treatment barrier is required to not only achieve the required pathogen removal but also the multiple barriers to contamination required by the ADWG. Typical treatment is filtration followed by ultraviolet disinfection followed by chlorination 

The log reduction credits for commonly used water treatment processes are listed in the ADWG and the HBT Manual, together with the operational performance criteria which must be achieved to claim the credits.

Quantitative risk assessment

The assignment of log reduction values to the source challenge and water treatment processes allows the risk assessment to be quantitative. The risk assessment process (called the Water Safety Assessment in the HBT Manual) involves comparing the source challenge (log reduction required for each reference pathogen) to the water treatment barrier performance (log credits available for each reference pathogen).

Where the water treatment barrier performance equals or exceeds the source challenge then the HBT is deemed to have been met.

Where the source challenge exceeds the water treatment capability the residual risk will be greater than one micro-DALY. For example, if there is a one log shortfall in water treatment then the scheme would be operating at 10 micro-DALY, 2 log shortfall would be 100 micro-DALY, etc.

Subjective source risk assessment

At the Water Corporation, work on developing the ‘HBT’ approach to source assessments commenced about 2010. Prior to then, subjective source risk assessments were the only option. Similar criteria for source classification and water treatment as described previously were used, as developed about 2004 by Keith Cadee (a General Manager in the Water Corporation). With a sound method of determining water treatment requirements now available, the next question was how to assess the risk for systems if water treatment barriers were inadequate to meet the challenge.

For the hazardous event of ‘harmful concentrations of pathogens entering the water supply through inadequate water treatment’, a table was developed showing likelihood for each combination of source challenge and treatment as shown in Table 5.

Table 5

Likelihood of harmful concentrations of source pathogens entering the water supply

Source challenge levelExisting source treatment level
Level 1Level 2Level 3Level 4
Category 1 Rare Rare Rare Rare 
Category 2 Likely Rare Rare Rare 
Category 3 Almost certain Possible Rare Rare 
Category 4 Almost certain Likely Possible Rare 
Source challenge levelExisting source treatment level
Level 1Level 2Level 3Level 4
Category 1 Rare Rare Rare Rare 
Category 2 Likely Rare Rare Rare 
Category 3 Almost certain Possible Rare Rare 
Category 4 Almost certain Likely Possible Rare 

The rationale for the selection of likelihood follows.

Where the level of treatment meets or exceeds that required for the source category then the likelihood is rare (i.e., adequate barriers).

Where a source needs filtration but there is none (i.e., only Level 1 treatment by chlorination alone), then the likelihood is likely for Category 2 sources since, although the bacteria and virus load is moderate and may still be dealt with by chlorination alone, there is no barrier to protozoa. For the more heavily contaminated Category 3 and 4 sources, likelihood is almost certain because there is no protozoa barrier, and the bacteria and virus load is likely too high for chlorination alone.

For a Category 3 source with Level 2 treatment, at least there is some filtration which means disinfection is effective and there is a modest protozoa barrier. Bacteria and virus treatment may well be adequate, but it is still possible protozoan pathogens will enter the supply system on occasions.

Likewise for a Category 4 source with a Level 3 treatment, there is a very good filtration barrier, which together with chlorination (Ct > 15 mg/L min), probably means treatment for bacteria and virus is adequate. However, without UV it is possible protozoan pathogens will enter the water supply when there is a high challenge.

With the availability of log values from the HBT work over the past decade it is possible to reverse engineer Table 5 to show the treatment log shortfall. Assuming chlorination achieving a Ct of 15 mg/L min is installed, then the shortfall for Protozoa was found to be higher than the other pathogens and is shown in Table 6.

Table 6

Likelihood of protozoa entering the water supply vs. water treatment log removal shortfall

Source challenge levelExisting source treatment level
Level 1Level 2Level 3Level 4
Category 1 Rare (0 log) Rare Rare Rare 
Category 2 Likely (3 log) Rare (0 log) Rare Rare 
Category 3 Almost certain (4 log) Possible (1 log) Rare (0 log) Rare 
Category 4 Almost certain (5 log) Likely (2 log) Possible (1 log) Rare (0 log) 
Source challenge levelExisting source treatment level
Level 1Level 2Level 3Level 4
Category 1 Rare (0 log) Rare Rare Rare 
Category 2 Likely (3 log) Rare (0 log) Rare Rare 
Category 3 Almost certain (4 log) Possible (1 log) Rare (0 log) Rare 
Category 4 Almost certain (5 log) Likely (2 log) Possible (1 log) Rare (0 log) 

The point of this exercise is to demonstrate that there is a good correlation between the qualitative assessment of likelihood and the actual shortfall in water treatment capability. This should provide confidence the ‘barrier’ approach, when used for a qualitative risk assessment, can provide plausibly accurate results.

However, more importantly, it can be appreciated that the ‘barrier’ approach’ recognises latent risks. If barriers are absent or underperforming, then the likelihood must be higher than ‘rare’, regardless of historical performance. The greater the gap between the barriers required and the barriers existing, then the higher the likelihood of the hazardous event. Given the widespread acceptance of the ‘barrier’ approach’ for source risk assessment in Australia, it raises the question as to whether it would also be an effective process for other water quality risk assessments. The answer to that question is yes, as the next examples will illustrate.

Example 2: Contamination of stored water

The ADWG requires barriers to contamination to be installed and maintained from catchment to tap. Whenever a water supply system is depressurised, there is a possibility of entry of contaminants. Tanks and reservoirs provide such an opportunity, and it is necessary to seal the storage to prevent ingress from storm and drainage water and access by animals and birds.

Notable water quality incidents resulting from contamination in water storages have occurred in Gideon, Missouri, USA in 1993 which infected more than 600 consumers and caused seven deaths (Clark et al. 1996; Angulo et al. 1997) and in Alamosa, Colorado, USA in 2008 (Falco & Williams 2009; Hrudey & Hrudey 2014) with 434 reported cases of gastroenteritis, 20 hospitalisations and one death.

About the turn of the century, the Water Corporation checked the sealing and security of all tanks and reservoirs. As there were over 900 storages to be checked, spread across a geographical area seven times the area of the UK, the task was split up among six teams comprising Head Office Water Quality staff and local asset managers. These teams used a common checklist for the review, but they did not just rely on visual inspection. They innovated by accessing the inside of the tank and looking for leaks while other team members sprayed water on the roof. For larger in ground storages, they rowed around in boats looking for visible ‘stars’ in the roof.

The process revealed that every tank had some sealing issue. The teams were required to do a risk assessment for each tank. The hazardous event to be assessed was ‘entry of harmful concentrations of pathogens to the stored water’. With Gideon in mind, it was accepted that the consequence of this event was major or catastrophic. However, the team members could not agree on the likelihood of the event. The Water Quality staff were very concerned that the primary barrier to contamination was unsound. They reasoned that it was probably the presence of a chlorine residual in the tank which was masking the problem by inactivating bacteria. Consequently, the routine but infrequent microbial samples were mostly free of indicator E. coli, but the teams knew the chlorine was a single barrier and not continuously monitored. The more frequent chlorine grab samples from the reticulation confirmed there were times when chlorine residual was absent in many systems and there were plausible circumstances where the residual could be lost in any tank. They reasoned the likelihood of pathogens in the tank had to be at least possible, if not likely. The Asset Management members noted that in many cases the roof had been like that decades and since ‘nothing had gone wrong’, they contended that under their risk framework the likelihood of the hazardous event must therefore be unlikely or rare.

The issue of latent risks was in focus again. Having the same risk scenarios assessed differently, depending on who does the assessment is untenable. The impasse went on for some time with every tank improvement needing to be justified individually as funding was sought for sealing improvements. This was very inefficient. What was needed was a process with corporate endorsement that is repeatable and independent of who does the assessment. Most importantly, it should be consistent with the ADWG and ensure the utility meets its duty of care to protect its customers from foreseeable harm, but it should not result in ‘gold plating’ (unjustified investment).

This situation was the catalyst for development of the first ‘barrier’ approach to water quality risk assessment. It preceded the work on source risk assessment but followed the same process.

  • Step 1. Challenge assessment categories.

  • Step 2. Barrier requirement assessment.

  • Step 3. Likelihood matrix assessment.

  • Step 4. Determine challenge and barriers applicable to the scheme being assessed.

  • Step 5. Plot scheme characteristics on the likelihood matrix.

For the hazardous event of ‘entry of harmful concentrations of pathogens to the stored water’, these steps are explained in the following.

Challenge assessment categories

The challenge involves access to the water in the tank by animals and birds or ingress of rain and drainage water contaminated by animal and bird faeces. This is a straightforward challenge and there is no need to have gradations (or levels) of challenge for this event.

Barrier requirement assessment

This step involves deciding the multiple barriers which are required to meet the challenge. It was decided that two barriers were required.

The primary barrier is the sealing of tanks and reservoirs to prevent ingress of water which may be contaminated (e.g., from bird droppings on a tank roof). Overflow pipes and other points of access to the stored water should be screened to prevent access by birds and animals.

The second barrier is always maintaining a chlorine residual >0.5 mg/L. This should be sufficient to inactivate any bacteria which may enter the tank should there be a short-term failure of the primary barrier. It was decided that bacterial contamination was the primary concern since access to the water body by humans and animals which may carry human infectious virus and protozoa was only a remote possibility.

Likelihood assessment

This step involves constructing a matrix of likelihood for all combinations of challenge and barriers. In this case, since there is only one level of challenge, the matrix is simply likelihood vs. any combination of barriers as shown in Table 7.

Table 7

Barriers vs. likelihood

Tank sealed (primary barrier)Chlorine residual >0.5 mg/L (secondary barrier)Likelihood
Yes Yes Rare 
Yes No Possible 
No Yes Likely 
No No Almost certain 
Tank sealed (primary barrier)Chlorine residual >0.5 mg/L (secondary barrier)Likelihood
Yes Yes Rare 
Yes No Possible 
No Yes Likely 
No No Almost certain 

This assessment is by necessity subjective. The likelihoods were estimated by a team of experienced water quality professionals. The rationale for the selection of likelihood follows:

Birds and animals can carry bacterial pathogens which are infectious to humans. With two reliable barriers to contamination, it will be rare to have a contamination event as both barriers must fail simultaneously.

Conversely, without any barriers, it is inevitable (almost certain) that contamination of the stored water will occur.

This finding does not mean it will happen many times per year. It means it is inevitable – a question of ‘when’ not ‘if’ the hazardous event will occur. But it does mean that when it rains and there are fresh bird droppings with pathogens on the roof then contamination will occur, and waterborne disease may result. Like Walkerton, ignoring the issue is a case of rolling the dice until the holes in the Swiss cheese line up.

Tank roofs are a live asset where hatches can be accidentally left open, seals degrade, holes get drilled, etc. Where the roof is the only barrier, it is possible that a breach can occur between inspections and that it could rain in that period and wash bird droppings into the tank.

There are many operational scenarios which can cause the chlorine residual in a tank to be depleted or lost altogether. Where the roof (being the primary barrier to contamination) is not secure then it is likely that contamination of the stored water will occur if maintaining a chlorine residual is the only barrier.

Because the risk assessment is subjective, other assessors may determine different likelihoods. Regardless of who does the assessment the methodology demands for the latent risk to be recognised. The likelihood can only be ‘rare’ when all the required barriers to contamination are in place and they are operating correctly and reliably, regardless of ‘historical’ performance. Again, the greater the gap between barriers existing and required, the higher the likelihood of the hazardous event.

Operational monitoring

Every risk assessment informs the associated operational monitoring plan. Operational monitoring may be measurements and observations which provide ongoing confirmation that the challenge and barrier performance aligns with the WSP or provides early warning that there has been a change in system status.

To confirm the tank sealing barrier is functioning correctly it is necessary to instigate an appropriate inspection programme such as a weekly site security check including roof hatches and screens on overflow and scour pipes; a monthly visual check of the roof; and an annual asset management check including from inside the tank while the roof is doused with water. At the Water Corporation, date-based work orders are generated by the Assets work management system and issued to Operations staff to undertake the required inspections. Where the inspection indicates barrier remediation is required then additional work orders are generated with a nominal time for work completion which is then tracked. Sometimes additional actions are initiated where a barrier is compromised. For example, chlorine residuals will be increased and checked more frequently if a roof is damaged in a storm.

Similarly, the residual on the tank outlet should be regularly checked (or preferably continuously monitored) to provide confidence this barrier is maintained at the desired level. All chlorine residual monitoring points have a target and an upper and lower bound. Out of specification conditions are alarmed on the Supervisory Control and Data Acquisition (SCADA) system where monitoring is continuous and on personal data assistants (PDAs) for grab sample checks. Out of spec conditions require an urgent response.

Records of the inspections, chlorine residual, out of specification conditions and remedial actions need to be kept so that the risk assessment can be updated as part of the annual review of the WSP. If the review indicates the barriers are not as reliable as estimated previously, then the likelihood will need to be modified, resulting in a higher risk. This will be the trigger for improvements to restore barriers to contamination to the required standard.

Example 3: Naegleria risk assessment

Naegleria challenge

The hazardous event which is the subject of this assessment is supplying water which is contaminated by Naegleria. Naegleria fowleri is a free living, thermophilic ameboflagellate which causes the waterborne disease primary amoebic meningoencephalitis (PAM). The route of infection is via the nasal passage and PAM is almost always fatal. The West Australian Department of Health requires all drinking water systems in WA to be free of Naegleria.

The ADWG advises that water with a temperature continuously above 25 °C or seasonally above 30 °C can support the growth of N. fowleri.

The Water Corporation has detected Naegleria in water at temperatures below 25 °C and therefore 20 °C has been adopted in WA to provide a safety factor.

All Water Corporation schemes are sampled and tested for Naegleria at least once per week. To some extent the level of challenge is best indicated by the history of detections over the previous five years.

Table 8 summarises the levels of challenge.

Table 8

Naegleria challenge and characteristics

Naegleria challengeCharacteristics
Level 1 Water temp. always <20 °C
No Naegleria detections in reticulation system ever 
Level 2 Water temp >20 °C in the past 5 years
No Naegleria detections in reticulation system in the past 5 years 
Level 3 Water temp >20 °C in the past 5 years
Naegleria detected in reticulation system in the past 5 years 
Level 4 Reticulation system is colonised as indicated by multiple detections each year 
Naegleria challengeCharacteristics
Level 1 Water temp. always <20 °C
No Naegleria detections in reticulation system ever 
Level 2 Water temp >20 °C in the past 5 years
No Naegleria detections in reticulation system in the past 5 years 
Level 3 Water temp >20 °C in the past 5 years
Naegleria detected in reticulation system in the past 5 years 
Level 4 Reticulation system is colonised as indicated by multiple detections each year 

Naegleria barrier assessment

Naegleria is a difficult microorganism to control because it can quickly change into different forms:

  • - Trophozoite – infective, feeding, and replicative stage.

  • - Flagellate – temporary non feeding stage stimulated by adverse conditions e.g., reduced food source.

  • - Cyst – when the environment is not conducive to continued feeding and growth e.g., low temperature. Cysts are environmentally resistant to increase Naegleria survival until conditions improve.

Chlorination at Ct 15 mg/L min will control Naegleria in the Trophozoite stage. However, in the cyst form Naegleria are more resistant to chlorine and can survive for months, even years in a chlorinated water supply where sediments, biofilm, pipe joints and filters provide protection from chlorine.

Water Industry experience in WA (from a combination of observation and science) is that the pre-conditions for managing the Naegleria risk down to an acceptable level are:

  • - continuous disinfection at Ct >15.mg/L min. This will inactivate mobile Naegleria in the trophozoite form, but some cysts may still survive.

  • - continuous chlorine residual greater than 0.5 mg/L must be maintained throughout the water supply system for 18 months. This will keep the Naegleria in the cyst form and they do seem to ‘die-off’ over that time and the likelihood of further detections is very low, so long as chlorination and chlorine residual is continuous.

If chlorination fails for even a short time during these 18 months, the Naegleria will change to the Trophozoite form and commence feeding and replicating. This means another 18 months of continuous chlorine residual is required to purge the Naegleria from the water supply system.

The 2008 paper titled ‘Operational management of Naegleria spp. in drinking water supplies in Western Australia’ by Trolio et al. (2008) explained how these principles were being applied in Water Corporation schemes. Subsequent research by CSIRO on a live Water Corporation scheme explained the science behind the observations and experience. The paper titled ‘Elimination of Naegleria fowleri from bulk water and biofilm in an operational drinking water distribution system’ by Miller et al. (2017) notes that Naegleria can survive intermittent doses of chlorine and that Naegleria associated with pipe wall biofilm is even more resistant to chlorine due to the added protection of the biofilm. The research showed that maintaining the target chlorine residual removed the Naegleria from the bulk water phase and also the pipe wall associated biofilm. The reason for this was changes in the biofilm which prevented it being recolonised by Naegleria. This explains the apparent ‘die-off’ after 18 months of continuous chlorination and chlorine residual in the pipe network.

For the purpose of the barrier risk assessment, the barriers to Naegleria are summarised in Table 9.

Table 9

Characteristics of Naegleria barrier levels

Barrier levelCharacteristics
Level 0 Chlorination not to CCP standardsa 
Level 1 Reliable chlorination as per CCP standard 
Levels 2 and 3 Reliable chlorination as per CCP standard
Free chlorine residual >0.5 mg/L maintained at all times in the last 5 years 
Level 4 Filtration to remove at least 3 log Naegleria
Reliable chlorination as per CCP standard
Free chlorine residual >0.5 mg/L maintained at all times in the last 5 years 
Barrier levelCharacteristics
Level 0 Chlorination not to CCP standardsa 
Level 1 Reliable chlorination as per CCP standard 
Levels 2 and 3 Reliable chlorination as per CCP standard
Free chlorine residual >0.5 mg/L maintained at all times in the last 5 years 
Level 4 Filtration to remove at least 3 log Naegleria
Reliable chlorination as per CCP standard
Free chlorine residual >0.5 mg/L maintained at all times in the last 5 years 

aCCP stands for critical control point. Because continuous chlorination is imperative to manage Naegleria, the chlorination facility must incorporate the following CCP features: flow paced and chlorine residual control; chlorine residual continuously monitored and alarmed; fail safe operation, i.e., flow shut down if chlorine residual is <0.5 mg/L.

Naegleria likelihood assessment

The likelihood of supplying water with Naegleria can be determined by considering the challenge and barrier level, as summarised in Table 10.

Table 10

Likelihood of supplying water with Naegleria

Naegleria challengeBarrier level
Level 0Level 1Levels 2 and 3Level 4
Level 1 Unlikely Rare Rare Rare 
Level 2 Likely Unlikely Rare Rare 
Level 3 Almost certain Likely Possible Rare 
Level 4 Almost certain Almost certain Likely Rare 
Naegleria challengeBarrier level
Level 0Level 1Levels 2 and 3Level 4
Level 1 Unlikely Rare Rare Rare 
Level 2 Likely Unlikely Rare Rare 
Level 3 Almost certain Likely Possible Rare 
Level 4 Almost certain Almost certain Likely Rare 
Table 11

Typical set points for chlorination

Set pointChlorination (mg/L)
Target 1.5 
Upper limit 1.8 
Lower limit 1.2 
Critical limit 1.0 
Set pointChlorination (mg/L)
Target 1.5 
Upper limit 1.8 
Lower limit 1.2 
Critical limit 1.0 

This table has been prepared by a team of experienced water quality professionals.

The rationale for the selection of likelihood follows:

Wherever the barrier equals or exceeds the challenge then the likelihood is ‘rare’.

A scheme with a Level 1 challenge always has a water temperature below that required to support Naegleria.

With high quality chlorination and no history of Naegleria detections, the likelihood of Naegleria in the water supply is ‘rare’.

Where the chlorination barrier is not to CCP standard there is only a slightly higher likelihood (‘unlikely’) since water temperature and history of Naegleria detections mitigate against Naegleria growth in the water supply.

A scheme with a Level 2 challenge has a water temperature which could support Naegleria growth but no history of detections. Given the ubiquitous nature of Naegleria, operating without CCP standard chlorination leaves the scheme vulnerable to entry of Naegleria if the chlorinator fails and the likelihood assigned is ‘likely’. If CCP chlorination is in place the likelihood reduces to ‘unlikely’ as a chlorinator failure will result in supply ceasing and water with a low chlorine residual (which might contain Naegleria) will not be admitted to the system.

A scheme with a Level 3 challenge has a track record of Naegleria growth. If the system only has a barrier level 0 (i.e., it does not have a CCP standard chlorinator and maintain a chlorine residual > 05 mg/L) then there are no adequate barriers and it is ‘almost certainNaegleria will populate the system.

If this Level 3 challenge scheme has CCP chlorination but does not maintain a residual (Barrier level 1) then most Naegleria will be inactivated by the chlorinator as they enter the system, but it is ‘likely’ some Naegleria will survive and populate the system in areas where the chlorine residual is low or absent.

If this scheme has CCP chlorination and maintains a chlorine residual it is still ‘possible’ the system has Naegleria as they can survive these conditions for about 18 months. Once the system has operated Naegleria free for five years then the challenge will revert to Level 2 and the likelihood of Naegleria in the system will be ‘rare’.

A challenge Level 4 system is so severely contaminated with Naegleria that it is considered colonised and requires filtration to provide a Naegleria free supply. The author has not encountered this situation.

Chlorination and chlorine residual maintenance are unlikely to be sufficient to inactivate all the Naegleria in such a grossly contaminated system and therefore, it is still ‘likely’ that Naegleria will be present in systems with a Level 2/3 barrier. Barrier Level 0 and Level 1 systems will be completely overwhelmed, and the likelihood is ‘almost certain’ that Naegleria will be in the system.

One of the schemes managed by the Water Corporation is a very large rural water supply covering an area about 600 km × 300 km and comprising about 12,000 km of pipe and numerous pump stations and tanks. It was constructed after World War 2 to provide a water supply to farming communities and encourage servicemen back onto the land. When this scheme was built, water quality was not a consideration.

After two deaths in the 1980s caused by N. fowleri (after swimming in a pool supplied from the scheme), attention was focussed on improving water quality, but the barrier requirements were not well understood at that time, particularly the need for continuous chlorine residual throughout the pipe network. Chlorination and rechlorination were installed throughout the scheme with varying degrees of success. Naegleria detections were still common.

While there were regular Naegleria detections from towns in the scheme, it had proven difficult to mount a strong case for resources to improve water quality performance for the reasons outlined earlier in this paper. Like Walkerton, it was nearly 20 years since there had been a case of PAM in WA. It was difficult to gain acceptance that the likelihood of an incident was ‘almost certain’. In addition, the funding model in place at the time discouraged investment in such rural schemes.

Knowledge of the barriers required to control Naegleria was sufficient by the early 2000s to undertake a ‘barrier’ assessment for the approximately 30 towns in the scheme.

Most of these towns were determined to be a Level 3 challenge since the predominantly above ground pipes ensured the water temperature always exceeded 20 °C and was considerably higher in summer and most towns had experienced Naegleria detections in the preceding five years.

The barrier level for most towns was assessed as Level 0 because the chlorinators were not to CCP standard. Note that most of these facilities were rechlorination (not source chlorinators) so there were no concerns about the safety of the water for drinking, just maintaining a continuous chlorine residual. Even where chlorination was of CCP standard, many towns could not maintain a chlorine residual due to the hydraulic characteristics of the system which resulted in extended water age. These towns were classified as having a Level 1 barrier.

With a Level 3 challenge and Level 0 barrier, reference to Table 10 indicates the likelihood of Naegleria in the water supply is rated ‘almost certain’. If the barrier is Level 1 the likelihood is ‘likely’. The Water Corporation rates the consequence of Naegleria contamination as ‘catastrophic’. With a likelihood of ‘almost certain’ or ‘likely’ the risk is rated as ‘Very high’.

The Naegleria barrier risk process and results were reviewed and endorsed by the Corporate Water Quality Committee, which is a governance subcommittee of the Executive. Accordingly, the Naegleria risk results were accepted for incorporation in the Corporation-wide risk profile.

There are very few risks rated as high as ‘Very high’ across the Corporation, so it immediately attracted the attention of the Board. Having shone the light on this risk via the ‘barrier’ risk assessment, corporate priority was now given to reducing the risk via operational and capital investment. This was not an easy nor inexpensive task. Upgrading chlorination facilities to CCP standard was straightforward. Maintaining a chlorine residual is the big challenge. A small water demand over a vast supply area automatically implies extended water age and excessive chlorine residual depletion. As mentioned, the scheme was designed to deliver quantity, not quality. Tanks were bypassed at times meaning the water stagnated and lost chlorine residual, only to be released into the system on days with high demand. Most tanks had combined inlet outlet pipes which created null points in the pipe network with resultant loss of chlorine residual. This vast system had to be hydraulically reengineered to minimise water age and maximise chlorine residual.

In terms of risk reduction, the results speak for themselves. In 2000, about the time when the barrier assessment was undertaken, there were 20 Naegleria detections that year. The detections declined over the next decade as capital and operational improvements were made. In the six-year period from 2011 to 2016 there were a total of two detections and in five out of the six years there were no detections.

The barrier assessment had been instrumental in identifying which schemes had gaps with respect to barriers required versus those existing. This enabled the latent risk to be correctly included in the risk assessment. The inevitability of an incident could therefore be well demonstrated, and the correct likelihood applied in the risk assessment, despite the absence of bodies!

It is never easy to gain resources for new initiatives like the Naegleria improvements which flowed from the barrier assessment. However, an important result of the ‘barrier’ approach is that it makes it clear what needs to be done to mitigate the risk and therefore the water supplier's duty of care to protect consumers from foreseeable harm. As a governance committee, the Corporate Water Quality Committee well understood this requirement which then became a driver for timely reduction in risk.

This leads into one of the other advantages of the barrier approach. Reductions in risk can be accurately assessed and reported. The likelihood of Naegleria detections for many towns in the scheme can be reported to have reduced from ‘almost certain’ to ‘rare’. Executives and the Board naturally like to see a risk reduction result following significant investment and the ‘barrier’ approach can demonstrate this.

Operational monitoring

The Naegleria challenge category is based on water temperature and Naegleria detections from verification monitoring in the reticulation of towns (sampled at least weekly). Water temperature is routinely monitored with microbiological samples. Any result which indicates temperature above 20 °C for the first time will flag as an ‘exception’ and is investigated to see if it affects the Naegleria risk. Any Naegleria detection is treated as a major corporate incident and triggers a significant operational response. This includes revising the ‘risk’ status of the town if appropriate.

In terms of monitoring barrier performance, the key parameters are:

  • - Continuous chlorination which achieves a Ct of 15 mg/L.

  • - Continuous chlorine residual > 0.5 mg/L across the pipe network.

The chlorine residual after chlorination should be continuously monitored and typically have the set points shown in Table 11.

These set points are higher than commonly found in an urban supply but necessary given the long distances and therefore extended water age typical of these rural schemes.

If the upper or lower limits are exceeded, then alarms are activated, and maintenance staff will be called to site. Most chlorinators are located at pump stations. If the critical limit is breached the pump will be shut off to prevent water with inadequate chlorine residual entering the water supply system. Supply can be maintained from storage tanks pending repair of the chlorinator. Many sites are rechlorinating water and therefore they are not CCPs in the sense of being critical to the supply of safe drinking water. However, they are managed as CCPs because they are critical to the control of Naegleria.

To confirm chlorine residual > 0.5 mg/L across the pipe network, grab samples are taken from near the end of each system two or three times per week. With chlorinator performance continuously monitored and available for review on SCADA, experience has shown this frequency is adequate for assurance purposes. Trends are monitored and if necessary, chlorinator dose rates are adjusted, or tank operating storages lowered to reduce water age.

Records of chlorination performance, out of specification conditions and remedial actions, chlorine residuals and Naegleria monitoring results from the pipe network are kept so that the risk assessment can be updated as part of the annual review of the WSP. If the review indicates the barriers are not as reliable as estimated previously or the challenge has increased, then the likelihood will need to be modified, resulting in a higher risk. This will trigger a review of barrier performance and upgrades will be initiated where warranted.

Applicability of the barrier approach

The ‘barrier’ approach was conceived in a large water utility environment to ensure latent risks are acknowledged in the risk assessment and to demonstrate the key principles in the ADWG relating to multiple barriers to contamination are being achieved. The reality is that microbial risk assessments are not easy and difficulties with their implementation are not confined to large utilities.

My experience with smaller utilities and municipal water providers is that they want to do their own risk assessments. They have often read the ADWG and have a good idea of what to do. But how do you do undertake an effective microbial risk assessment?

The often-small water quality team may not face the pressure of conforming to the assets or corporate risk assessment model described for a large utility, but how do you decide the likelihood of pathogens in the water supply if you are starting from scratch? It is impractical to measure specific pathogens as you can with chemical and other contaminants.

Because it is so hard, some will just assume the supply is safe because there have been no reports of illness and turn their attention to chemical and other risks which can be easily determined and measured. In such cases they have inadvertently adopted a de facto ‘historical’ approach to likelihood and failed to explore the possibility of ‘latent’ risk in their water supplies.

Some industries like mining also manage drinking water supplies. Because water supply is not their core business, these companies sometimes ask consultants to carry out a water quality risk assessment for their drinking water supplies. Consultants usually have their own or proprietary models for assessing risk. Feedback from some clients is that the resultant risk assessment is voluminous with hundreds of hazards identified but little information on how the final risk was determined (particularly likelihood). This lack of understanding of the risk assessment process by the client can lead to lack of confidence in the outcomes. With so many problems identified the client can struggle to prioritise tasks and work out where to start in terms of improvements. Unfortunately, this can result in delays or no progress at all. In contrast, when I have presented clients with a barrier risk assessment, the concept is quickly understood, and the outcomes more readily accepted. Improvement programmes are more easily prioritised as microbial risk is always the highest. The direct connection between the risk assessment and operational monitoring is a neat package which operators can also appreciate since every parameter which is monitored has an action level and they know the monitoring is for a reason.

The examples above demonstrate that the ‘barrier’ approach to microbial risk assessment can be applied to all schemes and in all circumstances. The ‘barrier’ approach is not some radical new idea about how to ensure the success of WSPs for a wide variety of schemes. It is an explanation of successful experience over nearly two decades of implementation. In that time the scope of assessments has been expanded to include a suite of ‘generic’ risks applicable to Western Australian water supplies, as listed in the following:

  • • Harmful concentrations of pathogens entering the water supply due to

    • - Inadequate treatment

    • - Contamination of treated water storages

    • - Accidental bypassing of treatment

    • - Contamination from backflow and cross connection to non-potable supply

  • • Supply of water containing Naegleria

  • • Supply of water containing a chemical with a concentration greater than the health guideline in the ADWG

  • • Supply of water containing a pesticide

  • • Supply of water containing nitrate above the guideline for infants

  • • Supply of water containing nitrate above the guideline for adults

  • • Supply of water with high THM concentration

  • • Supply of water with high concentration of TDS and hardness

The same ‘barrier’ process is effective no matter the size, type and location of the scheme.

The source risk assessment process described in the WSAA HBT Manual and the ADWG (NHMRC 2022) has quickly become well accepted and established, even in the non-utility sector. As mentioned, the source risk and water treatment performance assessments in the HBT Manual and ADWG is an example of the ‘barrier’ approach to risk assessment. As chair of the group which developed the WSAA Manual, I became very aware of the need and desire within the Australian water industry for practical and robust guidance on how to do source risk assessments. Following the successful adoption of the microbial HBT in Australia, it now seems appropriate to share the ‘barrier’ approach for non-source risks. This is a ‘lived’ experience from the Water Corporation and the process is easily transferable.

If taken up, it would mean the same ‘barrier’ approach would be used to assess the likelihood of all water quality risks from catchment to tap. This makes sense for consistency of approach and outcomes. More importantly, this approach would operationalise the key requirement in the ADWG for multiple barriers to contamination from catchment to tap which would be an outstanding national outcome in terms of addressing latent risks and therefore improving drinking water safety.

A word on consequence

The barrier assessment is used to determine the likelihood of a hazardous event. The consequence is fixed for each hazardous event as experience has shown that having variable consequences for each hazardous event introduces a degree of complication and complexity which is not warranted for a subjective risk assessment. The consequence of the hazardous event ‘concentration of harmful pathogens in the water supply’ has been rated as catastrophic by the Water Corporation due to the possibility of an outbreak of waterborne disease resulting in widespread illness and even death. Note that this consequence is not just based on public health outcomes but also ‘corporate’ factors including reputation, financial, service interruption and compliance criteria. Others may have a different view about what consequence is applicable for their locality. The point is that the consequence ‘is what it is’ and a water provider cannot change it. What a water provider can do is to change the likelihood of the hazardous event. Accordingly, the focus should be on the provision of adequate and reliable barriers to contamination.

Risk assessment approaches which primarily rely on asset management-orientated historical performance to determine the likelihood of a hazardous event are unsuitable for drinking water quality because they do not recognise ‘latent’ risks associated with absent or underperforming barriers to contamination. Most outbreaks occur when circumstances result in the ‘latent’ risk transforming into a ‘real’ risk.

Drinking water quality risk assessments should be based on a ‘barrier approach’ to align with the ADWG and related other international guidance. Where truly adequate and reliable multiple barriers to contamination are present, then the likelihood of a hazardous event is rare. This is where water providers should position themselves with respect to microbial contamination. Where barriers are absent, inadequate, or unreliable then the likelihood is NOT rare. The likelihood will be unlikely, possible, likely or almost certain depending on the nature and extent of the barrier shortfall.

The ‘barrier’ approach to water quality risk assessment is not radical nor new, having been used for over a decade at the Water Corporation. The methodology is not complex. The most difficult aspect is working out what barriers to contamination are required to meet the challenge from each hazardous event under consideration. This is the essence of good water quality management and a necessary step for any drinking water quality risk assessment. This risk assessment should be completed by people with relevant knowledge of water quality principles and experience with water quality management of water supply schemes. Likewise, such people are also required to use their knowledge and experience to assign likelihood to the matrix of challenge and barriers relevant to each hazardous event.

Once this framework is complete, the utility has a risk model it can apply consistently across all its schemes, regardless of type, size or location. This means the results are consistent, repeatable and are not dependent on who carries out the assessment.

The barrier risk assessment links directly to an operational monitoring programme which should be designed to regularly confirm that the estimated level of challenge has not been exceeded and the performance of barriers to contamination is adequate. Thus, the risk assessment is continuously validated, or early warning will be provided of a system change.

The Water Corporation of Western Australia supplies drinking water to the City of Perth with a population of over 2 million people and 220 regional and rural towns across spread across one-third of continental Australia. I served as the manager of Drinking Water Quality at the Water Corporation from 1996 to 2016, the last two years on secondment to the Water Services Association of Australia (WSAA) as Chair of the group preparing the Manual for the Application of Health-Based Targets for Drinking Water Safety (the HBT Manual). This paper is based primarily on my experience with water quality management at the Water Corporation where I was responsible for implementing the 1996, 2004, and 2011 versions of the Australian Drinking Water Guidelines (ADWG). I am a strong advocate of the ADWG, particularly the Framework for Management of Drinking Water Quality which was the centrepiece of the 2004 ADWG. The ADWG has always been my ‘bible’, but I found it necessary to develop supplementary internal water quality risk assessment guidance to ensure consistency of approach across all schemes and that the outcomes were accepted corporately, which is a prerequisite for obtaining scarce funds for water quality improvements. The work undertaken for source risk assessment at the Water Corporation in the early 2000s was the foundation for the HBT Manual, published by WSAA in 2015. The HBT Manual has been widely accepted and adopted within the Water Industry for assessing source risk and adequacy of source treatment. The approach (with some modification) was incorporated in the ADWG in 2022. The source risk methodology in the HBT Manual and ADWG is a ‘barrier’ approach. This paper builds on that success by explaining the ‘barrier’ approach for non-source risks which was also developed at the Water Corporation in the early 2000s and used internally ever since. This paper is based on my experience and recollections as manager of the Drinking Water Quality Branch at the Water Corporation. I extend my appreciation to all the staff who contributed to the development of the barrier approach for source and non-source risk assessments. I also acknowledge the support of the executive for their encouragement and subsequent endorsement of the approach and outcomes. Finally, I thank the current Head of Water Quality, Rachael Miller, for her support in sharing the ‘barrier’ approach with others outside the Water Corporation and continuing to work with the wider Australian Water Industry on the application of the ADWG by water utilities, with the objective of supporting the wider drinking water quality management community with this most important aspect of drinking water quality management.

1

Richard Walker was the manager of Drinking Water Quality at the Water Corporation of Western Australia from 1996 to 2016 during which the ‘barrier’ approach to water quality risk assessments was developed and implemented. The method has been successfully deployed for all Water Corporation schemes for nearly two decades. The examples and discussion in this paper are based on his experience at the Water Corporation, international collaborations, and subsequent consulting.

All relevant data are included in the paper or its Supplementary Information.

The authors declare there is no conflict.

Angulo
F. J.
,
Tippen
S.
,
Sharp
D. J.
,
Payne
B. J.
,
Collier
C.
,
Hill
J. E.
,
Barrett
T. J.
,
Clark
R. M.
,
Geldreich
E. E.
,
Donnell
H. D.
&
Swerdlow
D. L.
1997
A community waterborne outbreak of salmonellosis and the effectiveness of a boil water order
.
Am. J. Public Health
87
(
4
),
580
584
.
Clark
R. M.
,
Geldreich
E. E.
,
Fox
K. R.
,
Rice
E. W.
,
Johnson
C. H.
,
Goodrich
J. A.
,
Barnick
J. A.
&
Abdesaken
F.
1996
Tracking a Salmonella serovar typhimurium outbreak in Gideon, Missouri: role of contaminant propagation modelling
.
J. Water SRT – Aqua
45
(
4
),
171
183
.
Falco
R.
&
Williams
S. I.
2009
Waterborne Salmonella Outbreak in Alamosa, Colorado March and April 2008: Outbreak Identification, Response and Investigation
.
Colorado Department of Public Health and Environment. Safe Drinking Water Program, Water Quality Division
,
Denver, CO
.
Hrudey
S. E.
2001
Drinking water quality – a risk management approach
.
Water
26
(
1
),
29
32
.
Hrudey
S. E.
&
Hrudey
E. J.
2014
Ensuring Safe Drinking Water – Learning from Frontline Experience with Contamination
.
American Water Works Association
,
Denver, CO
, p.
269
.
Hrudey
S. E.
&
Walker
R.
2005
Walkerton – 5 years later. Tragedy could have been prevented
.
Opflow
31
(
6
),
1, 4
7
.
Miller
H.
,
Morgan
M.
,
Wylie
J.
,
Kaksonen
A.
,
Sutton
D.
,
Braun
K.
&
Puzon
G.
2017
Elimination of Naegleria fowleri from bulk water and biofilm in an operational drinking water distribution system
.
Water Res.
110
,
15
26
.
NHMRC
2022
Australian Drinking Water Guidelines 6
.
2011Version 3.8 Updated September 2022
.
National Health and Medical Research Council
,
Canberra
. .
O'Connor
D. R.
2002
Report of the Walkerton Inquiry – Part 1 – The Events of May 2000 and Related Issues
.
Available from: http://www.archives.gov.on.ca/en/ (accessed 10 April 2023)
.
Reason
J.
1990
Human Error
.
Cambridge University Press
,
Cambridge
,
UK
.
Reason
J.
2000
Human error: models and management
.
Brit. Med. J.
320
,
768
770
.
Trolio
R.
,
Bath
A.
,
Gordon
C.
,
Walker
R.
&
Wyber
A.
2008
Operational management of naegleria spp. In: drinking water supplies in western Australia
.
Water Sci. Technol. Water Supply
8
,
207
215
.
WSTWS/8.2/2008
.
Walker
R.
2016
The water safety continuum: a practical way to implement a health-based target for microbial quality
.
Water
1
(
1
),
008
.
WHO
2008
Guidelines for Drinking Water Quality
, 3rd edn.
WSAA
2015
Manual for the Application of Health-Based Targets for Drinking Water Safety
.
Water Services Association
,
Australia
. .
Wu
S.
,
Hrudey
S. E.
,
French
S.
,
Bedford
T.
,
Soane
E.
&
Pollard
S.
2009
A role for human reliability analysis in preventing drinking water incidents and securing safe drinking water
.
Water Res.
43
,
3227
3238
.
This is an Open Access article distributed under the terms of the Creative Commons Attribution Licence (CC BY-NC-ND 4.0), which permits copying and redistribution for non-commercial purposes with no derivatives, provided the original work is properly cited (http://creativecommons.org/licenses/by-nc-nd/4.0/).