With increasing interest in the implementation/functionality of a contaminant warning system for water distribution systems, questions exist over the application to a real distribution system. A methodology is described to assess the impacts of changes in the numbers of sensors, on the time delay required to detect a contaminant intrusion event and to maximize sensor detection redundancy as protection against false positives. The methodology is used to explore the point of diminishing marginal return of detection likelihood, and the average time delay of detected intrusion events. Pareto front performance improvement with increasing numbers of sensors (from 2 through 50) is characterized through a case study application to the City of Guelph water distribution system (WDS). The results provide a methodology for utilities to employ for decisions on the number of sensors to use for a system. Within the two scenarios applied, five and four sensors are shown to be the point of diminishing marginal return for Guelph WDS in terms of the Pareto front performance improvement, detection likelihood, and the average time delay for the case study. Nevertheless, given that the timeframe to detect a contamination event may be lengthy, placing more sensors than the point of diminishing marginal return may be appropriate.

This content is only available as a PDF.
You do not currently have access to this content.