Tipping bucket rain gauges (TBR) have become the most common device for measuring rainfall intensity in urban hydrology. Due to the measurement principle, the time resolution depends on rainfall intensity and bucket size. The present study investigated the influence of calibration uncertainties and bucket size on the accuracy of rainfall measurement and runoff simulation. Synthetic rainfall events with a time resolution of 6 seconds were generated from measured data. These rainfall series were taken as input to a model that simulated a TBR. Different TBR data series were produced by changing calibration parameters and bucket size of the simulated rain gauge. These data series together with the original rainfall events were used as input to a rainfall-runoff model. Computed runoff and overflow volume from a CSO weir were compared. The differences in rainfall depth, intensity peak and computed runoff due to the depth resolution of the TBR were smaller than expected. A depth resolution of the TBR of 0.2 - 0.3 mm per tip seems to fulfil the requirements in urban hydrology. Errors resulting from depth resolution are small compared to those of calibration (especially false rainfall depth per tip), site exposure, the influence of wind or disregarded areal rainfall distribution.

You do not currently have access to this content.