Abstract
Inspection of deep tunnel networks is extremely challenging due to their inaccessibility, and them being an unknown and potentially hazardous environment. Unmanned aerial vehicles (UAVs) provide a viable alternative for access, and are unaffected by debris or sewer flow. However, commercial UAVs are designed for high altitude aerial imagery and are not appropriate for short-range detailed imaging of tunnel surfaces. In addition, autonomous flight is usually achieved using GPS, which is not available underground. This paper presents the design and development of a smart UAV platform, Surveyor with Intelligent Rotating Lens (SWIRL), customized for autonomous operation in tunnels. It can capture high resolution images for subsequent image processing, and defect detection and classification. An innovative rotating system enables undistorted imaging of the tunnel's inner circumference surface using a single camera. The proposed location method using limited data resulted in substantial unit weight and power consumption reductions, compared to existing systems, making more than 35 minutes of autonomous flight possible.
INTRODUCTION
The Deep Tunnel Sewerage System (DTSS) in Singapore comprises extensive underground sewerage infrastructure that requires periodic inspections. Phase I of DTSS is a 48 km-long sewer tunnel (3 to 6 m diameter) and is fully operational. It is protected by specially designed corrosion protection lining (CPL) (Loganathan et al. 2011), and regular inspection is required to monitor the physical condition of the CPL and the tunnel's structural integrity. However, the tunnel's remote location and the potentially hazardous environment within make manual inspection challenging and dangerous. This paper describes an autonomous aerial robot capable of accessing and acquiring visual information, to evaluate and determine the physical condition of the CPL and the structural integrity of the main tunnel.
Conventional pipeline and tunnel robots are not ideal for inspections of large diameter deep tunnel networks (Norbert et al. 2006; Law et al. 2015) like the DTSS. The deployment of additional winching and hoisting systems, for the insertion and subsequent retrieval of the robot, makes routine inspections cumbersome and complex (Chen et al. 2016). Furthermore, a fully operational sewerage tunnel with sewage, silt debris and other (unknown) obstacles at the bottom makes the locomotion of such robots extremely challenging. Unmanned aerial vehicles (UAVs) provide a viable alternative for inspecting deep tunnel networks. The manoeuvrability of UAVs enables access via vertical direct access shafts without the need to deploy winch and hoisting systems. Further, the ability to traverse 3D spaces means that aerial robots are relatively less affected by the presence of sewage flow, silts, debris, etc, in an operating sewage system. However, commercial off-the-shelf UAVs designed for high altitude aerial imagery miss the mark and are not suitable for tunnel inspection. The optics of commercial units are optimised for aerial imagery and are inappropriate for short-range, detailed tunnel surface imaging. UAVs designed for aerial use are also unable to fly autonomously in enclosed underground environments due to the lack of Global Positioning System (GPS) that they rely on for autonomous outdoor flight.
Existing navigation systems for similar GPS-denied tunnels rely on heavy, power-consuming sensing and computational setups, which limit the flight time of these UAVs to 15 minutes at most (Özaslan et al. 2015, 2016, 2017). Hence, a purpose-built UAV platform is needed to inspect tunnels autonomously. The design and development of a smart UAV platform – Surveyor With Intelligent Rotating Lens (SWIRL, Figure 1) – is described in this paper. SWIRL integrates a rotating camera system, an efficient propulsion powertrain and computationally lightweight sensing systems (see Figure 2) to enable extended navigation and inspection of the DTSS.
The SWIRL platform autonomously navigating a horizontal tunnel (left) and a vertical shaft (right).
The SWIRL platform autonomously navigating a horizontal tunnel (left) and a vertical shaft (right).
The approach was to develop a vision-based imaging system capable of capturing high resolution images of the DTSS's inner surface with minimal distortion, with its own UAV platform. A UAV platform is built around it. This is very different from most UAV-based applications, where extrinsic sensing systems are designed around existing UAVs. This methodology was inspired by the evolution of capsule endoscopy for advanced biomedical imaging. Capsule endoscopy uses a remote, self-contained capsule to examine parts (by capturing images) of the lower gastrointestinal tract that are not accessible using traditional tethered endoscopy methods. The capsule is the size of a pill, and contains a tiny camera and light-source, as well as all other electronics required for operation. Traditionally, the camera lies along the capsule's longitudinal axis to capture the forward and rear views. Because of the camera's mounting direction, they are normally paired and fitted with wide angle lenses to maximize the field of view. While this enables good imaging, image quality suffers from serious distortion from the wide-angle lens despite using high fidelity cameras. A recent approach (RF Co. 2001) alleviates this problem by repositioning the camera to minimize image distortion, perpendicular rather than parallel to the intestine surface. As this reduces the field of view (FOV), an actuator is incorporated to rotate the camera about its longitudinal axis. By rotating the camera and capturing the image data at the same time, and taking advantage of the natural forward motion of the pill along the intestine, the latter's entire inner surface can be imaged, with high fidelity and minimal optical distortion using a single camera. A curvilinear 2D map of the intestine can be reconstructed from the overlapped ‘mosaic’ images captured. The high quality of the image also allows large digital magnification of abnormalities for further analysis and classification.
The DTSS is similar to a gastrointestinal tract in some ways – it is fairly inaccessible and inspection requires high quality imaging for proper evaluation of its condition. A longitudinal camera can be fixed onto the front of a UAV flying along the tunnel centre to capture internal surface images but they will be distorted like those from a conventional capsule endoscope. In the same way, many commercial UAV platforms designed for aerial filming use a conventional mechanical gimbal system (mounted at the bottom or top of the UAV) to provide independent control of the camera's pan, roll and tilt angles. While this provides good manoeuvrability and hemispheric visual coverage, a single camera cannot cover the entire tunnel as the UAV's structural frame prevents an omnidirectional view.
SYSTEM DESCRIPTION
SWIRL houses a novel rotating camera system. It was designed to minimize potential obstructions to its FOV and captures spiral panoramic shots of the tunnel's inner surface that can subsequently be reconstructed offline for visual inspection of the walls. The mechanical frame is made of high strength-to-weight-ratio carbon-fibre-reinforced polymer (CFRP) to make the platform rugged and reduce its structural weight. The structure's backbone has a dual-Y shape, each Y-shaped sub-structure comprising custom sets of rods and plates. The powertrain – from DJI (Shenzhen, China) – is also mounted on the rods and plates so as to minimize occlusion of the camera system's FOV by the powertrain and/or propellers. The camera system rotates about a central 25 mm diameter CFRP rod, which joins the separate parts of the Y-structure together. The camera system, with its own battery and microcontroller, has independent control and power electronics from the aerial platform.
The electronic control, power, and sensing components are mounted on CFRP plates. The electrical wiring and digital signals for these components are routed through the central rod on which the camera system rotates. The control electronics are separated into flight avionics for low-level UAV stabilization and a companion computer for high-level instructions. The flight avionics consist of a Pixhawk 2.1 flight controller from ProfiCNC (Ballarat, Australia) with built-in inertial measurement unit and an Intel Edison companion computer, which runs the perception algorithm that interfaces with the various sensors, and supplies high-level command for navigation and obstacle avoidance to the Pixhawk. The navigation and obstacle avoidance sensing system comprises a lightweight array of six TeraRanger One rangefinders (10 g each) from Terabee (Saint-Genis-Pouilly, France) and a down-pointing SF11/C laser altimeter from LightWare (Gauteng, South Africa). The rangefinder array is arranged to localise effectively in both tunnels and shafts. This was done using a genetic algorithm, which searches large design spaces of possible sensor placement. It produces an optimised sensor configuration with the lowest UAV tracking error in tunnels and shafts for a given range of geometric parameters. The perception algorithm relies on a dynamic number of four to six sensors, allowing for redundancy in the sensing system, which accommodates a scenario where up to two sensors may fail. Finally, SWIRL is powered by a pair of 6S 10C Li-Ion batteries with a total effective capacity of 21,000 mAh. The system's mass is 5 kg. The mass distributions between the various sub-systems are given in Table 1.
Mass distribution of SWIRL
Subsystem . | Weight . | |
---|---|---|
(g) . | (%) . | |
Mechanical frame | 828 | 16.0 |
Powertrain | 1,380 | 27.6 |
Rotating camera system | 372 | 8.0 |
Flight avionics | 100 | 2.0 |
Battery | 1,750 | 35.0 |
Sensing array | 110 | 2.2 |
Companion computer | 30 | 0.6 |
Total | 5,000 | 100 |
Subsystem . | Weight . | |
---|---|---|
(g) . | (%) . | |
Mechanical frame | 828 | 16.0 |
Powertrain | 1,380 | 27.6 |
Rotating camera system | 372 | 8.0 |
Flight avionics | 100 | 2.0 |
Battery | 1,750 | 35.0 |
Sensing array | 110 | 2.2 |
Companion computer | 30 | 0.6 |
Total | 5,000 | 100 |
360° revolving camera system and imaging

radial FOV
lateral FOV
Percentage radial overlap
Percentage lateral overlap
Camera shutter period
Driver gear teeth
Follower gear teeth
Revolution rate of stepper motor
Tunnel radius
Left: Acquiring a series of images at different camera inclinations and using the set of images to create a composite stitched image, which can be reconstructed geometrically. Right: Prototype revolving camera system.
Left: Acquiring a series of images at different camera inclinations and using the set of images to create a composite stitched image, which can be reconstructed geometrically. Right: Prototype revolving camera system.
No such rotary imaging system exists, thus that shown in Figure 3 was developed. It consists of a stepper motor with 1.8° step angle controlled digitally by a Teensy 3.5 microcontroller from PJRC (Sherwood, USA) on-board. A step-up gearing transmission (ratio 16:48) has the driver gear attached to a stepper motor and the follower gear fixed rigidly on the central carbon fibre rod. As a result, the stepper motor actuates the camera's inclination in precise, 0.6° steps, with a mechanical advantage of 3. The rotating camera also has built in real-time video streaming. A Li-Po battery powers all components. To ensure that the stepper motor has sufficient torque to drive the assembly, the components are arranged so that their moments are balanced – i.e., there are no net moments – and a bushing reduces the friction between the rotating assembly and the central carbon fibre rod.
Localization and navigation in the tunnel
The localization and navigation sensing system comprises the lightweight planar array of laser rangefinders – Figure 4. Using the prior knowledge of the local environment's geometry, the perception algorithm translates the range input from the sensing array into the position and heading estimates for the UAV in the tunnel. The range input point-cloud is used to fit a circle or a pair of parallel lines for navigation – see Figure 4. These parameters are used to determine the UAV's location in relation to the origin of the fitted geometry.
Measurements from the rangefinders are used to fit a circle (left) or a pair of parallel lines (right), in the least squares sense, to determine the UAV's 2D planar position and heading in a shaft or tunnel respectively.
Measurements from the rangefinders are used to fit a circle (left) or a pair of parallel lines (right), in the least squares sense, to determine the UAV's 2D planar position and heading in a shaft or tunnel respectively.
To determine how far along the tunnel the UAV has travelled, an optical flow sensor and time-of-flight laser rangefinder augment the planar array sensing system (see Figure 5). Their combined purpose is to provide a closed-loop feedback of the motion, typically provided by GPS for over-ground flight, along the tunnel axis. This is achieved by estimating the apparent motion of the structure between consecutive vision sensor frames and accumulating the estimated instantaneous displacement.
RESULTS AND DISCUSSION
360° revolving camera system and imaging
The camera system was tested at Connaught Drive underpass, Singapore (see Figure 6(a)), whose cylindrical geometry and flat base mimics a tunnel partially filled with sewage. The image stitching algorithms worked well. Three sample images are shown in Figure 6(b), while Figure 6(c) shows a single strip from the algorithm's panorama output – the system can be seen working at: https://youtu.be/gHReZ_mjTjs.
(a) Connaught Drive underpass used to evaluate the spiral panoramic stitching system; (b) three sample images from the experiment, and (c) strip segment of panoramic stitching.
(a) Connaught Drive underpass used to evaluate the spiral panoramic stitching system; (b) three sample images from the experiment, and (c) strip segment of panoramic stitching.
Autonomous flight in tunnels and shafts
The autonomous navigation system was tested in simulation using Gazebo with RotorS (Furrer et al. 2016). The UAV model used in the simulation was identical to that developed, to enable accurate simulation of the system's physical dynamics. The algorithm was tested on the actual UAV in the field. The experimental results are summarized in Table 2.
Root mean square (rms) and maximum errors across the autonomous navigation tests
Environment . | rms error . | max error . |
---|---|---|
(m) . | (m) . | |
(a) | 0.13 | 0.41 |
(b) | 0.53 | 1.44 |
Environment . | rms error . | max error . |
---|---|---|
(m) . | (m) . | |
(a) | 0.13 | 0.41 |
(b) | 0.53 | 1.44 |
Autonomous flight in tunnels
To evaluate the system's performance in a tunnel, the UAV was tested in the covered section of the Eu Tong Sen Canal (see Figure 7(a) and Table 3), which simulates the poor illumination in the DTSS tunnel. UAV take-off and landing were performed manually by a human pilot but, once the UAV was near the tunnel's centre axis, it was switched to autonomous mode. In automatic mode the UAV tries to stay centrally between the tunnel walls and at a set altitude – its horizontal speed was controlled by the human pilot. Autonomous flight was maintained for 36 seconds while travelling 45 m horizontally. Figure 8 shows the horizontal position error (y-axis) of the UAV while travelling through the tunnel. At 28 seconds, there is a sudden spike in the position error caused by an outlet on the tunnel's left wall. For the entire distance travelled autonomously, the UAV's rms position error was 0.13 m and its maximum deviation from the centreline 0.41 m. (It can be seen operating at: https://youtu.be/_qyYFYeT02Q).
Dimension of the tunnels and shafts shown in Figure 7
Environment . | L . | w . | H . | r . |
---|---|---|---|---|
(m) . | (m) . | (m) . | (m) . | |
(a) | 45 | 6 | 2 | – |
(b) | – | – | 45 | 2.5 |
Environment . | L . | w . | H . | r . |
---|---|---|---|---|
(m) . | (m) . | (m) . | (m) . | |
(a) | 45 | 6 | 2 | – |
(b) | – | – | 45 | 2.5 |
Sites for experimental performance evaluation of the autonomous navigation. The horizontal tunnel (a), and the vertical shaft (b).
Sites for experimental performance evaluation of the autonomous navigation. The horizontal tunnel (a), and the vertical shaft (b).
Autonomous flight in shafts
The second experiment was evaluation of the UAV's autonomous performance in vertical shafts, simulating access or exit from the DTSS tunnel (Figure 7(b)). The UAV was inserted into an actual DTSS entry shaft via a 1 m manhole at ground level. It had a safety tether attached to a winching system for emergency retrieval if necessary. A human pilot controlled take-off and flight to an initial position, where the UAV was roughly at the shaft's centreline. Once in autonomous mode, the UAV automatically tried to maintain its position in the shaft centre. Its vertical speed (and hence altitude) were controlled manually by the pilot. The UAV flew autonomously for about 4.5 minutes through a total vertical distance of approximately 8 m – Figure 9. As opening the access shaft's manhole causes venting from DTSS, which is pressurised, air escapes at high velocity from the manhole – wind speeds of up to 16 m/s were measured at the opening. This constant updraft affected UAV flight performance adversely, which was made worse because flying a multi-rotor craft in an enclosed environment fosters turbulence. As a result, the UAV's rms position error for the vertical flight test was 0.53 m with maximum deviation from the shaft centreline of 1.44 m, higher than the errors from the horizontal trial at the canal. (The UAV can be seen operating at: https://youtu.be/cAZB3ULbJzA).
Endurance evaluation
To assess the system's operational endurance, the UAV was commanded to maintain its centreline position automatically in a vertical mock-up of a shaft. In this configuration with all systems operating, the UAV flew for 35 mins and 41 seconds, during which time the rms and maximum errors were 0.16 and 0.46 m respectively.
Optical flow sensing evaluation
The first of these experiments tested the y-axis performance in the open section of Eu Tong Sen Canal. SWIRL was flown between two known points 31 m apart along the canal. The distance between the points was measured using a Fluke 414D distance measuring laser from Fluke (Everett, US). The distance recorded by the optical flow sensor was 31.44 m, a 1.42% deviation from the ground truth. The second experiment tested the x-axis performance indoors, where the optical flow sensor was compared with a downward-pointing SF11/C laser altimeter on-board. Figure 10 shows the vertical flight path of the quadrotor reported by the sensors. Both sensors show similar performance throughout the flight, with a maximum absolute deviation of 0.17 m (Figure 11) and a mean deviation of 3.61% from the LIDAR.
The results show that the optical flow sensor is a suitable alternative for altitude control algorithms that typically combine data from a barometer and an external range sensor, such as an ultrasonic sensor, for a reliable altitude estimate. The sensor can provide an accurate estimate of change in position over time along the y-axis even without compensation data from other sensors.
CONCLUSIONS
A novel UAV and imaging system for tunnel inspection has been designed to enable efficient tunnel inspection. The system features a rotary imaging system that allows the entire tunnel wall surface to be imaged systematically. The UAV's autonomous localisation and navigation systems also proved successful.
ACKNOWLEDGEMENTS
This project is supported by the National Research Foundation, Prime Minister's Office, Singapore under its Environment & Water Research Programme (Project Ref No. 1502-IRIS-02). This programme is administered by PUB, Singapore's national water agency.