Abstract

Inspection of deep tunnel networks is extremely challenging due to their inaccessibility, and them being an unknown and potentially hazardous environment. Unmanned aerial vehicles (UAVs) provide a viable alternative for access, and are unaffected by debris or sewer flow. However, commercial UAVs are designed for high altitude aerial imagery and are not appropriate for short-range detailed imaging of tunnel surfaces. In addition, autonomous flight is usually achieved using GPS, which is not available underground. This paper presents the design and development of a smart UAV platform, Surveyor with Intelligent Rotating Lens (SWIRL), customized for autonomous operation in tunnels. It can capture high resolution images for subsequent image processing, and defect detection and classification. An innovative rotating system enables undistorted imaging of the tunnel's inner circumference surface using a single camera. The proposed location method using limited data resulted in substantial unit weight and power consumption reductions, compared to existing systems, making more than 35 minutes of autonomous flight possible.

INTRODUCTION

The Deep Tunnel Sewerage System (DTSS) in Singapore comprises extensive underground sewerage infrastructure that requires periodic inspections. Phase I of DTSS is a 48 km-long sewer tunnel (3 to 6 m diameter) and is fully operational. It is protected by specially designed corrosion protection lining (CPL) (Loganathan et al. 2011), and regular inspection is required to monitor the physical condition of the CPL and the tunnel's structural integrity. However, the tunnel's remote location and the potentially hazardous environment within make manual inspection challenging and dangerous. This paper describes an autonomous aerial robot capable of accessing and acquiring visual information, to evaluate and determine the physical condition of the CPL and the structural integrity of the main tunnel.

Conventional pipeline and tunnel robots are not ideal for inspections of large diameter deep tunnel networks (Norbert et al. 2006; Law et al. 2015) like the DTSS. The deployment of additional winching and hoisting systems, for the insertion and subsequent retrieval of the robot, makes routine inspections cumbersome and complex (Chen et al. 2016). Furthermore, a fully operational sewerage tunnel with sewage, silt debris and other (unknown) obstacles at the bottom makes the locomotion of such robots extremely challenging. Unmanned aerial vehicles (UAVs) provide a viable alternative for inspecting deep tunnel networks. The manoeuvrability of UAVs enables access via vertical direct access shafts without the need to deploy winch and hoisting systems. Further, the ability to traverse 3D spaces means that aerial robots are relatively less affected by the presence of sewage flow, silts, debris, etc, in an operating sewage system. However, commercial off-the-shelf UAVs designed for high altitude aerial imagery miss the mark and are not suitable for tunnel inspection. The optics of commercial units are optimised for aerial imagery and are inappropriate for short-range, detailed tunnel surface imaging. UAVs designed for aerial use are also unable to fly autonomously in enclosed underground environments due to the lack of Global Positioning System (GPS) that they rely on for autonomous outdoor flight.

Existing navigation systems for similar GPS-denied tunnels rely on heavy, power-consuming sensing and computational setups, which limit the flight time of these UAVs to 15 minutes at most (Özaslan et al. 2015, 2016, 2017). Hence, a purpose-built UAV platform is needed to inspect tunnels autonomously. The design and development of a smart UAV platform – Surveyor With Intelligent Rotating Lens (SWIRL, Figure 1) – is described in this paper. SWIRL integrates a rotating camera system, an efficient propulsion powertrain and computationally lightweight sensing systems (see Figure 2) to enable extended navigation and inspection of the DTSS.

Figure 1

The SWIRL platform autonomously navigating a horizontal tunnel (left) and a vertical shaft (right).

Figure 1

The SWIRL platform autonomously navigating a horizontal tunnel (left) and a vertical shaft (right).

Figure 2

Key components of SWIRL.

Figure 2

Key components of SWIRL.

The approach was to develop a vision-based imaging system capable of capturing high resolution images of the DTSS's inner surface with minimal distortion, with its own UAV platform. A UAV platform is built around it. This is very different from most UAV-based applications, where extrinsic sensing systems are designed around existing UAVs. This methodology was inspired by the evolution of capsule endoscopy for advanced biomedical imaging. Capsule endoscopy uses a remote, self-contained capsule to examine parts (by capturing images) of the lower gastrointestinal tract that are not accessible using traditional tethered endoscopy methods. The capsule is the size of a pill, and contains a tiny camera and light-source, as well as all other electronics required for operation. Traditionally, the camera lies along the capsule's longitudinal axis to capture the forward and rear views. Because of the camera's mounting direction, they are normally paired and fitted with wide angle lenses to maximize the field of view. While this enables good imaging, image quality suffers from serious distortion from the wide-angle lens despite using high fidelity cameras. A recent approach (RF Co. 2001) alleviates this problem by repositioning the camera to minimize image distortion, perpendicular rather than parallel to the intestine surface. As this reduces the field of view (FOV), an actuator is incorporated to rotate the camera about its longitudinal axis. By rotating the camera and capturing the image data at the same time, and taking advantage of the natural forward motion of the pill along the intestine, the latter's entire inner surface can be imaged, with high fidelity and minimal optical distortion using a single camera. A curvilinear 2D map of the intestine can be reconstructed from the overlapped ‘mosaic’ images captured. The high quality of the image also allows large digital magnification of abnormalities for further analysis and classification.

The DTSS is similar to a gastrointestinal tract in some ways – it is fairly inaccessible and inspection requires high quality imaging for proper evaluation of its condition. A longitudinal camera can be fixed onto the front of a UAV flying along the tunnel centre to capture internal surface images but they will be distorted like those from a conventional capsule endoscope. In the same way, many commercial UAV platforms designed for aerial filming use a conventional mechanical gimbal system (mounted at the bottom or top of the UAV) to provide independent control of the camera's pan, roll and tilt angles. While this provides good manoeuvrability and hemispheric visual coverage, a single camera cannot cover the entire tunnel as the UAV's structural frame prevents an omnidirectional view.

SYSTEM DESCRIPTION

SWIRL houses a novel rotating camera system. It was designed to minimize potential obstructions to its FOV and captures spiral panoramic shots of the tunnel's inner surface that can subsequently be reconstructed offline for visual inspection of the walls. The mechanical frame is made of high strength-to-weight-ratio carbon-fibre-reinforced polymer (CFRP) to make the platform rugged and reduce its structural weight. The structure's backbone has a dual-Y shape, each Y-shaped sub-structure comprising custom sets of rods and plates. The powertrain – from DJI (Shenzhen, China) – is also mounted on the rods and plates so as to minimize occlusion of the camera system's FOV by the powertrain and/or propellers. The camera system rotates about a central 25 mm diameter CFRP rod, which joins the separate parts of the Y-structure together. The camera system, with its own battery and microcontroller, has independent control and power electronics from the aerial platform.

The electronic control, power, and sensing components are mounted on CFRP plates. The electrical wiring and digital signals for these components are routed through the central rod on which the camera system rotates. The control electronics are separated into flight avionics for low-level UAV stabilization and a companion computer for high-level instructions. The flight avionics consist of a Pixhawk 2.1 flight controller from ProfiCNC (Ballarat, Australia) with built-in inertial measurement unit and an Intel Edison companion computer, which runs the perception algorithm that interfaces with the various sensors, and supplies high-level command for navigation and obstacle avoidance to the Pixhawk. The navigation and obstacle avoidance sensing system comprises a lightweight array of six TeraRanger One rangefinders (10 g each) from Terabee (Saint-Genis-Pouilly, France) and a down-pointing SF11/C laser altimeter from LightWare (Gauteng, South Africa). The rangefinder array is arranged to localise effectively in both tunnels and shafts. This was done using a genetic algorithm, which searches large design spaces of possible sensor placement. It produces an optimised sensor configuration with the lowest UAV tracking error in tunnels and shafts for a given range of geometric parameters. The perception algorithm relies on a dynamic number of four to six sensors, allowing for redundancy in the sensing system, which accommodates a scenario where up to two sensors may fail. Finally, SWIRL is powered by a pair of 6S 10C Li-Ion batteries with a total effective capacity of 21,000 mAh. The system's mass is 5 kg. The mass distributions between the various sub-systems are given in Table 1.

Table 1

Mass distribution of SWIRL

SubsystemWeight
(g)(%)
Mechanical frame 828 16.0 
Powertrain 1,380 27.6 
Rotating camera system 372 8.0 
Flight avionics 100 2.0 
Battery 1,750 35.0 
Sensing array 110 2.2 
Companion computer 30 0.6 
Total 5,000 100 
SubsystemWeight
(g)(%)
Mechanical frame 828 16.0 
Powertrain 1,380 27.6 
Rotating camera system 372 8.0 
Flight avionics 100 2.0 
Battery 1,750 35.0 
Sensing array 110 2.2 
Companion computer 30 0.6 
Total 5,000 100 

360° revolving camera system and imaging

In the rotating system, the camera is positioned with its optical axis perpendicular to the tunnel wall – see Figure 3. The FOV is shown by the red projection onto the tunnel's inner surface. Illumination is provided to the camera's FOV by energy efficient, high luminance LEDs integrated onto the camera system. As the UAV moves at constant linear velocity along the camera's Y-axis, each camera rotation will take an image of a different part of the tunnel surface, not only on the u-axis but also the v-axis (along the tunnel's Y-axis) – see Figure 3. In Equation (1), the camera's inclination is denoted by θ, and its position along the tunnel's longitudinal axis by Yc. If the imaging path is plotted over the tunnel surface, it traces out a spiral, and if the camera's linear velocity and rotation speed are appropriate, the entire tunnel surface will be mapped in an orderly and structured manner without requiring abrupt motions of the camera. The camera system's rotation rate and the UAV's speed of movement are strongly related to ensure sufficient overlay for high quality reconstruction of the spiral panoramic images. Given a required lateral and radial overlap, and the camera's rotation rate, and UAV's speed of movement along the tunnel, , is given by Equation (1):
formula
(1)
and
  • radial FOV

  • lateral FOV

  • Percentage radial overlap

  • Percentage lateral overlap

  • Camera shutter period

  • Driver gear teeth

  • Follower gear teeth

  • Revolution rate of stepper motor

  • Tunnel radius

Figure 3

Left: Acquiring a series of images at different camera inclinations and using the set of images to create a composite stitched image, which can be reconstructed geometrically. Right: Prototype revolving camera system.

Figure 3

Left: Acquiring a series of images at different camera inclinations and using the set of images to create a composite stitched image, which can be reconstructed geometrically. Right: Prototype revolving camera system.

No such rotary imaging system exists, thus that shown in Figure 3 was developed. It consists of a stepper motor with 1.8° step angle controlled digitally by a Teensy 3.5 microcontroller from PJRC (Sherwood, USA) on-board. A step-up gearing transmission (ratio 16:48) has the driver gear attached to a stepper motor and the follower gear fixed rigidly on the central carbon fibre rod. As a result, the stepper motor actuates the camera's inclination in precise, 0.6° steps, with a mechanical advantage of 3. The rotating camera also has built in real-time video streaming. A Li-Po battery powers all components. To ensure that the stepper motor has sufficient torque to drive the assembly, the components are arranged so that their moments are balanced – i.e., there are no net moments – and a bushing reduces the friction between the rotating assembly and the central carbon fibre rod.

Localization and navigation in the tunnel

The localization and navigation sensing system comprises the lightweight planar array of laser rangefinders – Figure 4. Using the prior knowledge of the local environment's geometry, the perception algorithm translates the range input from the sensing array into the position and heading estimates for the UAV in the tunnel. The range input point-cloud is used to fit a circle or a pair of parallel lines for navigation – see Figure 4. These parameters are used to determine the UAV's location in relation to the origin of the fitted geometry.

Figure 4

Measurements from the rangefinders are used to fit a circle (left) or a pair of parallel lines (right), in the least squares sense, to determine the UAV's 2D planar position and heading in a shaft or tunnel respectively.

Figure 4

Measurements from the rangefinders are used to fit a circle (left) or a pair of parallel lines (right), in the least squares sense, to determine the UAV's 2D planar position and heading in a shaft or tunnel respectively.

To determine how far along the tunnel the UAV has travelled, an optical flow sensor and time-of-flight laser rangefinder augment the planar array sensing system (see Figure 5). Their combined purpose is to provide a closed-loop feedback of the motion, typically provided by GPS for over-ground flight, along the tunnel axis. This is achieved by estimating the apparent motion of the structure between consecutive vision sensor frames and accumulating the estimated instantaneous displacement.

Figure 5

Visual Odometry System mounted on SWIRL.

Figure 5

Visual Odometry System mounted on SWIRL.

RESULTS AND DISCUSSION

360° revolving camera system and imaging

The camera system was tested at Connaught Drive underpass, Singapore (see Figure 6(a)), whose cylindrical geometry and flat base mimics a tunnel partially filled with sewage. The image stitching algorithms worked well. Three sample images are shown in Figure 6(b), while Figure 6(c) shows a single strip from the algorithm's panorama output – the system can be seen working at: https://youtu.be/gHReZ_mjTjs.

Figure 6

(a) Connaught Drive underpass used to evaluate the spiral panoramic stitching system; (b) three sample images from the experiment, and (c) strip segment of panoramic stitching.

Figure 6

(a) Connaught Drive underpass used to evaluate the spiral panoramic stitching system; (b) three sample images from the experiment, and (c) strip segment of panoramic stitching.

Autonomous flight in tunnels and shafts

The autonomous navigation system was tested in simulation using Gazebo with RotorS (Furrer et al. 2016). The UAV model used in the simulation was identical to that developed, to enable accurate simulation of the system's physical dynamics. The algorithm was tested on the actual UAV in the field. The experimental results are summarized in Table 2.

Table 2

Root mean square (rms) and maximum errors across the autonomous navigation tests

Environmentrms errormax error
(m)(m)
(a) 0.13 0.41 
(b) 0.53 1.44 
Environmentrms errormax error
(m)(m)
(a) 0.13 0.41 
(b) 0.53 1.44 

Autonomous flight in tunnels

To evaluate the system's performance in a tunnel, the UAV was tested in the covered section of the Eu Tong Sen Canal (see Figure 7(a) and Table 3), which simulates the poor illumination in the DTSS tunnel. UAV take-off and landing were performed manually by a human pilot but, once the UAV was near the tunnel's centre axis, it was switched to autonomous mode. In automatic mode the UAV tries to stay centrally between the tunnel walls and at a set altitude – its horizontal speed was controlled by the human pilot. Autonomous flight was maintained for 36 seconds while travelling 45 m horizontally. Figure 8 shows the horizontal position error (y-axis) of the UAV while travelling through the tunnel. At 28 seconds, there is a sudden spike in the position error caused by an outlet on the tunnel's left wall. For the entire distance travelled autonomously, the UAV's rms position error was 0.13 m and its maximum deviation from the centreline 0.41 m. (It can be seen operating at: https://youtu.be/_qyYFYeT02Q).

Table 3

Dimension of the tunnels and shafts shown in Figure 7 

EnvironmentLwHr
(m)(m)(m)(m)
(a) 45 – 
(b) – – 45 2.5 
EnvironmentLwHr
(m)(m)(m)(m)
(a) 45 – 
(b) – – 45 2.5 
Figure 7

Sites for experimental performance evaluation of the autonomous navigation. The horizontal tunnel (a), and the vertical shaft (b).

Figure 7

Sites for experimental performance evaluation of the autonomous navigation. The horizontal tunnel (a), and the vertical shaft (b).

Figure 8

Estimated position, set-point, and absolute error.

Figure 8

Estimated position, set-point, and absolute error.

Autonomous flight in shafts

The second experiment was evaluation of the UAV's autonomous performance in vertical shafts, simulating access or exit from the DTSS tunnel (Figure 7(b)). The UAV was inserted into an actual DTSS entry shaft via a 1 m manhole at ground level. It had a safety tether attached to a winching system for emergency retrieval if necessary. A human pilot controlled take-off and flight to an initial position, where the UAV was roughly at the shaft's centreline. Once in autonomous mode, the UAV automatically tried to maintain its position in the shaft centre. Its vertical speed (and hence altitude) were controlled manually by the pilot. The UAV flew autonomously for about 4.5 minutes through a total vertical distance of approximately 8 m – Figure 9. As opening the access shaft's manhole causes venting from DTSS, which is pressurised, air escapes at high velocity from the manhole – wind speeds of up to 16 m/s were measured at the opening. This constant updraft affected UAV flight performance adversely, which was made worse because flying a multi-rotor craft in an enclosed environment fosters turbulence. As a result, the UAV's rms position error for the vertical flight test was 0.53 m with maximum deviation from the shaft centreline of 1.44 m, higher than the errors from the horizontal trial at the canal. (The UAV can be seen operating at: https://youtu.be/cAZB3ULbJzA).

Figure 9

Estimated position and position error during autonomous flight.

Figure 9

Estimated position and position error during autonomous flight.

Endurance evaluation

To assess the system's operational endurance, the UAV was commanded to maintain its centreline position automatically in a vertical mock-up of a shaft. In this configuration with all systems operating, the UAV flew for 35 mins and 41 seconds, during which time the rms and maximum errors were 0.16 and 0.46 m respectively.

Optical flow sensing evaluation

The first of these experiments tested the y-axis performance in the open section of Eu Tong Sen Canal. SWIRL was flown between two known points 31 m apart along the canal. The distance between the points was measured using a Fluke 414D distance measuring laser from Fluke (Everett, US). The distance recorded by the optical flow sensor was 31.44 m, a 1.42% deviation from the ground truth. The second experiment tested the x-axis performance indoors, where the optical flow sensor was compared with a downward-pointing SF11/C laser altimeter on-board. Figure 10 shows the vertical flight path of the quadrotor reported by the sensors. Both sensors show similar performance throughout the flight, with a maximum absolute deviation of 0.17 m (Figure 11) and a mean deviation of 3.61% from the LIDAR.

Figure 10

Vertical displacement test against LIDAR.

Figure 10

Vertical displacement test against LIDAR.

Figure 11

Absolute error between Optical Flow and LIDAR.

Figure 11

Absolute error between Optical Flow and LIDAR.

The results show that the optical flow sensor is a suitable alternative for altitude control algorithms that typically combine data from a barometer and an external range sensor, such as an ultrasonic sensor, for a reliable altitude estimate. The sensor can provide an accurate estimate of change in position over time along the y-axis even without compensation data from other sensors.

CONCLUSIONS

A novel UAV and imaging system for tunnel inspection has been designed to enable efficient tunnel inspection. The system features a rotary imaging system that allows the entire tunnel wall surface to be imaged systematically. The UAV's autonomous localisation and navigation systems also proved successful.

ACKNOWLEDGEMENTS

This project is supported by the National Research Foundation, Prime Minister's Office, Singapore under its Environment & Water Research Programme (Project Ref No. 1502-IRIS-02). This programme is administered by PUB, Singapore's national water agency.

REFERENCES

REFERENCES
Chen
I.-M.
,
Asadi
E.
,
Nie
J.
,
Yan
R.-J.
,
Law
W. C.
,
Kayacan
E.
&
Tiong
R.
2016
Innovations in Infrastructure Service Robots
.
ROMANSY 21 - Robot Design, Dynamics and Control
569
,
3
16
.
Furrer
F.
,
Burri
M.
,
Achtelik
M.
&
Siegwart
R.
2016
Robot operating system (ROS): the complete reference
.
RotorS---A Modular Gazebo MAV Simulator Framework
625
(
SCI
),
595
625
.
Law
W.-C.
,
Chen
I.-M.
,
Yeo
S.-H.
,
Seet
G.-L.
&
Low
K.-H.
2015
A study of in-pipe robots for maintenance of large-diameter sewerage tunnel
.
The 14th IFToMM World Congress
3
,
225
232
.
Loganathan
L. N.
,
Carroll
J. O.
,
Flanagan
R.
&
Van Weele
B.
2011
Corrosion Protection Lining (CPL) for the Deep Tunnel Sewer System in Singapore – A Case History
. In:
Geo-Frontiers 2011
.
ASCE
,
Dallas, Texas
,
United States
, pp.
1902
1911
.
Norbert
E.
,
Kutzer
S.
,
Saenz
J.
,
Reimann
B.
,
Schultke
F.
&
Athoff
H.
2006
Fully automatic inspection systems for large underground concrete pipes partially filled with wastewater
. In:
Intelligent Robots and Systems, 2006 IEEE/RSJ International Conference
, pp.
4234
4238
.
Özaslan
T.
,
Shen
S.
,
Yash
M.
,
Michael
N.
&
Vijay
K.
2015
Inspection of penstocks and featureless tunnel-like environments using micro UAVs
.
Field and Service Robotics: Results of the 9th International Conference
105
(
STAR
),
123
136
.
Özaslan
T.
,
Kartik
M.
,
James
K.
,
Yash
M.
,
Camillo
J. T.
,
Vijay
K.
&
Thomas
H.
2016
Towards fully autonomous visual inspection of dark featureless dam penstocks using MAVs
. In:
2016 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)
, pp.
4998
5005
.
Özaslan
T.
,
Giuseppe
L.
,
James
K.
,
Camillo
J. T.
,
Vijay
K.
,
Jennifer
M. W.
&
Thomas
H.
2017
Autonomous navigation and mapping for inspection of penstocks and tunnels with MAVs
.
IEEE Robotics and Automation Letters
2
(
3
),
1740
1747
.
RF Co., L.
2001
Sayaka: The Next Generation Capsule Endoscope
.
Nagano
,
Japan
.
http://rfsystemlab.com/en/sayaka/ (accessed 30 April 2018)
.