Research and Innovation

Drone Sense: Opportunities for Novel Sensor Systems

 
April 10, 2018

The paradigm of mobile sensing, where a mobile data mule carrying multiple sensors goes around a region collecting data, is an emerging facet of IoT. Compared to handhelds, vehicles, and ground robots, drones offer a great platform for mobile 4D sensing, and early experiments are already hinting at some great potential applications. Let’s explore some of the sensing modalitiesuseful for such applications, and the challenges that research is focused on solving.

Visual Sensing

Drones can carry a variety of sensors as payload for data collection. Each new sensor opens up a new data stream and offers a potentially new way to solve an existing or altogether new inspection problem. The most popular drone sensor used is the visual camera, used to capture images and videos of objects of interest.

Major classes of object information sought are presence, location, pose, identity, visual properties (including color, shape, texture, etc.), and spatial relations between objects. These attributes are then used for further analysis to answer high-level queries of business interest such as object counting, change detection, deformation, or damage estimation, and so on.

Drone image analysis poses several challenges in both indoor and outdoor environments. In many cases, the objects of interest are too small in the frame since a closer view is not feasible due to drone aerodynamic constraints. The quality of images can be enhanced by taking advantage of multiple frames and fusing them through super resolution techniques. We have used this approach to enhance the accuracy of barcode decoding on the fly.

One of the challenges we faced in our experiments with object detection in certain inspection scenarios was discriminating an object from a background of similar color and appearance. For example, inspecting a black rubber gasket or cable against a black metallic background. We used a camera array with polarizers at multiple angles to identify the material type, which in turn helped improve object detection and segmentation. This sensor assembly is now being integrated on drones for machine inspection tasks.

Deep neural networks are becoming adept at computer vision tasks, including image classification, detection, recognition, and segmentation. Convolutional Neural Networks (CNN) have been successfully applied for many of the multi-object classification and location applications. However, the use of CNNs requires substantial annotated training data. This is a concern for some in the drone application domain, as there are no standard drone data sets available to generate pre-trained models.

Beyond Vision

Vision is a small part of the whole sensing spectrum. Beyond the visible spectrum, an extended range of sensors can be used, such as thermal, multi-spectral, hyper spectral, acoustic, radar, and laser sensors. We have also tested something called ‘aerial drones with ears’, wherein acoustic sensing is used to locate sound sources. Potential application scenarios include locating a shout for help in a disaster scenario, receiving voice requests from field workers, capturing conversations through a crowd monitoring application, inspecting the hum of an electric transformer in the field, and so on.

The major challenge in this application was overcoming the dominating propeller noise. We used a microphone array with smart signal processing to achieve interesting results. This work has been recognized with the ‘Best Poster’ award at Sensys 2016. An extended project on this has been accepted as a full paper in ICRA 2018.

With sparsely distributed short-range sensor nodes such as RFID (which cannot be networked), a drone can be used as a mobile gateway to fly past and collect sensor data. To enable low-power operation, drones can wake the sensors up from deep sleep using radio waves, optical pulses, or even the rotor noise as a trigger. For example, a building inspection involves reading temperature, humidity, and strain sensors embedded inside walls.

In such a scenario, a drone will go around and collect data from these sensors by activating them. For passive embedded sensors, the triggering modality can also be used for wireless charging. We have validated a representative scenario where a drone uses a flashlight to wake up and charge a sensor node, and collect its data.

Pushing the Boundaries

But there are other challenging situations where novel and cost-effective sensing approaches are required. For example, seeing through canopy coverage is a challenge in drone-based road surveillance applications, as some urban roads have a green cover. Another example is the inspection of chimneys of blast furnaces where, among other things, the thickness of the chimney wall needs to be monitored periodically. What sensors can a drone use to perform such tasks, where the phenomenon to be observed is occluded?

In drones used for underground pipeline inspection for detecting gas and water leakages, research is still assessing the effectiveness of infrared sensors or quantum cascade lasers for such applications. For example, is it feasible to use radio frequency sensing or ground penetrating radars with small drones? How far is magnetic sensing useful for these types of sub-surface explorations?

These are just a few of several open problems in drone sensing applications, where innovative use of sensors and drone technologies is required.

Tags

Dr. Balamuralidhar P is a Principal Scientist in Embedded Systems and Robotics Research, TCS Research and Innovation, at Tata Consultancy Services Ltd. (TCS). He has over 30 years of research and development experience in Systems for Signal Processing, Communications, and the Internet of Things. His major areas of current research include different aspects of Cyber Physical Systems, Cognitive Robotics, and Computer Vision. He has authored the book IoT – Technical Challenges and Solutions, published by Artech House