Intelligence, Surveillance and Reconnaissance with Unmanned Aerial Vehicles

The comprehensive nature of ISR missions requires that information sourced by distributed multiple sensors must be fused to an overall picture available to the lowest possible tactical level. Herein, the set of sensors involved can comprise ground based stationary or mobile sensors (e.g. attached to a vehicle) as well as sensors which are integrated in UAVs. Furthermore, the sensors could be of different type (cameras, laser scanners etc.) and also generate different types of data (such as images, range information etc.). The main challenge therefore is to transmit these incoherent data gathered by the distributed and even moving sensors to a control station and fuse them in a suitable way to generate a common comprehensive situational picture. This problem is addressed here under the additional assumption that small UAVs are applied as flying sensor platforms. Small UAVs pose hard constraints with regard to payload weight, power consumption and on-board processing capacity. In a first approach it is assumed that the considered small UAV is only equipped with a 2D-camera in addition to GPS and low-cost IMU for navigation and control. It will be investigated how the generated 2D-image data can be fused with the on-board GPS- and IMU-data as well as information from ground-based sensors in a way that real-life 3D models including geometry and texture of an observed area can be created in the control station. It will be investigated how these 3D models could be dynamically updated in (near) real-time, how previously defined objects (e.g. humans or vehicles) can be recognized and tracked and how predefined events can be detected. In addition, it will be investigated what additional sensors might be integrated in the UAV (e.g. range sensors) and how this improves the described information processing processes.

Researchers: Dr. Miguel Olivares Mendez, Dr. Somasundar Kannan, Arun Annaiyan, Prof. Dr. Holger Voos.