Object detection, recognition, and tracking from UAVs using a thermal camera

  • Frederik S. Leira
    Department of Engineering Cybernetics, Autonomous Marine Operations and Systems Norwegian University of Science and Technology Trondheim Norway
  • Håkon Hagen Helgesen
    Department of Engineering Cybernetics, Autonomous Marine Operations and Systems Norwegian University of Science and Technology Trondheim Norway
  • Tor Arne Johansen
    Department of Engineering Cybernetics, Autonomous Marine Operations and Systems Norwegian University of Science and Technology Trondheim Norway
  • Thor I. Fossen
    Department of Engineering Cybernetics, Autonomous Marine Operations and Systems Norwegian University of Science and Technology Trondheim Norway

Description

<jats:title>Abstract</jats:title><jats:p>In this paper a multiple object detection, recognition, and tracking system for unmanned aerial vehicles (UAVs) has been studied. The system can be implemented on any UAVs platform, with the main requirement being that the UAV has a suitable onboard computational unit and a camera. It is intended to be used in a maritime object tracking system framework for UAVs, which enables a UAV to perform multiobject tracking and situational awareness of the sea surface, in real time, during a UAV operation. Using machine vision to automatically detect objects in the camera's image stream combined with the UAV's navigation data, the onboard computer is able to georeference each object detection to measure the location of the detected objects in a local North‐East (NE) coordinate frame. A tracking algorithm which uses a Kalman filter and a constant velocity motion model utilizes an object's position measurements, automatically found using the object detection algorithm, to track and estimate an object's position and velocity. Furthermore, a global‐nearest‐neighbor algorithm is applied for data association. This is achieved using a measure of distance that is based not only on the physical distance between an object's estimated position and the measured position, but also how similar the objects appear in the camera image. Four field tests were conducted at sea to verify the object detection and tracking system. One of the flight tests was a two‐object tracking scenario, which is also used in three scenarios with an additional two simulated objects. The tracking results demonstrate the effectiveness of using visual recognition for data association to avoid interchanging the two estimated object trajectories. Furthermore, real‐time computations performed on the gathered data show that the system is able to automatically detect and track the position and velocity of a boat. Given that the system had at least 100 georeferenced measurements of the boat's position, the position was estimated and tracked with an accuracy of 5–15 m from 400 m altitude while the boat was in the camera's field of view (FOV). The estimated speed and course would also converge to the object's true trajectories (measured by Global Positioning System, GPS) for the tested scenarios. This enables the system to track boats while they are outside the FOV of the camera for extended periods of time, with tracking results showing a drift in the boat's position estimate down to 1–5 m/min outside of the FOV of the camera.</jats:p>

Journal

Citations (1)*help

See more

Report a problem

Back to top