Sandia Labs, together with the Department of Homeland Security’s Science and Technology Directorate, is working on a novel drone-detection technique. Sandia’s method is video-based. Called “temporal frequency analysis,” it analyzes pixel fluctuations in recorded footage of a drone, and then applies machine learning to that footage to train the algorithm to recognize all drones like it in the future, and even anticipate their movements.
A new technique, developed by Sandia Labs in partnership with the Science and Technology directorate of the Department of Homeland Security, analyzes video of drones for movement patterns. (Randy Montoya/Sandia Labs)
While drones today can be detected by everything from radar to visual confirmation to transmitted radio signals, most of these methods face some limitations. Radar has a hard time discriminating between birds and gyrocopters, and struggles on its own with smaller drones. Radio signals are revealing for as long as the drone and its controller are sending them, but increased autonomy reduces the signals sent, eliminating signals as a viable tracking tool.
Electronic warfare is a driving force behind the development of autonomous systems, but even without a denied environment a vehicle that repeatedly broadcasts its location is one that’s easier to spot and stop. Anticipating an autonomous future, video-based systems focus on what is observable.
Key to Temporal Frequency Analysis is not just the video of the drone, though there is that, but it’s how the drone moves in space. The whole frame becomes relevant information and the process captures tens of thousands of them.
To build a baseline of data, Sandia flew three different multirotor drones (think quadcopters, hexacopters and the like) in front of a streaming video camera. For the baseline flights, the drone would move forward and backwards, side to side, up as well as down. (No word on if the drones also hit A, B, select or start.) Analysis of the video then rendered the drone’s flight paths.
The baseline established, Sandia then tested the tool in distracting environments, with helicopters and cars and especially birds entering the frame. The Lab says it was ultimately able to distinguish between bird and drone. Iterating the design in a research setting makes a tool that will have some utility outside it. While this was a project primarily for Homeland Security, the same algorithms could likely be applied in other settings where drone detection and path-tracking is at a premium.
A battlefield is a terrible place to iterate data. Set aside the limitations of data transfer, the obstacles put in place by jamming and interference and constraints on bandwidth. Battlefields emerge in a hail of bullets or with the sudden boom of an explosion, and having data collection tools in place to even begin to train sensors for machine learning is hardly a given. Which means that if the military or the government is to see iteration done on threat detection, it will likely happen far from active war zones, with the hope that the data gathered and studied is valuable enough to inform defenses in the future.