Learning Vision-based Cohesive Flight in Drone Swarms

09/03/2018
by   Fabian Schilling, et al.
0

This paper presents a data-driven approach to learning vision-based collective behavior from a simple flocking algorithm. We simulate a swarm of quadrotor drones and formulate the controller as a regression problem in which we generate 3D velocity commands directly from raw camera images. The dataset is created by simultaneously acquiring omnidirectional images and computing the corresponding control command from the flocking algorithm. We show that a convolutional neural network trained on the visual inputs of the drone can learn not only robust collision avoidance but also coherence of the flock in a sample-efficient manner. The neural controller effectively learns to localize other agents in the visual input, which we show by visualizing the regions with the most influence on the motion of an agent. This weakly supervised saliency map can be computed efficiently and may be used as a prior for subsequent detection and relative localization of other agents. We remove the dependence on sharing positions among flock members by taking only local visual information into account for control. Our work can therefore be seen as the first step towards a fully decentralized, vision-based flock without the need for communication or visual markers to aid detection of other agents.

READ FULL TEXT
research
08/08/2019

Learning Vision-based Flight in Drone Swarms by Imitation

Decentralized drone swarms deployed today either rely on sharing of posi...
research
12/02/2020

Vision-based Drone Flocking in Outdoor Environments

Decentralized deployment of drone swarms usually relies on inter-agent c...
research
02/06/2020

VGAI: A Vision-Based Decentralized Controller Learning Framework for Robot Swarms

Despite the popularity of decentralized controller learning, very few su...
research
01/07/2022

Visual Attention Prediction Improves Performance of Autonomous Drone Racing Agents

Humans race drones faster than neural networks trained for end-to-end au...
research
10/26/2022

Learning Deep Sensorimotor Policies for Vision-based Autonomous Drone Racing

Autonomous drones can operate in remote and unstructured environments, e...
research
06/29/2017

Vision-based Detection of Acoustic Timed Events: a Case Study on Clarinet Note Onsets

Acoustic events often have a visual counterpart. Knowledge of visual inf...
research
03/17/2021

Machine Vision based Sample-Tube Localization for Mars Sample Return

A potential Mars Sample Return (MSR) architecture is being jointly studi...

Please sign up or login with your details

Forgot password? Click here to reset