On yer bike! Boffins teach AI drone to fly itself using cams on bicycles, self-driving car

3 years ago admin Comments Off on On yer bike! Boffins teach AI drone to fly itself using cams on bicycles, self-driving car

At head height… don’t annoy this bot

As tech companies toy with the idea of using mini-drones for delivery, monitoring buildings, or surveillance, then figuring out how to fly them without human control is vital if sales are to scale up. A paper published in the IEEE Robotics and Automation journal (here’s a free pre-print version) shows how AI can help make that process easier.

Known as DroNet, the convolutional neural network has eight layers and is much smaller and less complex than other architectures. It works by examining images from a camera, and produces two outputs: a steering angle so it can hover and skirt around obstacles, and a collision probability so it knows whether or not it’s likely to bump into stuff, and take appropriate action.

The team of researchers from the University of Zurich, and the Technical University of Madrid in Spain, taught the software how to steer using more than 70,000 images from an open-source dataset created by cyber-uni Udacity, which is developing open-source self-driving car software. The pictures depict a range of scenarios taken from a front-facing camera tacked onto the vehicle.

A second dataset for collision-avoidance training was scraped together by attaching a GoPro camera to the handlebars of a bicycle to capture 32,000 images. Stills were taken from video recordings of the bicycle approaching different objects, such as pedestrians, vehicles, or trees. The images were then marked as 0 if the bike was far from an object, and 1 if it was very close to crashing into an object.

Low buzz

Antonio Loquercio, first author of the paper and a PhD student at the University of Zurich, explained to The Register that “drones operating outdoors generally fly at high altitudes, where the GPS signal is reliable enough to allow them to navigate without problems.” But by using data taken from cars and bicycles, the drones can also fly at similar altitudes, he explained.

When swooping along at low heights, the bot needs to cope with large groups of people, vehicles, and other obstacles in the way. “So, how should we develop drones that can also work there? Our intuition was that cars, bicycles, or similar vehicles, already have this great ability. Therefore, we developed an algorithm to make drones that can imitate them,” said Loquercio.

The video below shows a Parrot Bebop 2.0 drone buzzing loudly as it hovers along streets, turning corners and stopping in front of cyclists and pedestrians. It travelled up to 245 metres in one of the flying experiments in an urban setting, and could be flown 50 metres in a parking garage although it wasn’t explicitly trained on parking environments.

Youtube Video

It’s not bad for a small network with eight layers. Loquercio said reducing the complexity of the system means it an action performed by the drone requires a smaller computational cost, allowing it to react faster. That’s particularly useful for responding quickly to sudden obstacles and dangerous situations. It also means it should draw less power, which is good news for battery-powered setups.

And now the downsides

There are significant downsides. Since DroNet was trained on data taken from bicycles and cars, it can also only move on a single plane at groin height and cannot fly up or down. It has to be constantly in radio contact with its controller – a laptop running the machine-learning software that makes all the decisions – and it’s also not very agile or fast either.

Loquercio told El Reg research like DroNet was “a first, but important step, toward the integration of autonomous drones into our city.”

“[Drones] will help us in our everyday activities, deliver our packages, or support search and rescue teams in their operations in case of disaster. However, there still lot of work is required in this direction. Not only from us, but from the robotics community in general. For this reason, we open-sourced all our findings, our code and our datasets,” he concluded. ®

PS: You can play around with the code here.

[afsp_imgs kwd=”technics” num=”1″ wd=”640″ hg=”360″]

[afsp_tube kwd=”technics” num=”1″ wd=”640″ hg=”360″]