A drone aboard the International Space Station taught itself to judge distances only using one camera. While humans, and robots, use two eyes to estimate distances, things get tricky when using only one eye. Humans can still do well with just one eye but, until now, this wasn’t easy for robots.

“It is a mathematical impossibility to extract distances to objects from one single image, if the object has not been encountered before,” according to Guido de Croon, assistant professor at TU Delft Robotics Institute. The new technology uses deep learning to recognise objects. Once an object has been recognised, it can estimate distance using the physical characteristics of the object.

"It is a mathematical impossibility to extract distances to objects from one single image"

For the experiment aboard the International Space Station, a drone navigated its way through Japan’s module of the ISS. The drone made use of its stereo-vision camera to learn distances to walls and nearby objects. Once it had done that, the stereo camera was turned off and the drone began autonomous navigation with just one camera.

Quadcopter_for_ground_test.png

This self-supervised learning software has previously been tested with quadcopters at TU Delft’s CyberZoo, in the faculty of Aerospace Engineering.

There was an additional degree of difficulty present at the experiment. So far, the technology had only been tested here on Earth. In the International Space Station, things were slightly different. The drone had to operate in weightlessness, with no favoured up or down direction. This means that there were multiple angles an objects could be viewed from.

Operating in a weightless environment proved to be no problem, however. The drone managed to navigate normally, despite one of its cameras being disabled.

About the experiment
The experiment was a joined operation by the European Space Agency (ESA), the Massachusetts Institute of Technology (MIT) and the Micro Air Vehicles Lab (MAVLab) of the Delft University of Technology.