Skip to Content

Vision-based pose estimation using 3D markers

Unmanned Aerial Vehicles (UAVs) already have many applications and, as a good platform for service robotics, their number is rapidly increasing. Sometimes, UAVs must fly in GPS-denied environments, where there is no easy way for them to determine their own pose, relative to their surroundings. This research line tries to address this problem using visual information from an onboard camera.

Conventional camera pose estimation methods, such like those used for Augmented Reality, use printed markers at known positions. However, in large flight areas, either markers need to be very big to be seen from afar or plenty of them must be positioned over the terrain. This can be a very tedious task.

In this reasearch line, we propose an alternative kind of marker and a method for estimating the pose of the UAV with the views of the marker from an onboard camera. The marker is a 3D structure on the ground that may be formed by two or more poles. As the structure model is known, the algorithm tries to infer the point of view from which the structure is seen by the onboard camera. Thus, the position and attitude relative to the structure can be found. Special attention is put on the situations where some poles are not in the field of view or are partially visible. The estimation output by this method may be fused with the odometry given by the onboard inertial sensors.

First, a small simulator was programmed in Matlab. It allows us to test the performance of the method from different points of view and to quantify the positioning errors depending on the camera parameters and detection noise. The following figure shows a sample of the simulator output:

On the other hand, the algorithm has been implemented for real-time processing of video streams. Once the camera location is known, it is possible to embed virtual objects in the images, as if they really were in the scene. The concept is shown in the video below, which was recorded with a Parrot AR.Drone camera. This technique could enable the use of augmented reality in the video feedback from a UAV to help an operator.

 

Researches: