Skip to Content

"Omnidirectional vision applied to Unmanned Aerial Vehicles (UAVs) attitude and heading estimation"

Journal: 
Robotics and Autonomous Systems

doi:10.1016/j.robot.2010.02.012

Paper reference: 
OmnidirectionalforAttitude
Publication Date: 
04/03/2010

 

Abstract
This paper presents an aircraft attitude and heading estimator using catadioptric images as a principal sensor for UAV or as a redundant system for IMU (Inertial Measure Unit) and gyro sensors. First, we explain how the unified theory for central catadioptric cameras is used for attitude and heading estimation, explaining how the skyline is projected on the catadioptric image and how it is segmented and used to calculate the UAV’s attitude. Then, we use appearance images to obtain a visual compass, and we calculate the relative rotation and heading of the aerial vehicle. Finally the tests and results using the UAV COLIBRI platform and the validation of them in real flights are presented, comparing the estimated data with the inertial values measured on board.

Keywords: Omnidirectional images; Catadioptric systems; Unmanned Aerial Vehicles (UAV); Sky segmentation
 
More Info and additional video test:
 

 

AttachmentSize
omni_bibtex.txt692 bytes
Omnidirectional Vision applied to Unmanned Aerial Vehicles UAVs attitude and heading estimation.pdf758.05 KB