Skip to Content

Special visual systems

Natural User Interfaces for Human-Drone Multi-Modal Interaction

Publication
Personal drones are becoming part of every day life. To fully integrate them into society, it is crucial to design safe and intuitive ways to interact with these aerial systems. The recent advances on User-Centered Design (UCD) applied to Natural User Interfaces (NUIs) intend to make use of human innate features, such as speech, gestures and vision to interact with technology in the way humans would with one another. In this paper, a Graphical User Interface (GUI) and sev- eral NUI methods are studied and implemented, along with computer vision techniques, in a single software framework for aerial robotics called Aerostack which allows for intuitive and natural human-quadrotor interaction in indoor GPS-denied environments. These strategies include speech, body position, hand gesture and visual marker interactions used to directly command tasks to the drone. The NUIs presented are based on devices like the Leap Motion Controller, microphones and small size monocular on-board cameras which are unnoticeable to the user. Thanks to this UCD perspective, the users can choose the most intuitive and effective type of interaction for their application. Additionally, the strategies proposed allow for multi-modal interaction between multiple users and the drone by being able to integrate several of these interfaces in one single application as is shown in various real flight experiments performed with non-expert users.
Congresses name: 

2016 International Conference on Unmanned Aircraft Systems (ICUAS'16)

Place: 

Arlington, VA, USA

Date: 
June 7 - 10, 2016

Abstract - Personal drones are becoming part of every day life. To fully integrate them into society, it is crucial to design safe and intuitive ways to interact with these aerial systems. The recent advances on User-Centered Design (UCD) applied to Natural User Interfaces (NUIs) intend to make use of human innate features, such as speech, gestures and vision to interact with technology in the way humans would with one another.

On-board Light Low-cost Efficient ARM Architecture Pre-processing System for UAVs

On-board Light Low-cost Efficient ARM Architecture Pre-processing System for UAVs
On-board Light Low-cost Efficient ARM Architecture Pre-processing System for UAVs
Views

On-board Light Low-cost Efficient ARM Architecture Pre-processing System for UAVs
 

We tested different types of cameras from May, 2014:

1. With Point Grey Camera:

2. With Ueye cameras: (captured from our Embedded Stereo Vision System, where, bottom figure shows the terminal information in Red rectangle)

Researches: 

An Approach Towards Visual Autonomous ship board landing of a VTOL UAV

Journal: 
Journal of Intelligent and Robotic Systems
Publication Date: 
2013

We present the design and implementation of a vision based autonomous landing algorithm using a downward looking camera. To demonstrate the efficacy of our algorithms we emulate the dynamics of the ship-deck, for various sea states and different ships using a six degrees of freedom motion platform. We then present the design and implementation of our robust computer vision system to measure the pose of the shipdeck with respect to the vehicle. A Kalman filter is used in conjunction with our vision system to ensure the robustness of the estimates.

Visual Quadrotor Swarm for IMAV 2013 Indoor Competition

Congresses name: 

Robot 2013: First Iberian Robotics Conference (ROBOT 2013)

Place: 

Madrid (Spain)

Date: 
Nov. 28-29, 2013

This paper presents a low-cost framework for visual quadrotor swarm prototyping which will be utilized to participate in the 2013 International Micro Air Vehicle Indoor Flight Competition. The testbed facilitates the swarm design problem by utilizing a cost-efficient quadrotor platform, the Parrot AR Drone 2.0; by using markers to simplify the visual localization problem, and by broadcoasting the estimated location of the swarm members to obviate the partner dectection problem.

3D Object following based on visual information for Unmanned Aerial Vehicles

Publication
3D Object following based on visual information for Unmanned Aerial Vehicles
Abstract—This article presents a novel system and a control strategy for visual following of a 3D moving object by an Unmanned Aerial Vehicle UAV. The presented strategy is based only on the visual information given by an adaptive tracking method based on the color information, which jointly with the dynamics of a camera fixed to a rotary wind UAV are used to develop an Image-based visual servoing IBVS system. This system is focused on continuously following a 3D moving target object, maintaining it with a fixed distance and centered on the image plane. The algorithm is validated on real flights on outdoors scenarios, showing the robustness of the proposed systems agains winds perturbations, illumination and weather changes among others. The obtained results indicate that the proposed algorithms is suitable for complex controls task, such object following and pursuit, flying in formation, as well as their use for indoor navigation
Congresses name: 

The Latin American Robotics Competition (LARC), The Latin American Robotics Symposium (LARS), The Colombian Conference on Automatic Control (CCAC)

Place: 

Bogotá-Colombia

Date: 
Oct 1-4,2011

Abstract—This article presents a novel system and a control strategy for visual following of a 3D moving object by an Unmanned Aerial Vehicle UAV. The presented strategy is based only on the visual information given by an adaptive tracking method based on the color information, which jointly with the dynamics of a camera fixed to a rotary wind UAV are used to develop an Image-based visual servoing IBVS system. This system is focused on continuously following a 3D moving target object, maintaining it with a fixed distance and centered on the image plane.

Omnidirectional bearing-only see-and-avoid for small aerial robots

Publication
Omnidirectional bearing-only see-and-avoid for small aerial robots
Abstract—Motivated by the growing interest in unmanned aerial system’s applications in indoor and outdoor settings and the standardization of visual sensors as vehicle payload. This work presents a collision avoidance approach based on omnidirectional cameras that does not require the estimation of range between two platforms to resolve a collision encounter. It guarantees a minimum separation between the two vehicles involved by maximising the view-angle given by the omnidirectional sensor. Only visual information is used to achieve avoidance under a bearing-only visual servoing approach. We provide theoretical problem formulation, as well as results from simulations and real-flight using small quadrotors.
Congresses name: 

ICARA 2011: The 5th IEEE International Conference on Automation, Robotics and Applications

Place: 

Wellington, New Zealand

Date: 
06/12/2011

 Omnidirectional bearing-only see-and-avoid for small aerial robots

"On-board and Ground Visual Pose Estimation Techniques for UAV Control"

Publication
On-board and Ground Visual Pose Estimation Techniques for UAV Control
In this paper, two techniques to control UAVs (Unmanned Aerial Vehicles), based on visual information are presented. The first one is based on the detection and tracking of planar structures from an on-board camera, while the second one is based on the detection and 3D reconstruction of the position of the UAV based on an external camera system. Both strategies are tested with a VTOL (Vertical take-off and landing) UAV, and results show good behavior of the visual systems (precision in the estimation and frame rate) when estimating the helicopter’s position and using the extracted information to control the UAV.
Journal: 
Journal of Intelligent & Robotic Systems

SSN: 0921-0296 (Print) 1573-0409 (Online).

Paper reference: 
Volume 61, Issue 1-4, Pages 301-320. DOI: 10.1007/s10846-010-9505-9
Publication Date: 
MARCH 2011

In this paper, two techniques to control UAVs (Unmanned Aerial Vehicles), based on visual information are presented. The first one is based on the detection and tracking of planar structures from an on-board camera, while the second one is based on the detection and 3D reconstruction of the position of the UAV based on an external camera system.

"Onboard and Ground Visual Pose Estimation Techniques for UAV Control"

Publication
Congresses name: 

The 3rd International Symposium on Unmanned Aerial Vehicles (UAV’10).

Place: 

Dubai, Arab Emirates.

Date: 
June 21-23, 2010.

 Abstract :  In this paper two techniques to control UAVs (Unmanned Aerial Vehicles) based on visual information are presented. The first one is based on the detection and tracking of planar structures from an onboard camera, while the second one is based on the detection and 3D reconstruction of the position of the UAV based on an external camera system.

"Omnidirectional vision applied to Unmanned Aerial Vehicles (UAVs) attitude and heading estimation"

Journal: 
Robotics and Autonomous Systems

doi:10.1016/j.robot.2010.02.012

Paper reference: 
OmnidirectionalforAttitude
Publication Date: 
04/03/2010

 

Syndicate content