Skip to Content

Visual Control

Computer Vision control for multi-rotor UAVs

The communication protocol establishes a link between a ground station laptop, the on-board Intel Atom computer, and the Pelican flight controller. The on-board computer can receive commands through a WLAN, whether they are from a ground-based source or an on-board control algorithm. These commands are interpreted by the on-board computer and communicated directly to the flight controller as navigation commands. The system allows for vision-based control algorithms to issue navigation controls to the Pelican according to the images it receives from an on-board camera.

Visual Control of UAV`s by using soft computing

...God say “let there be light” and one trillion evolutive brain-eyes systems began their journey toward intelligence... Flying is a wonderful resource and a difficult one too. Trying to control a scale helicopter flying in a windy day is as difficult as trying to keep a broomstick on balance in one of your fingers in the same windy day. And yet there exist wonderful, neurally controlled small beings which master the delights of flying since the early creation. We talk of course about insects, first to conquer our gas filled atmosphere. In one sense insect are bio-machines controlled by finite, neural machines that have self learned to process complex vision-to-control problems. We are thus motivated to continue this strategy and define self-learned neural controller capable of flying real world machines using real world images.

Moving planar target following by vision system

The purpose of this research is to implement a real time moving object following from a UAV. A pan & tilt visual platform is used in order to follow a moving object in spite of the move of the helicopters, the vibrations and the perturbations of the environment.