Vision-based Landing of an Unmanned Aerial Vehicle

Omid Shakernia, Hoam Chung, David Shim1, and Ron Tal2
(Professor S. Shankar Sastry)
(ONR) N00014-00-1-0621 and (ONR) N00014-97-1-0946

Computer vision is gaining importance as a cheap, passive, and information-rich source complementing the sensor suite for control of unmanned aerial vehicles (UAV). A vision system on board a UAV typically augments a sensor suite that might include a Global Positioning System (GPS), inertial navigation sensors (INS), laser range finders, a digital compass, and sonar. Because of its structured nature, the task of autonomous landing is well-suited for vision-based state estimation and control and has recently been an active topic of research.

In our research [1], we have designed and implementated a real-time vision system for a rotorcraft unmanned aerial vehicle to estimate its state relative to a known landing target at 30 Hz. Our vision system consists of off-the-shelf hardware to perform image processing, segmentation, feature point extraction, camera control, and both linear and nonlinear optimization for model-based state estimation. Flight test results show our vision-based state estimates are accurate to within 5 cm in each axis of translation and 5 degrees in each axis of rotation, making it a viable sensor to be placed in the control loop of a hierarchical flight vehicle management system.

Recently [2], we developed a multiple view algorithm for vision-based landing of an unmanned aerial vehicle. Our algorithm is based on our recent results in multiple view geometry that exploit the rank deficiency of the so-called multiple view matrix. We show how the use of multiple views significantly improves motion and structure estimation. We compare our algorithm to our previous linear and nonlinear two-view algorithms using an actual flight test.

Furthermore, we have conducted vision-in-the-loop flight experiments and used the above vision system to successfully track a pitching landing deck which simulates the motion of a ship in 2 meter waves in 5 knot winds.


Figure 1: Berkeley UAV test-bed with on-board vision system: Yamaha R-50 helicopter with pan/tilt camera and computer box hovering above a pitching landing platform

Figure 2: Berkeley UAV test-bed with on-board vision system: Yamaha R-50 helicopter tracking a pitching landing platform, foreground shows the camera view of landing target

Figure 3: Vision monitoring station

[1]
C. S. Sharp, O. Shakernia, and S. S. Sastry, "A Vision System for Landing an Unmanned Aerial Vehicle," IEEE Int. Conf. Robotics and Automation, Seoul, Korea, May 2001.
[2]
O. Shakernia, C. S. Sharp, R. Vidal, Y. Ma, and S. S. Sastry, "A Vision System for Landing an Unmanned Aerial Vehicle," IEEE Int. Conf. Robotics and Automation, Washington, DC, May 2002.
1Outside Adviser (non-EECS)
2Staff

More information (http://robotics.eecs.berkeley.edu/~omids) or

Send mail to the author : (omids@eecs.berkeley.edu)


Edit this abstract