Computing Publications

Publications Home » Real-Time Simultaneous Localisati...

Real-Time Simultaneous Localisation and Mapping with a Single Camera

Andrew Davison

Conference or Workshop Paper
IEEE International Conference on Computer Vision
October, 2003
pp.1403–1410
IEEE
ISBN 0-7695-1950-4
Abstract

Ego-motion estimation for an agile single camera moving through

general, unknown scenes becomes a much more challenging problem when

real-time}performance is required rather than under the off-line

processing conditions under which most successful structure from

motion work has been achieved. This task of estimating camera motion

from measurements of a continuously expanding set of self-mapped

visual features is one of a class of problems known as Simultaneous

Localisation and Mapping (SLAM) in the robotics community, and we

argue that such real-time mapping research, despite

rarely being camera-based, is more relevant here than off-line

structure from motion methods due to the more fundamental emphasis

placed on propagation of uncertainty.

We present a top-down Bayesian framework for single-camera localisation via mapping of a sparse set of natural features using motion modelling and an information-guided active measurement strategy, in particular addressing the difficult issue of real-time feature initialisation via a factored sampling approach. Real-time handling of uncertainty permits robust localisation via the creating and active measurement of a sparse map of landmarks such that regions can be re-visited after periods of neglect and localisation can continue through periods when few features are visible. Results are presented of real-time localisation for a

hand-waved camera with very sparse prior scene knowledge and all processing carried out on a desktop PC.

PDF of full publication (403 kilobytes)
(need help viewing PDF files?)
BibTEX file for the publication
N.B.
Conditions for downloading publications from this site.
 

pubs.doc.ic.ac.uk: built & maintained by Ashok Argent-Katwala.