3D Immersive Visual Maps


In this work we have produced immersive 3D maps for real-time localisation and autonomous navigation In particular we have developped a method and apparatus for building dense visual maps of large scale environments for real-time localisation and autonomous navigation. We propose a spherical ego-centric representation of the environment which is able to reproduce photo-realistic omnidirectional views of captured environments. This representation is composed of a graph of locally accurate augmented spherical panoramas that allows to generate varying viewpoints through novel view synthesis. The spheres are related by a graph of 6 d.o.f. poses which are estimated through multi-view spherical registration. It is shown that this representation can be used to accurately localise a vehicle navigating within the spherical graph, using only a monocular camera for accurate localisation. To perform this task, an efficient direct image registration technique is employed. This approach directly exploits the advantages of the spherical representation by minimising a photometric error between a current image and a reference sphere. Autonomous navigation results are shown in challenging urban environments, containing pedestrians and other vehicles.

Example of an augmented spherical panorama obtained using a multi-baseline spherical camera:





This research was carried out in collaboration with the Arobas team of INRIA Sophia-Antipolis. The related publication can be found here:

Meilland, M., Rives, P. & Comport, A. I (2012). Dense RGB-D mapping for real-time localisation and navigation.. In IV12 Workshop on Navigation Positiong and Mapping. Alcalá de Henares, Spain.. [More] [Bibtex] [RIS] [MODS]