3D High Dynamic Range Dense Visual SLAM and Its Application to Real-time Object Re-lighting.

 

Acquiring High Dynamic Range (HDR) light-fields from several images with different exposures (sensor integration periods) has been widely considered for static camera positions. In this paper a new approach is proposed that enables 3D HDR environment maps to be acquired directly from a dynamic set of images in real-time. In particular a method will be proposed to use an RGB-D camera as a dynamic light-field sensor, based on a dense real-time 3D tracking and mapping approach, that avoids the need for a light-probe or the observation of reflective surfaces. The 6dof pose and dense scene structure will be estimated simultaneously with the observed dynamic range so as to compute the radiance map of the scene and fuse a stream of low dynamic range images (LDR) into an HDR image. This will then be used to create an arbitrary number of virtual omni-directional light-probes that will be placed at the positions where virtual augmented objects will be rendered. In addition, a solution is provided for the problem of automatic shutter variations in visual SLAM. Augmented reality results are provided which demonstrate real-time 3D HDR mapping, virtual light-probe synthesis and light source detection for rendering reflective objects with shadows seamlessly with the real video stream in real-time.

 

 

Some videos of this work can be found here:

 

 

 

 

Related publications : 

3D High Dynamic Range Dense Visual SLAM and Its Application to Real-time Object Re-lighting, Maxime Meilland, Christian Barat, Andrew I. Comport, International Symposium on Mixed and Augmented Reality, 2013, Adelaide, Australia. 2013. BibTex