[Real-time 3D Reconstruction in Dynamic Scenes using Point-based Fusion]

Real-time 3D Reconstruction in Dynamic Scenes using Point-based Fusion

Maik Keller1,  Damien Lefloch2,  Martin Lambers2,  Shahram Izadi3,  Tim Weyrich4,  Andreas Kolb2

1 pmdtechnologies
2 University of Siegen
3 Microsoft Research
4 University College London

Abstract

Real-time or online 3D reconstruction has wide applicability and receives further interest due to availability of consumer depth cameras. Typical approaches use a moving sensor to accumulate depth measurements into a single model which is continuously refined. Designing such systems is an intricate balance between reconstruction quality, speed, spatial scale, and scene assumptions. Existing online methods either trade scale to achieve higher quality reconstructions of small objects/scenes. Or handle larger scenes by trading real-time performance and/or quality, or by limiting the bounds of the active reconstruction. Additionally, many systems assume a static scene, and cannot robustly handle scene motion or reconstructions that evolve to reflect scene changes. We address these limitations with a new system for real-time dense reconstruction with equivalent quality to existing online methods, but with support for additional spatial scale and robustness in dynamic scenes. Our system is designed around a simple and flat point-based representation, which directly works with the input acquired from range/depth sensors, without the overhead of converting between representations. The use of points enables speed and memory efficiency, directly leveraging the standard graphics pipeline for all central operations; i.e., camera pose estimation, data association, outlier removal, fusion of depth maps into a single denoised model, and detection and update of dynamic objects. We conclude with qualitative and quantitative results that highlight robust tracking and high quality reconstructions of a diverse set of scenes at varying scales.

Citation Style:    Publication

Real-time 3D Reconstruction in Dynamic Scenes using Point-based Fusion.
Maik Keller, Damien Lefloch, Martin Lambers, Shahram Izadi, Tim Weyrich, Andreas Kolb.
In Proc. of Joint 3DIM/3DPVT Conference (3DV), 8 pages, Seattle, USA, June 2013.
Selected for oral presentation.
Maik Keller, Damien Lefloch, Martin Lambers, Shahram Izadi, Tim Weyrich, and Andreas Kolb. Real-time 3d reconstruction in dynamic scenes using point-based fusion. In Proc. of Joint 3DIM/3DPVT Conference (3DV), pages 1–8, June 2013. Selected for oral presentation.Keller, M., Lefloch, D., Lambers, M., Izadi, S., Weyrich, T., and Kolb, A. 2013. Real-time 3d reconstruction in dynamic scenes using point-based fusion. In Proc. of Joint 3DIM/3DPVT Conference (3DV), 1–8. Selected for oral presentation.M. Keller, D. Lefloch, M. Lambers, S. Izadi, T. Weyrich, and A. Kolb, “Real-time 3d reconstruction in dynamic scenes using point-based fusion,” inProc. of Joint 3DIM/3DPVT Conference (3DV), Jun. 2013, pp. 1–8, selected for oral presentation.

Acknowledgments

This research has partly been funded by the German Research Foundation (DFG), grant GRK-1564 Imaging New Modalities, and by the FP7 EU collaborative project BEAMING (248620). We thank Jens Orthmann for his work on the GPU framework osgCompute.


Privacy: This page is free of cookies or any means of data collection. Copyright disclaimer: The documents contained in these pages are included to ensure timely dissemination of scholarly and technical work on a non-commercial basis. Copyright and all rights therein are maintained by the authors or by other copyright holders, notwithstanding that they have offered their works here electronically. It is understood that all persons copying this information will adhere to the terms and constraints invoked by each author's copyright. These works may not be reposted without the explicit permission of the copyright holder.