Damien Lefloch1, Tim Weyrich2, Andreas Kolb1
1 University of Siegen
2 University College London
We propose a new real-time framework which efficiently reconstructs large-scale scenery by accumulating anisotropic point representations in combination with memory efficient representation of point attributes. The reduced memory footprint allows to store additional point properties that represent the accumulated anisotropic noise of the input range data in the reconstructed scene. We propose an efficient processing scheme for the extended and compressed point attributes that does not obstruct real-time reconstruction. Furthermore, we evaluate the positive impact of the anisotropy handling on the data accumulation and the 3D reconstruction quality.
Damien Lefloch, Tim Weyrich, Andreas Kolb. In Proc. of Intl. Conference on Information Fusion (Fusion), pp. 1–9, Washington, D.C., USA, July 2015.Damien Lefloch, Tim Weyrich, and Andreas Kolb. Anisotropic point-based fusion. In Proceedings of International Conference on Information Fusion (FUSION), pages 1–9. ISIF, July 2015.Lefloch, D., Weyrich, T., and Kolb, A. 2015. Anisotropic point-based fusion. In Proceedings of International Conference on Information Fusion (FUSION), ISIF, 1–9.D. Lefloch, T. Weyrich, and A. Kolb, “Anisotropic point-based fusion,” inProceedings of International Conference on Information Fusion (FUSION). ISIF, Jul. 2015, pp. 1–9. |
This work was funded by the German Research Foundation (DFG) as part of the research training group GRK 1564 Imaging New Modalities, and by the UK Engineering and Physical Sciences Research Council (grant EP/K023578/1).