[Neural BTF Compression and Interpolation]

Neural BTF Compression and Interpolation

Gilles Rainer1,  Wenzel Jakob2,  Abhijeet Ghosh3,  Tim Weyrich1

1 University College London
2 Ecole Polytechnique Fédérale de Lausanne (EPFL)
3 Imperial College London

Abstract

The Bidirectional Texture Function (BTF) is a data-driven solution to render materials with complex appearance. A typical capture contains tens of thousands of images of a material sample under varying viewing and lighting conditions. While capable of faithfully recording complex light interactions in the material, the main drawback is the massive memory requirement, both for storing and rendering, making effective compression of BTF data a critical component in practical applications. Common compression schemes used in practice are based on matrix factorization techniques, which preserve the discrete format of the original dataset. While this approach generalizes well to different materials, rendering with the compressed dataset still relies on interpolating between the closest samples. Depending on the material and the angular resolution of the BTF, this can lead to blurring and ghosting artefacts. An alternative approach uses analytic model fitting to approximate the BTF data, using continuous functions that naturally interpolate well, but whose expressive range is often not wide enough to faithfully recreate materials with complex non-local lighting effects (subsurface scattering, inter-reflections, shadowing and masking...). In light of these observations, we propose a neural network-based BTF representation inspired by autoencoders: our encoder compresses each texel to a small set of latent coefficients, while our decoder additionally takes in a light and view direction and outputs a single RGB vector at a time. This allows us to continuously query reflectance values in the light and view hemispheres, eliminating the need for linear interpolation between discrete samples. We train our architecture on fabric BTFs with a challenging appearance and compare to standard PCA as a baseline. We achieve competitive compression ratios and high-quality interpolation/extrapolation without blurring or ghosting artifacts.

Citation Style:    Publication

Neural BTF Compression and Interpolation.
Gilles Rainer, Wenzel Jakob, Abhijeet Ghosh, Tim Weyrich.
Computer Graphics Forum (Proc. Eurographics), 38(2), pp. 235–244, 2019.
Gilles Rainer, Wenzel Jakob, Abhijeet Ghosh, and Tim Weyrich. Neural BTF compression and interpolation. Computer Graphics Forum (Proc. Eurographics), 38(2):235–244, May 2019.Rainer, G., Jakob, W., Ghosh, A., and Weyrich, T. 2019. Neural BTF compression and interpolation. Computer Graphics Forum (Proc. Eurographics) 38, 2 (May), 235–244.G. Rainer, W. Jakob, A. Ghosh, and T. Weyrich, “Neural BTF compression and interpolation,” Computer Graphics Forum (Proc. Eurographics), vol. 38, no. 2, pp. 235–244, May 2019.

Acknowledgments

We would like to thank Change of Paradigm Ltd. for supporting this work, and Reinhard Klein and his team for providing data and helping with comparisons. We would also like to acknowledge the EPSRC Early Career Fellowship EP/N006259/1, and the EPSRC grant EP/K023578/1.


Privacy: This page is free of cookies or any means of data collection. Copyright disclaimer: The documents contained in these pages are included to ensure timely dissemination of scholarly and technical work on a non-commercial basis. Copyright and all rights therein are maintained by the authors or by other copyright holders, notwithstanding that they have offered their works here electronically. It is understood that all persons copying this information will adhere to the terms and constraints invoked by each author's copyright. These works may not be reposted without the explicit permission of the copyright holder.