Published in ACM Transactions on Graphics, Volume 34, Issue 4 (SIGGRAPH 2015)

Image Based Relighting Using Neural Networks

rendering resutls

        Relighting of various scenes using light transport captured by our method from a small number of images.

Abstract

We present a neural network regression method for relighting realworld scenes from a small number of images. The relighting in this work is formulated as the product of the scene’s light transport matrix and new lighting vectors, with the light transport matrix reconstructed from the input images. Based on the observation that there should exist non-linear local coherence in the light transport matrix, our method approximates matrix segments using neural networks that model light transport as a non-linear function of light source position and pixel coordinates. Central to this approach is a proposed neural network design which incorporates various elements that facilitate modeling of light transport from a small image set. In contrast to most image based relighting techniques, this regression-based approach allows input images to be captured under arbitrary illumination conditions, including light sources moved freely by hand. We validate our method with light transport data of real scenes containing complex lighting effects, and demonstrate that fewer input images are required in comparison to related techniques.

Keywords

image based relighting, light transport, neural network, clustering

Downloads

Acknowledgements

The authors thank Zheng ZHANG, Jiaxing ZHANG, Dong YU, Zhiheng HUANG for insightful discussions on deep neural networks, Yi MA, Gong CHENG, Jinyu LI on robust PCA, and Yang LIU on non-linear optimization. The authors also thank the anonymous reviewers for their helpful suggestions and comments. The Waldorf and the Bull scenes are from the public data shared by Matthew O’TOOLE and Kiriakos N. KUTULAKOS.