Image-space Adaptive Sampling for Fast Inverse Rendering

Abstract

Inverse rendering is crucial for many scientific and engineering disciplines. Recent progress in differentiable rendering has led to efficient differentiation of the full image formation process with respect to scene parameters, enabling gradient-based optimization.However, computational demands pose a significant challenge for differentiable rendering, particularly when rendering all pixels during inverse rendering from high-resolution/multi-view images. This computational cost leads to slow performance in each iteration of inverse rendering. Meanwhile, naively reducing the sampling budget by uniformly sampling pixels to render in each iteration can result in high gradient variance during inverse rendering, ultimately degrading overall performance.Our goal is to accelerate inverse rendering by reducing the sampling budget without sacrificing overall performance. In this paper, we introduce a novel image-space adaptive sampling framework to accelerate inverse rendering by dynamically adjusting pixel sampling probabilities based on gradient variance and contribution to the loss function. Our approach efficiently handles high-resolution images and complex scenes, with faster convergence and improved performance compared to uniform sampling, making it a robust solution for efficient inverse rendering.

Publication
Proceedings of the Special Interest Group on Computer Graphics and Interactive Techniques Conference Conference Papers
Kai Yan
Kai Yan
Ph.D. Candidate in Computer Science at UC Irvine
Cheng Zhang
Cheng Zhang
Research Scientist at Meta Reality Labs
Sébastien Speierer
Sébastien Speierer
Senior Research Engineer at Meta Reality Labs Research
Yufeng Zhu
Yufeng Zhu
Research Engineer at Meta Reality Lab Research
Zhao Dong
Zhao Dong
Senior Research Lead & Manager at Meta Reality Lab Research
Shuang Zhao
Shuang Zhao
Associate Professor of Computing and Data Science at the University of Illinois Urbana-Champaign