Differentiable time-gated rendering

Dec 1, 2021·
Lifan Wu
Lifan Wu
Guangyan Cai
Guangyan Cai
Ravi Ramamoorthi
Ravi Ramamoorthi
Shuang Zhao
Shuang Zhao
· 0 min read
Abstract
The continued advancements of time-of-flight imaging devices have enabled new imaging pipelines with numerous applications. Consequently, several forward rendering techniques capable of accurately and efficiently simulating these devices have been introduced. However, general-purpose differentiable rendering techniques that estimate derivatives of time-of-flight images are still lacking. In this paper, we introduce a new theory of differentiable time-gated rendering that enjoys the generality of differentiating with respect to arbitrary scene parameters. Our theory also allows the design of advanced Monte Carlo estimators capable of handling cameras with near-delta or discontinuous time gates. We validate our theory by comparing derivatives generated with our technique and finite differences. Further, we demonstrate the usefulness of our technique using a few proof-of-concept inverse-rendering examples that simulate several time-of-flight imaging scenarios.
Type
Publication
ACM Transactions on Graphics
Lifan Wu
Authors
Lifan Wu
Research Scientist at NVIDIA
Guangyan Cai
Authors
Ph.D. Candidate
Ravi Ramamoorthi
Authors
Ravi Ramamoorthi
Professor of Computer Science at the UC San Diego
Shuang Zhao
Authors
Shuang Zhao
Assistant Professor of Computer Science at the UC Irvine