Differentiable Owen Scrambling

Abstract

Quasi-Monte Carlo integration is at the core of rendering. This technique estimates the value of an integral by evaluating the integrand at well-chosen sample locations. These sample points are designed to cover the domain as uniformly as possible to achieve better convergence rates than purely random points. Deterministic low-discrepancy sequences have been shown to outperform many competitors by guaranteeing good uniformity as measured by the so-called discrepancy metric, and, indirectly, by an integer 𝑡 value relating the number of points falling into each domain stratum with the stratum area (lower 𝑡 is better). To achieve randomness, scrambling techniques produce multiple realizations preserving the 𝑡 value, making the construction stochastic. Among them, Owen scrambling is a popular approach that recursively permutes intervals for each dimension. However, relying on permutation trees makes it incompatible with smooth optimization frameworks. We present a differentiable Owen scrambling that regularizes permutations. We show that it can effectively be used with automatic differentiation tools for optimizing low-discrepancy sequences to improve metrics such as optimal transport uniformity, integration error, designed power spectra or projective properties, while maintaining their initial 𝑡-value as guaranteed by Owen scrambling. In some rendering settings, we show that our optimized sequences improve the rendering error.

Publication
ACM Transactions on Graphics

Owen scrambling is a popular tool in Quasi-Monte Carlo to randomize samples by permuting elementary intervals of [0,1)𝑠. It relies on a tree of boolean flags swapping digits of the positional decomposition of the sample coordinates (see Fig. 2). Our Owen permutation tree replaces permutations by smooth transitions between intervals, rendering the Owen tree differentiable. This can be used in a smooth optimization framework to improve the quality of low-discrepancy point sets while preserving their low discrepancy. Here, starting with a random Owen tree (left), our optimization results in low discrepancy samples with minimized optimal transport cost (right). These samples give lower errors for equal sample counts in simple rendering settings. We also show for each node of the tree the effect of smoothly exchanging two intervals on an optimal transport loss when starting from the identity permutation (middle).