Abstract Fourier analysis is gaining popularity in image synthesis as a tool for the analysis of error in Monte Carlo (MC) integration. Still, existing tools are only able to analyse convergence under simplifying assumptions (such as randomized shifts) which are not applied in practice during rendering. We reformulate the expressions for bias and variance of sampling‐based integrators to unify non‐uniform sample distributions [importance sampling (IS)] as well as correlations between samples while respecting finite sampling domains. Our unified formulation hints at fundamental limitations of Fourier‐based tools in performing variance analysis for MC integration. At the same time, it reveals that, when combined with correlated sampling, IS can impact convergence rate by introducing or inhibiting discontinuities in the integrand. We demonstrate that the convergence of multiple importance sampling (MIS) is determined by the strategy which converges slowest and propose several simple approaches to overcome this limitation. We show that smoothing light boundaries (as commonly done in production to reduce variance) can improve (M)IS convergence (at a cost of introducing a small amount of bias) since it removesC0discontinuities within the integration domain. We also propose practical integrand‐ and sample‐mirroring approaches which cancel the impact of boundary discontinuities on the convergence rate of estimators.
more »
« less
Gaussian Product Sampling for Rendering Layered Materials
Abstract To increase diversity and realism, surface bidirectional scattering distribution functions (BSDFs) are often modelled as consisting of multiple layers, but accurately evaluating layered BSDFs while accounting for all light transport paths is a challenging problem. Recently, Guoet al. [GHZ18] proposed an accurate and general position‐free Monte Carlo method, but this method introduces variance that leads to longer render time compared to non‐stochastic layered models. We improve the previous work by presenting two new sampling strategies,pair‐product samplingandmultiple‐product sampling. Our new methods better take advantage of the layered structure and reduce variance compared to the conventional approach of sequentially sampling one BSDF at a time. Ourpair‐product samplingstrategy importance samples the product of two BSDFs from a pair of adjacent layers. We further generalize this tomultiple‐product sampling, which importance samples the product of a chain of three or more BSDFs. In order to compute these products, we developed a new approximate Gaussian representation of individual layer BSDFs. This representation incorporates spatially varying material properties as parameters so that our techniques can support an arbitrary number of textured layers. Compared to previous Monte Carlo layering approaches, our results demonstrate substantial variance reduction in rendering isotropic layered surfaces.
more »
« less
- PAR ID:
- 10136771
- Publisher / Repository:
- Wiley-Blackwell
- Date Published:
- Journal Name:
- Computer Graphics Forum
- Volume:
- 39
- Issue:
- 1
- ISSN:
- 0167-7055
- Page Range / eLocation ID:
- p. 420-435
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
Meila, Marina; Zhang, Tong (Ed.)Black-box variational inference algorithms use stochastic sampling to analyze diverse statistical models, like those expressed in probabilistic programming languages, without model-specific derivations. While the popular score-function estimator computes unbiased gradient estimates, its variance is often unacceptably large, especially in models with discrete latent variables. We propose a stochastic natural gradient estimator that is as broadly applicable and unbiased, but improves efficiency by exploiting the curvature of the variational bound, and provably reduces variance by marginalizing discrete latent variables. Our marginalized stochastic natural gradients have intriguing connections to classic coordinate ascent variational inference, but allow parallel updates of variational parameters, and provide superior convergence guarantees relative to naive Monte Carlo approximations. We integrate our method with the probabilistic programming language Pyro and evaluate real-world models of documents, images, networks, and crowd-sourcing. Compared to score-function estimators, we require far fewer Monte Carlo samples and consistently convergence orders of magnitude faster.more » « less
-
Abstract Quantile is an important quantity in reliability analysis, as it is related to the resistance level for defining failure events. This study develops a computationally efficient sampling method for estimating extreme quantiles using stochastic black box computer models. Importance sampling has been widely employed as a powerful variance reduction technique to reduce estimation uncertainty and improve computational efficiency in many reliability studies. However, when applied to quantile estimation, importance sampling faces challenges, because a good choice of the importance sampling density relies on information about the unknown quantile. We propose an adaptive method that refines the importance sampling density parameter toward the unknown target quantile value along the iterations. The proposed adaptive scheme allows us to use the simulation outcomes obtained in previous iterations for steering the simulation process to focus on important input areas. We prove some convergence properties of the proposed method and show that our approach can achieve variance reduction over crude Monte Carlo sampling. We demonstrate its estimation efficiency through numerical examples and wind turbine case study.more » « less
-
null (Ed.)Electron probe microanalysis is a nondestructive technique widely used to determine the elemental composition of bulk samples. This was extended to layered specimens, with the development of appropriate software. The traditional quantification method requires the use of matrix correction procedures based upon models of the ionization depth distribution, the so-called ϕ ( ρz ) distribution. Most of these models have led to commercial quantification programs but only few of them allow the quantification of layered specimens. Therefore, we developed BadgerFilm, a free open-source thin film program available to the general public. This program implements a documented ϕ ( ρz ) model as well as algorithms to calculate fluorescence in bulk and thin film samples. Part 1 of the present work aims at describing the operation of the implemented ϕ ( ρz ) distribution model and validating its implementation against experimental measurements and Monte Carlo simulations on bulk samples. The program has the ability to predict absolute X-ray intensities that can be directly compared to Monte Carlo simulations. We demonstrate that the implemented model works very well for bulk materials. And as will be shown in Part 2, BadgerFilm predictions for thin film specimens are also shown to be in good agreements with experimental and Monte Carlo results.more » « less
-
Normalizing flows (NFs) provide uncorrelated samples from complex distributions, making them an appealing tool for parameter estimation. However, the practical utility of NFs remains limited by their tendency to collapse to a single mode of a multimodal distribution. In this study, we show that annealing with an adaptive schedule based on the effective sample size (ESS) can mitigate mode collapse. We demonstrate that our approach can converge the marginal likelihood for a biochemical oscillator model fit to time-series data in ten-fold less computation time than a widely used ensemble Markov chain Monte Carlo (MCMC) method. We show that the ESS can also be used to reduce variance by pruning the samples. We expect these developments to be of general use for sampling with NFs and discuss potential opportunities for further improvements.more » « less
An official website of the United States government
