Enhanced Direct Lighting Using Visibility-Aware Light Sampling

Geonu Noh , Hajin Choi , Bochang Moon

Gwangju Institute of Science and Technology

Advances in Computer Graphics (CGI 2023 Conference Proceedings)

Enhanced Direct Lighting Using Visibility-Aware Light Sampling
Equal-time comparisons (in twenty secs) of light sampling techniques for a scene with three lights. We render the scene without indirect illumination to show the visual differences among the tested methods clearly. We measured the numerical accuracy of rendered images using a relative L2 error (relL 2) [25].

Abstract

Next event estimation has been widely applied to Monte Carlo rendering methods such as path tracing since estimating direct and indirect lighting separately often enables finding light paths from the eye to the lights effectively. Its success heavily relies on light sampling for direct lighting when a scene contains multiple light sources since each light can contribute differently to the reflected radiance on a surface point. We present a light sampling technique that can guide such a light selection to improve direct lighting. We estimate a spatially-varying function that approximates the contribution of each light on surface points within a discretized local area (i.e., a voxel in an adaptive octree) while considering the visibility between lights and surface points. We then construct a probability distribution function for sampling lights per voxel, which is proportional to our estimated function. We demonstrate that our light sampling technique can significantly improve rendering quality thanks to improved direct lighting with our light sampling.

Contents

BibTex