Pixel History Linear Models for Real-Time Temporal Filtering

Jose A. Iglesias-Guitian1, Bochang Moon1, Charalampos Koniaris1, Eric Smolikowski2, Kenny Mitchell1

Disney Research (The Walt Disney Company)1, Walt Disney Imagineering (The Walt Disney Company)2

Computer Graphics Forum (Proc. of Pacific Graphics 2016)

Our filtering results obtained for frame #320 of the SPACELAND sequence. This scene showcases a non-linear camera motion, an animated directional light, and various shading effects including shadows and reflections. This scene has been rendered with Unreal Engine 4 [Epi] using 1 sample per pixel (Non-AA). The presence of fine geometric details and detailed textures produces lots of temporal aliasing and flickering artifacts. Our filtering method effectively reduces flickering without creating ghosting artifacts (please watch the supplementary video). Moreover, our approach produces less visual overblur (see insets) than the current state-of-the-art solutions for real-time temporal antialiasing, e.g., 1:33 dB better PSNR on average than Unreal Engine temporal filter (UE4-TAA).

Abstract

We propose a new real-time temporal filtering and antialiasing (AA) method for rasterization graphics pipelines. Our method is based on Pixel History Linear Models (PHLM), a new concept for modeling the history of pixel shading values over time using linear models. Based on PHLM, our method can predict per-pixel variations of the shading function between consecutive frames. This combines temporal reprojection with per-pixel shading predictions in order to provide temporally coherent shading, even in the presence of very noisy input images. Our method can address both spatial and temporal aliasing problems under a unique filtering framework that minimizes filtering error through a recursive least squares algorithm. We demonstrate our method working with a commercial deferred shading engine for rasterization and with our own OpenGL deferred shading renderer. We have implemented our method in GPU and it has shown significant reduction of temporal flicker in very challenging scenarios including foliage rendering, complex non-linear camera motions, dynamic lighting, reflections, shadows and fine geometric details. Our approach, based on PHLM, avoids the creation of visible ghosting artifacts and it reduces the filtering overblur characteristic of temporal deflickering methods. At the same time, the results are comparable to state-of-the-art realtime filters in terms of temporal coherence.

Video (available to download in the last section)

Contents