Dual-Band Feature Fusion for Neural Global Illumination with Multi-Frequency Reflections

Shaohua Mo1, Chuankun Zheng1, Zihao Lin1, Dianbing Xi1, Qi Ye2, Rui Wang1, Hujun Bao1, Yuchi Huo1,3
1State Key Laboratory of CAD&CG, Zhejiang University, 2Zhejiang University, 3Zhejiang Lab

Abstract

In this paper, we present a novel neural global illumination approach that enables multi-frequency reflections in dynamic scenes. Our method utilizes object-centric, spatial feature grids as the core framework to model rendering effects implicitly. A lightweight scene query, based on single-bounce ray tracing, is then performed on these feature grids to extract principal and secondary features separately. The principal features can capture a wide range of relatively low-frequency global illumination effects, such as diffuse indirect lighting and reflections on rough surfaces. In contrast, the secondary features can provide sparse scene-specific reflection details, typically with much higher frequencies than the final observed radiance. Inspired by the physical processes of light propagation, we introduce a novel dual-band feature fusion module that seamlessly blends these two types of features, generating fused features capable of modeling multi-frequency rendering effects. Additionally, we propose a two-stage training strategy tailored to accommodate the distinct characteristics of each feature type, significantly enhancing the overall quality and reducing artifacts in the rendered results. Experimental results demonstrate that our method delivers high-quality, multi-frequency dynamic reflections, outperforming state-of-the-art baselines, including path tracing with screen-space neural denoising and other neural global illumination methods.

Video

BibTeX

@inproceedings{mo2025dual,
  title={Dual-Band Feature Fusion for Neural Global Illumination with Multi-Frequency Reflections},
  author={Mo, Shaohua and Zheng, Chuankun and Lin, Zihao and Xi, Dianbing and Ye, Qi and Wang, Rui and Bao, Hujun and Huo, Yuchi},
  booktitle={Proceedings of the Special Interest Group on Computer Graphics and Interactive Techniques Conference Conference Papers},
  pages={1--11},
  year={2025}
}