Oh, JunhoAbbott, A. Lynn2025-03-052025-03-052024-10-150302-9743https://hdl.handle.net/10919/124800Synthesis of realistic virtual environments requires careful rendering of light and shadows, a task often bottle-necked by the high computational cost of global illumination (GI) techniques. This paper introduces a new GI approach that improves computational efficiency without a significant reduction in image quality. The proposed system transforms initial direct-illumination renderings into globally illuminated representations by incorporating a Cycle-Consistent Adversarial Network (CycleGAN). Our CycleGAN-based approach has demonstrated superior performance over the Pix2Pix model according to the LPIPS metric, which emphasizes perceptual similarity. To facilitate such comparisons, we have created a novel dataset (to be shared with the research community) that provides in-game images that were obtained with and without GI rendering. This work aims to advance real-time GI estimation without the need for costly, specialized computational hardware. Our work and the dataset are made publicly available at https://github.com/junhofive/CycleGAN-Illumination.Pages 73-86application/pdfenIn CopyrightVirtual RealityVideo GamesGraphicsLightingGlobal IlluminationGANCycleGANEstimation of Global Illumination using Cycle-Consistent Adversarial NetworksConference proceedingLecture Notes in Computer Sciencehttps://doi.org/10.1007/978-3-031-77392-1_615046Abbott, Amos [0000-0003-3850-6771]1611-3349