Course Description


Modern video games employ a variety of sophisticated algorithms to produce groundbreaking 3D rendering pushing the visual boundaries and interactive experience of rich environments. This course brings state-of-the-art and production-proven rendering techniques for fast, interactive rendering of complex and engaging virtual worlds of video games. 

In 2025, SIGGRAPH will celebrate the 20th anniversary of the Advances in Real-Time Rendering in Games program - one of the most enduring and influential research innovation forums in computer graphics. Since its inception in 2006, the program has served as a launchpad for groundbreaking rendering techniques that have fundamentally reshaped how artists, rendering engineers and game developers simulate lighting, geometry, and motion in real-time applications, especially in video games.

From the introduction of physically based shading models and temporal antialiasing to breakthroughs in ray tracing and neural-enhanced image quality, the Advances in Real-Time Rendering in Games course has consistently spotlighted state-of-the-art techniques shaping the evolution of video games, virtual productions, architectural visualization, and interactive experiences at scale.

The 2025 program will feature speakers from leading studios and engine teams including Activision, Ubisoft, Epic Games, id Software, MachineGames, HypeHype, and NVIDIA. As the 20th anniversary of the course, this year will also include a special retrospective honoring two decades of innovation, impact and shared technical progress in the real-time graphics community.

The presenters will cover a wide range of topics, from innovations in subsurface scattering and real-time path tracing, new methods for performant order-independent transparency, practical ray tracing for large-scale dynamic open worlds, efficient multi-platform strand-based hair and fur rendering methods, and multiple real-time stochastic direct lighting approaches for many lights rendering, targeted from high-end to low-end mobile GPUs, and other real-time performance global illumination methods. 

Whether you’re a rendering engineer, a game developer, or simply passionate about real-time visuals, this is the course to attend, if you want to see the latest and greatest rendering techniques in production today.

Previous years’ Advances course slides: go here

 

Syllabus

Advances in Real-Time Rendering in Games: Part I

Tuesday, 12 August 2025 9:00 am - 12:15 pm PST

Location: West Building, Ballroom C

Advances in Real-Time Rendering in Games - a 20th Year Retrospective, and a Look Ahead
Natalya Tatarchuk (Activision)

Adaptive Voxel-Based Order-Independent Transparency
Michał Drobot (Activision)

Ray Tracing the World of Assassin's Creed Shadows
Luc Leblanc (Ubisoft)
Melino Conte (Ubisoft)

Strand-based hair and fur rendering in Indiana Jones and the Great Circle
Sergei Kulikov (MachineGames Sweden AB)

Closing Notes for Part I
Natalya Tatarchuk (Activision)

 

Advances in Real-Time Rendering in Games: Part II

Tuesday, 12 August 2025, 2:00 pm - 5:15 pm PST

Location: West Building, Ballroom C

 

Welcome and Introduction to Part II
Natalya Tatarchuk (Activision)

Fast as Hell: idTech8 Global Illumination
Tiago Sousa (id Software)

Stochastic Tile-Based Lighting in HypeHype
Jarkko Lempiäinen (HypeHype)

Real-Time Subsurface Scattering via Hybrid ReSTIR-Path-Tracing and Diffusion
Tanki Zhang (NVIDIA)

MegaLights: Stochastic Direct Lighting in Unreal Engine 5
Krzysztof Narkowicz (Epic Games)
Tiago Costa (Epic Games)

Closing Notes for Advances in Real-Time Rendering in Games, 2024
Natalya Tatarchuk (Activision)

 

Prerequisites

 

Working knowledge of modern real-time graphics APIs like DirectX or Vulkan or Metal and a solid basis in commonly used graphics algorithms. Familiarity with the concepts of programmable shading and shading languages. Familiarity with shipping gaming consoles hardware and software capabilities is a plus but not required.

Intended Audience

 

Technical practitioners and developers of graphics engines for visualization, games, or effects rendering who are interested in interactive rendering.

Course Organizer

A person in a black coat

Description automatically generatedNatalya Tatarchuk is a graphics engineer and rendering enthusiast at heart, currently serving as Chief Technology Officer at Activision Publishing. In this role, she leads the technology strategy and execution across major Activision franchises—including Call of Duty—driving innovation at the intersection of cutting-edge tech and game development at scale.

Previously, Natalya was Distinguished Technical Fellow and Chief Architect, VP of Wētā Tools at Unity, where she advanced state-of-the-art rendering, graphics performance, and character creation tools for film and games. Before that, as VP of Graphics for the Unity Editor and Engine, she led Unity’s graphics initiatives across the real-time rendering stack.

Natalya’s roots in AAA game development include nearly a decade at Bungie, where she contributed to the groundbreaking visuals and engine architecture for Destiny and the Halo franchise—including Halo 3: ODST and Halo: Reach. She led the graphics group and contributed to engine development, the visual innovation and cross-platform rendering of Destiny franchise (still shipping on that technology even now).

Earlier in her career, she worked at AMD’s Graphics Products Group, where she pushed the boundaries of parallel computing and explored advanced real-time graphics techniques, graphics hardware design, and next-generation API development.

One of Natalya’s passions is fostering knowledge-sharing in the real-time graphics community, as she strongly believes that advancing the state of the art is always more powerful when done together. For over two decades, she has organized and curated some of the industry's most influential technical forums, including the Advances in Real-Time Rendering, Open Problems in Real-Time Rendering, and Rendering Engine Architecture courses. Most recently, she has co-organized Rendering Engine Architecture conferences with a few gaming industry colleagues. 

 

 

Talks

 

Advances in Real-Time Rendering in Games - a 20th Year Retrospective, and a Look Ahead

Abstract: This talk provides the context behind the history of Advances in Real-Time Rendering in Games course since its inception, as well as explaining the goals of what this session aims to achieve. This year, the speaker not only further delves into the analysis of current trends scene in gamers, player perspectives for video games, and explores what that means for trends for gaming technology in rendering and related areas, as well as platforms, and beyond, but also builds a retrospective of the last two decades of Advances in Real-Time Rendering in Games session, and looks back at what the program has introduced and shared w/ the graphics development and research community over the years. It’ll be a fun memory lane trip!

Speaker Bio:

A person in a black coat

Description automatically generatedNatalya Tatarchuk is a graphics engineer and rendering enthusiast at heart, currently serving as Chief Technology Officer at Activision Publishing. In this role, she leads the technology strategy and execution across major Activision franchises—including Call of Duty—driving innovation at the intersection of cutting-edge tech and game development at scale.

Previously, Natalya was Distinguished Technical Fellow and Chief Architect, VP of Wētā Tools at Unity, where she advanced state-of-the-art rendering, graphics performance, and character creation tools for film and games. Before that, as VP of Graphics for the Unity Editor and Engine, she led Unity’s graphics initiatives across the real-time rendering stack.

Natalya’s roots in AAA game development include nearly a decade at Bungie, where she contributed to the groundbreaking visuals and engine architecture for Destiny and the Halo franchise—including Halo 3: ODST and Halo: Reach. She led the graphics group and contributed to engine development, the visual innovation and cross-platform rendering of Destiny franchise (still shipping on that technology even now).

Earlier in her career, she worked at AMD’s Graphics Products Group, where she pushed the boundaries of parallel computing and explored advanced real-time graphics techniques, graphics hardware design, and next-generation API development.

One of Natalya’s passions is fostering knowledge-sharing in the real-time graphics community, as she strongly believes that advancing the state of the art is always more powerful when done together. For over two decades, she has organized and curated some of the industry's most influential technical forums, including the Advances in Real-Time Rendering, Open Problems in Real-Time Rendering, and Rendering Engine Architecture courses. Most recently, she has co-organized Rendering Engine Architecture conferences with a few gaming industry colleagues. 

Materials (Updated 9/29/2025): PDF (8 MB)

 

 

ADAPTIVE VOXEL-BASED ORDER-INDEPENDENT TRANSPARENCY

A person in a smokey area

AI-generated content may be incorrect.

Abstract: Rendering transparent objects and effects in real-time with high performance remains a significant challenge in game development. This talk explores the journey of the Call of Duty rendering engine as it transitioned to order-independent transparency (OIT) while supporting active game releases.

Several established algorithms for rendering transparency offer varying trade-offs in accuracy, robustness, and performance. However, none fully met the unique requirements of the Call of Duty franchise’s unique requirements, where visual accuracy is critical for gameplay and performance is paramount. This insight led to the development of a novel OIT technique: Adaptive Voxel-Based Order-Independent Transparency (AVBOIT).

This presentation covers the principles behind AVBOIT, its implementation details, and its performance profile across various hardware platforms. It also evaluates the advantages and limitations of this novel method, providing a balanced perspective on its practical impact. The talk includes a comparative analysis with industry-standard transparency algorithms and discusses practical applications using existing game content, highlighting game-specific challenges and solutions. Finally, it addresses ongoing work and potential future extensions for the method.

 

Speaker Bio:

Michal Drobot, a Technical Fellow at Activision, serves as the Franchise Rendering Director for the Call of Duty series.

Over the past decade, Michal has driven rendering architecture for seven Call of Duty titles, collaborating with Activision studios to bring their creative visions to life. Previously, he contributed to the design and optimization of the 3D renderer for Far Cry 4 at Ubisoft Montreal. Prior to that, he worked at Guerrilla Games, where he designed and optimized the rendering pipeline for the PlayStation 4 launch title Killzone: Shadow Fall.

Michal specializes in rendering algorithms, rendering architectures, hardware optimization, and low-level performance enhancements.

Materials (Updated 9/29/2025): PDF (6 MB)

 

 

 

RAY TRACING THE WORLD OF ASSASSIN'S CREED SHADOWS

A collage of buildings covered in snow

AI-generated content may be incorrect.

Abstract: Assassin’s Creed Shadows was developed with the Anvil game engine and is set in feudal Japan. For this game, we developed a ray-traced global illumination solution supporting large-scale dynamic open-worlds and their specificities. We will present the algorithms we rely on, the choices made for the foundation of our ray tracing pipeline, the reasoning behind them and their detailed performance. The talk will also cover the many challenges we faced due to the accurate depiction of this historical period such as translucent geometries, thin walls, small windows and large quantities of dense vegetation. We will also give insight into the challenges of implementing specular reflection inside our global illumination solution in the short period given by the delayed launch.

 

Speakers Bio:

 

A person smiling for the camera

AI-generated content may be incorrect.

Luc Leblanc is a Technical Lead for Anvil’s rendering team specializing in ray tracing and lighting topics. He has a research background from his PhD in addition to 25 years of experience in the rendering field and over 12 years of experience in the video-game industry. Notably, he was the main contributor for global illumination and ray tracing development at Eidos Montreal and for ray-traced specular global illumination on Assassin's Creed Shadows at Ubisoft Montreal.

A person sitting in a chair

AI-generated content may be incorrect.

Melino Conte is Team Lead and contributor for Anvil’s rendering team specializing in ray tracing and lighting topics. He has 10 years’ experience in the rendering field and 7 years of experience in the video game industry. Melino previously worked on rendering for Marvel's Guardians of the Galaxy title at Eidos Montreal and contributed on the ray tracing topics for Assassin's Creed Shadows at Ubisoft Montreal.

 

Materials (Updated 8/26/2025): PDF (14 MB)

 

 

STRAND-BASED HAIR AND FUR RENDERING IN INDIANA JONES AND THE GREAT CIRCLE

A person in a hat and glasses

AI-generated content may be incorrect.

Abstract: Strand hair systems are quickly becoming more widely adopted in video games. These systems offer very high levels of visual fidelity and help artists reduce iteration time when creating hair assets. However, performance constraints often considerably limit strand hair usage, especially on lower-end hardware.

In this talk, we will present our take on strand-based hair rendering with a focus on performance across a wide range of hardware. We will cover design constraints and decisions, various GPU optimizations for hair rasterization and shading, and problems we had to solve on our road to ship "Indiana Jones and the Great Circle" as a 60 Hz game, using strands as the only solution for human hair rendering.

Speakers Bio:

A person standing in front of a body of water

AI-generated content may be incorrect.

Sergei Kulikov is a senior render programmer at MachineGames, where he has worked on Indiana Jones and the Great Circle for the last 2.5 years. His current focus area is character rendering and lighting technology.

 

Earlier he worked as a graphics programmer at My.Games on War Robots: Frontiers and Armored Warfare. Before joining game development, he spent several years at Siemens working on CAD systems

 

 

Materials (Updated 11/11/2025): PPTX (114 MB), PDF (3 MB)

 

 

FAST AS HELL: IDTECH8 GLOBAL ILLUMINATION

Abstract: Learn about idTech 8 and how id Software transitioned from a pre-baked global illumination to a "fast as hell" real-time solution, enabling DOOM The Dark Ages to achieve 60hz or higher on all platforms.

 

Speakers Bio:

A person with a beard and mustache

AI-generated content may be incorrect.

Tiago Sousa is id Software’s Rendering Technical Director, where most recently contributed to the id Tech 8 game engine and the critically acclaimed DOOM The Dark Ages.

 

With over 20 years of experience in the video game industry, Tiago is passionate about computer graphics technology and has worked on notable titles, including Far Cry, the Crysis trilogy, DOOM (2016), DOOM Eternal, Wolfenstein II: The New Colossus, and others.

 

Materials (Updated 8/25/2025): PPTX (96 MB), PDF (3 MB)

 

 

STOCHASTIC TILE-BASED LIGHTING IN HYPEHYPE

A screenshot of a video game

AI-generated content may be incorrect.

Abstract: HypeHype is a gaming platform supporting user-generated content across a wide range of devices, from low-end mobile phones to high-end PCs. To give creators complete freedom in lighting their scenes, our lighting solution must deliver consistent results with high performance across all the target platforms. In this talk, we present a novel stochastic tile-based lighting algorithm that enables fully dynamic, fixed-cost local lighting with shadows—even on low-end mobile GPUs.

The algorithm is optimized for GPU wave coherence and efficient memory bandwidth usage, and it uses a two-stage light sampling strategy:

  1. Big-Tile Sampling: The screen is divided into large tiles, and a subset of lights is selected per tile using Stratified Reservoir Sampling (SRS) guided by a coarse PDF.
  2. Small-Tile Resampling: We resample smaller tiles from the big-tile reservoirs using a more refined PDF for high quality light samples.

By sharing light samples across small tile pixels, the algorithm amortizes resampling costs and improves GPU coherence, achieving efficient performance even on constrained devices. We will discuss the design choices, implementation details, and performance optimizations that make this approach scalable down to entry-level mobile hardware.

 

Speakers Bio:

A person with a beard

AI-generated content may be incorrect.

Jarkko Lempiäinen is a veteran graphics programmer who has been professionally developing real-time rendering technology for games and interactive applications since 1999. Before joining HypeHype as Principal Graphics Engineer to help shape its mobile platform, he contributed to numerous large-scale AAA productions, working across multiple studios and game engines in both lead and senior engineering roles.

 

His passion for computer graphics began well before his professional career. In the early 1980s, he started writing small games on the ZX Spectrum 48 and continued experimenting with graphics on 286-era PCs - long before the advent of GPUs. Starting from the early ’90s, he was active in the demoscene, sharpening his skills in low-level optimization and visual effects. That deep-rooted enthusiasm for pushing visual fidelity and technical limits continues to inspire his work today.

 

 

Materials (Updated 8/25/2025): PPTX (108 MB), PDF (3 MB)

 

 

 

REAL-TIME SUBSURFACE SCATTERING VIA HYBRID RESTIR-PATH-TRACING AND DIFFUSION

A blue statue of a dog

AI-generated content may be incorrect.

Abstract: This presentation introduces a novel hybrid solution for real-time subsurface scattering (SSS) that approaches path-traced quality while maintaining interactive performance. Traditional real-time SSS relies on diffusion approximations, but these methods often produce artifacts when handling thin or curved regions, such as nostrils and ears, due to their semi-infinite medium assumption. To address this limitation, we combine a brute-force volumetric path tracing component for single scattering with a newly derived, physically based diffusion profile for multiple scattering. This hybrid approach accurately captures the subtle transmission and scattering behaviors across a wide range of translucent materials, improving visual fidelity and reducing unwanted artifacts. Additionally, we demonstrate how ReSTIR can be integrated into this workflow to accelerate path-traced single-scattering computation. By leveraging ReSTIR’s reservoir-based sampling, our method strategically reuses samples over space and time, reducing noise and computational cost without sacrificing quality. As a result, users can achieve the appearance of path tracing—even in challenging scenarios—at frame rates suitable for real-time applications.

Attendees will learn practical strategies for implementing this hybrid approach in their own rendering pipelines. We emphasize how to balance physical rigor with efficient sampling and filtering, ensuring consistent, near-ground-truth results for numerous types of assets and lighting conditions. This talk empowers developers and technical artists to adopt path-tracer-level SSS in real-time environments, marking a significant step toward more lifelike digital humans, film-quality virtual production, and other use cases demanding high-fidelity translucency at interactive speeds.

 

Speakers Bio:

A person taking a selfie

AI-generated content may be incorrect.

Tianyi "Tanki" Zhang is a senior real-time rendering engineer at NVIDIA, working on the Omniverse RTX Renderer. As a rendering engineer, his focus includes real-time path tracing light transport algorithms and systems, path tracing real-time neural graphics primitive (Gaussian, NeRF), ultra-realistic XR pipeline integrating path tracing and compositing. Previously, Tanki was a Rendering Engineer Intern at Epic Games, contributing to real-time ray tracing features for Unreal Engine. With a background in computer science, game development, and art design, he is passionate about pushing the boundaries of computer graphics and shaping its future.

 

Materials (Updated 8/25/2025): PPTX (395 MB), PDF (6.3 MB)

 

 

 

MEGALIGHTS: STOCHASTIC DIRECT LIGHTING IN UNREAL ENGINE 5

A large circular structure with lights

AI-generated content may be incorrect.

Abstract: MegaLights, Unreal Engine 5’s new stochastic direct lighting path, enables artists to place orders of magnitude more dynamic and shadowed area lights than they could ever before. It’s designed to support current generation consoles and leverages ray tracing to enable realistic soft shadows from various types of area lights.

We surveyed existing explicit light sampling techniques such as light hierarchies or reservoir resampling-based approaches and found them difficult to scale to current generation consoles. Our talk will discuss how we overcome these challenges.

We will explain all parts of a complete direct lighting solution: how we sample lights, guide rays, our ray tracing pipeline, handle ray tracing geometry mismatches, scalability, translucency, volumetric effects, sample shading and denoising, while still being able to fit in the target hardware constraints.

Speakers Bio:

A person with blonde hair wearing a plaid shirt

AI-generated content may be incorrect.

 

Krzysztof Narkowicz is an Engineering Fellow in Graphics at Epic Games, where he co-founded Lumen and works on lighting. Prior to that, he spent over a decade working on smaller game titles at 11 Bit Studios and Flying Wild Hog. He loves working with artists, pretty pixels and coding “close to the metal”.

 

A person in a scarf

AI-generated content may be incorrect.

Tiago Costa is a Principal Rendering Programmer at Epic Games, working on ray tracing and other rendering features. Previously he worked at Rockstar North, Apple and Meta Reality Labs.

 

Materials (Updated 8/25/2025): PDF (14 MB)

 

 

 

 

Direct contact:

ceruleite.bsky.social
Mastodon @mirror2mask