Course Description

Advances in real-time graphics research and the ever-increasing power of mainstream GPUs and consoles continue to generate an explosion of innovative algorithms suitable for fast, interactive rendering of complex and engaging virtual worlds. Every year, the latest video games employ a vast variety of sophisticated algorithms to produce ground-breaking 3D rendering that pushes the visual boundaries and interactive experience of rich environments.

 

This course is the next installment in the established series of SIGGRAPH courses on real-time rendering. It presents the best graphics practices and research from the game-development community and provides practical and production-proven algorithms. The focus of the course is on the intersection between the game-development community and state-of-the-art 3D graphics research, and the potential for cross-pollination of knowledge in future games and other interactive applications.

 

The first part of the course includes speakers from the makers of several innovative game companies, such as Bungie, Activision Blizzard, Insomniac Games, CCP, Vicarious Visions and Gobo Games, as well as R&D from Bosch and Disney Research. Topics covered in the first part include practical methods of real game-rendering pipelines and shipping game engines, global illumination and reflections in real-time, high quality motion blur and ambient occlusion, complex lighting techniques, subsurface scattering and character rendering, practical approaches to shadows rendering in production scenarios and other exciting production-proven techniques.

 

The second part of the course includes speakers from the state-of-the-art games, such as Epic, Ubisoft, Firaxis, thatgamecompany, and Avalanche Studios, among some. Topics covered in the second part of the course include practical methods of real game-rendering pipelines and shipping game engines, improvements to deferred shading and global illumination, new material models, particle rendering and simulation, post-processing pipelines, shader antialiasing, dynamic sand simulation and rendering and other exciting production-proven techniques.

 

This is the course to attend if you are in the game development industry or want to learn the latest and greatest techniques in real-time rendering domain!

 

 


Syllabus

Advances in Real-Time Rendering in Games: Part I

WEDNESDAY, 8 AUGUST 9:00 AM - 12:15 PM | Los Angeles Convention Center - Room 515AB

Advances in Real-Time Rendering in Games: Part II

WEDNESDAY, 8 AUGUST 2:00 PM - 5:15 PM | Los Angeles Convention Center - Room 515AB

 

 

Advances in Real-Time Rendering in Games: Part I

WEDNESDAY, 8 AUGUST 9:00 AM - 12:15 PM | Los Angeles Convention Center - Room 515AB

09:00 am

Natalya Tatarchuk (Bungie)

Welcome!

 

09:15 am

Padraic Hennessy (Vicarious Visions)

Scalable High Quality Motion Blur and Ambient Occlusion

 

10:00 am

Hugh Malan (CCP)

Real-Time Global Illumination and Reflections in Dust 514

 

10:40 am

Jorge Jimenez (Activision Blizzard)

Separable Subsurface Scattering & Photorealistic Eyes Rendering

 

11:10 am

Lei Yang (Bosch Research North America) and Huw Bowles (Gobo Games)

Accelerating Rendering Pipelines Using Bidirectional Iterative Reprojection

 

11:40pm

Mike Acton (Insomniac)

CSM Scrolling, an Acceleration Technique for the Rendering of Cascaded Shadow Maps

 

12:10pm

Q&A

 

Advances in Real-Time Rendering in Games: Part II

WEDNESDAY, 8 AUGUST 2:00 PM - 5:15 PM | Los Angeles Convention Center - Room 515AB

 

02:00 pm

Natalya Tatarchuk (Bungie)

Welcome Back!

 

02:05 pm
Martin Mittring
(Epic)

The Technology Behind the “Unreal Engine 4 Elemental demo”

 

03:05 pm

Stephen Hill (Ubisoft) & Dan Baker (Firaxis)

Rock-Solid Shading: Image Stability without Sacrificing Detail

 

04:00 pm

John Edwards (thatgamecompany)

Dynamic Sand Simulation and Rendering in Journey

 

04:20 pm

Emil Persson (Avalanche Studios)

Graphics gems for games: Findings from Avalanche Studios

 

04:50 pm

Game Industry Panel: Advances in Real-Time Graphics and Practical Challenges of Game Development

 

 


 

Scalable High Quality Motion Blur and Ambient Occlusion Scalable High Quality Motion Blur and Ambient Occlusion

Abstract: In this talk, the author will discuss advances and extensions to classic post-processing techniques for Motion Blur and Ambient Occlusion to be featured in an upcoming AAA videogame. The advances allow a dramatic increase in visual quality and reduction in objectionable artifacts while maintaining the performance of previous techniques. In some cases asset authoring costs are also reduced. The resulting techniques are scalable from current generation consoles to DirectX 11® by simply adjusting meaningful parameters. Specifically the presenter will discuss how carefully selecting fall-off functions for Ambient Occlusion and observing the phenomenological characteristics of Motion Blur can lead to substantial reductions in the computational cost of these otherwise expensive effects.

Presenters:

Padraic Hennessy

Affiliation:

Vicarious Visions

Bios:

Padraic Hennessy is a Senior Graphics Engineer at Vicarious Visions, an Activision Blizzard Studio. Over the past five years he has worked on some of Activision’s largest franchises. Padraic is a primary member of the Studio’s Visual Alchemy Team. Works from this team, including the topics being presented, have been published at SIGGRAPH, I3D, HPG, and an upcoming chapter in GPU Pro 4. The team has also worked on AAA franchises such as Doom, Marvel Ultimate Alliance, and Guitar Hero. While graphics is his main focus at the studio he has contributed as a core engine architect, tools engineer, network engineer, and gameplay systems engineer. When not working on developing new graphics techniques Padraic strives to improve artist workflow and help artists understand complex graphics techniques through training seminars. Padraic has his B.S. in Computer Engineering from Binghamton University (2006).

 

Materials:
(Updated September 8th 2012)

PowerPoint Slides (74 MB), PDF Slides (16 MB), Video 1 (21 MB), Video 2 (31 MB)

 

 

 


 

Real-Time Global Illumination and Reflections in Dust 514

Abstract: This talk will present a method for approximating the first bounce of diffuse global illumination, and also obtaining the approximate point hit by a reflection ray. The method is fast enough to be affordable for current console games; it is shipping in Dust 514 by CCP, for the Sony PlayStation 3®. The method works by dividing the scene into layers, and building a height field imposter for each layer. The imposters are updated at runtime, and therefore dynamic changes to lighting, shadowing, and materials are supported. Performance details, quality problems for the method and its use in Dust 514 will be covered, as well as some of improvements possible with a larger rendering budget.

 

Presenters:

Hugh Malan (CCP Games)

 

Bios:

Hugh Malan is a senior graphics programmer working on Dust 514 at CCP, in Newcastle. Previously he worked as graphics lead for Crackdown and MyWorld for Realtime Worlds, and developed the "Realworldz" realtime procedural planet demo for 3Dlabs. Hugh is a graduate of Victoria University and Otago University, New Zealand.

 

Materials:
(Updated August 24th 2012)

PPT Slides (13 MB)

 


Separable Subsurface Scattering and Photorealistic Eyes Rendering

Abstract: In this session, the author will present a technique to simulate subsurface scattering for human skin that runs in a performance similar to a simple bloom shader. Previous real-time approaches simulate it by approximating the non-separable diffusion kernel using a sum of Gaussians, which required several (usually five) 1D convolutions. In this work we decompose the exact 2D diffusion kernel with only two 1D functions. This allows rendering subsurface scattering with only two screen-space convolutions, reducing both time and memory without a decrease in quality. A technique to render ambient subsurface scattering will be also presented.  The author will also show our latest advances in photorealistic eyes rendering, including realistic reflections, view and light rays refraction, caustics, ambient occlusion, eye redness, assets modeling and tear fluid representation.

Presenters:

Jorge Jimenez (Activision Blizzard)

Bios:

Jorge Jimenez is a real-time graphics researcher at Activision Blizzard. He received his PhD degree in real-time graphics from Universidad de Zaragoza (Spain) in 2012. His interests include real-time photorealistic rendering, special effects, and squeezing rendering algorithms to be practical in game environments. He has numerous contributions in books, journals and conferences, including the GPU Pro series, the Game Developer Magazine and the journal Transaction on Graphics. He co-organized the SIGGRAPH 2011 filtering antialiasing course, declaring open war against the jaggies. Some of his key achievements include Jimenez's MLAA, SMAA and the separable subsurface scattering technique.

 

Materials:
(Updated August 24th 2012)

PowerPoint Slides (162 MB)

 

 


Accelerating Rendering Pipelines Using Bidirectional Iterative Reprojection

Abstract: In this talk, the authors will discuss some recent tools that aim to avoid redundant rendering computation through pixel data reuse. In particular, a new frame-to-frame pixel reprojection approach based on an iterative frame warping algorithm will be presented. This approach is completely image-based, simple to implement and very efficient on legacy hardware. Based on this algorithm, the authors will then introduce Bidirectional Reprojection, a scheme that boosts frame rate by reconstructing interpolated frames from neighboring pairs of rendered frames on the fly. Reusing data from both temporal directions significantly improves the accuracy and efficiency of data reuse, since very few pixels are simultaneously occluded in both sources. Finally, the authors will provide some implementation details, demonstrate convincing results, and discuss the practicalities of integrating these algorithms into production renderers.

Presenter:

Lei Yang (Bosch Research North America) and Huw Bowles (Gobo Games)

Bio:

Lei Yang is a Research Scientist/Engineer at the Visual Computing group of the Bosch Research and Technology Center, Palo Alto. He obtained his PhD from The Hong Kong University of Science and Technology in 2011. Lei's current work spans a wide variety of real-time rendering related topics. Before joining Bosch, he also spent two summers in Adobe ATL and Black Rock Studio, working on production rendering systems. In the subject of exploiting coherence in computer graphics, he has co-instructed a course and published several papers in various renowned graphics conferences.

 

Huw Bowles is a Research Scientist/Engineer at Gobo Games, a games development studio in Brighton/UK. He obtained his MSc from ETH Zurich. In that time he worked as an intern at Disney Research Zurich on a nonphotorealistic rendering project. He later joined Black Rock Studio/Disney Interactive Studios during production of the console racing game Split/Second, where he worked on a screen-space stereoscopic 3D conversion. He further developed and subsequently published this work at Eurographics 2012, and has since been actively advocating the use of reprojection to exploit coherence in rendering pipelines.

 

Materials:
(Updated August 24th 2012)

PowerPoint Slides (15 MB)

 


 

CSM Scrolling, an Acceleration Technique for the Rendering of Cascaded Shadow Maps

Abstract: This talk will explain how a bitmap-scrolling technique, whose inspiration derives from the era of 8-bit games, can be combined with a shadow map caching scheme to significantly increase the performance of real-time cascaded shadow mapping in games. The two systems integrate well into the standard model of cascaded shadow mapping, and take advantage of frame-to-frame coherence to preserve much of the rendered shadow map information across frames. The technique is well-suited to current games consoles, and will ship in Overstrike, the forthcoming title by Insomniac.

 

Presenter:

Mike Acton (Insomniac)

Bio:

Mike Acton is Engine Director at Insomniac Games. When he’s not searching for new ways to optimize Insomniac’s engine, he dreams up new ways to help the development community. Mike can often be found extolling the virtues of understanding the data and hardware first along with programming for performance.

 

Materials:
(Updated August 24th 2012)

PowerPoint Slides (1.2 MB)

PDF Slides (1 MB)

 


The Technology behind the Unreal Engine 4 Elemental Demo

Abstract: The Elemental demo was developed to demonstrate the capabilities and drive the development of the new Unreal Engine 4. In this talk we want to present some technical details on our implementation of the UE4 rendering features, along with the goals that shape them.

 

Starting with the Unreal Engine3 code major changes to the rendering internals have been implemented. The real-time demo takes per pixel deferred shading to the next level: Light emitters can be image based light sources, area point lights or emissive materials. Our new light transport method based on voxel cone tracing supports dynamic and static geometry and it shines when it comes to glossy materials. Lighting affects both opaque and translucent materials, subsurface scattering and deferred decals. The lighting is complemented by the new GPU accelerated particle simulation and the new post processing pipeline.

 

Presenter:

Martin Mittring

Affiliation:

Epic

Bio:

Martin always wanted to work on high end real-time graphics so he targeted the small workstation world for his career. Luckily the world changed and the big PC world became his playground. He started at a game company in Munich, joined Crytek in Coburg, Germany, and shipped the technical advanced PC shooter “Far Cry” as Lead Network Programmer. He then became Lead Graphics Programmer and worked with many others onCryEngine shipping the famous “Crysis” shooter. About 3 years ago he moved to the United States and is now working as Senior Graphics Architect at Epic Games, pushing pixels in the Unreal Engine. After working on Unreal Engine 3 and shipping “Gears of War 3” he now works on the next generation: Unreal Engine 4.

 

Materials:
(Updated August 24th 2012)

PowerPoint Slides (47 MB)

 

 


 

Rock-Solid Shading: Image Stability Without Sacrificing Detail

Abstract: Over the last decade, huge improvements in hardware have increased the visual fidelity of games to unprecedented levels. However, even the best looking games still lack the visual cleanness of animated movies from the mid-90s despite exceeding them in level of detail. One reason is the lack of solid anti-aliasing in our shaders, typically avoided because it was considered too expensive.

 

This talk will focus on advances in hardware and research that enable practical solutions to shader anti-aliasing and appearance preservation in the context of physically-based rendering. We will present the underlying theory behind recent work, implementation trade-offs, content creation and pipeline implications, as well as integration with popular rendering techniques such as deferred shading. We will also cover ongoing investigations into environmental lighting and geometric sources of shader aliasing.

 

Presenter:

Stephen Hill (Ubisoft) & Dan Baker (Firaxis)

Bio:

Stephen Hill is a 3D Technical Lead at Ubisoft Montreal. He has previously worked on Splinter Cell: Chaos Theory, developing new effects for the first wave of Shader Model 3.0 hardware, followed by Splinter Cell: Conviction, where he created many of the core sub-systems and supporting tools of a (then) new renderer for Xbox 360. He presented highlights of this work at Gamefest and GDC.

 

Dan Baker is a rendering architect for Firaxis Games. He has helped architect and build Firaxis’ new engine technology which is architected around D3D11 and GPU Compute. With over a decade of experience, Dan began his career at Microsoft working on some of the very first shaders for DirectX 8, built the backend of the first version of HLSL, and later lead the development of HLSL for D3D10. Dan is a regular speaker at many conferences, including SIGGRAPH, GDC, I3D, Gamefest, and EGSR. His main areas of expertise include graphics architecture and lighting theory. While at Firaxis, Dan has worked on Sid Meier’s Railroads!, Sid Meier’s Civilization Revolution, and most recently Sid Meier’s Civilization V.

 

Materials:
(Updated September  2012)

PowerPoint Slides (16 MB)

PDF Slides (17.6 MB)

 


Dynamic Sand Simulation and Rendering in Journey

Abstract: In this talk, the author will describe the techniques used to create the dynamic sand dunes in PlayStation® Network title Journey. Specifically, the physics of footprints and trails will be discussed, as well as how to convey the sense of trillions of sparkling grains of sand on a TV that doesn't have the resolution to display them and a console that doesn't have the horsepower to render them.

Presenter:

John Edwards

Affiliation:

thatgamecompany

Bio:

John Edwards is Lead Programmer at thatgamecompany, where he worked on PlayStation Network games flOw, Flower and Journey. John has presented on technical and design topics at SIGGRAPH, the GDC, and Sony's internal tech conference. Growing up in the Pacific Northwest, with it's lush grass and forests, and the occasional oppressive rain that makes them possible, taught him the joys of both exploring nature and spending too much time playing games in front of the TV.  He now tries to reconcile the two pursuits by creating simulations that bring the feeling of the great outdoors into the living room.

 

Materials:
(Updated August 24th 2012)

PowerPoint Slides (8 MB)

Video 1 (10 MB), Video 2 (24 MB), Video 3 (12 MB)

 

 


Graphics gems for games: Findings from Avalanche Studios

Abstract: This talk will cover a collection of rendering techniques addressing common issues in games today. A particle trimming algorithm is described that automatically finds an optimal enclosing polygon for a given texture and target vertex count, resulting in substantial performance gains. Also covered is a method for instancing a mix of diverse meshes together in a single draw call where traditional instancing does not work. An analytical antialiasing scheme based on a second-depth buffer is described, as well as a method for eliminating aliasing from thin geometry such as phone-wires.

 

 

Presenter:

Emil Persson

Affiliation:

Avalanche Studios

Bio:

Emil Persson is a Senior Graphics Programmer at Avalanche Studios working on rendering techniques for large scale open world games. Previously he was an ISV Engineer at ATI/AMD assisting game developers with high-end rendering techniques and optimizations.

 

Materials:
(Updated August 24th 2012)

PowerPoint Slides (6 MB)

 

 

Contact: