Skip to content

A short list of rendering techniques used in modern AAA games

By Oleg Sidorkin, CTO and Co-Founder of Cinevva

Composite scene showing PBR materials, virtualized geometry, and ray-traced lighting in a modern AAA game

If you crack open the rendering pipeline of a 2026 AAA game, most of what you'll find traces back to a small set of techniques that ship across studios. The names change between engines, but the ideas are the same. Here's a short list of what's actually doing the work on screen, with a figure and a few deep technical references for each.

1. Physically based rendering (PBR)

Materials are described by albedo, roughness, metallic, normal, and ambient occlusion textures, and lit by energy-conserving shaders (Cook-Torrance specular, Lambertian or Disney diffuse). It's the baseline every modern AAA engine assumes. If a surface looks consistent under sunlight, lamp light, and a flashlight, PBR is why.

Iron helmet rendered with PBR materials, with floating texture swatches showing albedo, roughness, metallic, and normal maps

Deep dives:

2. Deferred and visibility-buffer shading

Geometry first writes attributes (normals, material IDs, depth) into a G-buffer or visibility buffer. Lighting runs as a fullscreen pass that reads those buffers and shades each pixel once. Visibility buffers (used by Nanite and similar) push this further by storing only triangle IDs and resolving material parameters per pixel later, which keeps overdraw cheap on dense geometry.

Deferred shading G-buffer split into final color, world normals, depth, and material ID panels

Deep dives:

3. Virtualized geometry (Nanite-style)

Meshes are pre-built into a hierarchy of clusters. At runtime the GPU streams and selects clusters at the resolution that matches each pixel, so you get sub-pixel-accurate detail without manual LODs. Unreal's Nanite is the most visible example. Other engines now ship their own variants. The practical result is film-quality assets in real time without LOD authoring.

Smooth render on the left, sub-pixel meshlet clusters in vivid colors on the right

Deep dives:

4. Real-time ray tracing for shadows, reflections, and AO

Hardware ray tracing (DXR, Vulkan RT) traces shadow rays, mirror and glossy reflection rays, and ambient occlusion rays against a BVH built each frame. Even a few rays per pixel beat what screen-space techniques can do, especially for off-screen reflections and contact shadows. Most games use it surgically, not for everything.

Glossy red sports car in a luxury showroom with accurate ray-traced reflections and contact shadows

Deep dives:

5. Software ray tracing (Lumen-style)

Not every player has an RTX card, so engines also ship distance-field or surface-cache fallbacks. Unreal's Lumen, for instance, traces against signed distance fields and surface caches for cheap diffuse GI, and only escalates to hardware rays when needed. It's how AAA games hit "ray-traced look" on consoles.

Cathedral interior with stained-glass color bleed and an SDF wireframe overlay revealing approximated geometry

Deep dives:

6. ReSTIR and reservoir sampling

For direct and indirect lighting with thousands of light sources, ReSTIR (Reservoir Spatio-Temporal Importance Resampling) reuses light samples across pixels and frames. It's how games like Cyberpunk 2077 with Path Tracing keep noise low at one or two rays per pixel. Expect to see it in more engines as path tracing becomes the high-end target.

Cyberpunk street with ray-traced reflections in wet pavement and visible bounced light paths

Deep dives:

7. Volumetric clouds, fog, and atmosphere

Skies are ray-marched through 3D noise and density volumes. Atmosphere uses precomputed scattering tables (Bruneton-style) for sun and moon transitions. Fog is a froxel grid (think a 3D texture aligned to the view frustum) that captures local lighting. Together they give you "weather as a system" instead of a skybox.

Mountain valley at sunset with volumetric clouds and god rays piercing through misty air

Deep dives:

8. Cascaded shadow maps and virtual shadow maps

For sun shadows, cascaded shadow maps split the frustum into ranges and render each at appropriate resolution. Virtual shadow maps go further: a single huge shadow map is split into pages and only the pages visible from the camera get rendered. It's how AAA games keep crisp shadows near the player without a giant memory bill.

Outdoor scene with three colored cascade frustums overlaid showing shadow detail near the camera

Deep dives:

9. Screen-space effects (SSAO, SSR, SSGI, SSSSS)

Reading the depth and normal buffers cheaply gives you ambient occlusion (SSAO), reflections (SSR), one-bounce global illumination (SSGI), and subsurface scattering for skin (SSSSS). They miss off-screen detail, which is why ray tracing is taking over, but they're still everywhere as a fast baseline.

Side-by-side kitchen scene comparing flat lighting on the left with SSAO, SSR, and SSGI on the right

Deep dives:

10. Temporal anti-aliasing and ML upscaling (DLSS, FSR, XeSS)

The frame is rendered at a lower internal resolution and reconstructed using motion vectors, depth, and history. ML-based upscalers (DLSS 3/4, FSR 3, XeSS) add frame generation on top, interpolating intermediate frames from optical flow. Most AAA titles now ship assuming an upscaler is on, which changes how you budget the rest of the frame.

Side-by-side of a low-resolution input and a sharp ML-reconstructed output of the same character

Deep dives:

11. GPU-driven rendering and mesh shaders

Culling, LOD selection, and draw submission all run on the GPU. Mesh shaders replace the vertex/geometry/tessellation pipeline with a more flexible compute-style stage that emits meshlets. Combined with multi-draw indirect, this keeps the CPU out of the per-object hot loop entirely.

Alien spaceport scene with colored meshlet cluster overlay showing GPU-driven rendering

Deep dives:

12. Hair, cloth, and skin rendering

Hair uses Marschner-style anisotropic shading with strand-based geometry (NVIDIA HairWorks, AMD TressFX, or engine-native systems). Cloth is simulated on the GPU with position-based dynamics and rendered with anisotropic specular. Skin uses screen-space subsurface scattering plus pre-integrated wrap lighting. These three are usually where you spot the budget gap between AAA and indie.

Close-up of a warrior with strand-based hair, cloth cloak, and subsurface-scattered skin

Deep dives:

13. Decals, virtual texturing, and material layering

Surface variation comes from layered decals (bullet holes, dirt, blood, grime) projected onto the depth buffer, plus virtual textures that stream high-resolution detail just-in-time. Material layering blends multiple PBR sets per pixel using masks and triplanar projection, which is how a single rock looks like five rocks.

Weathered concrete bunker wall with bullet decals, graffiti, and a virtual texture page atlas inset

Deep dives:

14. Order-independent transparency

Hair, foliage, particles, and glass don't sort cleanly. AAA engines use techniques like weighted blended OIT, depth peeling, or per-pixel linked lists to render them correctly without a CPU sort step. It's quietly one of the most expensive parts of the frame on a foliage-heavy scene.

Forest scene with overlapping translucent foliage, smoke, glass droplets, and translucent hair

Deep dives:

15. Neural radiance caching and ML denoisers

The newest layer. NVIDIA's neural radiance cache learns indirect lighting per-scene and queries it instead of tracing more rays. ML denoisers (OptiX, Intel Open Image Denoise, custom in-house) clean up sparse ray-traced signals in milliseconds. Expect this category to grow fast over the next two years.

Noisy 1-spp ray-traced cathedral on the left, clean ML-denoised result on the right

Deep dives:

What this means for the browser

We've shipped several of these techniques in WebGPU for our open-world browser engine. Cascaded shadow maps, GPU-driven instancing, triplanar PBR, screen-space fog, and clipmap-based virtualized terrain all run at 120 FPS in a tab. The rest (hardware ray tracing, mesh shaders, ML upscaling) is coming to the web as the WebGPU spec catches up. For deeper dives, see our guides on browser open-world tech and landscape generation.

Further reading across the whole stack

If you want to read one book, Real-Time Rendering, 4th edition is the standard reference covering most of the topics above. For ongoing research, the SIGGRAPH "Advances in Real-Time Rendering in Games" course archive (advances.realtimerendering.com) has free PDFs of the AAA-engine deep dives going back to 2006. For production tear-downs of how specific games render their frame, Adrian Courrèges' GPU profiling articles are required reading.