Skip to content

Volumetric clouds and weather effects in modern games

By Oleg Sidorkin, CTO and Co-Founder of Cinevva

Stylized AAA scene with towering volumetric storm clouds, lightning, rain shafts, and a wet stone road reflecting the sky

A few weeks ago I wrote about the rendering techniques modern AAA games actually ship. One area I left thin was sky and weather, because it deserves its own list. Clouds, fog, rain, and snow are the systems that turn a terrain demo into a place. They also share more code than they look like they do. Volumetric clouds, ground fog, and god rays are all the same ray-march. Wet roads, snow accumulation, and footprint trails are all the same displacement plus PBR trick. Wind is one direction vector that everything in the scene reads from.

Here's a short, opinionated tour of how the bigger studios build this stuff in 2026, with the papers and engine talks behind each piece.

1. Physically based sky and atmosphere

Atmospheric scattering is the foundation. The sky color, the horizon haze, the way distant mountains turn blue, the orange of sunset, all come from light scattering through the air. Modern engines compute this from physics: Rayleigh scattering for the blue, Mie scattering for the haze around the sun, ozone absorption for the deep violet at the zenith.

The original 2008 Bruneton method baked everything into 4D lookup tables, which limited dynamic time-of-day and added LUT artifacts at low sun angles. Sébastien Hillaire's 2020 update, which is what UE5's Sky Atmosphere component ships, replaces the high-dimensional LUT with a few 2D textures and a multiple-scattering approximation that updates per frame. It runs from a phone to a high-end PC.

Physically based sky at golden hour with smooth orange-to-blue gradient over a distant mountain silhouette

Deep dives:

2. Volumetric clouds with Perlin-Worley noise

The bedrock cloud technique in modern games started in Andrew Schneider's 2015 Horizon Zero Dawn talk. Clouds are not meshes. They are a 3D density function defined by layered noise: a low-frequency Perlin-Worley mix gives the overall cloud shape, and a higher-frequency Worley noise erodes the silhouette into wispy edges. A weather map (a 2D texture sampled by world XZ) controls coverage, cloud type, and precipitation per region. A height-based gradient blends between cumulus, stratus, and cirrus profiles by altitude.

The renderer marches a ray from the camera through the cloud volume, accumulating density and scattering. The "Nubis" iteration in 2017 added regional-scale authoring and animation, and the original PS4 implementation ran in about 2 ms for the entire sky. Most studios that ship volumetric clouds today still trace their lineage to this paper.

A 3D cloud shape decomposed into stacked Perlin and Worley noise patterns showing how detail erodes the silhouette

Deep dives:

3. Voxel clouds and Nubis³

The 2023 evolution of Nubis abandoned the 2.5D shape representation entirely in favor of true 3D voxels. Each voxel stores cloud density directly, which lets artists carve and animate cloud shapes the way they sculpt terrain. The cost of moving to a denser representation is paid back by ray-march acceleration with compressed signed distance fields and clever up-rezzing of sparse voxel data.

The result is the kind of cloudscape you can fly through without seeing the underlying tricks fall apart. It's overkill for most studios, but it's the direction the high end is moving.

A cumulus cloud shown decomposed into a 3D voxel grid, with smoothed wispy edges blending the chunky interior

Deep dives:

4. Layered cloud rendering and 2D backdrops

Not every studio can afford full-volume clouds, and not every camera angle needs them. A lot of games combine techniques: high-altitude cirrus rendered as a scrolling 2D layer, mid-altitude cumulus as volumetric rays, and low-altitude stratus as a thin participating-media slab. The horizon often gets a pre-baked sky cubemap that the volumetric pass blends into beyond a fade distance.

This layering is what keeps the cloud budget honest. A single cloud type at full quality can eat 4-6 ms; layering different qualities for different cloud altitudes can hold the same look at half the cost.

Sunset sky split into three cloud layers: high cirrus wisps, mid cumulus puffs, and low stratus haze near the horizon

Deep dives:

5. Volumetric fog with froxel grids

Fog is a 3D field, not a 2D screen effect. The standard modern approach is the froxel grid: a 3D texture aligned to the camera's view frustum, with each cell ("froxel" = frustum + voxel) storing density and lit color. A compute shader injects scattering from every light source into the grid, accumulates extinction along the view ray, and applies the result as a fullscreen pass.

This is what gives you light shafts through windows, colored fog around point lights, and visible volumes around explosions. It's also the underlying machinery for "atmospheric perspective" that fades distant objects into the air. The technique was introduced by Bart Wronski for Assassin's Creed 4 and standardized by Sébastien Hillaire in Frostbite.

Camera frustum visualized as a 3D froxel grid with smaller cells near the camera and bigger cells far away, fog particles inside

Deep dives:

6. God rays and crepuscular shafts

Visible rays of sunlight in misty air are not a separate effect. They fall out of the same fog system, as long as the fog density and the shadow map are both available to the same compute shader. When the shader injects light into a froxel, it samples the shadow map at that froxel's world position. Cells in shadow stay dark, cells in light pick up the sun color. March the camera ray through the result and the bright cells form continuous shafts.

Cheaper screen-space variants exist (radial blur from the sun position into the depth buffer) and are still the right pick on mobile or low-end hardware. They miss off-screen sun positions but cost almost nothing.

Dawn forest with sun rays slicing through tree trunks and ground fog, forming clear god ray shafts

Deep dives:

7. Lightning and stochastic weather events

Lightning is a one-frame effect with two parts: the bolt mesh, and the scene-wide tonemap and lighting response. The bolt itself is usually a procedural billboard mesh built from a recursive line-segment subdivision algorithm, jittered for chaos and tapered toward the ground. Some engines render it as a screen-space additive flash, others as a fully lit emissive geometry that casts light on the world via a one-frame point-light injection.

The interesting part is everything else: cloud bottoms light from below, the ground brightens for two frames, the auto-exposure metering takes a few frames to recover, and a delayed thunder cue plays based on distance. Done well, this turns a 16 ms flash into a five-second sequence that sells weather as something happening in the world, not just over it.

Lightning fork striking from a thundercloud at dusk, lighting the cloud bottoms and a small village silhouette

Deep dives:

8. Rain particles, rain meshes, and screen-space drops

Falling rain in modern games is rarely just particles. The cheap and convincing solution is a small set of scrolling textures stretched across vertical or screen-aligned quads, lit by the same sun and sky probes as everything else. Closer to the camera, individual streak particles add detail. On the camera lens itself, droplet textures, sliding trails, and impact ripples sell the "you are inside the storm" feel.

Wind affects the rain direction. The same wind vector pushes the cloud weather map, bends grass, and tilts the rain quads. One scene-wide vector, dozens of consumers.

Heavy night rainstorm with sheets of rain illuminated by distant headlights and ripples on a wet road

Deep dives:

9. Wet surfaces, puddles, and ripples

Rain that doesn't change the ground looks fake immediately. Wet surfaces respond by darkening their albedo (water absorbs incoming light), flattening their normals (the water film smooths microsurface), and dropping their roughness (water is a near-perfect mirror at glancing angles). The shader change is small, the visual change is huge.

Puddles are mask-driven: a height-based or vertex-painted mask defines low spots that fill with water as a "wetness" parameter rises. Ripples are flipbook normal-map textures triggered by raindrop impacts. The really nice implementations build the wetness state up over time, so a long rainstorm slowly soaks the world and a brief shower only darkens the high spots.

Wet cobblestone street at night with shallow puddles reflecting neon signs and ripple rings from raindrop impacts

Deep dives:

10. Snow accumulation and deformation

Snow is the symmetrical problem to rain: the world has to remember it, not just receive it. The standard approach uses a top-down "snow accumulation" texture that builds up over time wherever the sky is visible (computed against a top-down depth or shadow map). The terrain shader samples this mask and blends in the snow material in shaded regions and snow-deep displacement in exposed areas.

Footprints and tire tracks are rendered into a sliding deformation map centered on the player. As the camera moves, old footprints scroll out and the texture wraps around. The terrain or snow shader samples this deformation map and pushes vertices down where it has been written. Battlefield 5's snow does this with hardware tessellation; cheaper approaches use a high-density terrain mesh with vertex displacement only.

Stylized snowy landscape with fresh footprints, drifts piled against rocks, falling snowflakes, and hazy distant mountains

Deep dives:

11. Wind as a global system

Wind is not a particle effect. In production engines it's a single global vector (sometimes a low-resolution 3D field) that every dynamic system reads from in its vertex shader. Grass blades bend, tree branches sway, cloth flaps, leaves drift, rain tilts, smoke advects, cloud weather maps scroll. One uniform updated per frame, dozens of consumers.

The richer version is a "wind grid" that stores direction and strength sampled by world position, allowing for storms with localized gusts, sheltered valleys, and wakes behind buildings. Foliage also typically gets a per-vertex offset baked at authoring time so identical trees don't sway in lockstep. The result is a world that breathes at the same rate.

Strong wind blowing grass and trees sideways with leaves swirling in the foreground under stormy clouds

Deep dives:

12. Sandstorms, blizzards, and dense weather

Severe weather is its own rendering category. A sandstorm is a thick, opaque, ground-aligned fog with a strong directional bias and aggressive distance fog. A blizzard adds a near-camera particle blast and reduced visibility. Volcanic ash and smoke are the same architecture with different colors.

The thing that sells these is not the particles, it's the coupling: the sun darkens, the sky tints, the post-process color grading shifts, ambient audio swaps, footstep sounds change, the player's voice gets muffled if they have one. The renderer is the messenger; the immersion comes from every system in the game responding at the same time.

A wall of orange dust and sand rolling across a desert plain with bruised sky above and clear blue behind

Deep dives:

13. Time of day and dynamic skies

Real-time time of day is the multiplier that makes every other system in this list worth shipping. The sun direction and color update over a 24-minute or 24-hour cycle. The atmosphere LUT updates with the sun angle. The cloud lighting recomputes per frame. The shadow cascades repoint. The reflection probes refresh. The ambient color shifts. The post-process exposure adapts.

Doing this without visible artifacts is mostly a story of texture caching and temporal stability. Fast techniques precompute the sky at fixed sun angles and interpolate; slower ones recompute every frame. The Hillaire 2020 atmosphere model is fast enough to recompute, which is why UE5 ships it. The cloud weather map scrolls with wind, so coverage shifts naturally without anyone authoring keyframes.

Three vertical bands across one landscape: dawn pink, noon blue, sunset red, all sharing the same hill silhouette

Deep dives:

14. The weather state machine

Underneath all of this is a tiny state machine. Most games ship somewhere between 4 and 12 weather states (clear, partly cloudy, overcast, light rain, heavy rain, thunderstorm, fog, snow, blizzard, sandstorm), each defined by a set of parameters: cloud coverage and type, wind speed and direction, precipitation type and intensity, ambient color tints, audio profile, post-process grade.

Transitions are linear interpolations between parameter sets over 30 to 120 seconds. The transition isn't a special case, it's just two states being lerped, with each rendering subsystem reading the current parameter values that frame. Weather can be scripted (cutscene needs a storm), seeded (deterministic per region per in-game day so that two players in the same world see the same weather), or fully authored on a region grid. The cleanest pipelines treat all three as different schedulers writing into the same parameter buffer.

Side-by-side of the same scene under clear sun on the left and stormy rain on the right with wet glistening surfaces

Deep dives:

15. The cinematic moments

The whole stack exists for a few signature moments. Standing on a ridge as a storm front rolls in. Watching a sun shaft burn through a clearing in the canopy. Walking out of a cave into snow. Flying through a cumulus cloud and seeing the light on the inside.

These are the moments players take screenshots of. They're also the moments where every system above has to be working at the same time: cloud volumetrics, atmospheric scattering, fog with shadow integration, wet PBR, wind on the foliage, time-of-day color grading, and a transition between weather states all composing into one frame. Get any one of them wrong and the magic snaps.

Looking down across a sea of clouds onto a single mountain peak rising through, with sunlit cloud tops and shadowed valleys below

What this means for the browser

Most of these techniques map cleanly onto WebGPU. We've shipped basic atmospheric fog, equirectangular skybox blending, and screen-space distance haze in the open-world browser engine. The harder pieces (full volumetric clouds, froxel grid fog, wet PBR with dynamic puddles, snow deformation maps) are the obvious next step now that the terrain pipeline is stable. Spike 24's per-fragment fog color sampling from the skybox is one piece of this puzzle. A compute-shader cloud raymarcher feeding into the same atmosphere LUT is the next.

The good news is that the browser hardware floor is now high enough. WebGPU compute, 3D textures, indirect dispatch, and timestamp queries all exist. The Hillaire 2020 atmosphere has been ported to WebGL multiple times. Schneider's Nubis has open-source reference implementations in GLSL that translate to WGSL with mechanical edits. There is no longer a rendering reason that a browser game can't have the same sky as a console one. There are just engineering reasons, and engineering reasons are the kind we like.

Further reading across the whole stack

If you want one source that pulls all of this together, the SIGGRAPH "Advances in Real-Time Rendering in Games" archive (advances.realtimerendering.com) has the canonical weather and atmosphere talks going back to 2014. For production tear-downs of how specific games render their sky and weather, Adrian Courrèges' GPU profiling articles include detailed frame-by-frame breakdowns of GTA V, Horizon Zero Dawn, and Doom Eternal. For the sky and atmosphere math specifically, scratchapixel.com's volume rendering chapter is the gentlest introduction, and Hillaire's open-source implementation repository is the production-quality reference.