Blog

Water Rendering

Apr 3

Written by:
Thursday, April 03, 2014  RssIcon

by Helmut Garstenauer, Martin Garstenauer

As requested, we will describe some implementation details of our water rendering solution. This will not be a complete step-by-step introduction – there are already many articles out there (see Water Rendering Resources [1]). Instead, we will briefly describe all relevant parts, link to the relevant literature and go into detail where we have added something new.

clip_image001

Water Geometry

To create water, we can either render a 3D mesh with a water shader, or render a 2D quad and apply a deferred rendering/post-processing effect. The second method is described in the article Rendering Water as a Post-process Effect [2]. This method requires a G-buffer [3] – at least you need a texture containing the depth values of the scene. The post-processing effect checks the depth buffer to see if a pixel is above or below the water surface. If the water surface should be displaced to create waves, waves are stored in a height map, and the shader raymarches through the height field – similar to Parallax Occlusion Mapping [4]. This method is also used in the jMonkeyEngine [5].

We decided to render water using a mesh. This feels more intuitive. The mesh allows to define a local body of water, information (such as flow direction) can be stored in vertex attributes, work can be moved to the vertex shader, and so on.

The water mesh is rendered as an opaque object (no alpha blending).

Our solution distinguishes between local bodies of water (puddles, ponds, rivers, etc.) and an infinite ocean plane. A local body of water can be defined using any of the built-in shapes, e.g. a horizontal rectangle for a swimming pool, or a curved triangle mesh for a river. When the shape is a closed volume (e.g. a box or a closed triangle mesh), the volume defines the underwater area: When the camera is inside the volume, an underwater effect is applied.

An infinite ocean plane (without vertex displacement for waves) is rendered as a single horizontal quad that covers the view frustum.

To render the water surface with vertex displacement, the mesh needs to be finely tessellated. In case of a local body of water, a triangle mesh with a fixed resolution is sufficient. But a special solution is required to render the infinite ocean plane with vertex displacement. Automatic tessellation (using DX11 hardware) is one possible solution. The projected grid method is another solution.

Projected Grid

We use the projected grid method, which was introduced in Interactive Animation of Ocean Waves [6] and Real Time Water Rendering—Introducing the Projected Grid Concept [7]: A grid is defined in screen space. When the grid is rendered, the vertices are projected onto the water surface. The vertices are displaced by waves in the vertex shader.

Here is a wireframe rendering of the projected grid:

clip_image002

Our own implementation turned out to be very similar to the projected grid implementation described in Procedural Ocean Effects [8].

The projected grid method has undersampling problems, which are discussed in the literature, e.g. Oceans on a Shoestring: Shape Representation, Meshing and Shading [9]. These artifacts are tolerable as long as the wave displacement is within a certain limit. To reduce undersampling in the distance, we fade out the wave displacement.

When the grid vertices are displaced by waves, some gaps appear at the borders of the screen:

clip_image003

To avoid these gaps, we move the projecting camera back behind the player camera using a small offset. Thus, the projected grid covers more than the visible field of view. The correct offset depends on the maximal wave height – which is not known in all cases. We let the user define the offset. If the user sets an offset which is too small, gaps could still appear. To fix this, we also reduce the wave displacement to 0 near the grid borders.

Camera Near Plane vs. Water Surface

Other artifacts can appear if the water surface cuts the camera near plane. In most games, this situation is prevented by keeping the camera well above or below the water surface, avoiding problematic camera positions. However, this moves the responsibility to the camera game logic. We chose to handle this in the water renderer: If the camera near plane touches the water surface, vertex displacement is faded out and vertices are clamped to positions below the camera. If the camera origin is under the water surface, vertices are clamped to position above the camera. The resulting deformation of the water surface around the camera near plane is hardly noticeable.

Ripples

Small waves/ripples on the water are created using two normal maps, scrolled in different directions. Each normal map is sampled once, the normals are scaled by a user-defined "normal map strength" parameter, then added and normalized. The result is an animated normal map. Let's call it the "ripple map". (We will refer to this map later in the article.)

Here are two example normal maps which are used as input.

clip_image004

clip_image005

These normal maps can be created in Photoshop:

  • Create new 256x256 or 512x512 image. (File | New…)
  • Apply difference clouds two times. (Filter | Render | Difference clouds)
  • Invert image. (Image | Adjustments | Invert)
  • Create normal map using the NVIDIA Texture Tools. (Filter | NVIDIA Tools | NormalMapFilter…)

Reflection and Refraction

Reflections are created using a single skybox cube map or a planar reflection. For HDR cube maps we use RGBM encoding [10].

To create refractions, the scene render target is used as the refraction texture. We do not use a physically-based refraction effect. We simply use the water normal to distort the lookup position. The G-buffer is checked to reject any pixels above the water surface. If a sample is rejected, we sample again at the original (undistorted) position. The refraction distortion linearly fades to 0 if the underwater geometry is close to the water surface (see Soft Intersections below).

A Fresnel term [11] (similar to Schlick's approximation [12]) is used to lerp between reflection and refraction.

Lighting

Water is affected by the dominant ambient and directional light sources:

float3 lightColor = WaterColor * (AmbientLightIntensity + nDotL * DirectionalLightIntensity)

where nDotL is the dot product of the up normal (not the ripple normal) and the light direction. The WaterColor parameter is a user-defined tint color. The resulting light color is the input for the inscatter effect used in the underwater fog.

The specular highlight is created using standard Blinn-Phong – no special BRDF or normal map filtering yet.

Underwater Fog

The refracted scene color is attenuated based on the vertical underwater depth and the view distance from the refracted point to the water surface (or the camera if the player is underwater). Additional light is scattered to the viewer based on the underwater depth and view distance. We have tried several extinction/scatter formulas. Currently, we use the formulas presented in Real-time Atmospheric Effects in Games Revisited [13].

Important note: The underwater fog is the effect which determines the color of the water, e.g. transparent vs. blue ocean vs. brown murky water.

Soft Intersections

To avoid hard intersections between the scene and the water surface, the water effect fades out when the underwater geometry is near the water surface. The depth of the water surface pixel is compared with the depth in the G-buffer. When the delta is small we linearly interpolate between the computed water color and the undistorted refraction color (i.e. the normal scene).

Flow

Our implementation supports

  • constants flow with a user-defined velocity,
  • automatic flow based on the normal vectors of the water surface,
  • and flow maps.

Using the normal vectors is a natural way to define water flow: If the normal points up, the water surface shows ripples but no overall flow direction. If the water surface is tilted, the water flows downhill. This makes it is easy to define the flow, for example, for a meandering river.

Constant flow in one direction is simulated using the ripple maps mentioned above: If the two normal maps scroll in opposite directions with equal speed, the water surface shows ripples but doesn't have an overall flow direction. If the velocities of the normal maps do not cancel each other, water appears to be flowing in one direction.

A more sophisticated solution is required to simulate water flowing in different directions, for example, to create a vortex or a curved river. It is not possible to simply scroll a normal map along the flow direction. Each point on the water surface has to scroll the map in a different direction. The normal map quickly gets distorted beyond recognition.

Our flow implementation is similar to Water Flow in Portal 2 [14]: Each surface point alternates between two normal maps. One normal map is faded in and scrolled in the flow direction. Then, as the normal map gets distorted, it is faded out. A second, undistorted normal map is faded in and scrolled in the flow direction. And so on. Example code for water flow can be found here: Animating Water Using Flow Maps [15]

This solution suffers from a pulsing artifact, created by cycling the normal maps. This is also clarified in the article Water Flow Shader [16]. We use noise, as discussed in the Portal 2 presentation, to hide the problem. More tips and details can be found in Water Technology of Uncharted [17].

In addition, we found another trick to hide pulsing: The basic implementation alternates between two fixed normal maps. The improved implementation alternates between two "ripple maps" (as described above, see Ripples). So instead of cycling between two static normal maps (= two texture lookups), we cycle between two animated normal maps (= four texture lookups).

Additionally, the shader limits the flow velocity and reduces the normal strength if the water speed exceeds a certain limit.

The article Water Shader [18] describes a different, more recent method to create flow. This method does not suffer from the pulsing problem. However, we did not implement this method. While it works well with a flow texture, it is not obvious how to use it when flow is defined per vertex (e.g. using vertex normals).

Waves and Vertex Displacement

Vertex displacement mapping in the vertex shader is used to create waves. Without vertex displacement, the geometry is flat and only the scrolling normal maps create the impression of small waves.

For large waves the water renderer needs a displacement map and a normal map. The displacement map contains the x, y and z displacement. It is not a simple height map. Horizontal displacement is necessary to create "choppy" waves. Without it, waves are round and unnatural.

The displacement and normal maps have to be animated. Developers can provide their own maps, e.g. by running a grid-based water simulation on the CPU. The wave maps can be tiling (e.g. for an infinite ocean) or non-tiling (e.g. for a local wave simulation around the player).

We provide an ocean wave renderer, which simulates waves using a statistical ocean model and Fast Fourier Transform (FFT) [19], as described by Jerry Tessendorf (Simulating Ocean Water [20]). Translating the theoretical formulas to code can be challenging. I found the ocean source code in the Meshuggah demo [21] very helpful.

We compute the FFT on the GPU. GPU FFT is, for example, discussed in Simulating Ocean Waves with FFT on GPU [22] (source code [23]). Optionally, we compute FFT on the CPU. This is done in case the user wants to query the ocean displacement on the CPU, for example, to compute the position of swimming objects. The CPU FFT solves a simpler FFT, ignoring higher frequencies in the ocean spectrum. Example: The GPU performs a 256x256 FFT and the CPU a 16x16 FFT. The difference between a 256x256 FFT and a 16x16 FFT is noticeable but still convincing if used for swimming objects when the wave height is not too high. CPU FFT code is available on Paul Bourke's website [24].

I have recently discovered another article about FFT water, which looks helpful, but I have not read it yet: Ocean Simulation Part Two: Using the Fast Fourier Transform [25]

Foam

Foam is an optional effect. Foam can be added near the shore, where the water is shallow, and on top of large waves.

We use a single opaque foam texture. The foam is lit by ambient and directional light. The normal of the water surface is used to create local movement caused by waves:

float3 foam = FoamColor * tex2D(FoamSampler, (input.PositionWorld.xz - normal.xz * FoamDistortion) * FoamMapScale).rgb;
foam *= AmbientLight + nDotL * DirectionalLightIntensity;

Then we compute the foam intensity:

// No foam where foam map is black.
float foamIntensity = length(foam);

float foamShoreAmount = 1 - saturate(refractionDepthDelta / FoamShoreIntersection);
float foamCrestAmount = saturate((input.PositionWorld.y - SurfaceLevel - FoamCrestMin) / (FoamCrestMax - FoamCrestMin));

// Use max of shore and crest foam. Apply non-linear curve to change contrast.
foamIntensity *= pow(max(foamShoreAmount, foamCrestAmount), 3);
foamIntensity = saturate(foamIntensity);

Where the foam intensity is high, foam replaces the computed water color:

color = lerp(color, foam, foamIntensity);

Subsurface Scattering

If a wave is high and in front of the sun, light is scattered through the wave to the viewer. We fake this effect using a hack combining several terms:

float scatter = 1;
// More scattering when we look towards the light.
scatter *= saturate(-dot(viewDirection, DirectionalLightDirection) * 0.5 + 0.5);
// More scattering when we look straight onto the wave surface.
scatter *= saturate(-dot(viewDirection, normal));
// Less scattering when camera is above water and looks down.
scatter *= max(0, 1-abs(viewDirection.y));
// More scattering when wave is high above surface level.
scatter *= max(0, input.PositionWorld.y - SurfaceLevel);

The scatter factor influences the light intensity (discussed above, see Lighting):

lightColor += lightColor * scatter * ScatterColor;

Caustics

I assume, underwater caustics are usually created using predefined animated textures or at runtime using beam-tracing, see Periodic Caustic Textures [26] or Deep-Water Animation and Rendering [27]. We considered caustics a nice-to-have feature, and we gave ourselves not much time to experiment with the topic. After one day we found a good enough method to fake simple caustics in the water shader.

Caustics is a phenomenon which does not map well to the GPU. It is basically a scatter operation: A ray of light reaches the water surface, is refracted by the curved surface and finally hits the underwater geometry. The refracted light rays do not cover the underwater geometry uniformly. Similar to light that shines through a lens, we get bright spots and darker areas.

The initial idea behind our implementation is to turn the scatter operation into a gather operation: We take a few samples of the wave normal map and use a probability distribution to estimate how much each sample contributes to the underwater brightness. Using a Phong distribution, we multiply each sample with (Normal · LightDirection)n.

First, I tried taking only a single sample from the wave normal map. I added several specular lobes to get a more interesting effect:

float3 causticPosition = underwaterPosition + LightDirection * waterDepth / LightDirection.y;
float2 causticTexCoord = causticPosition.xz * WaveMapScale + WaveMapOffset;
float3 causticNormal = normalize(tex2D(WaveNormalSampler, causticTexCoord).xzy * 2 - 1);
float n = causticNormal.y;
float caustics = 0.05 * pow(n, 100) + 0.1 * pow(n, 2000) + 0.1 * pow(n, 5000) + 0.5 * pow(n, 10000);

The result was not satisfying, but it hinted at something useful:

clip_image006

The next step was to use more samples. I found a similar method in the NVIDIA DirectX 11 Island sample. (Too bad, the NVIDIA DirectX 11 samples webpage does not work anymore. It seems, with the release of NVIDIA GameWorks the old DirectX 11 samples were removed and replaced by … fewer samples!? You can still download the sample here: https://developer.nvidia.com/sites/default/files/akamai/gamedev/files/sdk/11/Island11.zip [28])

However, I didn't understand NVIDIA's implementation and created something similar myself. Following method computes the caustic intensity for an underwater pixel:

float ComputeCaustics(float3 position, float3 lightIntensity)
{
  // We could make the caustics depth-dependent, but without a high sample count,
  // this introduces artifacts.
  float depth = 1;  //SurfaceLevel - position.y
  
  float3 refractionToSurface = -DirectionalLightDirection * depth;
  float3 causticPosition = position + refractionToSurface;
  
  // For method 1 (Add), the intial value must be 0.
  // For method 2 (Multiply), the initial value must be 1.
  float caustics = 1;
  
  // Combine NxN samples.
  for (int i = 0; i < CausticsSampleCount; i++)
  {
    for (int j = 0; j < CausticsSampleCount; j++)
    {
      // A horizontal offset for the sample position
      float3 offset = depth * float3(
        (i - CausticsSampleCount / 2) * CausticsSampleOffset,
        0,
        (j - CausticsSampleCount / 2) * CausticsSampleOffset) ;
      
      // Lookup position.
      float2 texCoord = (causticPosition.xz + offset.xz) * WaveMapScale + WaveMapOffset;
      
      // The direction to the lookup position is:
      float3 refractedDirection = refractionToSurface + offset;
      
      // Sample wave normal map.
      float3 n = tex2Dlod(WaveNormalSampler, float4(texCoord, 0, 0)).xzy * 2 - 1;
      
      // Use wave normal to perturb refractedDirection.
      refractedDirection -= depth * CausticsDistortion * float3(n.x, 0, n.z);
      
      // Light contribution of this sample.
      float sampleIntensity = pow(
        max(0.0001, dot(normalize(refractedDirection), -DirectionalLightDirection)),
        CausticsExponent);
      
      // Method 1: Add intensity of samples. Creates smoother caustics.
      //caustics += sampleIntensity;
      
      // Method 2: Multiply intensities. Creates caustics with more contrast.
      caustics *= sampleIntensity;
    }
  }
  
  caustics *= CausticsIntensity;
  
  // Optional: Apply a non-linear curve to enhance contrast.
  //caustics = pow(caustics, 3);
  
  caustics *= lightIntensity;
  
  return caustics;
}

Astonishingly, we get some interesting fake caustics with as little as 3 x 3 samples. For example:

clip_image007

To see the caustics in motion, check out this video: Water Rendering (DigitalRune YouTube Channel)

This method is simple, the caustics are fully dynamic and change with light direction and wave height. However, it is not good at creating very crisp, intense caustics. In the future, we might explore more sophisticated solutions.

Underwater Distortion Effect

If the camera is underwater, the scene should be distorted to create an underwater feeling. The water surface shader creates refraction distortion only when the camera is looking through the water surface. If we want a distortion effect when the camera is underwater and looking at underwater geometry, we have to apply an additional effect. In our example project, we use a post-processing effect to add a fullscreen distortion when the camera origin is inside the water volume. The distortion post-processing shader is very simple: When sampling from the source image, add an offset to each pixel. The offset is a sine/cosine function of the current time and pixel position. The offset is reduced to 0 at the screen edges - otherwise, we would create gaps at the screen border.

The method was adapted from an NVIDIA sample. Here is the pixel shader:

float4 PSPostProcess(float2 texCoord : TEXCOORD0) : COLOR0
{
  // Calculate a pertubation for the texture coordinates
  float2 perturbation;
  perturbation.x = sin(6.28 * frac(Time / 3) + 20 * texCoord.x);
  perturbation.y = cos(6.28 * frac(Time / 3) + 20 * texCoord.y);
  perturbation *= 0.005;

  // Perturb the texture coordinates. Fall off to zero on the borders to avoid
  // artifacts on edges.
  perturbation *= saturate(0.8 - (2.0 * texCoord - 1.0) * (2.0 * texCoord - 1.0));
  
  return tex2D(SourceSampler, texCoord + perturbation);
}

Other underwater effects, like god rays or depth-based blur are not implemented yet.

Fog

In our deferred lighting pipeline, fog is applied to opaque geometry in a single full-screen pass. Transparent, forward rendered geometry applies the fog when the geometry is rendered. Since water isn't rendered into the G-buffer either, it also applies fog in the water shader directly.

Conclusion

I would like to note that our implementation is not the pinnacle of water rendering tech. Various features will be added or refined in the future: underwater light shafts, local wave simulation from impacts, better foam, etc. However, we are very pleased with the current results. The features, implemented in a few weeks, exceeded our initial expectations.

References

[1] DigitalRune Blog: "Water Rendering Resources", http://www.digitalrune.com/Support/Blog/tabid/719/EntryId/208/Water-Rendering-Resources.aspx
[2] Wojciech Toman: "Rendering Water as a Post-process Effect", http://www.gamedev.net/page/reference/index.html/_/technical/graphics-programming-and-theory/rendering-water-as-a-post-process-effect-r2642
[3] Wikipedia: "Deferred shading", http://en.wikipedia.org/wiki/Deferred_shading
[4] DigitalRune Blog: "Parallax Mapping", http://www.digitalrune.com/Support/Blog/tabid/719/EntryId/201/Parallax-Mapping.aspx
[5] jMonkeyEngine: "Rendering Water as Post-Process Effect", http://hub.jmonkeyengine.org/wiki/doku.php/jme3:advanced:post-processor_water
[6] Damien Hinsinger, Fabrice Neyret, Marie-Paule Cani: "Interactive Animation of Ocean Waves", ACM-SIGGRAPH/EG Symposium on Computer Animation (SCA), 2002, http://www-evasion.imag.fr/Publications/2002/HNC02/
[7] Claes Johanson: "Real-time water rendering - introducing the projected grid concept", Master of Science thesis in computer graphics, March 2004, http://fileadmin.cs.lth.se/graphics/theses/projects/projgrid/
[8] László Szécsi, Khashayar Arman: "Procedural Ocean Effects", ShaderX6 - Advanced Rendering Techniques, http://www.shaderx6.com/
[9] Huw Bowles: "Oceans on a Shoestring: Shape Representation, Meshing and Shading", Advances in Real-Time Rendering in Games, SIGGRAPH 2013, http://advances.realtimerendering.com/s2013/index.html
[10] Brian Karis: "RGBM color encoding", http://graphicrants.blogspot.co.at/2009/04/rgbm-color-encoding.html
[11] Wikipedia: "Fresnel equations", http://en.wikipedia.org/wiki/Fresnel_equations
[12] Wikipedia: "Schlick's approximation", http://en.wikipedia.org/wiki/Schlick's_approximation
[13] C. Wenzel: "Real-time Atmospheric Effects in Games Revisited", GDC 2007, http://www.crytek.com/cryengine/presentations/real-time-atmospheric-effects-in-games-revisited
[14] Alex Vlachos: "Water Flow in Portal 2", SIGGRAPH 2010, http://www.valvesoftware.com/publications/2010/siggraph2010_vlachos_waterflow.pdf
[15] Kyle Hayward: "Animating Water Using Flow Maps", http://graphicsrunner.blogspot.co.at/2010/08/water-using-flow-maps.html
[16] IceFall Games: "Water Flow Shader", http://mtnphil.wordpress.com/2012/08/25/water-flow-shader/
[17] Carlos Gonzalez Ochoa, Doug Holder: "Water Technology of Uncharted", GDC 2012, http://cgzoo.files.wordpress.com/2012/04/water-technology-of-uncharted-gdc-2012.pdf
[18] "Water shader", http://www.rug.nl/science-and-society/centre-for-information-technology/research/hpcv/publications/watershader/
[19] Wikipedia: "Fast Fourier transform", http://en.wikipedia.org/wiki/Fast_Fourier_transform
[20] Jerry Tessendorf: "Simulating Ocean Water", http://graphics.ucsd.edu/courses/rendering/2005/jdewall/tessendorf.pdf
[21] Carsten Wenzel: "Meshuggah’s Effects explained", ShaderX2 - Shader Tips & Tricks, http://www.shaderx2.com/
[22] Shiqiu Liu: "Simulating Ocean Waves with FFT on GPU", http://www.edxgraphics.com/2/post/2011/09/simulating-ocean-waves-with-fft-on-gpu.html
[23] EDXGraphics, http://edxgraphics.codeplex.com/
[24] Paul Bourke: "DFT (Discrete Fourier Transform), FFT (Fast Fourier Transform)", http://paulbourke.net/miscellaneous/dft/
[25] Keith Lantz: "Ocean Simulation Part Two: Using the Fast Fourier Transform", http://www.keithlantz.net/2011/11/ocean-simulation-part-two-using-the-fast-fourier-transform/
[26] "Periodic Caustic Textures", http://www.dgp.toronto.edu/~stam/reality/Research/PeriodicCaustics/index.html
[27] Lasse Staff Jensen, Robert Goliáš: "Deep-Water Animation and Rendering", http://whitenight-games.chez-alice.fr/Docs/Water.pdf
[28] NVIDIA: DirectX 11 Island Sample, https://developer.nvidia.com/sites/default/files/akamai/gamedev/files/sdk/11/Island11.zip

Tags:
Categories:

Your name:
Gravatar Preview
Your email:
(Optional) Email used only to show Gravatar.
Your website:
Title:
Comment:
Security Code
CAPTCHA image
Enter the code shown above in the box below
Add Comment   Cancel