Home iOS & Swift Books Metal by Tutorials

20
Advanced Lighting Written by Marius Horga

Heads up... You're reading this book for free, with parts of this chapter shown beyond this point as scrambled text.

You can unlock the rest of this book, and our entire catalogue of books and videos, with a raywenderlich.com Professional subscription.

As you’ve progressed through this book, you’ve encountered various lighting and reflection models:

  • In Chapter 5, “Lighting Fundamentals,” you started with the Phong reflection model which defines light as a sum of three distinct components: ambient light, diffuse light and specular light.
  • In Chapter 7, “Maps and Materials,” you briefly looked at physically based rendering and the Fresnel effect.
  • In Chapter 12, “Environment,” you implemented skybox-based reflection and image-based lighting, and you used a Bidirectional Reflectance Distribution Function (BRDF) look-up table.

In this chapter, you’ll learn about global illumination and the famous rendering equation that defines it.

While reflection is possible using the local illumination techniques you’ve seen so far, advanced effects — like refraction, subsurface scattering, total internal reflection, caustics and color bleeding — are only possible with global illumination. By the end of this chapter, you’ll be able to render beautiful content like this:

You’ll start by examining the rendering equation. From there, you’ll move on to reflection and refraction, and you’ll render water in two ways: ray marching and rasterization.

The rendering equation

Two academic papers — one by James Kajiya and the other by David Immel et al. — introduced the rendering equation in 1986. In its raw form, this equation might look intimidating:

Note: A friendlier form of it was written recently by Pixar’s Julian Fong in a tweet. Check references.markdown for more information.

The rendering equation is based on the law of conservation of energy, and in simple terms, it translates to an equilibrium equation where the sum of all source lights must equal the sum of all destination lights:

incoming light + emitted light = transmitted light + outgoing light

If you rearrange the terms of the equilibrium equation, you get the most basic form of the rendering equation:

outgoing light = emitted light + incoming light - transmitted light

The incoming light - transmitted light part of the equation is subject to recursion because of multiple light bounces at that point. That recursion process translates to an integral over a unit hemisphere that’s centered on the normal vector at the point and which contains all the possible values for the negative direction of the incoming light.

Although the rendering equation might be a bit intimidating, think of it like this: all the light leaving an object is what remains from all the lights coming into the object after some of them were transmitted through the object.

The transmitted light can be either absorbed by the surface of the object (material), changing its color; or scattered through the object, which leads to a range of interesting optical effects such as refraction, subsurface scattering, total internal reflection, caustics and so on.

Reflection

Reflection is one of the most common interactions between light and objects. Imagine looking into a mirror. Not only would you see your image being reflected, but you’d also see the reflection of any nearby objects.

Getting started

Open the starter playground named AdvancedLighting, and select the 1. reflection playground page. Run the playground, and you’ll see this:

Drawing a checkerboard pattern

To draw a pattern on the plane, you first you need to have a way of identifying objects within the scene by comparing their proximity to the camera based on distance.

constant float PlaneObj = 0.0;
constant float SphereObj = 1.0;
float dts = distToSphere(r, s);
float object = (dtp > dts) ? SphereObj : PlaneObj;
return float2(dist, object);
float2 dist = distToScene(cam.ray);
float closestObject = dist.y;
// 1
if (closestObject == PlaneObj) {
  // 2
  float2 pos = cam.ray.origin.xz;
  pos *= 0.1;
  // 3
  pos = floor(fmod(pos, 2.0));
  float check = mod(pos.x + pos.y, 2.0);
  // 4
  col *= check * 0.5 + 0.5;
}
fmod = numerator - denominator * trunc(numerator / denominator)
mod = numerator - denominator * floor(numerator / denominator)
float mod(float x, float y) {
  return x - y * floor(x / y);
}

Camera reflectRay(Camera cam, float3 n, float eps) {
  cam.ray.origin += n * eps;
  cam.ray.dir = reflect(cam.ray.dir, n);
  return cam;
}
float3 normal = getNormal(cam.ray);
cam = reflectRay(cam, normal, eps);

if (!hit) {
  col = mix(float3(.8, .8, .4), float3(.4, .4, 1.), 
            cam.ray.dir.y);
} else {
  float3 n = getNormal(cam.ray);
  float o = ao(cam.ray.origin, n);
  col = col * o;
}
col *= mix(float3(0.8, 0.8, 0.4), float3(0.4, 0.4, 1.0), 
           cam.ray.dir.y);

constant float &time [[buffer(0)]]
float3 camPos = float3(15.0, 7.0, 0.0);
float3 camPos = float3(sin(time) * 15.0,
                       sin(time) * 5.0 + 7.0,
                       cos(time) * 15.0);

Refraction

Refraction is another common interaction between light and objects that you often see in nature. While it’s true that most objects in nature are opaque — thus absorbing most of the light they get — the few objects that are translucent, or transparent, allow for the light to propagate through them.

sin(theta2) = sin(theta1) / 1.33
bool inside = false;
float2 dist = distToScene(cam.ray);
dist.x *= inside ? -1.0 : 1.0;
float3 normal = getNormal(cam.ray);
if (dist.x < eps) {
float3 normal = getNormal(cam.ray) * (inside ? -1.0 : 1.0);
cam = reflectRay(cam, normal, eps);
// 1
else if (closestObject == SphereObj) {
  inside = !inside;
  // 2
  float ior = inside ? 1.0 / 1.33 : 1.33;
  cam = refractRay(cam, normal, eps, ior);
}
Camera refractRay(Camera cam, float3 n, float eps, float ior) {
  cam.ray.origin -= n * eps * 2.0;
  cam.ray.dir = refract(cam.ray.dir, n, ior);
  return cam;
}

Raytraced water

It’s relatively straightforward to create a cheap, fake water-like effect on the sphere.

float object = (dtp > dts) ? SphereObj : PlaneObj;
if (object == SphereObj) {
  // 1
  float3 pos = r.origin;
  pos += float3(sin(pos.y * 5.0), 
                sin(pos.z * 5.0), 
                sin(pos.x * 5.0)) * 0.05;
  // 2
  Ray ray = Ray{pos, r.dir};
  dts = distToSphere(ray, s);
}
cam.ray.origin += cam.ray.dir * dist.x;
cam.ray.origin += cam.ray.dir * dist.x * 0.5;

Rasterized water

From this point on, and until the end of the chapter, you’ll work on adapting an exemplary algorithm for creating realistic water developed by Michael Horsch in 2005 (for more information, see references.markdown). This realistic water algorithm is purely based on lighting and its optical properties, as opposed to having a water simulation based on physics.

1. Create the water surface

First, you’ll create a plane for the water surface. In Renderer.swift, add these properties to Renderer:

lazy var water: MTKMesh = {
  do {
    let mesh = Primitive.plane(device: Renderer.device)
    let water = try MTKMesh(mesh: mesh, device: Renderer.device)
    return water
  } catch let error {
    fatalError(error.localizedDescription)
  }
}()

var waterTransform = Transform()
var waterPipelineState: MTLRenderPipelineState!
// water pipeline state
descriptor.vertexFunction = 
  library.makeFunction(name: "vertex_water")
descriptor.fragmentFunction = 
  library.makeFunction(name: "fragment_water")
descriptor.colorAttachments[0].pixelFormat = .bgra8Unorm
descriptor.vertexDescriptor = 
  MTKMetalVertexDescriptorFromModelIO(water.vertexDescriptor)
try waterPipelineState = 
  device.makeRenderPipelineState(descriptor: descriptor)
func renderWater(renderEncoder: MTLRenderCommandEncoder) {
  renderEncoder.pushDebugGroup("water")
  renderEncoder.setRenderPipelineState(waterPipelineState)
  renderEncoder.setVertexBuffer(water.vertexBuffers[0].buffer, 
                                offset: 0, index: 0)
  uniforms.modelMatrix = waterTransform.matrix
  renderEncoder.setVertexBytes(&uniforms,
                     length: MemoryLayout<Uniforms>.stride,
                     index: Int(BufferIndexUniforms.rawValue))
  for submesh in water.submeshes {
    renderEncoder.drawIndexedPrimitives(type: .triangle,
             indexCount: submesh.indexCount,
             indexType: submesh.indexType,
             indexBuffer: submesh.indexBuffer.buffer,
             indexBufferOffset: submesh.indexBuffer.offset)
  }
  renderEncoder.popDebugGroup()
}
renderWater(renderEncoder: renderEncoder)
#import "Common.h"

struct VertexIn {
  float4 position [[attribute(0)]];
  float2 uv [[attribute(2)]];
};

struct VertexOut {
  float4 position [[position]];
  float2 uv;
};

vertex VertexOut 
     vertex_water(const VertexIn vertex_in [[stage_in]],
                  constant Uniforms &uniforms
                             [[buffer(BufferIndexUniforms)]]) {
  VertexOut vertex_out;
  float4x4 mvp = uniforms.projectionMatrix * uniforms.viewMatrix
                         * uniforms.modelMatrix;
  vertex_out.position = mvp * vertex_in.position;
  vertex_out.uv = vertex_in.uv;
  return vertex_out;
}

fragment float4 
     fragment_water(VertexOut vertex_in [[stage_in]]) {
  return float4(0.0, 0.3, 0.5, 1.0);
}

2. The reflection render pass

The water plane should reflect its surroundings. In Chapter 12, “Environment,” you reflected the skybox onto objects, but this time you’re also going to reflect the house and terrain on the water.

let reflectionRenderPass: RenderPass
reflectionRenderPass = RenderPass(name: "reflection",
                                  size: metalView.drawableSize)
reflectionRenderPass.updateTextures(size: size)
// 1
let reflectEncoder = 
  commandBuffer.makeRenderCommandEncoder(
         descriptor: reflectionRenderPass.descriptor)!
reflectEncoder.setDepthStencilState(depthStencilState)
// 2
reflectionCamera.transform = camera.transform
reflectionCamera.transform.position.y = -camera.transform.position.y
reflectionCamera.transform.rotation.x = -camera.transform.rotation.x
uniforms.viewMatrix = reflectionCamera.viewMatrix
// 3
renderHouse(renderEncoder: reflectEncoder)
renderTerrain(renderEncoder: reflectEncoder)
renderSkybox(renderEncoder: reflectEncoder)
reflectEncoder.endEncoding()
renderEncoder.setFragmentTexture(reflectionRenderPass.texture, 
                                 index: 0)
texture2d<float> reflectionTexture [[texture(0)]]
// 1
constexpr sampler s(filter::linear, address::repeat);
// 2
float width = float(reflectionTexture.get_width() * 2.0);
float height = float(reflectionTexture.get_height() * 2.0);
float x = vertex_in.position.x / width;
float y = vertex_in.position.y / height;
float2 reflectionCoords = float2(x, 1 - y);
// 3
float4 color = reflectionTexture.sample(s, reflectionCoords);
color = mix(color, float4(0.0, 0.3, 0.5, 1.0), 0.3);
return color;

3. Clipping planes

A clipping plane, as its name suggests, clips the scene using a plane. It’s hardware accelerated, meaning that if geometry is not within the clip range, the GPU immediately discards it and doesn’t put it through the rasterizer.

vector_float4 clipPlane;
float clip_distance [[clip_distance]] [1];
float clip_distance;
vertex_out.clip_distance[0] = 
    dot(uniforms.modelMatrix * vertex_in.position,
        uniforms.clipPlane);
var clipPlane = float4(0, 1, 0, 0.1)
uniforms.clipPlane = clipPlane
clipPlane = float4(0, -1, 0, 6)
uniforms.clipPlane = clipPlane

clipPlane = float4(0, -1, 0, 100)

4. Rippling normal maps

The project contains a tiling normal map for the water ripples.

var waterTexture: MTLTexture?
waterTexture = 
    try Renderer.loadTexture(imageName: "normal-water.png")
renderEncoder.setFragmentTexture(waterTexture, index: 2)
renderEncoder.setFragmentBytes(&timer, 
                               length: MemoryLayout<Float>.size, 
                               index: 3)
texture2d<float> normalTexture [[texture(2)]],
constant float& timer [[buffer(3)]]
// 1
float2 uv = vertex_in.uv * 2.0;
// 2
float waveStrength = 0.1;
float2 rippleX = float2(uv.x + timer, uv.y);
float2 rippleY = float2(-uv.x, uv.y) + timer;
float2 ripple = 
    ((normalTexture.sample(s, rippleX).rg * 2.0 - 1.0) +
     (normalTexture.sample(s, rippleY).rg * 2.0 - 1.0)) 
      * waveStrength;
reflectionCoords += ripple;
// 3  
reflectionCoords = clamp(reflectionCoords, 0.001, 0.999);

5. The refraction render pass

Refraction is very similar to reflection, except that you only need to preserve the part of the scene where the Y coordinate is negative.

let refractionRenderPass: RenderPass
refractionRenderPass = RenderPass(name: "refraction",
                                  size: metalView.drawableSize)
refractionRenderPass.updateTextures(size: size)
// 1
clipPlane = float4(0, -1, 0, 0.1)
uniforms.clipPlane = clipPlane
uniforms.viewMatrix = camera.viewMatrix
// 2
let refractEncoder = 
    commandBuffer.makeRenderCommandEncoder(
          descriptor: refractionRenderPass.descriptor)!
refractEncoder.setDepthStencilState(depthStencilState)
renderHouse(renderEncoder: refractEncoder)
renderTerrain(renderEncoder: refractEncoder)
renderSkybox(renderEncoder: refractEncoder)
refractEncoder.endEncoding()
renderEncoder.setFragmentTexture(refractionRenderPass.texture, 
                                 index: 1)
texture2d<float> refractionTexture [[texture(1)]]
float2 refractionCoords = float2(x, y);
refractionCoords += ripple;
refractionCoords = clamp(refractionCoords, 0.001, 0.999);
float4 color = reflectionTexture.sample(s, reflectionCoords);
float4 color = refractionTexture.sample(s, refractionCoords);

6. The Fresnel effect

The Fresnel effect is a concept you’ve met with in previous chapters. As you may remember, the viewing angle plays a significant role in the amount of reflection you can see. What’s new in this chapter is that the viewing angle also affects refraction but in inverse proportion:

vector_float3 cameraPosition;
uniforms.cameraPosition = camera.transform.position
float3 worldPosition;
float3 toCamera;
vertex_out.worldPosition = 
    (uniforms.modelMatrix * vertex_in.position).xyz;
vertex_out.toCamera = uniforms.cameraPosition - vertex_out.worldPosition;
float4 color = refractionTexture.sample(s, refractionCoords);
float3 viewVector = normalize(vertex_in.toCamera);
float mixRatio = dot(viewVector, float3(0.0, 1.0, 0.0));
float4 color = 
    mix(reflectionTexture.sample(s, reflectionCoords),
        refractionTexture.sample(s, refractionCoords),
        mixRatio);

7. Add smoothness using a depth texture

The light propagation varies for different transparent media, but for water, the colors with longer wavelengths (closer to infrared) quickly fade away as the light ray goes deeper. The bluish colors (closer to ultraviolet) tend to be visible at greater depths because they have shorter wavelengths.

renderEncoder.setFragmentTexture(
                  refractionRenderPass.depthTexture,
                  index: 4)
guard let attachment = descriptor.colorAttachments[0] 
      else { return }
attachment.isBlendingEnabled = true
attachment.rgbBlendOperation = .add
attachment.sourceRGBBlendFactor = .sourceAlpha
attachment.destinationRGBBlendFactor = .oneMinusSourceAlpha
depth2d<float> depthMap [[texture(4)]]
float proj33 = far / (far - near); 
float proj43 = proj33 * -near;     
float depth = depthMap.sample(s, refractionCoords);
float floorDistance = proj43 / (depth - proj33);
depth = vertex_in.position.z;
float waterDistance = proj43 / (depth - proj33);
depth = floorDistance - waterDistance;
color.a = clamp(depth * 0.75, 0.0, 1.0);

Challenge

Your challenge for this chapter is to use the normal map from the ripples section, this time to add surface lighting.

Where to go from here?

You’ve certainly made a splash with this chapter! If you want to explore more about water rendering, the references.markdown file for this chapter contains links to interesting articles and videos.

Have a technical question? Want to report a bug? You can ask questions and report bugs to the book authors in our official book forum here.

Have feedback to share about the online reading experience? If you have feedback about the UI, UX, highlighting, or other features of our online readers, you can send them to the design team with the form below:

© 2021 Razeware LLC

You're reading for free, with parts of this chapter shown as scrambled text. Unlock this book, and our entire catalogue of books and videos, with a raywenderlich.com Professional subscription.

Unlock Now

To highlight or take notes, you’ll need to own this book in a subscription or purchased by itself.