Chapters

Hide chapters

Metal by Tutorials

Second Edition · iOS 13 · Swift 5.1 · Xcode 11

Before You Begin

Section 0: 3 chapters
Show chapters Hide chapters

Section I: The Player

Section 1: 8 chapters
Show chapters Hide chapters

Section III: The Effects

Section 3: 10 chapters
Show chapters Hide chapters

10. Fragment Post-Processing
Written by Marius Horga

Heads up... You're reading this book for free, with parts of this chapter shown beyond this point as scrambled text.

Before embarking on complex features like tessellation and instancing, it’s best to start with simple techniques to improve your render quality. After the fragments have been processed in the pipeline, a series of operations are run on the GPU. These operations are sometimes referred to as Per-sample Processing (https://www.khronos.org/opengl/wiki/Per-Sample_Processing), and include: Alpha testing; Depth testing; Stencil testing; Scissor testing; Blending; and Anti-aliasing.

As you go through this chapter, you’ll learn about most of these.

Getting started

In the projects directory for this chapter, open the starter playground. Then, run the playground and you’ll see this image:

When you move closer to the tree — using either the scroll wheel or the two-finger gesture on your trackpad — you’ll notice the leaves have an unpleasant look. In the playground’s Resources folder, take a look at treeColor.png. The area of the texture surrounding the leaf is transparent.

To make the leaves look more natural, you’ll render the white part of the texture as transparent. However, before changing anything, it’s important to understand the difference between transparent objects and those that are translucent.

Alpha testing

An object that is transparent allows light to entirely pass through it. A translucent object, on the other hand, will distort light when it passes through. Objects like water, glass, and plastic are all translucent. Objects can also be opaque. In fact, most objects in nature are opaque, meaning they don’t allow any light to pass through, like trees and rocks.

var transparencyEnabled = false
var treePipelineState: MTLRenderPipelineState! 
// 1
let treeFragmentFunction = 
    library.makeFunction(name: "fragment_tree")
descriptor.fragmentFunction = treeFragmentFunction
// 2
treePipelineState = 
  try device.makeRenderPipelineState(descriptor: descriptor)
// render tree
renderEncoder.setRenderPipelineState(treePipelineState)
renderEncoder.setFragmentBytes(&transparencyEnabled, 
                  length: MemoryLayout<Bool>.size, index: 0)
// 1
public override var acceptsFirstResponder: Bool {
  return true
}

public override func keyDown(with event: NSEvent) {
  enum KeyCode: UInt16 {
    case t = 0x11
  }
  
  guard
    let renderer = renderer,
    let keyCode = KeyCode(rawValue: event.keyCode)
  else {return}
  
  // 2
  switch keyCode {
  case .t:
    renderer.transparencyEnabled = !renderer.transparencyEnabled
  }
}
// 1
fragment float4 fragment_tree(VertexOut vertex_in [[stage_in]],
             texture2d<float> texture [[texture(0)]],
             constant bool & transparencyEnabled [[buffer(0)]]) {
  // 2
  constexpr sampler s(filter::linear);
  float4 color = texture.sample(s, vertex_in.uv);
  // 3
  if (transparencyEnabled && color.a < 0.1) {
    discard_fragment();
  }
  // 4
  return color;
}

vertex VertexOut 
         vertex_light(const VertexIn vertex_in [[stage_in]],
                constant float4x4 &mvp_matrix [[buffer(1)]]) {
  VertexOut vertex_out;
  vertex_out.position = mvp_matrix * vertex_in.position;
  vertex_out.uv = vertex_in.uv;
  return vertex_out;
}
treePipelineState = 
    try device.makeRenderPipelineState(descriptor: descriptor)
let lightVertexFunction = 
     library.makeFunction(name: "vertex_light")
descriptor.vertexFunction = lightVertexFunction
// 1
vertex_out.normal = (mvp_matrix 
                       * float4(vertex_in.normal, 0)).xyz;
// 2
float3 light_direction = {1.0, 1.0, -1.0};
float4 light_color = float4(1.0);
// 3
float intensity = dot(normalize(vertex_out.normal),
                      normalize(light_direction));
// 4
vertex_out.color = saturate(light_color * intensity);
color *= vertex_in.color * 2;

Depth testing

Depth testing compares the depth value of the current fragment to one stored in the framebuffer. If a fragment is farther away than the current depth value, this fragment fails the depth test and is discarded since it’s occluded by another fragment. You’ll learn more about depth testing in Chapter 14, “Multipass and Deferred Rendering.”

Stencil testing

Stencil testing compares the value stored in a stencil attachment to a masked reference value. If a fragment makes it through the mask it’s kept; otherwise it’s discarded.

Scissor testing

If you only want to render part of the screen, you can tell the GPU to only render within a particular rectangle. This is much more efficient than rendering the whole screen. The scissor test checks whether a fragment is inside a defined 2D area called the scissor rectangle. If the fragment falls outside of this rectangle, it’s discarded.

renderEncoder.setScissorRect(MTLScissorRect(x: 500, y: 500, 
                                   width: 600, height: 400))

Alpha blending

Alpha blending is different from alpha testing in that the latter only works with total transparency. In that case, all you have to do is discard fragments. For translucent or partially transparent objects, discarding fragments is not the solution anymore because you want the fragment color to contribute to a certain extent to the existing framebuffer color. You don’t want to just replace it.

var windowTexture: MTLTexture?
var windowPipelineState: MTLRenderPipelineState! 
lazy var window: MTKMesh = {
  do {
    let primitive = self.loadModel(name: "plane")!
    let model = try MTKMesh(mesh: primitive,
                            device: device)
    windowTexture = loadTexture(imageName: "windowColor")
    return model
  } catch {
    fatalError()
  }
}()
var windowTransform = Transform()
windowTransform.scale = [2, 2, 2]
windowTransform.position = [0, 3, 4]
windowTransform.rotation = [-Float.pi / 2, 0, 0]
let windowFragmentFunction = 
  library.makeFunction(name: "fragment_window")
descriptor.fragmentFunction = windowFragmentFunction
descriptor.vertexFunction = lightVertexFunction
windowPipelineState = 
  try device.makeRenderPipelineState(descriptor: descriptor)
// render window
renderEncoder.setRenderPipelineState(windowPipelineState)
modelViewProjectionMatrix = camera.projectionMatrix * 
  camera.viewMatrix * windowTransform.matrix
renderEncoder.setVertexBytes(&modelViewProjectionMatrix, 
                     length: MemoryLayout<float4x4>.stride, 
                     index: 1)
renderEncoder.setVertexBuffer(window.vertexBuffers[0].buffer, 
                              offset: 0, index: 0)
renderEncoder.setFragmentTexture(windowTexture, index: 0)
draw(renderEncoder: renderEncoder, model: window)
fragment float4 
            fragment_window(VertexOut vertex_in [[stage_in]],
                     texture2d<float> texture [[texture(0)]]) {
  constexpr sampler s(filter::linear);
  float4 color = texture.sample(s, vertex_in.uv);
  return color;
}

windowPipelineState = 
    try device.makeRenderPipelineState(descriptor: descriptor)
// 1
guard let attachment = descriptor.colorAttachments[0] else { return }
// 2
attachment.isBlendingEnabled = true
// 3
attachment.rgbBlendOperation = .add
// 4
attachment.sourceRGBBlendFactor = .sourceAlpha
// 5
attachment.destinationRGBBlendFactor = .oneMinusSourceAlpha
var blendingEnabled = false
renderEncoder.setFragmentBytes(&blendingEnabled, 
           length: MemoryLayout<Bool>.size, index: 0)
case b = 0xB
case .b:
  renderer.blendingEnabled = !renderer.blendingEnabled
fragment float4 
            fragment_window(VertexOut vertex_in [[stage_in]],
                constant bool &blendingEnabled [[buffer(0)]],
                     texture2d<float> texture [[texture(0)]])
if (blendingEnabled) {
  color.a = 0.5;
}

var window2Transform = Transform()
window2Transform.scale = [3, 2, 2]
window2Transform.position = [0, 3, 5]
window2Transform.rotation = [-Float.pi / 2, 0, 0]
modelViewProjectionMatrix = camera.projectionMatrix *
  camera.viewMatrix * window2Transform.matrix
renderEncoder.setVertexBytes(&modelViewProjectionMatrix,
                       length: MemoryLayout<float4x4>.stride, 
                       index: 1)
draw(renderEncoder: renderEncoder, model: window)

window2Transform.scale = [2, 2, 2]
window2Transform.position = [0, 2.75, 3]
window2Transform.rotation = [-Float.pi / 2, 0, 0]

Antialiasing

Often, rendered models show slightly jagged edges that are visible if you zoom in a few times. This is called aliasing and is caused by the rasterizer when generating the fragments. If you look at the edge of a triangle, or any straight line — especially one with a slope — you’ll notice the line doesn’t always go precisely through the center of a pixel; some pixels will be colored above the line and some below it.

var antialiasingEnabled = false
var treePipelineStateAA: MTLRenderPipelineState!
var windowPipelineStateAA: MTLRenderPipelineState!
treePipelineState = 
  try device.makeRenderPipelineState(descriptor: descriptor)
descriptor.sampleCount = 4
treePipelineStateAA = 
  try device.makeRenderPipelineState(descriptor: descriptor)
descriptor.sampleCount = 1
windowPipelineState = 
  try device.makeRenderPipelineState(descriptor: descriptor)
descriptor.sampleCount = 4
windowPipelineStateAA = 
  try device.makeRenderPipelineState(descriptor: descriptor)
descriptor.sampleCount = 1
renderEncoder.setRenderPipelineState(treePipelineState)
view.sampleCount = antialiasingEnabled ? 4 : 1
var aaPipelineState = antialiasingEnabled ? 
                        treePipelineStateAA! : treePipelineState
renderEncoder.setRenderPipelineState(aaPipelineState!)
renderEncoder.setRenderPipelineState(windowPipelineState)
aaPipelineState = antialiasingEnabled ? 
                    windowPipelineStateAA : windowPipelineState
renderEncoder.setRenderPipelineState(aaPipelineState!)
case a = 0
case .a:
  renderer.antialiasingEnabled = !renderer.antialiasingEnabled

Fog

If you still haven’t had enough fun in this chapter, why don’t you add some fog to the scene to make it even more interesting?

var fogEnabled = false
renderEncoder.setFragmentBytes(&fogEnabled, 
                               length: MemoryLayout<Bool>.size, 
							   index: 1)
case f = 0x3
case .f:
  renderer.fogEnabled = !renderer.fogEnabled
constant bool &fogEnabled [[buffer(1)]]
float4 fog(float4 position, float4 color) {
  // 1
  float distance = position.z / position.w;
  // 2
  float density = 0.2;
  float fog = 1.0 - clamp(exp(-density * distance), 0.0, 1.0);
  // 3
  float4 fogColor = float4(1.0);
  color = mix(color, fogColor, fog);
  return color;
}
if (fogEnabled) {
  color = fog(vertex_in.position, color);
}

Challenge

I hope you had a blast playing with various fragment processing techniques. I know I did!

Where to go from here?

In this chapter, you only looked into fixed-function blending and antialiasing. Per-fragment or per-sample programmable blending is possible by using the [[color]] attribute, which identifies the color attachment, as an argument to the fragment function.

Have a technical question? Want to report a bug? You can ask questions and report bugs to the book authors in our official book forum here.
© 2024 Kodeco Inc.

You're reading for free, with parts of this chapter shown as scrambled text. Unlock this book, and our entire catalogue of books and videos, with a Kodeco Personal Plan.

Unlock now