Chapters

Hide chapters

Metal by Tutorials

Second Edition · iOS 13 · Swift 5.1 · Xcode 11

Before You Begin

Section 0: 3 chapters
Show chapters Hide chapters

Section I: The Player

Section 1: 8 chapters
Show chapters Hide chapters

Section III: The Effects

Section 3: 10 chapters
Show chapters Hide chapters

6. Textures
Written by Caroline Begbie

Heads up... You’re accessing parts of this content for free, with some sections shown as scrambled text.

Heads up... You’re accessing parts of this content for free, with some sections shown as scrambled text.

Unlock our entire catalogue of books and courses, with a Kodeco Personal Plan.

Unlock now

Now that you have light in your scene, the next step is to add color. The easiest way to add fine details to a model is to use an image texture. Using textures, you can make your models come to life!

In this chapter, you’ll learn about:

  • UV coordinates: How to unwrap a mesh so that you can apply a texture to it.
  • Texturing a model: How to read the texture in a fragment shader.
  • Samplers: Different ways you can read (sample) a texture.
  • Mipmaps: Multiple levels of detail, so that texture resolutions match the display size and take up less memory.
  • The asset catalog: How to organize your textures.

Textures and UV maps

The following image shows a simple house model with twelve vertices. So you can experiment, the Blender and .obj files are included in the Resources ▸ LowPolyHouse folder for this chapter. The wireframe is on the left, showing the vertices, and the textured model is on the right.

The house started as a cube but has four extra vertices, two of which raise the roof.

To texture the model, you first have to flatten it using a process called UV unwrapping. With UV unwrapping, you create a UV map by unfolding the model; this unfolding can be done by marking and cutting seams using your modeling app. The following image is the result of UV unwrapping the house in Blender and exporting the UV map:

The walls of the house have been marked as seams so they can lie flat; the roof has also been separated out by marking seams as well. If you print this UV map on paper and cut it out, you can fold it back into the house model. In Blender, you have complete control of where the seams are, and how to cut up your mesh. Blender automatically unwraps the model by cutting the mesh at these seams, but if necessary, you can also move every vertex in the UV Unwrap window to suit your texture.

Now that you have a flattened map, you can paint on it by using the UV map exported from Blender as a guide. This is the house texture made in Photoshop. It was created by cutting up a photo of a real house.

Note how the edges of the texture aren’t perfect, and there’s a copyright message. In the spaces where there are no vertices in the map, you can put whatever you want, as it won’t show up on the model. It’s a good idea not to match the UV edges exactly, but to let the color bleed, as sometimes computers don’t accurately compute floating point numbers.

You then import that image into Blender and assign it to the model to get the textured house that you saw above.

When you export a UV mapped model to an .obj file, Blender adds the UV coordinates to the file. Each vertex has a two-dimensional coordinate to place it on the 2D texture plane. The top-left is (0, 1) and the bottom-right is (1, 0).

The following diagram indicates some of the house vertices, with the matching coordinates from the .obj file. You can look at the contents of the .obj file using TextEdit.

One of the advantages of mapping from 0 to 1 is that you can swap in lower or higher resolution textures. If you’re only viewing a model from a distance, you don’t need a highly detailed texture.

The house is easy to unwrap, but imagine how complex unwrapping curved surfaces might be. This is the UV map of the train (which is still a simple model):

Photoshop, naturally, is not the only solution for texturing a model. You can use any image editor for painting on a flat texture. In the last few years, several other apps that allow painting directly on the model have become mainstream:

  • Blender (free)
  • Substance Designer and Substance Painter by Adobe ($$): In Designer, you can create complex materials procedurally. Using Painter, you can paint these materials on the model. The yellow house you’ll encounter in the next chapter was textured in Substance Painter.
  • 3DCoat by 3Dcoat.com ($$)
  • Mudbox by Autodesk ($$)
  • Mari by Foundry ($$$)

In addition to texturing, using Blender, 3DCoat or Mudbox, you can sculpt models in a similar fashion to ZBrush and create low poly models from the high poly sculpt.

As you’ll find out in the next chapter, color is not the only texture you can paint using these apps, so having a specialized texturing app is invaluable.

Texture the model

Open up the starter project for this chapter. The code is almost the same as the challenge project from the previous chapter, except that the scene lighting is refactored to a new Lighting struct, and the light debugging code is gone. The initial scene contains the house model that you’ve already been introduced to with a background color more appropriate to a pastoral scene.

1. Add UV coordinates to the vertex descriptor

As you learned previously, when you unwrap a model in Blender (or the modeling app of your choice), it saves the UV coordinates with the model. To load these into your app, Model I/O needs to have a texture coordinate attribute set up in the vertex descriptor.

typedef enum {
  Position = 0,
  Normal = 1,
  UV = 2
} Attributes;
vertexDescriptor.attributes[Int(UV.rawValue)] =
    MDLVertexAttribute(name: MDLVertexAttributeTextureCoordinate,
                       format: .float2,
                       offset: offset,
                       bufferIndex: Int(BufferIndexVertices.rawValue))
offset += MemoryLayout<float2>.stride

2. Update the shader attributes

In Shaders.metal, the vertex function vertexIn parameter uses the stage_in attribute which relies on the vertex descriptor layout. By simply updating the VertexIn struct with the new texture coordinate attribute, the vertex function will read in the texture coordinate data.

float2 uv [[attribute(UV)]];
float2 uv;
.uv = vertexIn.uv

3. Load the image

Each submesh of a model’s mesh has a different material characteristic. In the next chapter, you’ll use a model that has a submesh for each unique color. For textured models, each submesh will contain a reference to a unique texture.

import MetalKit

protocol Texturable {}

extension Texturable {
}
static func loadTexture(imageName: String) throws -> MTLTexture? {
  // 1
  let textureLoader = MTKTextureLoader(device: Renderer.device)
  
  // 2
  let textureLoaderOptions: [MTKTextureLoader.Option: Any] = 
              [.origin: MTKTextureLoader.Origin.bottomLeft]
  // 3
  let fileExtension =
    URL(fileURLWithPath: imageName).pathExtension.isEmpty ?
      "png" : nil
  // 4
  guard let url = Bundle.main.url(forResource: imageName,
                                  withExtension: fileExtension)
    else {
      print("Failed to load \(imageName)")
      return nil
  }
  
  let texture = 
    try textureLoader.newTexture(URL: url,
                                 options: textureLoaderOptions)
  print("loaded texture: \(url.lastPathComponent)")
  return texture
}
extension Submesh: Texturable {}
struct Textures {
  let baseColor: MTLTexture?
}

let textures: Textures
private extension Submesh.Textures {
  init(material: MDLMaterial?) {
    func property(with semantic: MDLMaterialSemantic) 
          -> MTLTexture? {
      guard let property = material?.property(with: semantic),
        property.type == .string,
        let filename = property.stringValue,
        let texture = 
            try? Submesh.loadTexture(imageName: filename) 
      else { return nil }
      return texture
    }
    baseColor = property(with: MDLMaterialSemantic.baseColor)
  }
}
textures = Textures(material: mdlSubmesh.material)

4. Pass the loaded texture to the fragment function

In the next chapter, you’ll learn about several other texture types and how to send them to the fragment function using different indices. So in Common.h, set up a new enum to keep track of these texture buffer index numbers:

typedef enum {
  BaseColorTexture = 0
} Textures;
renderEncoder.setFragmentTexture(submesh.textures.baseColor,
                       index: Int(BaseColorTexture.rawValue))

5. Update the fragment function

You’ll now change the fragment function to accept the texture and read from it.

texture2d<float> baseColorTexture [[texture(BaseColorTexture)]],
constexpr sampler textureSampler;
float3 baseColor = float3(1, 1, 1);
float3 baseColor = baseColorTexture.sample(textureSampler, 
                                           in.uv).rgb;
return float4(baseColor, 1);

sRGB color space

You’ll notice that the rendered texture looks much darker than the original image.

sRGBcolor = pow(linearColor, 1.0/2.2);
let textureLoaderOptions: [MTKTextureLoader.Option: Any] = 
  [.origin: MTKTextureLoader.Origin.bottomLeft]
let textureLoaderOptions: [MTKTextureLoader.Option: Any] = 
  [.origin: MTKTextureLoader.Origin.bottomLeft, .SRGB: false]

GPU frame capture

There’s an easy way to find out what format your texture is in on the GPU, and also to look at all the other Metal buffers currently residing there: the GPU frame capture tool (also called the GPU Debugger).

Samplers

When sampling your texture just now, you used a default sampler. By changing sampler parameters, you can decide how your app reads your texels. You’ll now add a ground plane to your scene to see how you can control the appearance of the ground texture.

let ground = Model(name: "plane.obj")
ground.scale = [40, 40, 40]
models.append(ground)

constexpr sampler textureSampler;
constexpr sampler textureSampler(filter::linear);

constexpr sampler textureSampler(filter::linear, 
                                 address::repeat);
float3 baseColor = baseColorTexture.sample(textureSampler, 
                                           in.uv * 16).rgb;

var tiling: UInt32 = 1
uint tiling;
ground.tiling = 16
fragmentUniforms.tiling = model.tiling
renderEncoder.setFragmentBytes(&fragmentUniforms,
           length: MemoryLayout<FragmentUniforms>.stride,
           index: Int(BufferIndexFragmentUniforms.rawValue))
float3 baseColor = baseColorTexture.sample(textureSampler, 
                         in.uv * fragmentUniforms.tiling).rgb;

Metal API sampler states

Creating a sampler in the shader is not the only option. Instead, you’re going to create an MTLSamplerState in the Metal API and hold it with the model. You’ll then send the sampler state to the fragment function.

let samplerState: MTLSamplerState?
private static func buildSamplerState() -> MTLSamplerState? {
  let descriptor = MTLSamplerDescriptor()
  descriptor.sAddressMode = .repeat
  descriptor.tAddressMode = .repeat
  let samplerState = 
       Renderer.device.makeSamplerState(descriptor: descriptor)
  return samplerState
}
samplerState = Model.buildSamplerState()
renderEncoder.setFragmentSamplerState(model.samplerState, 
                                      index: 0)
sampler textureSampler [[sampler(0)]],
constexpr sampler textureSampler(filter::linear, 
                                 address::repeat);

Mipmaps

Check out the relative sizes of the roof texture and how it appears on the screen:

let textureLoaderOptions: [MTKTextureLoader.Option: Any] =
      [.origin: MTKTextureLoader.Origin.bottomLeft,
       .SRGB: false,
       .generateMipmaps: NSNumber(booleanLiteral: true)]
descriptor.mipFilter = .linear

descriptor.maxAnisotropy = 8

The asset catalog

As its name suggests, the asset catalog can hold all of your assets, whether they be data, images, textures or even colors. You’ve probably used the catalog for app icons and images. Textures differ from images in that the GPU uses them, and thus they have different attributes in the catalog. To create textures, you add a new texture set to the asset catalog.

print("Failed to load \(imageName)")
return nil
print(
  "Failed to load \(imageName)\n - loading from Assets Catalog")
return try textureLoader.newTexture(name: imageName, 
                                    scaleFactor: 1.0,
                                    bundle: Bundle.main, 
                                    options: nil)
map_Kd ground.png
#map_Kd ground.png
map_Kd grass

The right texture for the right job

Using asset catalogs gives you complete control over how to deliver your textures. Currently, you only have two color textures. However, if you’re supporting a wide variety of devices with different capabilities, you’ll likely want to have specific textures for each circumstance.

Texture compression

In recent years, people have put in much effort towards compressing textures to save both CPU and GPU memory. There are various formats you can use, such as ETC and PVRTC. Apple has embraced ASTC as being the most high-quality compressed format. ASTC is available on the A8 chip and newer.

Where to go from here?

In this chapter, you found out how to wrap a model in a texture, how to sample that texture in a shader and how to enhance your renders using mipmaps. You also learned how to use the invaluable GPU Frame Capture tool. The GPU Frame Capture tool is great for looking at what’s happening on the GPU and analyzing whether or not the shaders are performing the proper steps.

Have a technical question? Want to report a bug? You can ask questions and report bugs to the book authors in our official book forum here.
© 2024 Kodeco Inc.

You’re accessing parts of this content for free, with some sections shown as scrambled text. Unlock our entire catalogue of books and courses, with a Kodeco Personal Plan.

Unlock now