OpenGL ES 2.0 for iPhone Tutorial Part 2: Textures

In this tutorial series, our aim is to take the mystery and difficulty out of OpenGL ES 2.0, by giving you hands-on experience using it from the ground up! In the first part of the series, we covered the basics of initializing OpenGL, creating some simple vertex and fragment shaders, and presenting a simple rotating […]


  • Other, Other, Other

In this tutorial series, our aim is to take the mystery and difficulty out of OpenGL ES 2.0, by giving you hands-on experience using it from the ground up!

In the first part of the series, we covered the basics of initializing OpenGL, creating some simple vertex and fragment shaders, and presenting a simple rotating cube to the screen.

In this part of the tutorial series, we’re going to take things to the next level by adding some textures to our cube!

Caveat: I am not an Open GL expert! I am learning this myself, and am writing tutorials as I go. If I make any boneheaded mistakes, feel free to chime in with corrections or insights! :]

OK, let’s dive into OpenGL ES 2.0 textures!

Getting Started

If you don’t have it already, download a copy of the sample project where we left it off in the previous tutorial.

Compile and run the project, and you’ll see a rotating cube:

Rotating Cube with OpenGL ES 2.0 Example

Right now the cube looks green and red because we colored the vertices that way – it doesn’t have any texture applied.

But don’t worry – that’s exactly what we’re going to do in this tutorial!

Start by downloading these textures made by my lovely wife and unzip the folder. Then drag the folder into your Xcode project, make sure that “Copy items into destination group’s folder” is selected, and click Finish.

You’ll see there are two images – one that looks like a floor tile, and one that looks like a fish. We’ll start by applying the floor tile around each face of the cube.

Reading the Pixel Data

Our first step is we have to somehow give the image data to OpenGL.

The problem is OpenGL doesn’t accept images the way it’s handy to use them as programmers (paths to PNGs). Instead, OpenGL requires you to send them as a buffer of pixel data – and you need to specify the exact format.

Luckily, you can get this buffer of pixel data quite easily using some built-in Quartz2D functions. If you’ve read the Core Graphics 101 tutorial series, many of these calls will look familiar.

There are four main steps to get this to work:

  1. Get Core Graphics image reference. Since we’re going to use Core Graphics to write out the raw pixel data, we need a reference to the image! This is quite simple – UIImage has a CGImageRef property we can use.
  2. Create Core Graphics bitmap context. The next step is to create a Core Graphics bitmap context, which is a fancy way of saying a buffer in memory to store the raw pixel data.
  3. Draw the image into the context. We can do this with a simple Core Graphics function call – and then the buffer will contain raw pixel data!
  4. Send the pixel data to OpenGL. To do this, we need to create an OpenGL texture object and get its unique ID (called it’s “name”), and then we use a function call to pass the pixel data to OpenGL.

OK, so let’s see what this looks like in code. In OpenGLView.m, add a new method right above initWithFrame:

- (GLuint)setupTexture:(NSString *)fileName {    
    // 1
    CGImageRef spriteImage = [UIImage imageNamed:fileName].CGImage;
    if (!spriteImage) {
        NSLog(@"Failed to load image %@", fileName);
    // 2
    size_t width = CGImageGetWidth(spriteImage);
    size_t height = CGImageGetHeight(spriteImage);
    GLubyte * spriteData = (GLubyte *) calloc(width*height*4, sizeof(GLubyte));
    CGContextRef spriteContext = CGBitmapContextCreate(spriteData, width, height, 8, width*4, 
        CGImageGetColorSpace(spriteImage), kCGImageAlphaPremultipliedLast);    
    // 3
    CGContextDrawImage(spriteContext, CGRectMake(0, 0, width, height), spriteImage);
    // 4
    GLuint texName;
    glGenTextures(1, &texName);
    glBindTexture(GL_TEXTURE_2D, texName);
    glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, width, height, 0, GL_RGBA, GL_UNSIGNED_BYTE, spriteData);
    return texName;    

There’s a lot of code here, so let’s go over it section by section.

1) Get Core Graphics image reference. As you can see this is the simplest step. We just use the UIImage imageNamed initializer I’m sure you’ve seen many times, and then access its CGImage property.

2) Create Core Graphics bitmap context. To create a bitmap context, you have to allocate space for it yourself. Here we use some function calls to get the width and height of the image, and then allocate width*height*4 bytes.

“Why times 4?” you may wonder. When we call the method to draw the image data, it will write one byte each for red, green, blue, and alpha – so 4 bytes in total.

“Why 1 byte per each?” you may wonder. Well, we tell Core Graphics to do this when we set up the context. The fourth parameter to CGBitmapContextCreate is the bits per component, and we set this to 8 bits (1 byte).

3) Draw the image into the context. This is also a pretty simiple step – we just tell Core Graphics to draw the image at the specified rectangle. Since we’re done with the context at this point, we can release it.

4) Send the pixel data to OpenGL. We first need to call glGenTextures to create a texture object and give us its unique ID (called “name”).

We then call glBindTexture to load our new texture name into the current texture unit.

The next step is to set a texture parameter for our texture, using glTexParameteri. Here we’re setting the GL_TEXTURE_MIN_FILTER (the case where we have to shrink the texture for far away objects) to GL_NEAREST (when drawing a vertex, choose the closest corresponding texture pixel).

Another easy way to think of GL_NEAREST is “pixel art-like” while GL_LINEAR is “smooth”.

Note: Setting the GL_TEXTURE_MIN_FILTER is actually required if you aren’t using mipmaps (like this case!) I didn’t know this at first and didn’t include this line, and nothing showed up. I found out later on that this is actually listed in the OpenGL common mistakes – d’oh!

The final step is to send the pixel data buffer we created earlier over to OpenGL with glTexImage2D. When you call this function, you specify the format of the pixel data you send in. Here we specify GL_RGBA and GL_UNSIGNED_BYTE to say “there’s 1 byte for red, green, blue, and alpha.”

OpenGL supports other pixel formats if you’d like (this is how the Cocos2D pixel formats work). But for this tutorial, we’ll stick with this simple case.

Once we’ve sent the image data to OpenGL, we can deallocate the pixel buffer – we don’t need it anymore because OpenGL is storing the texture in the GPU. We finish by returning the texture name, which we’ll need to refer to it later when drawing.

Using the Texture Data

Now that we have a helper method to load an image, send it to OpenGL, and return to us its texture name, let’s make use of this to skin our cube.

Let’s start with our vertex and fragment shaders. Open up SimpleVertex.glsl and replace it with the following:

attribute vec4 Position; 
attribute vec4 SourceColor; 

varying vec4 DestinationColor; 

uniform mat4 Projection;
uniform mat4 Modelview;

attribute vec2 TexCoordIn; // New
varying vec2 TexCoordOut; // New

void main(void) { 
    DestinationColor = SourceColor; 
    gl_Position = Projection * Modelview * Position;
    TexCoordOut = TexCoordIn; // New

Here we declare a new attribute called TexCoordIn. Remember that an attribute is a value that you can set for each vertex. So for each vertex, we’ll specify the coordinate on the texture that it should map to.

Texture Coordinates are kind of weird in that they’re always in the range of 0-1. So (0,0) would be the bottom left of the texture, and (1,1) would be the upper right of the texture.

Or it would be, but Core Graphics flips images when you load them in. So in the code here (0,1) is actually the bottom left and (0, 0) is the top left, oddly enough.

We also declare a new varying called TexCoordOut, and set it to TexCoordIn. Remember that a varying is a value that OpenGL will automatically interpolate for us by the time it gets to the fragment shader. So for example if we set the bottom left corner of a square we’re texturing to (0,0) and the bottom right to (1, 0), if we’re rendering the pixel in-between on the bottom, our fragment shader will be automatically passed (0.5, 0).

Next replace SimpleFragment.glsl with the following:

varying lowp vec4 DestinationColor;

varying lowp vec2 TexCoordOut; // New
uniform sampler2D Texture; // New

void main(void) {
    gl_FragColor = DestinationColor * texture2D(Texture, TexCoordOut); // New

We used to set the output color to just the destination color – now we multiply the color by whatever is in the texture at the specified coordinate. texture2D is a built-in GLSL function that samples a texture for us.

Now that our new shaders are ready to go, let’s make use of them! Open up OpenGLView.h and add the following new instance variables:

GLuint _floorTexture;
GLuint _fishTexture;
GLuint _texCoordSlot;
GLuint _textureUniform;

These will keep track of the texture names for our two textures, the new input attribute slot, and the new texture uniform slot.

Then open up OpenGLView.m and make the following changes:

// Add texture coordinates to Vertex structure as follows
typedef struct {
    float Position[3];
    float Color[4];
    float TexCoord[2]; // New
} Vertex;

// Add texture coordinates to Vertices as follows
const Vertex Vertices[] = {
    {{1, -1, 0}, {1, 0, 0, 1}, {1, 0}},
    {{1, 1, 0}, {1, 0, 0, 1}, {1, 1}},
    {{-1, 1, 0}, {0, 1, 0, 1}, {0, 1}},
    {{-1, -1, 0}, {0, 1, 0, 1}, {0, 0}},
    {{1, -1, -1}, {1, 0, 0, 1}, {1, 0}},
    {{1, 1, -1}, {1, 0, 0, 1}, {1, 1}},
    {{-1, 1, -1}, {0, 1, 0, 1}, {0, 1}},
    {{-1, -1, -1}, {0, 1, 0, 1}, {0, 0}}

// Add to end of compileShaders
_texCoordSlot = glGetAttribLocation(programHandle, "TexCoordIn");
_textureUniform = glGetUniformLocation(programHandle, "Texture");

// Add to end of initWithFrame
_floorTexture = [self setupTexture:@"tile_floor.png"];
_fishTexture = [self setupTexture:@"item_powerup_fish.png"];

// Add inside render:, right before glDrawElements
glVertexAttribPointer(_texCoordSlot, 2, GL_FLOAT, GL_FALSE, 
    sizeof(Vertex), (GLvoid*) (sizeof(float) * 7));    
glBindTexture(GL_TEXTURE_2D, _floorTexture);
glUniform1i(_textureUniform, 0);

Based on what we covered in the tutorial series so far, most of this should be straightforward.

The only thing worth mentioning in more detail is the final three lines. This is how we set the “Texture” uniform we defined in our fragment shader to the texture we loaded in code.

First, we activate the texture unit we want to load our texture into. On iOS, we’re guaranteed to have at least 2 texture units, and most of the time 8. This can be good if you need to perform computations on more than one texture at a time. However, for this tutorial, we don’t really need to use more than one texture unit at a time, so we’ll just stick the first texture unit (GL_TEXTURE0).

We then bind the texture into the current texture unit (GL_TEXTURE0). Finally, we set the texture uniform to the index of the texture unit it’s in (0).

Note that lines 1 and 3 aren’t strictly necessary, and a lot of times you’ll see code that doesn’t even include those lines. This is because it’s assuming GL_TEXTURE0 is already the active texture unit, and doesn’t bother setting the uniform because it defaults to 0 anyway. However, I’m including the lines here because I think it makes it a lot easier to understand for beginners.

Compile and run the code, and you’ll see a textured cube!

Reused vertices with same texture coordinates cause strange stretching

Well… sort of. The front of the cube looks good, but the sides of the cube look wonky and stretched – what’s going on with that?

Fixing the Stretch Effect

The problem is we currently are just setting one texture coordinate per vertex, and reusing vertices.

For example, we map the bottom left corner of the front face to (0,0). But on the left side, that same vertex is the bottom right, so it doesn’t make sense to have a (0,0) texture coordinate – it should be (1,0).

In OpenGL, you can’t think of a vertex as just its vertex coordinates – it’s the unique combination of its coordinate, color, texture coordinate, and anything else you have in your structure.

Go ahead and replace your Vertices and Indices arrays with the following, which make a different vertex/color/texture coord combo for each face to ensure the texture coordinates map properly:

#define TEX_COORD_MAX   1

const Vertex Vertices[] = {
    // Front
    {{1, -1, 0}, {1, 0, 0, 1}, {TEX_COORD_MAX, 0}},
    {{1, 1, 0}, {0, 1, 0, 1}, {TEX_COORD_MAX, TEX_COORD_MAX}},
    {{-1, 1, 0}, {0, 0, 1, 1}, {0, TEX_COORD_MAX}},
    {{-1, -1, 0}, {0, 0, 0, 1}, {0, 0}},
    // Back
    {{1, 1, -2}, {1, 0, 0, 1}, {TEX_COORD_MAX, 0}},
    {{-1, -1, -2}, {0, 1, 0, 1}, {TEX_COORD_MAX, TEX_COORD_MAX}},
    {{1, -1, -2}, {0, 0, 1, 1}, {0, TEX_COORD_MAX}},
    {{-1, 1, -2}, {0, 0, 0, 1}, {0, 0}},
    // Left
    {{-1, -1, 0}, {1, 0, 0, 1}, {TEX_COORD_MAX, 0}}, 
    {{-1, 1, 0}, {0, 1, 0, 1}, {TEX_COORD_MAX, TEX_COORD_MAX}},
    {{-1, 1, -2}, {0, 0, 1, 1}, {0, TEX_COORD_MAX}},
    {{-1, -1, -2}, {0, 0, 0, 1}, {0, 0}},
    // Right
    {{1, -1, -2}, {1, 0, 0, 1}, {TEX_COORD_MAX, 0}},
    {{1, 1, -2}, {0, 1, 0, 1}, {TEX_COORD_MAX, TEX_COORD_MAX}},
    {{1, 1, 0}, {0, 0, 1, 1}, {0, TEX_COORD_MAX}},
    {{1, -1, 0}, {0, 0, 0, 1}, {0, 0}},
    // Top
    {{1, 1, 0}, {1, 0, 0, 1}, {TEX_COORD_MAX, 0}},
    {{1, 1, -2}, {0, 1, 0, 1}, {TEX_COORD_MAX, TEX_COORD_MAX}},
    {{-1, 1, -2}, {0, 0, 1, 1}, {0, TEX_COORD_MAX}},
    {{-1, 1, 0}, {0, 0, 0, 1}, {0, 0}},
    // Bottom
    {{1, -1, -2}, {1, 0, 0, 1}, {TEX_COORD_MAX, 0}},
    {{1, -1, 0}, {0, 1, 0, 1}, {TEX_COORD_MAX, TEX_COORD_MAX}},
    {{-1, -1, 0}, {0, 0, 1, 1}, {0, TEX_COORD_MAX}}, 
    {{-1, -1, -2}, {0, 0, 0, 1}, {0, 0}}

const GLubyte Indices[] = {
    // Front
    0, 1, 2,
    2, 3, 0,
    // Back
    4, 5, 6,
    4, 5, 7,
    // Left
    8, 9, 10,
    10, 11, 8,
    // Right
    12, 13, 14,
    14, 15, 12,
    // Top
    16, 17, 18,
    18, 19, 16,
    // Bottom
    20, 21, 22,
    22, 23, 20

Just like last time, I got these by sketching out the cube on paper and figuring them out – it’s a good exercise if you want to try it yourself.

Note we’re repeating a lot more data than we did last time. I don’t know a smart way around this, while still allowing the texture coordinates to be different (but if someone else knows please chime in!)

Compile and run, and now we have a beautiful looking textured cube!

A beautiful textured cube made with OpenGL ES 2.0

Repeating Textures

In OpenGL, it’s nice and easy to make a texture repeat over and over across a surface if you’d like. The stone tile we’re using happens to be a seamless texture, so let’s try repeating it several times across each face.

Simply make the following change to OpenGLView.m:

#define TEX_COORD_MAX   4

So now we’re mapping each cube face so the bottom left is (0,0) and the bottom right is (4, 4).
When mapping texture coordinates, it will behave as if it was a modulo of 1 – for example if a texture coordinate is 1.5, it will map to the texture as if it was 0.5.

Compile and run, and you’ll see the texture repeating nicely along the cube!

OpenGL ES 2.0 texture with GL_REPEAT

Note: This works automatically because the default value of GL_TEXTURE_WRAP_S and GL_TEXTURE_WRAP_T are set to GL_REPEAT. If you don’t want textures to repeat like this (maybe you want them clamped to the last pixel value), you can override this behavior with glTexParameteri.

Adding the Decal

We’ll wrap this tutorial up by putting a small fishbone on top of this cube. Why? Because the Grand Cat Dispatch got hungry!

The code to do this is mainly just extra practice for what we’ve done already in this tutorial series. So let’s jump right in.

Open up OpenGLView.h and add the following new instance variables:

GLuint _vertexBuffer;
GLuint _indexBuffer;
GLuint _vertexBuffer2;
GLuint _indexBuffer2;

Before we only had one vertex and index buffer, so when we created it we just bound it as the active buffer and never needed a reference to it. We’re going to need two vertex/index buffers now (one for the cube, and one for the face that will hold the fishbone decal), so now we need some references.

Switch to OpenGLView.m and make the following changes:

// 1) Add to top of file
const Vertex Vertices2[] = {
    {{0.5, -0.5, 0.01}, {1, 1, 1, 1}, {1, 1}},
    {{0.5, 0.5, 0.01}, {1, 1, 1, 1}, {1, 0}},
    {{-0.5, 0.5, 0.01}, {1, 1, 1, 1}, {0, 0}},
    {{-0.5, -0.5, 0.01}, {1, 1, 1, 1}, {0, 1}},

const GLubyte Indices2[] = {
    1, 0, 2, 3

// 2) Replace setupVBOs with the following
- (void)setupVBOs {
    glGenBuffers(1, &_vertexBuffer);
    glBindBuffer(GL_ARRAY_BUFFER, _vertexBuffer);
    glBufferData(GL_ARRAY_BUFFER, sizeof(Vertices), Vertices, GL_STATIC_DRAW);
    glGenBuffers(1, &_indexBuffer);
    glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, _indexBuffer);
    glBufferData(GL_ELEMENT_ARRAY_BUFFER, sizeof(Indices), Indices, GL_STATIC_DRAW);
    glGenBuffers(1, &_vertexBuffer2);
    glBindBuffer(GL_ARRAY_BUFFER, _vertexBuffer2);
    glBufferData(GL_ARRAY_BUFFER, sizeof(Vertices2), Vertices2, GL_STATIC_DRAW);
    glGenBuffers(1, &_indexBuffer2);
    glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, _indexBuffer2);
    glBufferData(GL_ELEMENT_ARRAY_BUFFER, sizeof(Indices2), Indices2, GL_STATIC_DRAW);

// 3) Add inside render:, right after call to glViewport
glBindBuffer(GL_ARRAY_BUFFER, _vertexBuffer);
glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, _indexBuffer);

// 4) Add to bottom of render:, right before [_context presentRenderbuffer:...]
glBindBuffer(GL_ARRAY_BUFFER, _vertexBuffer2);
glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, _indexBuffer2);

glBindTexture(GL_TEXTURE_2D, _fishTexture);
glUniform1i(_textureUniform, 0); 

glUniformMatrix4fv(_modelViewUniform, 1, 0, modelView.glMatrix);

glVertexAttribPointer(_positionSlot, 3, GL_FLOAT, GL_FALSE, sizeof(Vertex), 0);
glVertexAttribPointer(_colorSlot, 4, GL_FLOAT, GL_FALSE, sizeof(Vertex), (GLvoid*) (sizeof(float) * 3));
glVertexAttribPointer(_texCoordSlot, 2, GL_FLOAT, GL_FALSE, sizeof(Vertex), (GLvoid*) (sizeof(float) * 7));

glDrawElements(GL_TRIANGLE_STRIP, sizeof(Indices2)/sizeof(Indices2[0]), GL_UNSIGNED_BYTE, 0);

In the first section, we define a new set of vertices for the rectangle where we’ll draw the fish texture. Note we make it a little bit smaller than the front face, and we also make the z coordinate slightly taller so it will show up. Otherwise, it could be discarded by the depth test.

In the second section, we store the vertex/index buffers we create in the new instance variables rather than local variables. We also create a second vertex/index buffer with our new vertices/indexes for the fish rectangle.

In the third section, we bind the cube vertex/index buffer before drawing it, because we can no longer assume it’s already set.

In the fourth section, we bind the fish rectangle vertex/index buffers, load in the fish texture, and set up all the attributes. Note we draw the triangles with a new method – GL_TRIANGLE_STRIP.

After the first three vertices, GL_TRIANGLE_STRIP makes new triangles by combining the previous two vertices with the next vertex. This can be nice to use because it can reduce the index buffer size. I use it here mainly to show you how it works.

Compile and run, and it works (sort of):

Textures drawn with blending not enabled - black background

It drew our fish image, but it didn’t nicely blend the fish image with the rest of the drawing going on. To enable this just add the following two lines to the top of render:


The first line uses glBlendFunc to set the blending algorithm. It’s set to GL_ONE for the source (which means “take all of the source”) and GL_ONE_MINS_SRC_ALPHA for the destination (which means “take all of the destination except where the source is set”).

For more discussion on blending modes, check out this tutorial for more information and a pointer to a cool online tool.

The second line enables blending. And that’s it! Compile and run, and now you have a strange textured cube with a strange fish bone on it!

A textured cube with OpenGL ES 2.0

It’s a strange world, the full moon must be out tonight!

Where To Go From Here?

Here is the full sample project we created in the above tutorial.

At this point you’re starting to know some of the most important aspects of using OpenGL ES 2.0 – adding vertices, creating vertex buffer objects, creating shaders, texturing objects, and more!

There’s still a lot more to learn though, so at some point I hope to return to this tutorial series to add some more cool info.

In the meantime, if you want to learn more I recommend iPhone 3D Programming by Philip Rideout – I got started by reading this book!

If you have any questions, suggestions, or tips, please join the forum discussion below!