OpenGL ES 2.0 for iPhone Tutorial Part 2: Textures

In this tutorial series, our aim is to take the mystery and difficulty out of OpenGL ES 2.0, by giving you hands-on experience using it from the ground up! In the first part of the series, we covered the basics of initializing OpenGL, creating some simple vertex and fragment shaders, and presenting a simple rotating […] By Ray Wenderlich.

Leave a rating/review
Save for later
Share

In this tutorial series, our aim is to take the mystery and difficulty out of OpenGL ES 2.0, by giving you hands-on experience using it from the ground up!

In the first part of the series, we covered the basics of initializing OpenGL, creating some simple vertex and fragment shaders, and presenting a simple rotating cube to the screen.

In this part of the tutorial series, we’re going to take things to the next level by adding some textures to our cube!

Caveat: I am not an Open GL expert! I am learning this myself, and am writing tutorials as I go. If I make any boneheaded mistakes, feel free to chime in with corrections or insights! :]

OK, let’s dive into OpenGL ES 2.0 textures!

Getting Started

If you don’t have it already, download a copy of the sample project where we left it off in the previous tutorial.

Compile and run the project, and you’ll see a rotating cube:

Rotating Cube with OpenGL ES 2.0 Example

Right now the cube looks green and red because we colored the vertices that way – it doesn’t have any texture applied.

But don’t worry – that’s exactly what we’re going to do in this tutorial!

Start by downloading these textures made by my lovely wife and unzip the folder. Then drag the folder into your Xcode project, make sure that “Copy items into destination group’s folder” is selected, and click Finish.

You’ll see there are two images – one that looks like a floor tile, and one that looks like a fish. We’ll start by applying the floor tile around each face of the cube.

Reading the Pixel Data

Our first step is we have to somehow give the image data to OpenGL.

The problem is OpenGL doesn’t accept images the way it’s handy to use them as programmers (paths to PNGs). Instead, OpenGL requires you to send them as a buffer of pixel data – and you need to specify the exact format.

Luckily, you can get this buffer of pixel data quite easily using some built-in Quartz2D functions. If you’ve read the Core Graphics 101 tutorial series, many of these calls will look familiar.

There are four main steps to get this to work:

  1. Get Core Graphics image reference. Since we’re going to use Core Graphics to write out the raw pixel data, we need a reference to the image! This is quite simple – UIImage has a CGImageRef property we can use.
  2. Create Core Graphics bitmap context. The next step is to create a Core Graphics bitmap context, which is a fancy way of saying a buffer in memory to store the raw pixel data.
  3. Draw the image into the context. We can do this with a simple Core Graphics function call – and then the buffer will contain raw pixel data!
  4. Send the pixel data to OpenGL. To do this, we need to create an OpenGL texture object and get its unique ID (called it’s “name”), and then we use a function call to pass the pixel data to OpenGL.

OK, so let’s see what this looks like in code. In OpenGLView.m, add a new method right above initWithFrame:

- (GLuint)setupTexture:(NSString *)fileName {    
    // 1
    CGImageRef spriteImage = [UIImage imageNamed:fileName].CGImage;
    if (!spriteImage) {
        NSLog(@"Failed to load image %@", fileName);
        exit(1);
    }
    
    // 2
    size_t width = CGImageGetWidth(spriteImage);
    size_t height = CGImageGetHeight(spriteImage);
    
    GLubyte * spriteData = (GLubyte *) calloc(width*height*4, sizeof(GLubyte));
    
    CGContextRef spriteContext = CGBitmapContextCreate(spriteData, width, height, 8, width*4, 
        CGImageGetColorSpace(spriteImage), kCGImageAlphaPremultipliedLast);    
    
    // 3
    CGContextDrawImage(spriteContext, CGRectMake(0, 0, width, height), spriteImage);
    
    CGContextRelease(spriteContext);
    
    // 4
    GLuint texName;
    glGenTextures(1, &texName);
    glBindTexture(GL_TEXTURE_2D, texName);
    
    glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_NEAREST); 
    
    glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, width, height, 0, GL_RGBA, GL_UNSIGNED_BYTE, spriteData);
    
    free(spriteData);        
    return texName;    
}

There’s a lot of code here, so let’s go over it section by section.

1) Get Core Graphics image reference. As you can see this is the simplest step. We just use the UIImage imageNamed initializer I’m sure you’ve seen many times, and then access its CGImage property.

2) Create Core Graphics bitmap context. To create a bitmap context, you have to allocate space for it yourself. Here we use some function calls to get the width and height of the image, and then allocate width*height*4 bytes.

“Why times 4?” you may wonder. When we call the method to draw the image data, it will write one byte each for red, green, blue, and alpha – so 4 bytes in total.

“Why 1 byte per each?” you may wonder. Well, we tell Core Graphics to do this when we set up the context. The fourth parameter to CGBitmapContextCreate is the bits per component, and we set this to 8 bits (1 byte).

3) Draw the image into the context. This is also a pretty simiple step – we just tell Core Graphics to draw the image at the specified rectangle. Since we’re done with the context at this point, we can release it.

4) Send the pixel data to OpenGL. We first need to call glGenTextures to create a texture object and give us its unique ID (called “name”).

We then call glBindTexture to load our new texture name into the current texture unit.

The next step is to set a texture parameter for our texture, using glTexParameteri. Here we’re setting the GL_TEXTURE_MIN_FILTER (the case where we have to shrink the texture for far away objects) to GL_NEAREST (when drawing a vertex, choose the closest corresponding texture pixel).

Another easy way to think of GL_NEAREST is “pixel art-like” while GL_LINEAR is “smooth”.

Note: Setting the GL_TEXTURE_MIN_FILTER is actually required if you aren’t using mipmaps (like this case!) I didn’t know this at first and didn’t include this line, and nothing showed up. I found out later on that this is actually listed in the OpenGL common mistakes – d’oh!

The final step is to send the pixel data buffer we created earlier over to OpenGL with glTexImage2D. When you call this function, you specify the format of the pixel data you send in. Here we specify GL_RGBA and GL_UNSIGNED_BYTE to say “there’s 1 byte for red, green, blue, and alpha.”

OpenGL supports other pixel formats if you’d like (this is how the Cocos2D pixel formats work). But for this tutorial, we’ll stick with this simple case.

Once we’ve sent the image data to OpenGL, we can deallocate the pixel buffer – we don’t need it anymore because OpenGL is storing the texture in the GPU. We finish by returning the texture name, which we’ll need to refer to it later when drawing.

Contributors

Over 300 content creators. Join our team.