WebGL from Scratch: Textures, part I

My previous posts on model loading might have gotten a little carried away. See, I’m learning as I go here, so breaking free of specifying vertices in literal arrays and pulling them in from models exported from a class-A piece of software like Blender was quite exciting. So, in my haste, I’ve skipped over some fundamental topics, and texturing’s one of them.

In the traditional sense, textures are images mapped onto geometry to enhance realism. A column made out of only a few rectangles can look awesome when wrapped in a cracked marble texture, which offers detail above and beyond what the geometry itself describes. As with lighting, describing geometry with triangles is an approximation, and textures help fool the brain into seeing detail that really isn’t there.

I’m going to start from fresh with the code: aside from the borrowed createProgram function (which is going to be essentially the same in every WebGL program that I write), everything else has been hand-written from a blank text file. Well, that and the body of createFlatMesh, which I lifted from a C++ OpenGL program that I wrote a while ago.

Rather than trying to puzzle that code out, it helps if you understand that I’m drawing the mesh a patch at a time, where each patch defines 4 squares using 2 triangles each, drawn counter-clockwise around a central point. After spinning around that point, the centre point is moved 2 units along to be the centre of a new cluster of squares. When a row is complete, the loop bumps down 2 units to the centre of the next row and starts again. This means that a single cluster is created like this:

mesh-segment

This might seem like an awkward way to do this, but it generates a mesh that I think looks nicer when deformed (hint, hint).

mesh

Note that this code still requires gl-matrix—our ability to do without that library vanished when we moved into the third dimension. I’m no longer using jQuery here, as it doen’t add anything useful. While you can use any image, the code below uses bricks.png.

Chrome Users: as with the OBJ data models used in the last few posts, Chrome considers disk-sourced images to be cross-domain and won’t allow them to be loaded into a WebGL texture. You’re going to need a web server, even if it’s just python -m SimpleHTTPServer.

Mapping Textures to Surfaces

The texture that I’m going to be working with is the simplest, most commonly used form: a plain, 2 dimensional image. I’ll associate texture coordinates with each vertex, the interpolated value of which will be used in the fragment shader to look up the sample value in the texture. The shape I’ll be drawing—a square rotated around the X-axis, with perspective—is built with a regularly-spaced mesh, so the texture coordinates are easy to calculate. In effect, we’re stitching an image onto a mesh:

texture-mapping

One thing to note is that texture coordinates range from 0.0, 0.0 (bottom left) to 1.0, 1.0 (top right). Even though your shape might be defined to straddle the origin, with negative coordinates on any axis, the same is not true of texture coordinates.

<!doctype html>
<html>
  <head>
    <title>Hacking WebGL</title>
    <script type="text/javascript" src="gl-matrix.js"></script>
    <script id="vertex-shader" type="x-shader/x-vertex">
      precision mediump float;

      uniform mat4 modelMatrix, viewMatrix, projectionMatrix;

      attribute vec3 pos;
      attribute vec2 texCoords;

      varying vec2 tc;

      void main() {
        tc = texCoords;
        gl_Position = 
          projectionMatrix * viewMatrix *
          modelMatrix * vec4(pos, 1.0);
      }      
    </script>
    <script id="fragment-shader" type="x-shader/x-fragment">
      precision mediump float;

      uniform sampler2D image;

      varying vec2 tc;

      void main() {
        gl_FragColor = texture2D(image, tc.st);
      }
    </script>
    <script type="text/javascript" src="gl-matrix.js"></script>
    <script type="text/javascript">

    function createCanvas() {
      var canvas = document.createElement('canvas');
      document.getElementById('content').appendChild(canvas);
      return canvas;      
    }

    function createProgram(gl, shaderSpecs) {
      var program = gl.createProgram();
      for ( var i = 0 ; i < shaderSpecs.length ; i++ ) {
        var spec = shaderSpecs[i];
        var shader = gl.createShader(spec.type);
        gl.shaderSource(
          shader, document.getElementById(spec.container).text
        );
        gl.compileShader(shader);
        if (!gl.getShaderParameter(shader, gl.COMPILE_STATUS)) {
          throw gl.getShaderInfoLog(shader);
        }
        gl.attachShader(program, shader);
        gl.deleteShader(shader);
      }
      gl.linkProgram(program);
      if (!gl.getProgramParameter(program, gl.LINK_STATUS)) {
        throw gl.getProgramInfoLog(program);
      }
      return program;
    }

    function render(gl, scene) {
      gl.clear(gl.COLOR_BUFFER_BIT);
      gl.useProgram(scene.program);
      gl.uniformMatrix4fv(
        scene.program.modelMatrixUniform, gl.FALSE,
        scene.object.modelMatrix);
      gl.bindBuffer(gl.ARRAY_BUFFER, scene.object.buffer);
      gl.bindTexture(gl.TEXTURE_2D, scene.object.texture);

      gl.drawArrays(
        scene.object.primitiveType, 0,
        scene.object.vertexCount);

      gl.bindTexture(gl.TEXTURE_2D, null);

      gl.bindBuffer(gl.ARRAY_BUFFER, null);
      gl.useProgram(null);
      requestAnimationFrame(function() {
        render(gl, scene);
      });
    }

    function createFlatMesh(gl) {
      var MAX_ROWS=32, MAX_COLS=32;
      var points = [];

      for ( var r = 0 ; r <= MAX_ROWS ; r++ ) {
        for ( var c = 0 ; c <= MAX_COLS ; c++ ) {
          points.push({
            location: [-0.75 + (1.5 / MAX_COLS) * c, 
                        0.75 - (1.5 / MAX_ROWS) * r,
                        0.0],
            texture: [1.0 / MAX_COLS * c,
                      1.0 / MAX_ROWS * r]
          });
        }
      }
      var OFFSET = function(R,C) {
        return ((R) * ((MAX_COLS)+1) + (C));
      };
      var
        vertices = [],
        rotations = [-1,-1,-1,0,1,1,1,0,-1,-1,-1,0,1,1,1,0];
      for ( var r = 1 ; r <= MAX_ROWS ; r += 2 ) {
        for ( var c = 1 ; c <= MAX_COLS ; c += 2 ) {
          for ( var i = 0 ; i < 8 ; i++ ) {
            var off1 = OFFSET(r, c);
            var off2 = OFFSET(r + rotations[i],   c + rotations[i+6]);
            var off3 = OFFSET(r + rotations[i+1], c + rotations[i+7]);
            Array.prototype.push.apply(
              vertices, points[off1].location);
            Array.prototype.push.apply(
              vertices, points[off1].texture);
            Array.prototype.push.apply(
              vertices, points[off2].location);
            Array.prototype.push.apply(
              vertices, points[off2].texture);
            Array.prototype.push.apply(
              vertices, points[off3].location);
            Array.prototype.push.apply(
              vertices, points[off3].texture);
          }
        }
      }

      var buffer = gl.createBuffer();
      gl.bindBuffer(gl.ARRAY_BUFFER, buffer);
      gl.bufferData(
        gl.ARRAY_BUFFER, new Float32Array(vertices),
        gl.STATIC_DRAW);
      gl.bindBuffer(gl.ARRAY_BUFFER, null);

      return {
        buffer: buffer,
        primitiveType: gl.TRIANGLES,
        vertexCount: vertices.length / 5
      }
    }

    function loadTexture(name, gl, mesh, andThenFn) {
      var texture = gl.createTexture();
      var image = new Image();
      image.onload = function() {
        gl.bindTexture(gl.TEXTURE_2D, texture);
        gl.pixelStorei(gl.UNPACK_FLIP_Y_WEBGL, false);
        gl.texImage2D(
          gl.TEXTURE_2D, 0, gl.RGBA, gl.RGBA, gl.UNSIGNED_BYTE, image);
        gl.texParameteri(
          gl.TEXTURE_2D, gl.TEXTURE_MAG_FILTER, gl.LINEAR);
        gl.texParameteri(
          gl.TEXTURE_2D, gl.TEXTURE_MIN_FILTER, gl.LINEAR);
        gl.texParameteri(
          gl.TEXTURE_2D, gl.TEXTURE_WRAP_T, gl.CLAMP_TO_EDGE);
        gl.texParameteri(
          gl.TEXTURE_2D, gl.TEXTURE_WRAP_S, gl.CLAMP_TO_EDGE);
        gl.bindTexture(gl.TEXTURE_2D, null);
        mesh.texture = texture;
        andThenFn();
      }
      image.src = name;
    }

    function init() {
      var canvas = createCanvas();
      var gl = canvas.getContext('experimental-webgl');
      var resize = function() {
        canvas.width = window.innerWidth;
        canvas.height = window.innerHeight;
        gl.viewport(0,0,canvas.width,canvas.height);
      };
      window.addEventListener('resize', resize);
      resize();

      gl.enable(gl.DEPTH_TEST);
      gl.clearColor(0.0, 0.0, 0.0, 0.0);

      var mesh = createFlatMesh(gl);

      var program = createProgram(
        gl,
        [{container: 'vertex-shader', type: gl.VERTEX_SHADER},
         {container: 'fragment-shader', type: gl.FRAGMENT_SHADER}]);


      var projectionMatrix = mat4.create();
      mat4.perspective(
        projectionMatrix, 0.75, canvas.width/canvas.height,
        0.1, 100);
      var viewMatrix = mat4.create();
      var modelMatrix = mat4.create();
      mat4.translate(modelMatrix, modelMatrix, [0,0,-2]);
      mat4.rotate(modelMatrix, modelMatrix, -1, [1,0,0]);

      mesh.modelMatrix = modelMatrix;

      gl.useProgram(program);

      program.modelMatrixUniform =
        gl.getUniformLocation(program, 'modelMatrix');
      program.viewMatrixUniform =
        gl.getUniformLocation(program, 'viewMatrix');
      program.projectionMatrixUniform =
        gl.getUniformLocation(program, 'projectionMatrix');
      
      gl.uniformMatrix4fv(
        program.projectionMatrixUniform, gl.FALSE,
        projectionMatrix);
      gl.uniformMatrix4fv(
        program.viewMatrixUniform, gl.FALSE, viewMatrix);

      gl.bindBuffer(gl.ARRAY_BUFFER, mesh.buffer);

      program.positionAttribute =
        gl.getAttribLocation(program, 'pos');
      program.textureCoordsAttribute =
        gl.getAttribLocation(program, 'texCoords');
      gl.enableVertexAttribArray(program.positionAttribute);
      gl.enableVertexAttribArray(program.textureCoordsAttribute);
      gl.vertexAttribPointer(
        program.positionAttribute, 3, gl.FLOAT, false,
        5 * Float32Array.BYTES_PER_ELEMENT,
        0);
      gl.vertexAttribPointer(
        program.textureCoordsAttribute, 2, gl.FLOAT, false,
        5 * Float32Array.BYTES_PER_ELEMENT,
        3 * Float32Array.BYTES_PER_ELEMENT);

      gl.bindBuffer(gl.ARRAY_BUFFER, null);
      gl.useProgram(null);

      loadTexture('bricks.png', gl, mesh,
        function() {
          requestAnimationFrame(function() {
            render(gl, {
              program: program,
              object: mesh
            });
          })
        });
    }
    </script>
  </head>
  <body onLoad="init()">
    <div id="content">
    </div>
  </body>
</html>

What’s New

  1. The vertex shader now accepts texture coordinates, a plain vec2 (i.e. an X/Y into a 2D image)
  2. The fragment shader is finally getting some action, and has a sampler2D that provides access to the sample data
  3. The fragment shader also calls a function, texture2D that looks up that sampler and extracts that sample data
  4. On the JavaScript/HTML front, I’m creating the <canvas> element programmatically. I’m also attaching an event handler to the window resize event which will resize the canvas and adjust the WebGL viewport.

Finally—arguably most importantly—I have a new loadTexture function. This is responsible for requesting the load of the source image and, after it has, ensuring that a WebGL texture is created and attached to the to-be-rendered object. Since this is asynchronous, we can’t just march onto requestAnimationFrame after calling it, so we parcel that action up in a post-image-load function that loadTexture can invoke. This kicks off the animation loop, but only once the texture data has been successfully loaded.

Like buffers, texture handles are created with a ‘create’ function, createTexture, which returns an opaque identifier that your code can subsequently use to refer to it. Also like buffers, just creating a texture does not allocate storage for it–it just reserves the identifier. It’s not until you call texImage2D that texture data is actually uploaded to GPU memory. This create/bind/manipulate/unbind workflow is a very common pattern in WebGL.

Depending upon your image it may be loaded upside-down, in which case a call to pixelStorei with UNPACK_FLIP_Y_WEBGL set to true will set things right.

So, what’s with all the texParameteri calls? The TEXTURE_{MAG,MIN}_FILTER says what to do if either multiple or no texture pixels are directly located at the sample point at render time. A value of LINEAR causes an interpolation between the neighbouring/contributing values, and generally offers good visual quality. A value of NEAREST, as the name implies, grabs the nearest concrete value. It’s worth experimenting with both to see what that means to the visual quality of your scene.

TEXTURE_WRAP_{T,S} says what to do when the texture image isn’t exactly the same size as the containing object. Should there be a border? Should it stretch the image to fit? The latter is what CLAMP_TO_EDGE does.

That wraps up simple texturing: upload a texture, specify how it maps to a given surface, and bind it during the draw call.

Next Up: If you run that and see a perspective-projected brick square, then all is well. You might, however, be skeptical about it being a fairly complex mesh, rather than just a pair of triangles. In OpenGL, you’d just flip glPolygonMode to GL_LINE and you’d see the wireframe, but WebGL doesn’t offer this (switching the primitive type to LINES doesn’t count: you have to change your model data to account for line drawing, and I don’t want to do that).

So the next post is going to expose the wireframe with a quick hack involving some extra vertex data and the introduction of a feature of the fragment shader: the discard keyword.

Advertisements

One thought on “WebGL from Scratch: Textures, part I

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s