Fractal Terrain: Midpoint Displacement Algorithm

I just finished an introductory class on OpenGL and the mathematics behind computer graphics. The final project was fairly open ended, so I decided to tackle an old problem and produce something pretty while at it.

My project was to make a super simple flight simulator in OpenGL with about 10 hours worth of work. The end result looks pretty nice.

Finished product running in high detail


Now, the midpoint displacement algorithm is fairly easy to get working right, especially in 2D.  Here is the page I got all of my information from.   I started with simply getting a 2D version rendered on the screen, and eventually built my way up to 3D.

Generating the 3D terrain is a bit trickier than first glance.  First off, I wanted to use triangles instead of quads, there you already have some issues with how you are going to store everything.  The easiest approach is typically to use quads, so I used an interesting approach to get it all working.

What you see above are two different coordinate frames.  The one on the left, "square space," is more easily understood and makes terrain generation very nice.  The one on the right, "triangle space," is harder to work with for some things, but looks much better when rendered.  To get from one to the other you simply apply a shear transformation.

I was able to generate tiles of terrain in square space by starting with four vertices.  Using the midpoint displacement algorithm I simply increased the depth until I had very fine terrain.  The result was then transformed to triangle space, not that only x & y were affected, not the vertical z direction, and used these vertices to display my terrain in OpenGL.

Basically the same image as above, but rendered in wireframe

So now I can create an render a tile of any size and render it on the screen.  To get nice looking terrain I had to be able to stitch tiles together.  I ended up creating a 10 x 10 grid of tiles, which I then generated in a manner to ensure that their edges were seamless.  I also made sure that the far edges wrap around as well.  Thus, when flying around in my virtual world there is basically and endless set of scrolling terrain.

Why did I bother with tiles?  Because that way I can choose which tiles to render and which not to.  I take my viewpoint, convert it from triangle space to square space, use that to figure out where in the grid I am, then only render the tiles I can actually see.

Some additional things I had to do to get it looking nice was adding color and lighting.  The first was simply done by lerping between colors  based on vertex altitude.  Getting the vertex normals was a tougher job.

The surface normal for a triangle is well-defined, but for a vertex between six triangles you have to do some magic to get a good vector.  I ended up taking the cross product of all six triangles (getting that to work at the edge between two tiles was tricky), then averaging them to get the normal.  The end result I unitized to get my vector.  This was all computed at the very beginning in terrain generation.

The last fun thing I did was add camera flight control and distance rendering.  Camera flight control has the user control heading and camera pitch via the arrow keys, and has the user fly over the terrain.  Pretty neat, nothing too hard there.

Distance rendering was a way to cut down on triangles farther away from the user.  This was somewhat tricky.

Here I have distance based rendering enabled.  The closer terrain is rendered to a higher depth, whereas the farther terrain is rendered with a lower depth.  To make it look nice I have a smooth transition between to two.  Thus, I can render to non-integer depths, i.e., depth 1.5, by rendering depth 1 with the depth 2 vertexes displaced only halfway from the midpoint.

I did not have the time to get the tile edges to stitch together perfectly, but am happy nonetheless.  The user can toggle between uniform depth rendering and distance based rendering.

As always, code is available.  See here.