Visit the Ejecta website for more info on what it is and how to use it. I will talk a bit more about some implementation details for the Canvas API here.
Implementing a general purpose drawing API, such as the HTML5 Canvas API, on top of OpenGL is by no means an easy endeavor. Before I decided to roll my own solution (you know, I have this problem), I looked at a number of graphic libraries including Google's skia and OpenVG.
I discovered exactly what I feared beforehand: these libraries do way too much, are too large and too hard to implement. You can't just use them here and there to draw – instead they replace your whole drawing stack. Getting them to compile alone is a huge pain; getting them to compile on the iPhone and then get them do what you wanted to seemed close to impossible.
So I began working on my own solution. Implementing the path methods for moveTo(), lineTo(), bezierCurveTo(), etc. was fairly straight forward: have an array of subpaths where each subpath is an array of points (x,y). Each call to the API methods pushes one or more points to the subpath or closes it.
However, I struggled a bit with getting bezier curves to behave in a manner that makes sense for the current scale; i.e. push more points for large bezier curves and at sharp corners, fewer points for smaller ones and straight lines. After a few days of reading and experimenting, I found this excellent article on adaptive bezier curves and adopted its solution.
The hard part was getting that array of points on the screen. For drawing lines (.stroke()) I didn't want to go with the obvious solution of just using GL_LINES, because it has a number of drawbacks, especially on iOS: no anti aliasing, limited line width and no miters or line caps.
So instead of using GL_LINES to draw, I ended up creating 2 triangles for each line segment and calculate the miter values myself. This correctly honors the APIs .miterLimit property, though the bevel it then draws is still a bit off. The code I ended up with is a bit on the ugly side, because it handles a lot of edge cases, but all in all this solution worked very well and is extremely fast.
Implementing .fill() proved to be yet another challenge. With OpenGL, before you can draw a primitive to the screen, you have to break it down into triangles first. This is quite easy to do for convex polygons, but not so much for concave ones that potentially have holes in them.
I spent a few days looking for triangulation library and soon realized that this is serious business. Triangle for instance, sports 16k loc – I'm quite allergic to libraries that need that much code to solve seemingly simple problems. Poly2Tri looked much more sane, but apparently has some stability problems.
After a bit of searching, I found libtess2, which is based on OpenGL's libtess and is supposed to be extremely robust and quite fast. The code base is excellent and I had no problem implementing it with Ejecta.
However, some tests showed that it's much slower than I hoped it would be. Realtime triangulation of complex polygons isn't very feasible on the iPhone.
In the end, I found a trick that lets you draw polygons in OpenGL without triangulating them first. It is so simple and elegant to implement, yet so ingenious: You can draw polygons with a simple triangle fan and mark those areas that you overdraw in the stencil buffer. See Drawing Filled, Concave Polygons Using the Stencil Buffer. It's a hacker's solution – thinking outside the box – and it fills me with joy.
There's still some parts missing in my Canvas implementation, namely gradients, shadows and most notably: text. I believe the best solution for drawing text in OpenGL, while honoring the Canvas spec, would be drawing to a texture using the iPhone's CG methods. This will make it quite slow, but should be good enough for a few paragraphs of text.
What makes this possible is a compatibility layer that mimics the HTML5 Canvas and Audio APIs but is implemented with OpenGL and OpenAL behind the scenes. Think of it as a browser that can only display a Canvas element and play Audio elements, but does not render generic HTML pages. A browser perfectly suited for HTML5 games.
If you have a license for Impact you will find the complete source code for this all on your download page. I also wrote some basic documentation to get you started. In theory, you don't have to know anything about Objective-C to use this, but at this stage some Objective-C knowledge will sure come in handy. Again, this is very experimental. Don't expect it to work at all.
Anyway, the performance of both games is pretty good. I still get some occasional slowdowns (~20fps) on my iPhone3GS in Biolab Disaster when there are too many particles on the screen, but it remains playable at all times. It's also nice to have perfectly working sound on iOS now - something even some desktop browsers still struggle with.
I know you're waiting for the release of the Impact Game Engine, and I promise you, it's coming. I just get distracted too easily. So here's my game Biolab Disaster running on the iPhone 3GS with 60 frames per second:
Songfever, at it's heart, is a tangible front-end for iTunes. Album covers are projected onto four physical objects standing on a shelf. With a scroll wheel, attached to the shelf, you can scroll through your music library in a Cover Flow like fashion. The goal was to reintroduce the aesthetic quality of a physical music collection (CDs, LPs), while maintaining the comfort of a digital one.
Each of the four covers are tracked by a webcam mounted above the shelf. Their positions are translated in software in the same way the projector located relative to the webcam in the real world. When rendered in a 3D OpenGL view and projected back onto the shelf, the album covers line up with the physical objects again. Approximately, that is.
Songfever was our main project in the 4th semester for “Digital Media” at the Hochschule Darmstadt. I was responsible for the programming; the UVC Camera Control I posted some month ago, was a small part of it.
We finally managed to built a website for our last years semester project Klangpong. I was mainly responsible for the programming and 3d graphics. The full Processing source code of the game is available at the site, but it’s a bit complicated to get it up and running.
We will present the game at the Mediale (Darmstadt/Germany) starting tomorrow – so if you want to play it, pay us a visit!
In my last post I said I’d add some exhaust streams to my robot. I also said I’d use particles for that. Well, I didn’t. I spent several hours trying to get it right, but it always looked like my robot’s taking a bubble bath instead of riding on a powerful jet engine. I ended up using a single alpha blended sprite for each of the streams. This may sound cheap but actually looks really good.
As you will see in the following video, I did use particles for another effect: the huge explosion you’ll face every time you loose. Since there really is no way to “win” this game, I at least wanted to make the Game Over as visually appealing as possible. I also added a start animation – the robot now launches like a real rocket from a girder that is disconnected just after the engines have warmed up.
This video shows the game in its current state – it is by no means final. There’s still a lot of stuff missing: a properly designed HUD, effects when crates are collected or new ones spawn and of course: sound! I can’t really talk about sound yet, because I haven’t done anything so far. The only thing I can tell you is that Yuckfu will have sound and music. Hopefully.
Read on for some more in depth (and embarrassing) game developer geekery.
After my last post I read a bit more about displaying text with OpenGL. One of the more popular solutions is to have a single texture containing all (ASCII) characters of a font. As it turned out, there are several applications to build such a texture for you. Bitmap Font Builder is one of them. It looks a bit clunky, but gets the job done nicely – and it’s free!
Here’s a quick test I did with the Helvetica typeface. I exported the texture as a transparent PNG file, cropped it and added a simple drop shadow effect in Photoshop. For now, I only used the upper case letters. I’ll probably come back later and export the full set of Latin-1 characters.
It’s been a while… I finished the basic gameplay mechanics of my iPhone version of YuckFu just a day after my last post, but hadn’t yet found the time to write about it. So here’s just a short update.
As you can see in the screenshot, the game still looks like crap. But that’s intentional (no really, it is). The main focus of YuckFu was always on the gameplay, so thats the first thing I wanted to finish. Just to make sure it works on the iPhone – and it does! A game with good graphics that is no fun to play isn’t worth anything. A game, however, that doesn’t look nice and shiny but is fun to play, is still a good game. Just to clarify: I’m aiming for both, but I’ll still need to tweak some values and let some friends test it before I move on to the graphics side.
And of course there’s on thing still missing: the score. Displaying text in OpenGL applications has always been somewhat tedious. I haven’t yet really looked if OSX provides some different solutions, but as far as I know there are only two viable options: Using a single texture with all glyphs in it (built in Photoshop or whatever), or loading a TrueType font and rendering each glyph into a texture. With both methods you’ll face the same problems: Each glyph has to be stored in memory – this is fine for ASCII characters, but can get challenging with UTF8 characters. So a scoreboard with names consisting of characters outside of ASCII is probably not going to happen for YuckFu. The other problem is the lack of support for “features” like kerning. Even displaying proportional fonts can be a pain. So, maybe I’ll just settle with a retro looking bitmap font – but not before I’ve checked if OSX does indeed have some magic way for solving these problems! Although they are not exactly on the OS side…