Impact Is Now Free & Open Source

My HTML5 Game Engine Impact launched almost 8 years ago. The last update was published in 2014. While Impact still works nicely in modern browsers, it lacks support for better graphic and sound APIs that are now available. I felt increasingly bad for selling a product that is hardly maintained or improved.

So as of today Impact will be available completely for free, published under the permissive MIT License.

Impact's source is available on

Thanks anyone who bought a license in the last few years!

Play GTA V In Your Browser – Sort Of

Inspired by a blog post to run your own cloud gaming service which uses a VPN and Steam's In-Home Streaming, I thought I could do this, too, but in the browser.

With the tech I developed for Instant Webcam I had all the major building blocks ready and just needed to glue them together in nice, useable Windows App.

Demo Video for jsmpeg-vnc

The result is jsmpeg-vnc – a tiny Windows application written in C. It captures the screen at 60 frames per second, encodes it into MPEG1 using FFmpeg, sends it over WebSockets to the browser where it is finally decoded again in JavaScript using jsmpeg. All the mouse and keyboard input in the browser is captured as well and sent back to the server again.

This works way better than it should. Latency is minimal and only affected by the network conditions. It's certainly good enough to play most types of games.

Update: I did some latency measurement now. For my system it hovers around 50-70ms (video, photo). Tested with an 800x600 window at ~7% CPU utilization on a Core i5. My desktop monitor is a Dell U2711, which seems to add about 15ms latency itself.

Full source code and binary releases are on github: jsmpeg-vnc

What makes an On-Screen Keyboard Fun?

We recently released our typing game ZType for the iPhone and learned a whole lot in the process.


The original Web Version of ZType was hastily created for Mozilla's Game On competition a few years ago and won the Community Choice Award back then. For this version, it was a conscious decission to not care for mobile device. After all, what fun would a typing game be if you don't have a real keyboard?

The Web Version of ZType still remains to be quite popular and we got and more requests for a mobile version of the game. So a few months back we started investigating the possibilities. There are already a few typing games for iOS out there. Many of them can hardly be called a "game" (to be fair, some don't aim to be one), most use the default iOS keyboard and gameplay is cumbersome – sort of what we expected from typing games with an on-screen keyboard.

The default iOS on-screen keyboard needs a lot of screen real estate for things we don't care about: language settings, microphone input, a space bar and word suggestions. Most modern on-screen keyboards are predictive in that they try to figure out what it is you want to write. This works okay-ish for writing a quick SMS, but quickly becomes obstructive for a typing game. It was clear that if we wanted to bring ZType to mobile devices and make it fun to play, we'd have to come up with a custom keyboard solution.

In ZType we carefully control which words appear on the screen. We prevent having two words with the same first letter in the game at the same time, so that targeting words remains unambiguous. With a layout map of a QWERTY keyboard, we can further ensure that these letters spread out over the keyboard, instead of clustering to nearby keys.

Now, here's the crux: if we control which words are on the screen and we ensure keys are spread out over the keyboard, we can enlarge the touch area for each valid key, making it easier to hit the right key. After having targeted a word, there's only one valid key to hit and we can similarly enlarge the touch area for it as well.

We carefully designed the keyboard so that even with those enlarged touch areas, it's still possible to hit all keys. For example, if S is a valid key and thus has a larger touch area, the touch areas for all surrounding keys (W, A, D, X and Y) simply gets a bit smaller. This took some fiddling around and hours of play testing to get right, but in the end was worth it.

Another thing we noticed is that sometimes you hit a key while performing a downward motion with your finger. Typically an on-screen keyboard only emits a letter when you lift your finger from a key, not when you touch it. This allows you to swipe over the keyboard until you see the right key highlighted, but it also creates another problem: Say you want to type the letter T – there's one smooth motion by your thumb, starting from somewhat above the keyboard, then hitting the surface of the touch screen at that T key and continuing to move your thumb further down on the glass before lifting your thumb again. That moment, when you lift your thumb, it may well be over the G key.

The iOS default keyboard seems to recognize this motion and correctly determines that the letter you meant to type was the T, even though you ended up lifting your thumb over the G. In our tests that happened often enough that we had to take care of it too. We ended up detecting if the key where you lifted your finger was held for a period shorter than 30msec and the key where you started the touch was one row above. If both of the conditions are met, we determine that the key you actually wanted to hit was the key were you started the touch at.

The end result is a game where you can almost blindly type on an on-screen keyboard, without being patronised. I'm proud to say that it's actually a fun typing game.

Try it yourself: Download ZType from the AppStore

Xibalba & WebVR

WebVR is an effort of Mozilla and Google to enable Virtual Reality content on the web. Experimental builds of Chromium and Firefox that provide a JavaScript API for WebVR have been available for a while now.

My game Xibalba already provided a stereo rendering mode and supported an external tracking server to get orientation updates from the VR headset, but it was clumsy and only worked for the old Oculus Rift DK1. So I decided to ditch the old stereo rendering mode and implement WebVR proper.

Xibalba VR

WebVR handles the distortion for the Headset by itself. All you have to do is render your game into a side-by-side stereo format and request fullscreen mode on your Headset. Orientation updates are provided by a nice, clean API that's easily implemented in the game.

I also added support for the JavaScript Gamepad API with a Plugin for Impact – the game plays great with a XBox360 Gamepad, even if you don't have an Oculus Rift.

Play the game here:

If you have a DK2 but no WebVR enabled build of Chrome or Firefox, you can also try a standalone Windows version of the game. It's essentially a Chromium build bundled with the game and batch file to start it. Makes it a bit easier to get going.

Download: ~80mb

XType Plus – an HTML5 Game for the Nintendo Wii U

For many Indie Game Developers getting your game on a real gaming console is something like the holy grail. At least for me it was. The entry barrier for any gaming console seems unequally high, compared to PC or mobile development.

Nintendo is trying to change that with their Nintendo Web Framework – it's essentially a WebKit browser for the Wii U but offers some custom APIs to interface with various controllers and Nintendo's online services. Nintendo also did a tremendous job of speeding up Canvas2D rendering on their console and providing excellent HTML5 Audio and WebAudio support. The result is a game development framework suitable for about any 2D game you can think of, but with an extremely low entry barrier.

Gameplay Trailer for XType Plus

A few years ago I made an HTML5 game called X-Type, built on top of the Impact Game Engine. It was mostly meant to test the rendering speed of Browsers at the time, but turned out to be quite fun to play as well.

In 2013 Nintendo invited me to GDC Europe where I presented a quick port of the game for the Wii U. The feedback I got at GDC Europe was extremely positive, so I decided push the prototype as far as I can, polish all the rough edges and make a real game out of it.

A prototype of XType Plus at GDC Europe

Today, XType Plus launches in the Nintendo eShop in North America, Europe, the UK, Australia and New Zealand. I'm very excited to see this come to life!

For the most part, porting the game to the Nintendo Web Framework was very straight forward. My Game Engine Impact worked without any modifications and now officially supports the NWF as well to tightly integrate the Wii U controllers and other platform features.

There were some rough edges, particularly with Nintendo's online services, but these have mostly been smoothed out in current NWF releases. Interfacing with the GamePad screen and various controllers was a breeze and most features supported by a "real" Browser work as expected in NWF. Canvas and Audio work exactly as in the Browser and the settings in XType Plus for instance are simply saved to localStorage.

Even with all the additions for the Wii U, the game still runs in a Desktop Browser as well. This made the debugging process of core gameplay features a very simple process.

Screenshot of XType Plus from the all new Plus Mode

If you want to know more about the creation process of XType Plus and the Nintendo Web Framework please have a look at the interview I did with back in Februray.

I hope to see your names pop up in the Highscores of XType Plus!

Xibalba – A WebGL First Person Shooter

More than a year ago I started to work on a project that was supposed to be released for 7DFPS – a game dev competition about making a First Person Shooter in 7 Days. I didn't meet the deadline, but the game I started back then is now finally finished.

Please Enjoy:

Xibalba Xibalba – A WebGL First Person Shooter

Xibalba is also available in the iOS AppStore

The game was build on top of my HTML5 Game Engine Impact and is entirely written in JavaScript. While Impact is intended for 2D games, this first person shooter fit very naturally with the engine.

Xibalba, at its heart, is just a 2D game with a 3D viewport. So all the necessary elements for the gameplay were already provided by the game engine: collision detection and response, entity management, sound playback a versatile level editor and much more. The only thing missing was a 3D view.

I created a 10 minute Making Of screencast that explains the ideas and the design behind the game a bit closer.

The Making Of Xibalba on Youtube

The iPhone and iPad version of the game was made with Ejecta. The game also runs just fine in Mobile Safari on the current iOS 8 beta, but the browser's UI, especially in landscape mode, unfortunately hinder the gameplay quite a bit. It's much more playable in Chrome for Android.

I also published the 3D Viewport part of Xibalba, along with another tiny demo game, under the name TwoPointFive – a tribute to the excellent ThreeJS library.

The TwoPointFive Plugin is available on Github:

Fast Image Filters with WebGL

WebGLImageFilter is a small JavaScript library for applying a chain of filters to an image.

Sergey Brin in his badass Tesla, sporting Chrome wheels

But to quote myself from twitter:

That awkward moment when you realize the lib you've been polishing for the past 6 hours already exists. With more features and a better API.

~ @phoboslab, Nov 3.

So, yes, there's already a library called glfx.js which basically does the same thing. And I'm almost saddened to say that it's excellent.

It's not all in vain, however. My implementation features raw convolution and color matrix filters. The latter should be particularly interesting for those coming from the Flash world: it's the exact same as Flash's ColorMatrixFilter and allows for some nice "Instagrammy" photo effects among others. There are some JavaScript libraries around that implement this ColorMatrixFilter, but they all do it in JavaScript directly, which is quite slow. Mine can be used in realtime, even on iOS with Ejecta.

I also learned some things about WebGL and Shaders with all this. One interesting aspect is that you can't draw a framebuffer texture onto itself. So in order to apply more than one effect for the same image, you need to juggle around 2 textures - use one as the source, the other one as the target and then switch. And if you don't want to waste any draw calls, you need to know which draw call will be the last, so that you can draw directly onto the target Canvas instead of on a framebuffer texture.

WebGL's coordinate system also complicates things a bit. It has the origin in the bottom left corner, instead of in the top left like most 2D APIs, essentially flipping everything on the Y axis. This has caused me a lot of pain with Ejecta already. The solution sounds trivial: just scale your drawing coordinates by -1, but this only gets you so far. If you need to get the pixel data of your image or draw one canvas into another, you suddenly have to invert everything again.

Lesson learned: Always google before you start working.

Download: WebGLImageFilter on github.

HTML5 Live Video Streaming via WebSockets

When I built my Instant Webcam App, I was searching for solutions to stream live video from the iPhone's Camera to browsers. There were none.

When it comes to (live) streaming video with HTML5, the situation is pretty dire. HTML5 Video currently has no formalized support for streaming whatsoever. Safari supports the awkward HTTP Live Streaming and there's an upcomming Media Source Extension standard as well as MPEG-DASH. But all these solutions divide the video in shorter segments, each of which can be downloaded by the browser individually. This introduces a minimum lag of 5 seconds.

So here's a totally different solution that works in any modern browser: Firefox, Chrome, Safari, Mobile Safari, Chrome for Android and even Internet Explorer 10.

Please use a browser that supports the Canvas Element, like Chrome, Firefox, Safari or Internet Explorer 10

A live view at recording from our office in Darmstadt, Germany. For a live streaming example, please check the free iOS app instead.

It's quite backwards, uses outdated technology and doesn't support audio at the moment. But it works. Surprisingly well.

The Camera Video is encoded to MPEG by ffmpeg on a local machine and then sent to a public webserver via HTTP. On the webserver a tiny nodejs script simply distributes the MPEG stream via WebSockets to all connected Browsers. The Browser then decodes the MPEG stream in JavaScript and renders the decoded pictures into a Canvas Element.

You can even use a Raspberry Pi to stream the video. It's a bit on the slow side, but In my tests it had no problem encoding 320x240 video on the fly with 30fps. This makes it the, to my knowledge, best video streaming solution for the Raspberry Pi right now.

Here's how to set this up. First get a current version of ffmpeg. Up to date packages are available at deb-multimedia. If you are on Linux, your Webcam should be available at /dev/video0 or /dev/video1. On OSX or Windows you may be able to feed ffmpeg through VLC somehow.

Make sure you have nodejs installed on the server through which you want to distribute the stream. Get the stream-server.js script from jsmpeg.

Now install its dependency to the ws WebSocket package and start the server with a password of your choosing. This password is there to ensure that no one can hijack the video stream:

npm install ws
node stream-server.js yourpassword

You should see the following output when the server is running correctly:

Listening for MPEG Stream on<secret>/<width>/<height>
Awaiting WebSocket connections on ws://

With the nodejs script started on the server, you can now start ffmpeg on the local machine and point it to the domain and port where the nodejs script is running:

ffmpeg -s 640x480 -f video4linux2 -i /dev/video0 -f mpeg1video \
-b 800k -r 30

This starts capturing the webcam video in 640x480 and encodes an MPEG video with 30fps and a bitrate of 800kbit/s. The encoded video is then sent to the specified host and port via HTTP. Make sure to provide the correct secret as specified in the stream-server.js. The width and height parameters in the destination URL also have to be set correctly; the stream server otherwise has no way to figure out the correct dimensions.

On the Raspberry Pi you will probably have to turn down the resolution to 320x240 to still be able to encode with 30fps.

To view the stream, get the stream-example.html and jsmpg.js from the jsmpeg project. Change the WebSocket URL in the stream-example.html to the one of your server and open it in your favorite browser.

If everything works, you should be able to see a smooth camera video with less than 100ms lag. Quite nice for such hackery and a humble MPEG decoder in JS.

Again, for an easier to use solution, check the Instant Webcam App.

Drawing Pixels is Hard

Way harder than it should be.

Back in 2009 when I first started to work on what would become my HTML5 game engine Impact, I was immediately presented with the challenge of scaling the game screen while maintaining crisp, clean pixels. This sounds like an easy problem to solve – after all Flash did this from day one and "retro" games are a big chunk of the market, especially for browser games, so it really should be supported – but it's not.

Let's say I have a game with an internal resolution of 320×240 and I want to scale it up 2x to 640×480 when presented on a website. With the HTML5 Canvas element, there are essentially two different ways to do this.

a) Creating the Canvas element in the scaled up resolution (640×480) and draw all images at twice the size:

var canvas = document.createElement('canvas');
canvas.width = 640;
canvas.width = 480;

var ctx = canvas.getContext('2d');
ctx.scale( 2, 2 );
ctx.drawImage( img, 0, 0 );

b) Using CSS to scale the Canvas – In my opinion this is the cleaner way to do it. It nicely decouples the internal canvas size from the size at which it is presented:

var canvas = document.createElement('canvas');
canvas.width = 320;
canvas.width = 240; = '640px'; = '480px';

var ctx = canvas.getContext('2d');
ctx.drawImage( img, 0, 0 );

Both methods have a problem though – they use a bilinear (blurry) filtering instead of nearest-neighbor (pixel repetition) when scaling.

For the internal scaling approach (method a), you can set the context's imageSmoothingEnabled property to false in order to have crisp, nearest-neighbor scaling. This has been supported in Firefox for a few years now, but Chrome only just recently implemented it and it is currently unsupported in Safari (including Mobile Safari) and Internet Explorer (test case).

When doing the scaling in CSS (method b), you can use the image-rendering CSS property to specify the scaling algorithm the browser should use. This works well in Firefox and Safari, but all other browsers simply ignore it for the Canvas element (test case).

Of course Internet Explorer is the only browser that currently doesn't support any of these methods.

Not having crisp scaling really bothered me when I initially started to work on Impact. Keep in mind that at the time no browser supported either of the two methods described above. So I experiment a lot to find a solution.

And I found one. It's incredibly backwards and really quite sad: I do the scaling in JavaScript. Load the pixel data of each image, loop through all pixels and copy and scale the image, pixel by pixel, into a larger canvas then throw away the original image and use this larger canvas as the source for drawing instead.

var resize = function( img, scale ) {
    // Takes an image and a scaling factor and returns the scaled image

    // The original image is drawn into an offscreen canvas of the same size
    // and copied, pixel by pixel into another offscreen canvas with the 
    // new size.

    var widthScaled = img.width * scale;
    var heightScaled = img.height * scale;

    var orig = document.createElement('canvas');
    orig.width = img.width;
    orig.height = img.height;
    var origCtx = orig.getContext('2d');
    origCtx.drawImage(img, 0, 0);
    var origPixels = origCtx.getImageData(0, 0, img.width, img.height);

    var scaled = document.createElement('canvas');
    scaled.width = widthScaled;
    scaled.height = heightScaled;
    var scaledCtx = scaled.getContext('2d');
    var scaledPixels = scaledCtx.getImageData( 0, 0, widthScaled, heightScaled );

    for( var y = 0; y < heightScaled; y++ ) {
        for( var x = 0; x < widthScaled; x++ ) {
            var index = (Math.floor(y / scale) * img.width + Math.floor(x / scale)) * 4;
            var indexScaled = (y * widthScaled + x) * 4;
  [ indexScaled ] =[ index ];
  [ indexScaled+1 ] =[ index+1 ];
  [ indexScaled+2 ] =[ index+2 ];
  [ indexScaled+3 ] =[ index+3 ];
    scaledCtx.putImageData( scaledPixels, 0, 0 );
    return scaled;

This worked surprisingly well and has been the easiest way to scale up pixel-style games in Impact from day one. The scaling is only done once when the game first loads, so the performance hit isn't that bad, but you still notice the longer load times on mobile devices or when loading big images. After all, it's a stupidly costly operation do to, even in native code. We usually use GPUs for stuff like that.

All in all, doing the scaling in JavaScript is not the "right" solution, but the one that works for all browsers.

Or rather worked for all browsers.

Meet the retina iPhone

When Apple introduced the iPhone 4, it was the first device with a retina display. The pixels on the screen are so small, that you can't discern them. This also means, that in order to read anything on a website at all, this website has to be scaled up 2x.

So Apple introduced the devicePixelRatio. It's the ratio of real hardware pixels to CSS pixels. The iPhone 4 has a device pixel ratio of 2, i.e. one CSS pixel is displayed with 2 hardware pixels on the screen.

This also means that the following canvas element will be automatically scaled up to 640×480 hardware pixels on a retina device, when drawn on a website. Its internal resolution, however, still is 320×240.

<canvas width="320" height="240">

This automatic scaling again happens with the bilinear (blurry) filtering by default.

So, in order to draw at the native hardware resolution, you'd have to do your image scaling in JavaScript as usual but with twice the scaling factor, create the canvas with twice the internal size and then scale it down again using CSS.

Or, in recent Safari's, use the image-rendering: -webkit-optimize-contrast; CSS property. Nice!

This certainly makes things a bit more complicated, but devicePixelRatio was a sane idea. It makes sense.

Meet the retina MacBook Pro

For the new retina MacBook Pro (MBP), Apple had another idea. Instead of behaving in the same way as Mobile Safari on the iPhone, Safari for the retina MBP will automatically create a canvas element with twice the internal resolution than you requested. In theory, this is quite nice if you only want to draw shapes onto your canvas - they will automatically be in retina resolution. However, it significantly breaks drawing images.

Consider this Canvas element:

<canvas width="320" height="240"></canvas>

On the retina MBP, this will actually create a Canvas element with an internal resolution of 640×480. It will still behave as if it had an internal resolution of 320×240, though. Sort of.

This ingenious idea is called backingStorePixelRatio and, you guessed it, for the retina MBP it is 2. It's still 1 for the retina iPhone. Because… yeah…

(Paul Lewis recently wrote a nice article about High DPI Canvas Drawing, including a handy function that mediates between the retina iPhone and MBP and always draws in the native resolution)

Ok, so what happens if you now draw a 320×240 image to this 320×240 Canvas that in reality is a 640×480 Canvas? Yep, the image will get scaled using bilinear (blurry) filtering. Granted, if it wouldn't use bilinear filtering, this whole pixel ratio dance wouldn't make much sense. The problem is, there's no opt-out.

Let's say I want to analyze the colors of an image. I'd normally just draw the image to a canvas element retrieve an array of pixels from the canvas and then do whatever I want to do with them. Like this:

ctx.drawImage( img, 0, 0 );
var pixels = ctx.getImageData( 0, 0, img.width, img.height );
// do something with

On the retina MBP you can't do that anymore. The pixels that getImageData() returns are interpolated pixels, not the original pixels of the image. The image you have drawn to the canvas was first scaled up, to meet the bigger backing store and then scaled down again when retrieved through getImageData(), because getImageData() still acts as if the canvas was 320×240.

Fortunately, Apple also introduced a new getImageDataHD() method to retrieve the real pixel data from the backing store. So all you'd have to do is draw your image to the canvas with half the size, in order to draw it at the real size. Confused yet?

var ratio = ctx.webkitBackingStorePixelRatio || 1;
ctx.drawImage( img, 0, 0, img.width/ratio, img.height/ratio );

var pixels = null;
if( ratio != 1 ) {
    pixels = ctx.webkitGetImageDataHD( 0, 0, img.width, img.height );
else {
    pixels = ctx.getImageData( 0, 0, img.width, img.height );

(Did I say it's called getImageDataHD()? I lied. You gotta love those vendor prefixes. Imagine how nice it would be if there also was a moz, ms, o and a plain variant!)

The "Good" News

Ok, take a deep breath, there are only 3 different paths you have to consider when drawing sharp pixels on a scaled canvas.

The CSS image-rendering property and the Canvas' imageSmoothingEnabled really make things a bit easier, but it would be nice if they were universally supported. Especially Safari is in desperate need for imageSmoothingEnabled-support, with all the crazy retina stuff they have going on.

Let me also go on record saying that backingStorePixelRatio was a bad idea. It would have been a nice opt-in feature, but it's not a good default. A comment from Jake Archibald on Paul Lewis' article tells us why:

<canvas> 2D is a bitmap API, it's pixel dependent. An api that lets you query individual pixels shouldn't be creating pixels you don't ask for.

Apple's backingStorePixelRatio completely breaks the font rendering in Impact, makes games look blurry and breaks a whole bunch of other apps that use direct pixel manipulation. But at least Apple didn't have to update all their dashboard widgets for retina resolution. How convenient!

Update September 18th 2012: To demonstrate the bug in Safari, I build another test case and filed a report with Apple.

What the Fucking Fuck, Apple?

Update Tuesday, July 16th 2012: The Bug appears to be fixed in iOS 6 Beta 3.

Almost two years ago I noticed a strange multitouch problem in Mobile Safari with one of my games. This problem was new; it was introduced with the iOS 4.2.1 update and wasn't present on older iOS versions.

I built a test case and filed a bug report with Apple. It was marked as a duplicate of a bug they already knew about. I have no idea what that original bug report said, because fuck you, Apple's bug tracker is private. That was almost two years ago.

Two years.

I re-submitted this bug report 4 more times, after it wasn't fixed in the subsequent iOS updates. It was marked as a duplicate and closed each time. It still hasn't been fixed in the iOS 6 Beta.

Two fucking years.

Annoyed by this bug, Jayson Potter recently contacted me, asking for the bug id and description of my report and re-submitted it by himself. This time it wasn't marked as a duplicate. Instead, he got this reply from Apple a week later:

Please know that our engineers have not been able to reproduce this reported behavior with iOS 5.x

WHAT THE FUCKING FUCK, APPLE? You need two fucking years to decide that you can't reproduce a bug with a test case so simple that a three year old could understand it?

This is beyond frustrating. And there's no alternative to Mobile Safari on iOS; Apple doesn't allow it. On the iPhone, you're as locked in with HTML5 as you were with Flash everywhere else before - you're completely at the mercy of one fucking company.

So I'm really desperate to get this bug fixed – I recorded a nice video and voice over, explaining the bug itself and why it's such a big deal. I re-submitted the bug report again this morning (Bug #11796586), with a link to it:

Demonstrating the bug in the test case and my game Biolab Disaster

All in all I probably spent about 40 hours dealing with this shit: Building the test case, reporting it to Apple, writing documentation that states where this bug is and why it exists, explaining to my customer's that it's not their fault, nor mine, that their on-screen button doesn't work and apologizing to them for the fact that Apple is a fucking ignorant piece of shit company.

Fucking fix it, Apple.

X-Type – Making Of

For this year's Google IO, Google asked me to do a Chrome experiment for Mobile for them. They initially wanted me to vamp up Biolab Disaster – it's still a good game, but because of it's retro style it wouldn't be that impressive. Modern mobile browsers can do a lot more.

I suggested I would try to take another game of mine – X-Type – and make it work on mobile browsers. The game was made with my JavaScript Game Engine, so it mostly "just worked" on mobile browsers already. Yet, I still had a lot of work to do.

Have a look at X-Type over at

X-TYPE running on various mobile devices

Screen Size

One of the most difficult things for HTML5 games is dealing with the vast amount of different screen sizes and aspect ratios out there. I experimented a lot with different solutions and ended with a fairly simple one: the internal width the game's viewport is always 480px. These 480px get scaled to whatever is available on the device. The height of the viewport is variable, so that it fills the screen with the same scaling factor as the width.

// The internal width for our canvas is fixed at 480px.
// The internal height is set so that it fills the screen when scaled
canvas.width = 480;
canvas.height = window.innerHeight * (canvas.width / window.innerWidth);

// Scale the canvas via CSS to fill the screen = window.innerWidth + 'px'; = window.innerHeight + 'px';

In older browsers (Mobile and Desktop), scaling the <canvas> element was a horrible idea – it decreased performance to a tenth of what it would be unscaled. I'm happy to report that this is no longer true; Mobile Safari on iOS5 and the Chrome Beta on Android work just fine with scaled canvas. It still makes the game unplayable in Android's "Browser", though.

I also took care to only display the game in portrait mode and show a "Please Rotate the Device" message otherwise. Mobile Safari and Chrome both support the orientationchange event, which makes this easy. However, we can not rely on window.orientation, which reports the rotation in degrees (0, 90, 180 or 270), because some devices report 0° for portrait mode, while others report 0° for landscape. How convenient!

The solution is to just check if the window height is bigger than the width – if so, we're obviously in portrait mode! But as this would be too easy, Chrome's Browser offers another challenge for us: it only updates the window dimensions after it has fired the orientationchange event. So we listen for orientationchange and resize events. Sigh.

var wasPortrait = -1;
var checkOrientation = function() {
    var isPortrait = (window.innerHeight > window.innerWidth);
    if( isPortrait === wasPortrait ) { return; // Nothing to do here }
    wasPortrait = isPortrait;

    // Do your stuff...
window.addEventListener( 'orientationchange', checkOrientation, false );
window.addEventListener( 'resize', checkOrientation, false );


Since iOS 5 the <canvas> element is hardware accelerated and it really shows. You can draw hundreds of sprites on the screen without any slowdowns at all. The same is true for Chrome on Android – to a certain degree.

All drawing is scheduled via requestAnimationFrame and thus bound to the display's refresh rate. This works nicely on iOS, but Chrome refuses to draw at the 60hz even for the simplest scenes. You can use setInterval to process more frames, but only a portion of them is really presented on the screen.

So while Chrome's JavaScript engine is fast enough to process and render 60 FPS, it fails to display all of the rendered frames. I have no doubt that this bug(?) will get fixed.

While the game works in a multitude of other browser's on Android, such as Firefox, Dolphin or Opera, none of them provided good performance. I suspect the <canvas> element is not hardware acclerated in any of these, but didn't investigate further.


In the desktop version of the game you move the player with the arrow keys and aim and shoot with the mouse. Of course this doesn't work on touch screens, so I opted for dual virtual analog sticks. This worked out surprisingly well – with a bit of practice, you can control your spaceship quite precisely.

I also tried to make the analog sticks appear where you touch the screen first, so that you can always change the position. This made everything quite a bit confusing; it's easier to grasp the concept when the position of the analog sticks is fixed. Providing the dynamic positioning is probably more of "pro gamer" feature that should be optional, if at all.


This is the sad part. I complained about support for the <audio> element in mobile browsers last year already – and guess what: it's still the same shit. Apple hasn't done anything at all to improve the situation; same goes for Android's Browser. The Chrome Beta on Android seems to have some support for Audio, but it's not really usable for real time games at the moment. I'll investigate this further.

As always, I have high hopes though. Never give up, never surrender!

All in all, I'm very pleased with the results. Rendering performance in modern mobile browsers is really awesome and the quirks I encountered were workaroundable.

I learned a lot with this project and will use this new gained knowledge to make mobile browser support much more easy in the next version of Impact.

Measuring Input Lag in Browsers

For games, the time between a key-press and the appropriate action happening on the screen can often decide over virtual life or death.

It is no surprise that game developers go to great length to reduce input lag as much as possible. Guitar Hero for instance even has an option to compensate for lag. One of the biggest time hogs these days seem to be displays, but a poorly written application or game can introduce a lot of lag as well.

The typical path a key-press travels – Keyboard » USB Driver » OS » Application – is bad enough as each of these layer can introduce some lag. Yet, JavaScript games have to go through an additional layer they have no control over: the browser – which in itself may have several layers that introduce lag until a JavaScript callback for a keydown event can be called.

I decided to try and measure this lag for different browsers to see if there's room for improvement. The good news: most browsers do a good job delivering input to JavaScript events as fast as possible. The bad news: Chrome does not.

I build a simple website that draws to a canvas element at 60 frames per second, counts the number of frames and captures input events. I used my camera (a Samsung NX200) to record video at 120 frames per second, then slowed the video down even further to have a close look at the results.

The response time from the command prompt was used as a baseline (I measured the same lag with Quake 3 and other native games): 5 frames or about ~83ms. This is pretty awful to begin with. I'm not really sure where all this lag comes from, but I blame the display. Since I measured all (Windows) browsers with the same setup, results are still comparable.

Firefox 14 had the fastest response time of only 5 frames, followed by IE9, Opera 12 and Safari 5.1.5 (Mac) with 6 frames. The real surprise here is Chrome's response time of 8 frames or 133ms – 50ms more than Firefox. This doesn't sound like much, but if you directly compare a real JavaScript game in Chrome and Firefox, you can definitely feel the difference.

The input lag for Mobile Safari on iOS6 is around the 5 frame, ~83ms mark as well. I also tried to measure the Android's "Browser" and the Chrome Beta on my Galaxy Nexus, but couldn't get accurate results.

Chrome on Android refused to render at more than 20 frames per second and the "Browser", while proclaiming to render at 60 frames per second, only really presented every fourth frame. This is clearly visible in the slow motion video – 4 boxes appear at the same time, instead of one after the other. This is also why HTML5 games in the "Browser" (damn, I hate that "name") still seem to stutter, even though they are "rendered" at 60 frames per second.

Android has a lot of catching up to do.

I build another simple website that behaves like Guitar Hero's lag calibration. As humans, we try to compensate for the lag ourselves by pressing a bit earlier, so the results from this aren't as accurate as measured with a camera, but you can clearly see and feel the difference between Chrome and about any other desktop browser.

Try it here:

Update June 26th, 2012

As suggested by Filip Svendsen in the comments, I made another video to show mouse input lag in Firefox 14 and Chrome 22. To make the lag easier to spot, I wrote a Python script to move the mouse cursor at constant speeds.

From the video, it's quite obvious that the mouse lag in Chrome is much higher than in Firefox. However, the movement looks much smoother in Chrome. I guess you can't have everything - at least not yet.

What the requestAnimationFrame API Should Have Looked Like

Chrome, Firefox and IE10 now support requestAnimationFrame – a noble idea, with a totally backwards API.

Imagine you have this JavaScript animation that you want to constantly update. An animation that constantly updates, needs to be updated more than once, i.e. constantly. After all it's an animation. So why on earth do I have an API to request one update at a time?

Mozilla's documentation even warns developers that the API might not work as expected:

Note: Your callback routine must itself call requestAnimationFrame() unless you want the animation to stop.

Ah, yes.

Car analogy time: I have this very nice car that makes a beeping noise whenever I open the door with the ignition off and the headlights still on. You know, to warn me that my headlights might run the battery dry in my absence. "Thank you car, that's very kind of you. But let me ask you something: why didn't you just turn the headlights off yourself instead of notifying me?"

The requestAnimationFrame API is modeled after setTimeout when it should've been modeled after setInterval instead.

Here's my proposal:

// Install an animation that continously calls the callback
var animationId = setAnimation( callback, element );

// Stop the animation
clearAnimation( animationId );

And the code to make this happen:

(function(w) { 
    "use strict";

    // Find vendor prefix, if any
    var vendors = ['ms', 'moz', 'webkit', 'o'];
    for( var i = 0; i < vendors.length && !w.requestAnimationFrame; i++ ) {
        w.requestAnimationFrame = w[vendors[i]+'RequestAnimationFrame'];

    // Use requestAnimationFrame if available
    if( w.requestAnimationFrame ) {
        var next = 1, 
            anims = {};

        w.setAnimation = function( callback, element ) {
            var current = next++;
            anims[current] = true;

            var animate = function() {
                if( !anims[current] ) { return; } // deleted?
                w.requestAnimationFrame( animate, element );
            w.requestAnimationFrame( animate, element );
            return current;

        w.clearAnimation = function( id ) {
            delete anims[id];

    // [set/clear]Interval fallback
    else {
        w.setAnimation = function( callback, element ) {
            return w.setInterval( callback, 1000/60 );
        w.clearAnimation = w.clearInterval;

Gist on Github

Hate IE with a Passion

There's no denial that HTML5 Gaming is going to be huge. It was a big topic at this year's GDC and there's an arcade machine running an HTML5 game currently at SXSW. I've been at Future of Mobile and onGameStart last year, giving some insight and showing off my Game Engine. HTML5 Gaming is here to stay.

In my talk at onGameStart I had a short section dedicated to Internet Explorer: I apologized to the one guy from Microsoft in the audience - then switched to a slide stating "Hate IE with a Passion" and a next one showing the IE logo with a speech bubble, proclaiming "Please kill me!", disgustingly set in Comic Sans because "IE doesn't deserve a better font".

The part about IE is 26 minutes in.

But wait, didn't I say in an earlier post that IE9 works great with my Game Engine?

IE9's smoothness on the other hand is remarkable. Of all browsers and systems I tested, IE9 subjectively produced the best results.

IE9 was a huge leap forward from IE8 and all the things that are implemented in IE9 work pretty well. There are some problems with

So where's the problem?

It's all the things that IE9 doesn't do. And I'm not talking about some nice to have features, but Microsoft's general attitude. They pretty much ensure that IE will always stay obsolete and be every developer's last choice. Here's how.

Auto Updates and Release Cycle

This is a big one. IE8 was released in March 2009, IE9 in March 2011 - two years later. There's no release date for IE10 yet, but it will take at least until March next year. Microsoft has finally decided to have some sort of auto update, but it will leave a lot of users behind. It will take years until all IE users have updated.

Compare this with the Chrome and Firefox guys: they release a new version every 6 weeks now. And the best thing - users don't even notice. Chrome's update mechanism is truly invisible to the average user. If you're doing some cutting edge web stuff, you no longer have to say "works in Chrome 17", but just "works in Chrome". Nobody cares about the version number, because everyone is always on the newest version, or at most two versions behind.

From a developers standpoint, this is invaluable. It brings us certainty. We can be sure that any feature that was added 12 weeks ago has now arrived for the vast majority of users.

Not so with IE. Paul Irish recently had a beautiful article about the clusterfuck that is Microsoft's lifecycle policy. It's just sad.

<Video> and <Audio> Codecs

Firefox, Chrome and Opera support the open source WebM and Ogg Theora video codecs and the open source Ogg Vorbis audio codec. IE supports none of these.

I know WebM is controversial, but damn, they are Microsoft. They should be able to come up with a solution that's good for the web: Buy the MPEG-LA and release all h264 patents in the public domain; improve WebM; invent a new, open video codec. Whatever, just do something.

With Ogg Vorbis I can't even think of single reason not to support it. Over the last few month I asked a few guys from Microsoft why they don't support Ogg Vorbis - nobody could give me an answer. Nobody knows. Maybe we are not complaining loud enough?

WebAudio, Fullscreen, Mouselock…

There are ongoing efforts to implement better audio APIs, fullscreen support, mouse lock (all desperately needed for games) and a whole bunch of other stuff. And browser vendors are working together to form a common standard - yet, Microsoft is curiously absent from these discussions.

Of course IE will support those features eventually. Maybe in version 11 or 12, after every other major browser made it clear that these are useful and important. Then it will take another 3-4 years for IE users to update. So we can safely use the fullscreen API for IE in about 6 years. Awesome.

Microsoft should spark those discussions instead of trying to catch up years later.

IE is slowing everyone down and makes your life as a developer more complicated. It actively prevents progress. Microsoft just looks at what everyone else does, implements the bare minimum they need in order to not be completely ignored and annoys us with a yearly update cycle and millions of users unable or unaware to update. This is not how the web works.

I wrote this post because I still care about IE - I still haven't given up completely.

What can you do? Force Microsoft to fix IE or let it die?

Hate IE with a passion!

Multiple Channels for HTML5 Audio

Now that I've calmed down a bit after my The State of HTML5 Audio article, let's see if can actually write something productive.

Here's the deal: for a typical game you may have a sound effect that you want to play very often. In Biolab Disaster the sound of your plasma gun is such an effect. This particular sound has a length of 0.6 seconds, but you sure can mash the shoot button quicker than 1.66 times per second – so to not interrupt the plasma sound each time you press the button, but instead play a new sound we need to have multiple sound “channels”.

There are a few different ways to create

var a1 = new Audio();
var a2 = document.createElement( 'audio' );
var a3 = a1.cloneNode( true );

(You could also use .innerHTML or document.write() if you're of the adventurous type.)

An important thing to know about the

So, the most logical way to create our sound channels I can think of is this (let's say we want to have four channels):

var channels = [];
for( var i = 0; i < 4; i++ ) {
    var a = new Audio( 'sound.ogg' );
    a.preload = 'auto';
    channels.push( a );

But hold on! Doesn't this result in the Browser requesting that sound file 4 times? Well, it really shouldn't – if I create 100 elements with the same source file, that source file is only loaded once. The same should apply to all other resources.

But guess what?

Second try. Let's create only one

var a = new Audio( 'sound.ogg' );

// Add the audio element to the DOM, because otherwise a certain 
// retarded Browser from Redmond will refuse to properly clone it
document.body.appendChild( a );

var channels = [a];
for( var i = 0; i < 3; i++ ) {
    channels.push( a.cloneNode(true) );

Result? Firefox 3.6 and Opera 11 load the sound file only once. Firefox 4, Chrome 11, Safari 5 and IE9 still request it multiple times.

Ok, what if we wait till the sound file has been loaded completely and only then clone it? Pfft, fine:

a.addEventListener( 'canplaythrough', function(ev){
    for( var i = 0; i < 3; i++ ) {
        channels.push( a.cloneNode(true) );
}, false );

(Boring test page)

With that, Firefox 3.6, Firefox 4, Opera 11, Chrome 11 and IE9 all load the sound file just once. Note however, that the canplaythrough event is not exactly the right place to clone the audio element. The audio file might not be loaded completely, but just enough to play the whole thing without interrupting, provided that the download rate doesn't change. It's a nice event to have for long sound files (music), but a bit pointless for short samples.

Why didn't I use the onload event? Well, the HTML5 Media Elements define a number of events, but onload is not one of them. There's also no “completed” event ore something like that.

Why didn't I use the progress event and check for e.loaded and Only Firefox supports those two properties. For other browsers there's the buffered object that specifies a number of “TimeRanges”.

In a quick test though, Chrome only fired the progress event once for my short sound file. And at the time it did that, the buffered object was still empty. If I wanted to check if the file has been loaded completely, I would need to set up an interval and poll for the loaded ranges. Pretty.

At this point I stopped for a moment and remembered why I'm doing this in the first place: Browsers, could you please properly cache audio files? Not you Opera; today I like you.

But there's more: Did you notice that Safari was missing in the list of browsers that only loaded the sound file once for the last example? Did you also notice how I always wrote the file was loaded “multiple times” instead of just saying “4 times” as in "once for each element"?

Here's the reason: for that simple HTML page with one external resource, Safari does not send 2 HTTP requests as one might expect, but 18:

"GET /html5audio/ HTTP/1.1" 200 506
"GET /html5audio/beep.mp3 HTTP/1.1" 206 2
"GET /html5audio/beep.mp3 HTTP/1.1" 206 16744
"GET /html5audio/beep.mp3 HTTP/1.1" 206 2
"GET /html5audio/beep.mp3 HTTP/1.1" 206 16744
"GET /html5audio/beep.mp3 HTTP/1.1" 206 2
"GET /html5audio/beep.mp3 HTTP/1.1" 206 2
"GET /html5audio/beep.mp3 HTTP/1.1" 304 -
"GET /html5audio/beep.mp3 HTTP/1.1" 206 16744
"GET /html5audio/beep.mp3 HTTP/1.1" 206 16744
"GET /html5audio/beep.mp3 HTTP/1.1" 206 2
"GET /html5audio/beep.mp3 HTTP/1.1" 304 -
"GET /html5audio/beep.mp3 HTTP/1.1" 304 -
"GET /html5audio/beep.mp3 HTTP/1.1" 206 16744
"GET /html5audio/beep.mp3 HTTP/1.1" 206 7068
"GET /html5audio/beep.mp3 HTTP/1.1" 206 15620
"GET /html5audio/beep.mp3 HTTP/1.1" 206 5620
"GET /html5audio/beep.mp3 HTTP/1.1" 206 14172

The last number on each line indicates the number of bytes sent. The beep.mp3 file is 17kb in size, but Safari only requested certain byte ranges of it. Now that's clever!

The State of HTML5 Audio

When I started to work on my JavaScript Game Engine back in October 2009, the biggest problems I encountered were with the new HTML5 Audio Element. The Canvas Element already worked nicely in all browsers that supported it at the time, albeit some were a little slow.

Now, in 2011, the rendering performance for Canvas has been improved dramatically, audio however is still broken in large parts. I think it is time for a change in tone. Be warned, there's some profanity ahead because HTML5 Audio is still that fucked up.

Before we start, you may want to play a quick round of Biolab Disaster or Z-Type and have a look at a simple test case to experience the sound issues first hand.

Browsers From Companies That Actually Care About HTML5 Audio:

Firefox 4, Beta 13pre

Almost perfect Audio support with occasional clicks & pops when playing many sounds. Notably, Firefox 4 also features the Audio Data API for generating and modifying sounds on the fly – which I think is brilliant and desperately needed to compete with Flash.

Note that the current Beta 12 still has some larger timing issues, but this seems to be fixed in the Beta 13 nightly build.

Opera 11

Good Audio support overall. Some clicks & pops and minor timing issues. Works nicely for games but still needs improvement.

Chrome 11

Severe problems with short sound clips, timing issues, clicks & pops, goes completely silent after a few seconds. Pretty much useless for games at the moment.

(Come on guys, you've done some amazing things in the past, I'm sure you could get HTML5 Audio right if you just tried!)

Update: As mentioned on Twitter and in the comments, Chrome also has an Audio Data API that is somewhat similar to the one in Firefox 4.

Browsers From Companies That Hate the Web Enough to Not Support Ogg/Vorbis, but do Have an <Audio> Tag So They Can Say They Have an <Audio> Tag (Seriously, Fuck You):

(Wohoo, big surprise, it's Apple and Microsoft! Who would've thought?)

Safari 5

Some clicks and pops, lag for short sound files. Doesn't support Ogg/Vorbis, but instead favors MP3 – a patent encumbered codec with a far worse quality to size ratio that was never meant for short sound clips and introduces a leading and trailing silence. Ride on.

Internet Explorer 9

Doesn't play audio in my test case at all. Sure, I could adjust my test case so that it would work, by doing something like this:

if (IE) { 
else { 

But quite frankly, I don't give a shit anymore about this stupid piece of crap that Microsoft calls a Browser and throws out a release for every 3 years, just in time to not let it fade out completely but annoy web developers for more years to come. IE9 is not even out yet and already 2-3 years behind of every other major browser.

Internet Explorer, you would do us all a huge favor if you would just die a silent death. Please go away. Nobody likes you.

Browsers That Say They Support HTML5 Audio But Actually Don't Support HTML5 Audio:

Android Browser

Understands the <Audio> element, but can't play sounds at all, because it has no codecs. Not even Wave. Utterly useless for anything.

Mobile Safari

Can't pre-load sound files, only plays one sound at a time, severe lag, timing issues, clicks & pops, completely ignores every other call to play a sound, doesn't support Ogg/Vorbis. Utterly useless for anything. Oh, and did I mention it doesn't support Ogg/Vorbis?

Final Thoughts

Surprisingly, Google's Chrome has the worst HTML5 Audio support of all the good Desktop Browsers - that is every Browser but IE. I'm not an audio engineer, but before Browser vendors took their shot at it, my impression of digital audio was that it is a solved problem. I'm amazed that after so many iterations HTML5 Audio is still that broken.

The Audio support on mobile Browsers (iOS and Android) is laughable at best. It's completely unusable for even the simplest of tasks. You can jump through hoops and ask real nice, but it still sucks ass.

I've heard some arguments that it is crippled on purpose, because you don't want a website to start blaring music at you as soon as you open it. Fine, I understand that argument. But why not ask the user for permission to play audio and then do it right? It's already been done with Geo Location and Storage.

And as for Vorbis support, Adobe should send some flowers to Microsoft and Apple for being in the same douchebag company league as they are. By trying to make things as complicated as possible they ensure that the easiest way to have audio in all Browsers is still Flash. Good job.

But seriously, I really don't understand why they can't support Ogg/Vorbis. This is not the same discussion as with HTML5 Video, mind you. Quality wise there is no question that Vorbis is better than MP3; there are no patent issues with Vorbis like there are with WebM and I even fail to see any "company politics" reason for not supporting it.

“Mimimi, there are no hardware decoders for Vorbis” you say? Fuck you. The first gen iPod touch can decode Vorbis in software while running Wolfenstein 3D with 60fps.

To conclude this rant, let me say that I'm a bit angry at myself for giving Internet Explorer that much screen time again, when instead we should all stop caring about that piece of shit.

Game On Spotlight: Z-Type

This post first appeared in Mozilla's Gaming Blog Game On Spotlight: Z-Type

The Idea

Z-Type was specifically created for Mozilla’s Game On. I immediately wanted to participate in the competition when I first heard of it, but the deadline seemed so far away that I didn’t bother to begin working on a game back then. Fast forward to this tweet announcing that the deadline was only one week away – it took me by surprise. I still hadn’t even began working on anything. The thought of just submitting my earlier game Biolab Disaster crossed my mind but was immediately dismissed again.

Because time was short, I decided to create an open ended arcade game where you shoot for the highscore. A game like Tetris that just gets faster and faster the longer you survive. A game where I wouldn’t have to design multiple levels or write a story.

I enjoyed Crimsonland’s Type-o-Shooter mode back in 2003 and recently discovered Typing of the Dead. And while many JavaScript and Flash Apps out there test your typing speed, they make no attempt of being a game. So I set out to fill that void with Z-Type.

Although the games don’t look or play alike, the name Z-Type is an homage to the classic space shooter R-Type – but with a Z! Because Z is a really cool letter.


Z-Type imageI initially wanted to create a Resident Evil like setting with zombies and lots of blood and gore. But considering the time frame and my limited artistic abilities, I ultimately decided for a more abstract space theme. I also wanted a game that looks modern to show that Impact can be used for more than just retro pixel style Jump’n’Runs. The simplistic, yet shiny looks of Geometry Wars came to my mind.

The most important thing when you want to create such a neon look is that you don’t just draw some semi transparent PNGs to the canvas, but draw them with an additive blend mode. You can switch to the additive blend mode in the canvas API using ctx.globalCompositeOperation = 'lighter'. Now, when you drawn an image onto the canvas, the resulting color of each pixel will only be lighter, but never darker than the destination pixel. This is great for explosions, fire and all kinds of self illuminating effects. All the spaceships in Z-Type are drawn in this mode to give them the glowing look.

The background of the game consists of two layers: one is just a static gradient from black to blue, the other is a slightly tilted grid that constantly scrolls downwards to give a sense of movement and progress.

All fonts in the game are drawn from font bitmaps instead of using the canvas APIs .fillText() method. This is done for two reasons: it’s fast and it looks exactly the same in all browsers on all systems, regardless of the installed fonts or how .fillText() is implemented.


Z-Type is built on top of my JavaScript Game Engine. It solely relies on the Canvas element for drawing. This had some negative performance implications in the past, compared to using lot’s of HTML elements, but the Canvas drawing performance in the newest generation of browsers is really excellent. Also, from a game programming standpoint, drawing in a Canvas feels much more natural; it feels like the right thing to do.

The engine already provides much of the stuff any 2D game needs. That includes animations, background drawing, collision detection and response, pre-loading, timing, sound and input facilities and game entity management. With all this in place, the source for the game itself (minus the word list) clocks in at just 1100 lines of code.

All enemies are subclassed from a base enemy class that draws the label, manages targeting, health (remaining letters), being hit and spawning explosions on death. The actual enemy classes used in the game are only about 30 lines of code each, as their individual AI is very simple – deciding for a direction to fly to and (for some enemies) spawning a missile once in a while.


It’s the small details, the polish, that make or break a game.

The one thing I took special care of with Z-Type is that every shot you take, every keystroke, is as satisfactory as possible. It’s a lesson that is preached and lived by id software – the creators of the Doom and Quake series. The plasma bullet you shoot has to look awesome, it has to sound awesome and there has to be an explosion for every single hit. It is absolutely necessary that you feel powerful while playing the game even if you are not.

Consider this; what is more fun: shooting at an enemy twenty times with a pistol, or shooting at twenty enemies with a rocket launcher? Both might be equally challenging, but the latter is much more satisfactory because you get better feedback. Tweaking a game to feel right in this regard is hard work, but work that certainly pays of.

Z-Type also tries to be smart about which enemy you are targeting. If two enemies start with the same letter, it always chooses the one that is closer to you. Furthermore, the game tries to avoid making this decision in the first place: it tries to spawn enemies whose first letter is unique at the give moment.

Another important lesson I learned throughout my years of playing video games is to be forgiving to the player: Always decide in the players favor. There’s nothing more frustrating than a bullet hitting you that doesn’t look like it’s hitting you. On the other hand, nobody minds when it looks like a bullet hits you but doesn’t actually do any damage. Quite the contrary: you feel lucky. Thus, the collision boxes for the enemies and the player in Z-Type are a good deal smaller than their graphics. It doesn’t matter if the game is fair, as long as you are the one who is in favor.

This also means that if you are unsure if your game is too hard or too easy – which is often difficult to tell because you as the developer already know the game in an out – always err on the “too easy” side. That’s something I learned the hard way: I got it horribly wrong with my way too hard iPhone game Yuckfu a few years ago.


Similarly to some gameplay aspects, sound is an often overlooked topic. There are only 4 different sound effects in Z-Type, yet I spent more than a day to find the perfect set. The important thing here again was the feedback you get from these sounds. Your plasma bullets have to sound powerful, but not at the cost of being annoying when you type 90 words per minute.

I used the excellent Audacity sound editor to cut, combine and modify my sound files. The plasma sound for instance, was created from 3 different source files: An anti tank gun (from a sound CD), a bullet hitting a watermelon (from a YouTube video) and a “laser” sound (from an old PC game). Combining these was a tedious process, because I didn’t really know what I was doing. And even when it sounded good in Audacity it sometimes just felt wrong in the game when played in context. Nevertheless I’m quite happy with the result.

The Music was done by my good friend Andreas Lösch, who also provided the music for my previous games. I initially wanted something really ambient and unobtrusive, but decided for a slightly different direction after hearing one of his piano arrangements again. I asked him to make some modifications on it for the game – removing vocals, make it loop-able etc. I’m very happy with the result and the mood it creates in the game.

The state of HTML5 Audio support in browsers is sadly still okay-ish at best. You have to deal with pops and clicks (IE9, Opera, Safari), severe lag (IE9, Firefox 3.6 & 4b9), refusing to play short sounds at all after a few seconds (Chrome 9 & 11 dev) and downright broken implementations (Mobile Safari). And dont get me started on the fact that you have to provide your sound files in the Ogg Vorbis and MP3 codec, because Microsoft and Apple are too arrogant to support and open standard.

I hate to say it, but as good as the Canvas support in all browsers has come to be, as broken, ignored and unpleasant to work with is HTML5 Audio right now.

Release Shortly after publishing Z-Type it exploded on Reddit, StumbleUpon and some other sites, bringing in hundreds of thousands of visitors. This really took me by surprise – I liked what I had achieved but I had no idea that a typing game would be this popular. My guess is that it has to do with the game’s accessibility – anyone who can type on a keyboard can play the game and be good at it, even if they never played a video game before.

Many complained that the game was too easy, or that the difficulty level doesn’t increase fast enough. And while I suspect that many others where happy with the difficulty, it really does plateau at a certain level. So this is something I still want to fix in the coming days.

All in all developing Z-Type has been a really fun and rewarding experience. My thanks go to Andreas Lösch for providing the music, everyone who voted for the game and of course the Mozilla Game On staff for organizing this contest. Thank you!


I planned for a while now to participate in Mozilla's Game On contest. I didn't want to submit Biolab Disaster as my entry, but instead make something new and fresh. After all, the deadline for the contest was still months away. When I read this tweet I realized it wasn't anymore.

At this point I started to work on a game I wanted to make for a long time. I always was a fan of Crimsonland's (a 2003 arcade game) Type-o-Shooter mode and recently discovered the Dreamcast classic Typing of the Dead. So the overall direction of the game was clear.

Today I finished Z-Type. Please Enjoy!


hqx Scaling in JavaScript

I recently came across the hqx Scaling Algorithm for upscaling pixel art images. hqx was originally developed for the ZSNES emulator to scale up each frame in realtime and is thus written in fast (read: ugly) C code.

I took the challenge to implement hqx in JavaScript for Biolab Disaster and I think the result looks great. The algorithm does an extremely nice job with "organic" structures such as all the rock tiles, but tends to make low contrast zones quite blurry.

hqx Test Page

hqx comes in three flavours: hq2x, hq3x and hq4x. All of these are implemented in the JavaScript version and you can directly test them with Biolab Disaster hqx (the links below the game).

The source for the JavaScript implementation of hqx is available on Github under the GNU LGPL.

Brace for Impact!

(Silly title, I know, but how could I ever resist such an opportunity?)

My HTML5 Game Engine Impact is now ready. It took some time, but I think it was worth it. I'm proud of what I have achieved and I hope you'll like it too.

Part of why it took so long to put it all together is that it now runs on the iPhone, iPod Touch and iPad. Try it yourself at and or watch a short video:

All those platforms still have their problems with sound and the iPhone 4 has a hard time filling all its pixels, but the games remain to be playable even on the 1st gen iPod Touch. You can read a bit more about Impact on mobile platforms in the documentation.

Even with iOS support, it might come as a shock to some of you that I am selling Impact, rather than releasing it for free. I love free and open source software and I've been contributing stuff for quite some time now. I had a hard time thinking about whether to release my Game Engine for free. The reason I decided to charge for Impact is a) it is easily the biggest thing I've ever made and I'd love to continue working on it full time, and b) I believe it is worth the money.

Ironically, my decision to sell Impact set back the release date quite a bit. If I'm selling something, I want it to be worth every penny. And even though the engine hasn't been far from completion for some time, I hadn't written a single line of documentation.

I feel that a good documentation is crucial for the success and adoption of any software project. So I set myself the goal to write the best documentation I possibly could.

I'm not a big fan of inline documentation (with documentation generators like JSDoc) because it tends to clutter the source code with trivial statements and – more importantly – makes it easy to write bad documentation. If you are writing the documentation separately from the code, you think about it differently. You think about the documentation as something that works without the source code, something that makes sense without the source code.

You rarely see code examples in automatically generated documentations, but for me as a developer, code examples are oftentimes exactly what I need. Take a look at the documentation for the ig.Entity Class - one of the more complex classes of Impact. This is something documentation generators just can't do.

Of course it took me longer to write the documentation separately than it would have if I wrote it inline, but this is only because it is more in-depth, more thorough.

But don't take my word for it. Please see for yourself!

On a lighter note, I'm currently sending out a few thousand emails to those who signed up on the old Impact landing page. I'm using a 10-line PHP script for that. Let's see how this turns out...

Flash Animation Without Flash

More than two years ago, I created a Flash Animation for my university class. Today, I converted the whole thing to plain Javascript and HTML5, using the new <canvas> tag to draw and the <audio> tag for music playback. It now runs smoother than it ever did in Flash.

Without further ado: Venetianization / HTML5 Animation

Some technical notes: For the original Flash Animation I used ActionScript 3 and created my own classes. Javascript doesn't have classes per se, but you can build something that looks and feels exactly like it. MooTools did an awesome job at that. Converting my ActionScript classes to MooTools classes was a no-brainer.

I was able to reuse most of the code with some basic search and replace throughout the source. One thing however, that is (to date) completely missing in Javascript, is the ability to analyze the current sound spectrum of an audio file. I ended up extracting the raw values of the spectrum with a sampling rate of 15Hz (which is enough for an animation that initially ran at 30Hz) and put them in a large array in one of the source files.

The thing that annoyed me the most however, is that I now have the music in two different formats: OGG Vorbis for Opera, Firefox and Chrome, and MP3 for Safari. I totally understand that Firefox, being open source and all, can't include MP3 support. What I don't get, is that Apple doesn't support OGG Vorbis – an audio format that is clearly superior to MP3 – instead, they choose to sit on their high horse and twiddle their thumbs. This is exactly the behavior that made Internet Explorer a laughable side note.

Side note: Of course none of this works in any version of Internet Explorer.