PHOBOSLAB

Reverse Engineering WipEout (PSX)

In 1995 one of my all time favorite video games was released: the original WipEout for PlayStation. The brand new PlayStation produced 3D graphics previously unseen on living room TVs and WipEout exploited its capabilities like no other game at the time. It was one of the pioneering titles of the fifth generation console era.

WipEout's art style was distinctively different from other games too. With the help of the UK based design studio The Designers Republic the game achieved a mature look that was in stark contrast to the comic style found in most other games.

I remember poking around on the CD of the PC Version of WipEout back in the day, looking for ways to modify the game. I was thrilled to find .pcx images of all textures and tried to change one of the in-game billboard graphics to show my name. I wasn't able to get it working.

Now, almost 20 years later, I thought I'd give it another shot.

WipEout Model Viewer – A WebGL Experiment

Extracting 3D Models

I didn't have much experience with reverse engineering any data formats, but given the game's age, I expected the 3D format to be quite simple and straight forward. What surprised me was the many different nuances in which scene models were stored. The track itself was stored separately with a different format, spanning multiple files, yet again.

Knowing the PlayStation didn't have a Floating Point Unit (FPU) I assumed vertex data had to be stored as Integers. Using JavaScript and the three.js 3D Library I quickly whipped up a page that would load one the games's .PRM files that I thought would contain the 3D objects. I loaded all data in the file as coordinates of 3 Integers (x, y, z) and rendered them as a point cloud. This allowed me to quickly shift the stride (the spacing between distinct coordinates in the file) and look for patterns.

What I found was indeed chunks of raw vertex data, along with an object header specifying the object's internal name and number of vertices and polygons in that object. Each polygon is preceded by a polygon header and a chunk of index pointers into the object's vertices. These polygon headers is where I probably spent most of my time. I identified 11 different polygon types used by the game – triangles, quads, sprites, textured or flat, having vertex colors or face colors etc. Two types are still ignored because I couldn't figure out what they were.

The track itself is stored in several distinct files. The most interesting of which are TRACK.TRV, containing raw vertices, and TRACK.TRF containing the track faces as quads of 4 index pointers into the vertices. Pretty straight forward.

Reading textures

Most textures in early PSX games were stored in the TIM image format, containing data straight in the PSX' native frame buffer format. A TIM file stores pixels either directly as 16 BPP colors, or as 8 BPP or 4 BPP indices into a color palette. And indeed, you can find a number of TIM files on the original WipEout CD: the start splash screen, background images, loading images etc. However, the images I was most interested in – the textures used for 3D models - were missing.

Some googling revealed that textures are stored in the compressed SCENE.CMP files. These files contain a very simple header that specifies the number of TIM images in the file, along with the uncompressed sizes of each image. The data itself is compressed using LZ77. I even found some C code in a reverse engineering wiki, that would uncompress these files. With this code quickly ported to JavaScript and poking around in the polygon data to find the texture index and coordinates, I was finally be able to draw textured models.

The textures for the track however were an entirely different beast. Along with the vertex and face data (TRACK.TRV and TRACK.TRF) each track comes with an additional TRACK.TRS, containing Track Sections of some sort, and a TRACK.VEW, containing visibility lists (I think). I spent a lot of time trying to find the texture index for each track face in any of these files.

In fact, the unhelpfully named TRACK.CMP did not contain the track textures. Instead, track textures were stored in a file called LIBRARY.CMP, which seemed to be compiled from a set of source images.

As I painstakingly found out, the tracks in WipEout had a Level of Detail (LOD) system, subdividing each track face in up to 4x4 quads when they're near the camera. This was probably done to lessen the impact of the PSX' missing perspective correction when drawing textures. These subdivided faces also used higher resolution textures.

Now, here's the kicker the LIBRARY.CMP file contains about 300 images, but only 19 distinct textures - each in 3 different LOD levels: 4x4, 2x2 and 1x1 tiles, each tile being 32x32 pixels in size. And, as if that wasn't complicated enough, these 4x4 and 2x2 tiles are not stored in the right order. Instead, yet another file, LIBRARY.TTF, stores indices into the LIBRARY.CMP to compose these 4x4 and 2x2 versions.

Knowing that the texture index has to be between 0 and 18, I found it stored as a single byte along with each track face. I also found that another single byte for each track face specifies a few flags for that face. The most important one for drawing: whether to draw with flipped X texture coordinates.

Wrapping up

Finally, I had a complete, textured, vertex-lit scene and track. Using three.js drawing everything was a no-brainer, as it happily lets you create models of all kinds of different polygon data. I also wanted to have a simple fly-through animation for each track. three.js was tremendously helpful to create a smooth camera spline along the track. The optional orbit controls for three.js are top notch, too. They just work without any modifications on mobile and touch devices.

With all the work that went into this project, handling the WebGL drawing with three.js turned out to be one of the simplest parts.

The finished WipEout Model Viewer loads all the original data files and does all the binary file reading, unpacking and scene creation in JavaScript.

All in all, I wrote about 800 lines of JavaScript to load and draw 3D scenes for a 20 year old game. I wonder how big the original WipEout sorurce is. It's quite sad that probably nobody will ever see it again, considering that it now belongs to Sony.

Full source on github: github.com/phoboslab/wipeout

Tuesday, April 14th 2015 / Comments (59)

Xibalba & WebVR

WebVR is an effort of Mozilla and Google to enable Virtual Reality content on the web. Experimental builds of Chromium and Firefox that provide a JavaScript API for WebVR have been available for a while now.

My game Xibalba already provided a stereo rendering mode and supported an external tracking server to get orientation updates from the VR headset, but it was clumsy and only worked for the old Oculus Rift DK1. So I decided to ditch the old stereo rendering mode and implement WebVR proper.

Xibalba VR

WebVR handles the distortion for the Headset by itself. All you have to do is render your game into a side-by-side stereo format and request fullscreen mode on your Headset. Orientation updates are provided by a nice, clean API that's easily implemented in the game.

I also added support for the JavaScript Gamepad API with a Plugin for Impact – the game plays great with a XBox360 Gamepad, even if you don't have an Oculus Rift.

Play the game here: phoboslab.org/xibalba/

If you have a DK2 but no WebVR enabled build of Chrome or Firefox, you can also try a standalone Windows version of the game. It's essentially a Chromium build bundled with the game and batch file to start it. Makes it a bit easier to get going.

Download: xibalba-vr.zip ~80mb

Monday, February 2nd 2015 / Comments (7)

XType Plus – an HTML5 Game for the Nintendo Wii U

For many Indie Game Developers getting your game on a real gaming console is something like the holy grail. At least for me it was. The entry barrier for any gaming console seems unequally high, compared to PC or mobile development.

Nintendo is trying to change that with their Nintendo Web Framework – it's essentially a WebKit browser for the Wii U but offers some custom APIs to interface with various controllers and Nintendo's online services. Nintendo also did a tremendous job of speeding up Canvas2D rendering on their console and providing excellent HTML5 Audio and WebAudio support. The result is a game development framework suitable for about any 2D game you can think of, but with an extremely low entry barrier.

Gameplay Trailer for XType Plus

A few years ago I made an HTML5 game called X-Type, built on top of the Impact Game Engine. It was mostly meant to test the rendering speed of Browsers at the time, but turned out to be quite fun to play as well.

In 2013 Nintendo invited me to GDC Europe where I presented a quick port of the game for the Wii U. The feedback I got at GDC Europe was extremely positive, so I decided push the prototype as far as I can, polish all the rough edges and make a real game out of it.

A prototype of XType Plus at GDC Europe

Today, XType Plus launches in the Nintendo eShop in North America, Europe, the UK, Australia and New Zealand. I'm very excited to see this come to life!

For the most part, porting the game to the Nintendo Web Framework was very straight forward. My Game Engine Impact worked without any modifications and now officially supports the NWF as well to tightly integrate the Wii U controllers and other platform features.

There were some rough edges, particularly with Nintendo's online services, but these have mostly been smoothed out in current NWF releases. Interfacing with the GamePad screen and various controllers was a breeze and most features supported by a "real" Browser work as expected in NWF. Canvas and Audio work exactly as in the Browser and the settings in XType Plus for instance are simply saved to localStorage.

Even with all the additions for the Wii U, the game still runs in a Desktop Browser as well. This made the debugging process of core gameplay features a very simple process.

Screenshot of XType Plus from the all new Plus Mode

If you want to know more about the creation process of XType Plus and the Nintendo Web Framework please have a look at the interview I did with NintendoLife.com back in Februray.

I hope to see your names pop up in the Highscores of XType Plus!

Thursday, July 31st 2014 / Comments (11)

Xibalba – A WebGL First Person Shooter

More than a year ago I started to work on a project that was supposed to be released for 7DFPS – a game dev competition about making a First Person Shooter in 7 Days. I didn't meet the deadline, but the game I started back then is now finally finished.

Please Enjoy:

Xibalba Xibalba – A WebGL First Person Shooter

Xibalba is also available in the iOS AppStore

The game was build on top of my HTML5 Game Engine Impact and is entirely written in JavaScript. While Impact is intended for 2D games, this first person shooter fit very naturally with the engine.

Xibalba, at its heart, is just a 2D game with a 3D viewport. So all the necessary elements for the gameplay were already provided by the game engine: collision detection and response, entity management, sound playback a versatile level editor and much more. The only thing missing was a 3D view.

I created a 10 minute Making Of screencast that explains the ideas and the design behind the game a bit closer.

The Making Of Xibalba on Youtube

The iPhone and iPad version of the game was made with Ejecta. The game also runs just fine in Mobile Safari on the current iOS 8 beta, but the browser's UI, especially in landscape mode, unfortunately hinder the gameplay quite a bit. It's much more playable in Chrome for Android.

I also published the 3D Viewport part of Xibalba, along with another tiny demo game, under the name TwoPointFive – a tribute to the excellent ThreeJS library.

The TwoPointFive Plugin is available on Github: github.com/phoboslab/twopointfive

Monday, July 28th 2014 / Comments (21)

Fast Image Filters with WebGL

WebGLImageFilter is a small JavaScript library for applying a chain of filters to an image.

Filters:
Sergey Brin in his badass Tesla, sporting Chrome wheels

But to quote myself from twitter:

That awkward moment when you realize the lib you've been polishing for the past 6 hours already exists. With more features and a better API.

~ @phoboslab, Nov 3.

So, yes, there's already a library called glfx.js which basically does the same thing. And I'm almost saddened to say that it's excellent.

It's not all in vain, however. My implementation features raw convolution and color matrix filters. The latter should be particularly interesting for those coming from the Flash world: it's the exact same as Flash's ColorMatrixFilter and allows for some nice "Instagrammy" photo effects among others. There are some JavaScript libraries around that implement this ColorMatrixFilter, but they all do it in JavaScript directly, which is quite slow. Mine can be used in realtime, even on iOS with Ejecta.

I also learned some things about WebGL and Shaders with all this. One interesting aspect is that you can't draw a framebuffer texture onto itself. So in order to apply more than one effect for the same image, you need to juggle around 2 textures - use one as the source, the other one as the target and then switch. And if you don't want to waste any draw calls, you need to know which draw call will be the last, so that you can draw directly onto the target Canvas instead of on a framebuffer texture.

WebGL's coordinate system also complicates things a bit. It has the origin in the bottom left corner, instead of in the top left like most 2D APIs, essentially flipping everything on the Y axis. This has caused me a lot of pain with Ejecta already. The solution sounds trivial: just scale your drawing coordinates by -1, but this only gets you so far. If you need to get the pixel data of your image or draw one canvas into another, you suddenly have to invert everything again.

Lesson learned: Always google before you start working.

Download: WebGLImageFilter on github.

Sunday, November 3rd 2013 / Comments (8)

A Tale of Bad UX

A few weeks ago, I completed my iOS App Instant Webcam and eagerly submitted it to Apple for review. They approved it about a week later and I was free to sell it in the AppStore. About 30 minutes later my App showed up when you searched for it. All ready to launch I thought.

Before publishing the accompanying blog post and tweeting about it, I tested the App one last time: I downloaded it directly from the AppStore on my iPhone and made sure it works. Which it did. Then, just to see what it looks like, I searched for the App on my iPad 1, which as you know doesn't have a camera. You shouldn't be able to download the App on a camera-less device at all. The AppStore should indicate this, but It didn't.

You could still download the App just fine on the iPad 1, despite the device having no camera to use it. When you start the App, it instantly crashes.

Shit.

Of course this was my fault. I forgot to set the appropriate key for UIRequiredDeviceCapabilities in the App's Info.plist file - namely the video-camera key that tells the AppStore that this App indeed requires camera capabilities.

You can't test this at all before submitting your App for review. There's no "staging area" where you can see how your App will look like in the store. This bug only presents itself after release. You can't test for it beforehand. Apple's review process should probably have caught this bug, but didn't.

The UIRequiredDeviceCapabilities property only affects the presentation in the AppStore, yet it has to be set in the App directly. Which means in order to change this you have to recompile your App, submit it for review again and wait a week or two till it's approved. Not fun.

Worse still, you can't release an update for your App that requires features that your original version did not require. So this bug that only presents itself after release is essentially unfixable.

The official way to "mitigate" this, confirmed via iTunes connect support, is the following.

Suffice to say, you can still download Instant Webcam on your camera-less iPad or iPod and have it crash immediately.

Tuesday, October 29th 2013 / Comments (6)

HTML5 Live Video Streaming via WebSockets

When I built my Instant Webcam App, I was searching for solutions to stream live video from the iPhone's Camera to browsers. There were none.

When it comes to (live) streaming video with HTML5, the situation is pretty dire. HTML5 Video currently has no formalized support for streaming whatsoever. Safari supports the awkward HTTP Live Streaming and there's an upcomming Media Source Extension standard as well as MPEG-DASH. But all these solutions divide the video in shorter segments, each of which can be downloaded by the browser individually. This introduces a minimum lag of 5 seconds.

So here's a totally different solution that works in any modern browser: Firefox, Chrome, Safari, Mobile Safari, Chrome for Android and even Internet Explorer 10.

Read complete post »

Wednesday, September 11th 2013 / Comments (91)

Quake for Oculus Rift

With the many "modern" games like Half-Life 2 and Doom 3 already modified to support the Oculus Rift, I decided to give one of my favorite classic games a shot: the original Quake.

Id Software long ago released the full Source Code of Quake under the GPL license. It has been ported to virtually every hardware platform in existence by the community and is still maintaned and kept up to date by several Open Source projects. One of them is the excellent Quakespasm - it aims to be faithful to the original with no changes to the gameplay but tons of smaller improvements and bugfixes. It's built on top of SDL, so it runs on Windows, OSX and Linux.

Quakespasm provided a nice and clean starting point to implement support for the Rift. It's evident that Quake's C source code has a radically different style than John Carmack's later games, with almost all data stored in global variables and tons of functions with no arguments. Still, the source is very straight forward and easily understandable. I had almost no problems implementing the Rift support, but in the end took quite a few shortcuts with some dirty hacks.

All in all, it works really well with the Rift. Quake feels even grittier and darker than it ever did before. The textures are pretty coarse and you can count the polygons on the enemies by hand, but it all comes together so nicely when you run through the corridors, completely immersed in the world with the original Nine Inch Nails soundtrack blasting through your headphones. For me personally, this ended up being one of the most enjoyable Rift experiences.

There are still various problems. Making the UI readable in Rift mode is a single dirty hack, some values for the eye offsets are elaborate guesses and in general this thing could have been implemented a lot cleaner. But as far as I can tell, everything works like it should. I only tried this on Windows; my guess is that it works on OSX as well, but I haven't changed the project files to incorporate the new code.

To enable Rift support, bring up the console (~ key) and type the following:

vr_enabled 1

If your Rift is connected, this should be all that's needed to get started. On my system, turning on VSync was a good idea - updates are a bit slower, but less stuttery. There's no tearing without VSync, so I assume that SDL is doing something funky here behind the scenes.

You can still buy the full game at idsoftware.com or on Steam.

Edit July 6th: I fixed some rendering issues that warped the view. Please re-download.

Edit July 7th: Switched to predicted view orientation and added vr_supersample option.

Edit August 22nd: renamed all cvars to have a vr_ prefix. See the Readme on Github for a list of available cvars.

Friday, July 5th 2013 / Comments (62)