PHOBOSLAB

Blog Home

Posts tagged with “Objective-C”

Ejecta

Ejecta is a fast JavaScript, Canvas & Audio implementation for iOS. Today, I'm releasing it under the MIT Open Source license.

Visit the Ejecta website for more info on what it is and how to use it. I will talk a bit more about some implementation details for the Canvas API here.

Implementing a general purpose drawing API, such as the HTML5 Canvas API, on top of OpenGL is by no means an easy endeavor. Before I decided to roll my own solution (you know, I have this problem), I looked at a number of graphic libraries including Google's skia and OpenVG.

I discovered exactly what I feared beforehand: these libraries do way too much, are too large and too hard to implement. You can't just use them here and there to draw – instead they replace your whole drawing stack. Getting them to compile alone is a huge pain; getting them to compile on the iPhone and then get them do what you wanted to seemed close to impossible.

So I began working on my own solution. Implementing the path methods for moveTo(), lineTo(), bezierCurveTo(), etc. was fairly straight forward: have an array of subpaths where each subpath is an array of points (x,y). Each call to the API methods pushes one or more points to the subpath or closes it.

However, I struggled a bit with getting bezier curves to behave in a manner that makes sense for the current scale; i.e. push more points for large bezier curves and at sharp corners, fewer points for smaller ones and straight lines. After a few days of reading and experimenting, I found this excellent article on adaptive bezier curves and adopted its solution.

The hard part was getting that array of points on the screen. For drawing lines (.stroke()) I didn't want to go with the obvious solution of just using GL_LINES, because it has a number of drawbacks, especially on iOS: no anti aliasing, limited line width and no miters or line caps.

So instead of using GL_LINES to draw, I ended up creating 2 triangles for each line segment and calculate the miter values myself. This correctly honors the APIs .miterLimit property, though the bevel it then draws is still a bit off. The code I ended up with is a bit on the ugly side, because it handles a lot of edge cases, but all in all this solution worked very well and is extremely fast.

Implementing .fill() proved to be yet another challenge. With OpenGL, before you can draw a primitive to the screen, you have to break it down into triangles first. This is quite easy to do for convex polygons, but not so much for concave ones that potentially have holes in them.

I spent a few days looking for triangulation library and soon realized that this is serious business. Triangle for instance, sports 16k loc – I'm quite allergic to libraries that need that much code to solve seemingly simple problems. Poly2Tri looked much more sane, but apparently has some stability problems.

After a bit of searching, I found libtess2, which is based on OpenGL's libtess and is supposed to be extremely robust and quite fast. The code base is excellent and I had no problem implementing it with Ejecta.

However, some tests showed that it's much slower than I hoped it would be. Realtime triangulation of complex polygons isn't very feasible on the iPhone.

In the end, I found a trick that lets you draw polygons in OpenGL without triangulating them first. It is so simple and elegant to implement, yet so ingenious: You can draw polygons with a simple triangle fan and mark those areas that you overdraw in the stencil buffer. See Drawing Filled, Concave Polygons Using the Stencil Buffer. It's a hacker's solution – thinking outside the box – and it fills me with joy.

There's still some parts missing in my Canvas implementation, namely gradients, shadows and most notably: text. I believe the best solution for drawing text in OpenGL, while honoring the Canvas spec, would be drawing to a texture using the iPhone's CG methods. This will make it quite slow, but should be good enough for a few paragraphs of text.

If you want to help out with anything grab the Ejecta source code on github – I'd be honored.

Wednesday, September 26th 2012 / Comments (13)

iOS and JavaScript - for Real this Time!

Less talk, more action – Apple just approved two of my JavaScript games for the AppStore: Biolab Disaster and Drop. Both are free; go check them out. You can also play them in your browser here and here.

Both games are pretty simple (the source for Drop is only 300 odd lines long) and written with my JavaScript Game Engine Impact.

These are certainly not the first games written in JavaScript to be available in the AppStore. Tools like AppMobi, PhoneGap or Titanium make it easy to bundle some HTML pages and JavaScript together in an App and display them in a UIWebView, which is basically just a browser window (correction: Titanium doesn't use a UIWebView but instead has some native bindings). Games written with Impact already work okay-ish in the iPhone's browser and thus also in AppMobi and PhoneGap.

So what's so special about these two games now? They don't use PhoneGap or Titanium. They don't even use a UIWebView. Instead, they bypass the iPhone's browser altogether and use Apple's JavaScript interpreter (JavaScriptCore) directly. All graphics are rendered with OpenGL instead of in a browser window and all sound and music is played back with OpenAL instead of… well, having no sound at all.

What makes this possible is a compatibility layer that mimics the HTML5 Canvas and Audio APIs but is implemented with OpenGL and OpenAL behind the scenes. Think of it as a browser that can only display a Canvas element and play Audio elements, but does not render generic HTML pages. A browser perfectly suited for HTML5 games.

This means you can take your JavaScript games written for Impact and run them on iOS with perfect sound and touch input and way better drawing performance than with Mobile Safari. Now, to be clear, I only implemented a bare minimum of the Canvas API - just enough to be able to run Impact. The whole thing is still in a very experimental state.

If you have a license for Impact you will find the complete source code for this all on your download page. I also wrote some basic documentation to get you started. In theory, you don't have to know anything about Objective-C to use this, but at this stage some Objective-C knowledge will sure come in handy. Again, this is very experimental. Don't expect it to work at all.

If you've been following this blog for a while, you might remember a post from October 2010 where I attempted the exact same thing. Back then I used the JavaScriptCore library that is already available on iOS and used by Apple in Mobile Safari. The problem was, that this library is "private", meaning that Apple does not want you to use it. And since you can't publish anything in the AppStore that uses private libraries I abandoned the idea.

However I recently revisited the project, because there was still a chance to make this work: JavaScriptCore is a part of the open source WebKit project. Instead of using the private library that comes with iOS, you theoretically could compile your own version of this library and bundle it together with your App. Which is exactly what I did.

Since Apple does not provide any project files to compile JavaScriptCore for iOS (presumably to annoy us) and JavaScriptCore itself uses some of iOS' private APIs, compiling this beast into a static library - in an App Store compatible fashion - took me a few days.

I also had to make a small sacrifice: JavaScriptCore uses libicu to sort strings according to a unicode locale. Sadly, libicu is also private on iOS and bundling it is not an option because of its size. So I got rid of libicu completely. This means that only ASCII strings are now sorted correctly (e.g. the umlaut "Ä" will come after "Z", not after "A" as it should). Other than that, the JavaScript library should behave exactly as the private one that comes with iOS.

Also, the JavaScriptCore library bundled with iOSImpact does not use the JIT compiler (Nitro). You can't allocate executable memory on the iPhone and if Apple doesn't lift that restriction, there's nothing I can do about it. However, Apple recently made an exception for Mobile Safari – if they would make their JavaScriptCore API public, they probably could enable the JIT for everyone. That's a big if though; I don't see it happening, because Apple loves native code (Objective-C) and hates scripting languages.

I was afraid Apple would reject the two games for some obscure reason, but they didn't. Which leaves me to wonder why the JavaScriptCore library that comes with iOS is private in the first place. Bundling JavaScriptCore with your App adds about 2MB in size – not much, but I fail to see how this can be in Apple's interest.

Anyway, the performance of both games is pretty good. I still get some occasional slowdowns (~20fps) on my iPhone3GS in Biolab Disaster when there are too many particles on the screen, but it remains playable at all times. It's also nice to have perfectly working sound on iOS now - something even some desktop browsers still struggle with.

Wednesday, April 27th 2011 / Comments (52)

Impact for iOS

I know you're waiting for the release of the Impact Game Engine, and I promise you, it's coming. I just get distracted too easily. So here's my game Biolab Disaster running on the iPhone 3GS with 60 frames per second:

The game is running in its own process and is not using the iPhone's browser at all. Instead, it's just using the JavaScriptCore Framework to run the game. All the necessary calls to the Canvas API have been reimplemented with OpenGL-ES and the touch input is passed over to JavaScript to be evaluated by the engine. I of course had to make some changes to the engine, but the game source code is exactly the same as for the web version.

The JavaScriptCore Framework is still private on iOS. So as it is now, I won't be allowed to distribute the game in the AppStore. But my understanding is, that if I bundle my own copy of the JavaScriptCore Framework (which is part of WebKit and thus freely available) with my game, I should be on the safe side. Let's see how this works out.

Monday, October 11th 2010 / Comments (38)

Songfever

Songfever, at it's heart, is a tangible front-end for iTunes. Album covers are projected onto four physical objects standing on a shelf. With a scroll wheel, attached to the shelf, you can scroll through your music library in a Cover Flow like fashion. The goal was to reintroduce the aesthetic quality of a physical music collection (CDs, LPs), while maintaining the comfort of a digital one.

Songfever

Each of the four covers are tracked by a webcam mounted above the shelf. Their positions are translated in software in the same way the projector located relative to the webcam in the real world. When rendered in a 3D OpenGL view and projected back onto the shelf, the album covers line up with the physical objects again. Approximately, that is.

Songfever was our main project in the 4th semester for “Digital Media” at the Hochschule Darmstadt. I was responsible for the programming; the UVC Camera Control I posted some month ago, was a small part of it.

Visit the Songfever Website for some more photos and videos.

Friday, June 25th 2010 / Comments (0)

UVC Camera Control for Mac OS X

For a recent computer vision project I needed to pull images out of a Logitech QuickCam 9000 and track some markers with the help of the ARToolKitPlus library. I connected the camera to my Mac and was quite surprised to see that it just works. There was no need to install any drivers. As I learned later, that's because the QuickCam 9000 is a UVC webcam for which Mac OS X 10.4.9 already provides a driver. I was able to get to the raw camera images through the QTKit Framework in no time.

However, the QuickCam 9000 has its auto exposure enabled by default, which is absolutely deadly for stable tracking results. I thought I could just turn the auto exposure off and set it to a fixed value through some QTKit API – but no, there's no way to change the exposure of a UVC camera with QTKit. In fact, there's no way to change any parameters of your camera. No exposure values, no white balance, no gain, nothing. Apple just implemented the bare minimum in its UVC driver and the QTkit Framework.

Well, maybe I could get to these parameters through the older Sequence Grabber Framework then? After all, there's a VDIIDCSetFeatures function and a vdIIDCFeatureExposure key! But nope, as the name implies, this stuff only works for IIDC cameras. What's an IIDC camera? Wil Shipley asked the same questions almost 3 years ago - even back then, IIDC cameras were pretty much deprecated. Still, these cameras are the only devices that Apple provides an API for, if you want to change some esoteric parameters no one would ever need to change, like oh, the exposure time or white balance temperature for instance.

Apple is aware of the problem but hasn't done anything to solve it. And Logitech apparently doesn't see the need to provide a Mac driver for their cameras, since Mac OS X already ships with one. Great.

But wait, UVC is a standard, right? USB.org provides a documentation for all device classes, and the Video Class is no exception. So, I poked around in the documentation for a while, read some Linux UVC driver sourcecode and used Apple's USB Prober to see what's going on. After some more hours of playing around with the Kernel Framework's USB API, I was finally able to control some of the QuickCam's settings!

Read complete post »

Wednesday, July 15th 2009 / Comments (116)

Yuckfu Dev Diary #3 – Loading and Displaying 3D Models

After some hours of reading to figure out how exactly the memory management with alloc, dealloc, retain and release works with NSObjects in Objective-C (here’s a good article) and some more hours to get used to the funny syntax, I was finally able to do something with it.

To get my 3D model onto the iPhone, I had to find a data format that is easy to load and can be used directly in my Application without much modifications. I decided to export the model from Wings3D as Wavefront .OBJ file, which is a pretty straight forward ASCII format. I, however, didn’t want to go through the hassle of parsing ASCII data in C, so I wrote a small PHP script to transform this .obj file into a binary format that was ready to be used with OpenGL ES.

This is where the fun starts.

Read complete post »

Tuesday, July 29th 2008 / Comments (3)