Benchmarking in your pants

Just how fast is the iPhone?

Let’s run some benchmarks comparing the iPhone to my iMac running Safari 3 on a 1.83 Ghz Intel Core Duo processor:

Test iMac iPhone Slower by
100,000 iterations 0.041 secs. 3.209 secs. 78x
10,000 divisions 0.005 0.413 82x
10,000 sin(x) calls 0.009 0.709 79x
10,000 string allocations 0.010 0.777 78x
10,000 function calls 0.010 0.904 90x

This means that Javascript on the iPhone will take about 80 times longer to run than it does on the desktop. Keep that in mind while you’re writing your Web 2.0 applications.

If you need any more proof of how much slower Javascript is on the phone, take a look at the plotting application I did for C4[1]. Make sure to run it on the phone and the desktop: the performance difference is palpable.

But our benchmarking story doesn’t end there. What kind of performance would we get from a native iPhone application running these same kind of tests?

Thanks to the hard work of many developers congregating at the iPhone Dev Wiki, there is now a toolchain that allows us to create applications that don’t rely on sweet technologies like Javascript.

(Note: use Google if you’re interested in these tools; I’m honoring the request to not link directly.)

That, combined with the talents of Lucas Newman (author of Lights Off and other goodness at Delicious Monster,) resulted in a series of tests written in Objective-C. Here’s the source code: Marbles.zip

Let’s see how Javascript compares to native code running on the iPhone:

Test Native Javascript Slower by
100,000 iterations 0.015 secs. 3.209 secs. 214x
10,000 divisions 0.004 0.413 103x
10,000 sin(x) calls 0.105 0.709 7x
10,000 string allocations 0.085 0.777 9x
10,000 function calls 0.004 0.904 226x

In looking at these numbers, it’s easy to see that there are huge performance wins with native applications: lower function call overhead along with faster iteration and calculation. Even transcendental functions and object allocation sees a 700-900% speed increase.

More reasons for developers to crave a real iPhone SDK.

One thing that’s interesting to note: in the process of writing the Marbles test, Lucas noticed that _objc_msgSend_fpret was an undefined symbol in the ARM runtime libraries (triggered when calling NSDate’s timeIntervalSinceDate: method.) I looked around the libraries on the iPhone and couldn’t find any definitions, nor could a workaround with NSDecimal or NSDecimalNumber be found. Maybe this is one reason why the iPhone SDK isn’t available yet: important stuff is missing.

My guess is that developers just need to be patient. Apple is smart enough to realize that there are major advantages to having third party applications running natively on the iPhone. Not only will these applications be easier to write and faster to run, but they will also add to an ecosystem that makes the iPhone even more attractive to consumers. Apple also realizes that releasing an SDK means having interfaces that don’t change, code that works correctly, and documentation that explains its use.

If you’re a Cocoa developer and haven’t started looking at UIKit, now would be a good time. You have what you need to start experimenting. And you’ll be ready whenever Apple is.

Why stop at the Dock?

If the changes to the Leopard Dock are a good idea, shouldn’t Apple go all the way and do the same thing to the Finder? And then applications, too! Hell, I can totally see these windows flying around with Spaces and Exposé and Core OMFG!

The Finder of the future?

Forget about October, I’m stoked about 10.6! Let’s hope they add more reflections and transparency, too!

iPhone scrolling tip

If you’re an iPhone owner, you’ve probably encountered a problem with scrolling. For the most part it’s very intuitive, but there are occasions where you can’t get to what you want. The problem is that there aren’t any traditional scroll bars, so it seems like you are stuck. Even very smart engineers who know a lot about mobile computing are having problems.

Here are a couple of situations I’ve encountered:

  • While editing in a text area with a lot of text. You probably have resorted to using the insertion point to (slowly) reposition the text.
  • On a web page has content displayed using the CSS scroll:auto property. You’re stuck looking at the top of a <div> and can’t get to the lower part of the content.

Fortunately there is a very simple, yet unintuitive, solution for both of these situations. Use two fingers to scroll the content.

Flicking doesn’t work, but it’s better than nothing. I certainly wouldn’t design an interface around this “feature” due to its lack of discoverability—just use it as a workaround to deal with existing sites. And considering that people are submitting bug reports about scrolling not working, it looks like Apple has some work left to do with this “scroll within a scroll” gesture.

Credit for finding this trick goes to Matthew Krivanek on the iPhoneWebDev mailing list. I found his post while looking for solutions to the fixed positioning problem in MobileSafari.

Multi-touch on the desktop

Now that the iPhone has given us all a taste of a multi-touch user interface, I have been hearing many people say how cool it would be to have touch-based input on a new line of desktop displays from Apple.

If you’re one of the people who think that a multi-touch monitor is a good idea, try this little experiment: touch the top and bottom of your display repeatedly for five minutes. Unless you’re able to beat the governor of California in an arm wrestling match, you’ll give up well before that time limit. Now can you imagine using an interface like this for an eight hour work day?

One of the things that people don’t realize about the iPhone is that it works at a low angle (as opposed to the high angle of your desktop or laptop display.) Our bodies are more comfortable and adept at handling repetitive physical tasks when they are performed at these low angles. What works well for the eyes does not work well for the hands.

If you’re old enough to remember a time before CAD systems, you’ll likely remember drafting tables. These tables were adjustable from completely flat (a low angle) to somewhere around 40° (a medium angle.) A drafting table is an environment where it is easy to work with your hands and associated tools for hours on end.

The iPhone’s multi-touch UI works similarly: if you watch people use it, I think you’ll see a lot more people working at waist level than at chest level. The only time you need the interface close to your head is when you’re enjoying those 3 pt fonts in MobileSafari :-)

Of course, Apple could come up with some kind of ergonomic multi-touch desk. Or we could all go out and buy a Microsoft Surface real soon now. However, I’m pretty happy with the recent demise of the glass-based CRT and not looking forward to the added weight that a touch based interface would add to a 30″ LCD monitor.

But even if there was a solution to the ergonomic issues, there would be problems mixing mouse-based applications (with small hit areas) with touch-based inputs (and large hit areas). Touch-based UI is not something you just bolt onto existing applications—it’s something that has to be designed in from the start.

You can already see this mismatch between the mouse-based and touch-based environments. All you need to do is view a web application that is targeted at the iPhone browser. In a desktop environment the controls seem large, but on the phone they are comfortably sized.

Resolution independent interfaces may solve some of the problems with control sizes, but the fact remains that a desktop interface has a much higher information density than a mobile one. A desktop is a multi-tasking environment while a mobile device is typically oriented towards a single task (making a call, finding a restaurant, getting directions, etc.) Don’t assume that the multi-touch you are using in a single task environment, with its lower information density and more focused interface, will be equally successful in a high density, multi-tasking desktop. Take another look at Jeff Han’s amazing demo and realize that he’s only working in one application at a time—what happens when you add a browser, an e-mail client, and some of your other favorite applications to that desktop?

I also find it difficult to believe that any kind of touch-based UI will replace the keyboard anytime soon. For people who are touch typists, you can’t beat the feedback of a key press for common applications like word processing and e-mail. Eventually we’ll have haptic interfaces with simulated feedback which will obviate the need for a separate keyboard.

The bottom line is that we’ve only just begun a journey that will fundamentally change the way we interact with machines. A major part of this change will be evaluating new and better ways to use computers—what has worked well in the past may not work so well in the future. And because of the magnitude of this change, I think there will be an extended period where touch-based, mouse-based and keyboard-based interfaces will need to coexist. If we’re not careful about developing these new interfaces, we’ll end up with something like Victor Frankenstein’s creation: pieced together and frightening.

Update: Tog agrees with me. Make sure to check out the Starfire video for ideas on how horizontal and vertical work surfaces can be integrated. Even though the cultural and technological elements are a bit dated, the human-centered design is still relevant.

Quartz and Javascript, sitting in a tree…

Even if everything isn’t copacetic in the land of “sweet”, at least Javascript and Quartz are getting along.

Thanks to Apple’s contribution to the WHATWG’s HTML 5 specification, it’s pretty easy to use Quartz graphics technology in an iPhone application. Together with MobileSafari’s event handling, you can start to do some fairly sophisticated drawing using a <canvas> element on the iPhone.

Here’s a sample application that draws and updates a graphic based on user input: canvas_test.html

(Make sure to resize your browser to have a 360 pixel height if you’re running on a desktop instead of the iPhone. And don’t be a fool: view the source.)

A few things to note:

  • The Javascript timer events only fire if the page is frontmost in Safari. Don’t make assumptions about when stuff will happen based upon your previous AJAX experience.
  • The minimum interval for the timer is much higher on the iPhone than a typical desktop browser. Change the setTimeout() parameter from 1000 to 10 milliseconds, and you’ll see that it’s not fast enough for serious gaming. You might also want to look at CPU usage on the desktop as a clue to why the iPhone developers chose to limit the timer interval.
  • It appears that MobileSafari isn’t very fast at recognizing mouse (multi-touch) events. Try pressing the screen quickly in different locations: you’ll see that many of the events are not captured.
  • I’ve said it before, and I’ll say it again. The finger is a very imprecise pointing instrument. Try to get the graphic centered at 100, 100 and you’ll see what I mean.

And if you had any doubts about the WHATWG being a good idea: this example works just fine in Firefox, Camino and Opera. I’ll let you guess how well it works in Internet Explorer…

Update: I built a graphing calculator using the concepts presented in this essay. Try launching the application on your desktop and iPhone: there’s quite a performance difference between the two environments. To understand the differences, I ran some benchmarks.