Nike – iPod

Having just presented a talk at C4[2] about the human factors involved in developing touch-based applications, I find it rather ironic to see the Nike + iPod integration move into the latest iPod touch.

Why? Because I see some serious problems with how our bodies will interact with this device and its software.

Note: These are first impressions. I obviously haven’t had a chance to use the product, so the implementation by Apple/Nike is just a guess at this point in time. I’ve heard that there are some special controls in this software that allow “eyes off” control: if that’s the case, then we’ll all learn something from that UI.

The first problem is the size. When I’m running, I want the smallest piece of equipment possible. The simple reason is that any mass acts as a pendulum as you move. Sure the new iPod touch is lighter than its predecessor, but it’s still bigger and heavier than its nano sibling. Bigger is certainly not better.

There’s also a problem with where this device will attach to your body. Because of the size and weight, it’s likely that you’ll need to use it on an armband or in your pocket. Both of these locations present problems with interaction. You can’t see buttons on a touch screen when they’re in your pocket. It’s also uncomfortable to operate an interface that is strapped to your arm: try to unlock, launch and use an application while it’s positioned on your upper arm. Now do it while you’re running.

Good luck finding the PowerSong button without looking, too. Unlike the Nike + iPod application on the nano, this and other operations can’t be done solely by feel. Being able to operate the device while running is essential: you literally don’t want to take your eyes off the road. This “eyes off” approach to interaction is why the new iPod touch has physical volume buttons and why it was the most popular request by customers.

Some may argue that this device will be fine in a more controlled setting such as a gym. But if you’re running on a treadmill, there is already plenty of feedback from the machine’s built-in sensors and monitors. You don’t need a sensor in your shoe.

But in either case, you’ll find the biggest interaction problem is that sweaty fingers don’t work well on a capacitance-based touch screen. The salts in your body fluids make it much harder for the device to recognize your input. If you don’t believe me, try dissolving a teaspoon of salt in a cup of water. Then dip your finger in the salty water and try using the screen. You’ll find touch controls are jerky and non-responsive. Now add some body movement and you’ve got a real interaction problem.

Sometimes a few simple buttons, and not a fancy multi-touch UI, is the best way to solve an interaction problem.