Wearing Apple

Tim Cook has openly stated that Apple is working on “new product” categories. Many people, customers and competitors alike, assume that means some kind of wearable computing device. And of course that means it has to be some kind of “smartwatch”, right?

I don’t think so.

The Market

First, let’s look at the market for quality timepieces; ones that you’d be proud to wear on your wrist. It’s dominated by companies with centuries of experience. It’s also a high-end market: spending a few thousand dollars on a nice watch is chump change. You’re buying a work of art.

Apple certainly has great designers, but they’re going to be competing against craftsmen who’ve been refining their craft since the 15th century.

I realize that this sounds a lot like Ed Colligan talking about mobile phones:

“PC guys are not going to just figure this out. They’re not going to just walk in.”

There’s a big difference here: the companies that dominated the music player and mobile phone markets were making complete crap prior to Apple’s arrival. Granted, there are a lot of cheap and crappy watches on the market, but they’re not remotely interesting to the demographic that buys Apple products. And to many people, a fine timepiece is more about status than technology.

Taking all of this into consideration, watches don’t sound like product category that fits Apple well. One of the many reasons we love their products is because they are best of breed.

The Watch

Now, let’s look at the state of the “smartwatch” market. Smart companies, like Nike and FitBit, are creating products that aren’t really a “watch”. It’s like the “phone” being the least interesting thing on our “smartphone”.

In fact, Tim Cook is already wearing one of these devices:

Walt: Is wearables a thing — is it part of the post-PC era wearables that go beyond fitness devices?

Tim: Yes, I think so. I wear a Fuelband, I think Nike did a great job.

Note: This quote, and others in this piece, were made during Tim Cook’s interview at D11.

However, once manufacturers try to put a lot of smarts in a watch, they end up with a product that’s a bulky mess or ugly as sin. Probably both.

Again, Tim Cook has noticed this same thing:

“There are lots of gadgets in the space. I would say that the ones that are doing more than one thing, there’s nothing great out there that I’ve seen. Nothing that’s going to convince a kid that’s never worn glasses or a band or a watch or whatever to wear one. At least I haven’t seen it. So there’s lots of things to solve in this space.”

We all know that Apple’s designers and developers can solve these problems. But can they do it without making compromises?

“It’s an area that’s ripe for exploration, it’s ripe for us to get excited about. Lots of companies will play in this space.”

That exploration is all about saying no a thousand times. It’s easy to create a great looking concept, but turning that fantasy into a product is where it gets hard and compromises are often made.

It’s my view that the technology needed to make that great product that does everything just isn’t there yet. There are too many compromises.

The New Hub

Let’s assume for a second that you can make a compelling device for a wrist. I’d still be asking myself this question: “Why do I need a computer on my wrist?” I already have one in my pocket.

With the introduction of CarPlay, Apple’s answer to this question is becoming clearer. That device in your pocket is the new hub for your digital data. Ben Thompson calls it Digital Hub 2.0.

You don’t need a computer on your wrist. Apple’s wearable devices will become a part of this circle that surrounds your main device. And as with CarPlay, it’s going to be a two-way street where interactions and information in the player make their way back to the main device.

The Fashion

The development of wearable technology is going to enter a space where most companies in the Silicon Valley have no experience. As a geek, do you know who Anna Wintour is and why she would be important to the success of a wearable device?

Angela Arhendts certainly does…

Put aside your geek sensibilities for a second and think about this device as a piece of fashion instead of a cool gadget. What’s the primary function of fashion?

It’s about expressing a personal style. It’s an extension of your personality and way to show it in a public way.

Look at the huge variety of iPhone cases that are on the market. That’s only 16 examples of 4.7 million search results. Fashion is as diverse as the people who wear it.

An important part of this diversity is customer’s gender: men have different tastes than women. That concept watch mentioned above looks fantastic—if you’re a male. I can’t imagine my wife wearing it.

Then there’s the question of size: as someone who’s 6’7″ tall, I know firsthand that “one size fits all” is the greatest lie ever told by clothing manufacturers. Since everyone’s body is unique, wearable devices must also consider comfort and fit in the design.

Apple may be able to use something like an iPhone case to make a wearable device appeal to both sexes. Or adjustments that make a single device (SKU) adapt to multiple body shapes. But I suspect that it’s going to take more than that to be appealing as a piece of fashion.

The Customer

Now that we’ve looked at the challenges in developing these devices, let’s think about the most important question Apple is asking itself:

Who is going to buy this wearable technology?

Trends are always set by the younger generation. Especially with clothing, jewelry and other items that appeal to a demographic with a lot of expendable income. To me, this quote by Tim Cook is the most telling:

“To convince people they have to wear something, it has to be incredible. If we asked a room of 20-year olds to stand up if they’re wearing a watch, I don’t think anyone would stand up.”

This response to Kara Swisher’s question about Apple’s interests in wearable technology covers all the bases. It includes the target market (“20-year-olds”), product focus (“has to be incredible”), and most importantly, he’s seeing the same thing I am: people don’t need to wear watches because they already have that computer in their pocket.

Note also that in the response he doesn’t say “wear a watch”, it’s “wear something”. It’s implied, but not stated. Remember that he learned from the master of misdirection: Steve Jobs.

The Product

Given everything presented above, it’s pretty clear to me that a “smartwatch” isn’t in Apple’s immediate future. But they’re clearly interested in wearable technology. So what are the alternatives for a product that could be released this year?

The first step is to start looking at things from Apple’s point-of-view. I ask myself, “What problems can a wearable device solve?”

As I think about answers to that question, it leads me to the conclusion that Jony Ive and crew aren’t looking solely at the wrist. Wearable technology could take cues from other kinds of jewelry: rings and necklaces, for example.

What if Apple’s entry into this space is a ring?

  • Limited display — A discreet way to provide notifications—just an LED or two that indicate what’s happening on your iPhone. Maybe even a small, flexible E Ink display for high contrast text.
  • Tactile — A way for your finger to sense a notification. It’s easy to miss audio cues in a noisy room or a vibration in your pocket.
  • Small & light — A hallmark of Apple design. Simplicity in form and function.
  • Customer appeal — Those 20-year-olds don’t wear watches, but jewelry is already a part of their culture. “Bling” came from the world of hip hop.
  • Sensors — Your finger has a pulse and blood pressure for Healthbook.
  • Power — The Lightning connector is small for a reason.
  • Competition — While all its competitors are rushing watches to market, Apple catches them with their pants down.
  • Low cost — How many of these rings could Apple sell if they were priced at $99?

But the biggest feature of all would be that this wearable device could support iBeacon. This technology is based on Bluetooth LE which Apple describes as “a new class of low-powered, low-cost transmitters that can notify nearby iOS 7 devices of their presence.”

Let this sink in for a second: your wearable device is transmitting a signal with a unique identifier that can be picked up by an iOS 7 device. And the proximity detection is sensitive within a few inches. Presumably, this signal could be also be detected on your Mac as well, since they have supported Bluetooth 4.0 since mid-2011.

By wearing this ring on your finger, your devices can know how close you are to them.

This opens up a world of possibilities: imagine the joy we’d all feel when a notification only popped up on the device we’re closest to. Right now my ring finger is hovering over my MacBook Air’s keyboard by 2-3 inches, while the phone in my pocket is over a foot away. Notification Center needs this information.

Conclusion

Predicting Apple’s future is always fraught with difficulties. I may not be right about the end result being something other than a watch, but I’m certain that there are people at Apple thinking about the issues I’ve outlined above.

And I, like many others, am really looking forward to wearing Apple.

Font Points and the Web

When sizing fonts with CSS, there’s a rule of thumb that states:

1em = 12pt = 16px = 100%

There are even handy tools that help you calculate sizes based on this rule.

But this rule of thumb leaves out a very important piece of information. And without that information, you could be left wondering why the size of your type doesn’t look right.

Testing Points and Ems

To illustrate, let’s take a look at some work I’ve been doing recently to measure text accurately on the web. I have a simple test page that displays text in a single font at various sizes:

<!DOCTYPE html>
<html lang="en">
<head>
  <meta charset="utf-8" />
  <title>Em Test</title>
  <meta name="generator" content="BBEdit 10.5" />
  <style>
    body {
      font-family: "Helvetica Neue", sans-serif;
      font-size: 100%;
      line-height: 1;
    }
    p {
      padding: 0;
      margin: 16px 0 16px 0;
    }
  </style>
</head>
<body>

<p style="font-size: 2em; background-color: #eee;">MjṎ @ 24pt / 2em</p>

</body>
</html>

You can view a slightly more complex version of the page here.

The W3C recommends specifying fonts using either ems or pixels. Since high resolution displays are becoming ever more common, that really leaves you with just one choice: the em.

(Note: In the examples that follow, I’m using text set at 2em since it’s easier to see the problem I’m going to describe. The effect, however, will happen with any em setting.)

The body is set in Helvetica Neue at 100% so 1em will be the equivalent of 16px when the user’s browser is using a default zoom setting. The line-height is 1 so there’s no additional padding around the text: the default of “normal” would make the line approximately 20% taller.

When you display the test page, everything looks good because the sizes are all scaled correctly relative to each other. A quick check of the light gray background with xScope shows that the height of the paragraph element is 32 pixels (2 ems):

Browser

But then things got confusing.

Points aren’t Points

I had just been measuring some text in TextEdit and noticed something was amiss. Comparing the two “24 point” fonts looked like this:

TextEdit versus Browser

I’m no typography expert, but there’s something clearly wrong here: a 24pt font in TextEdit was noticeably smaller than the same size in my web browser.

I confirmed this behavior in three browsers running on my Mac: Safari/Chrome (rendering with WebKit) and Firefox (rendering with Gecko) displayed the larger version of 24 point text. Why were the sizes different?

After some experimentation, it appeared that rendered glyphs were from a 32pt font:

72 DPI

What the hell?

A Brief History of Type

When confronted with a problem like this, it’s always a good idea to question your assumptions. Was I measuring the right thing? So I started learning more about type…

In general, points are meaningless. It’s a relic of the days of metal type where the size specified the height of the metal, not the mark it made on the page. Many people mistake the size of a glyph (shown below with red bars) with the size of a point (green bars):

Point versus Glyph

(Photo by Daniel Ullrich. Modifications by yours truly.)

Even things like the word “Em” got lost in translation when moving to the web: it originally referred to the width of a typeface at a given point size. This worked in the early days of printing because the metal castings for the letter “M” were square. Thankfully, we’re no longer working in the 16th century.

In modern fonts, like Helvetica Neue, a glyph for a capital “M” may be square, but the flexibility of digital type means that the width and height of a point rarely match.

Thanks Microsoft!

After doing this research, I was sure I was measuring the right thing. The sizes of the glyphs should match, regardless of point size. Something else was clearly in play here.

Let’s just say I spent a lot of quality time with Google before eventually stumbling across a hint on Microsoft’s developer site. The document talks about using a default setting of 96 DPI. I’ve been spending a lot of time lately with the Mac’s text system, so I knew that TextEdit was using 72 DPI to render text.

I quickly opened a new Photoshop document and set the resolution to 96 pixels per inch (DPI). And guess what?

96 DPI

All the text got larger and it matched what I saw in my browser. Mystery solved!

72 DPI on the Mac is 75% smaller than 96 DPI used by the Web. So the 24pt font I was seeing in TextEdit was 75% smaller than the 32pt font used in the browser.

Further research about how 96 DPI is used on the web turned up this Surfin’ Safari post on CSS units written by Dave Hyatt back in 2006:

This is why browsers use the 96 dpi rule when deciding how all of the absolute units relate to the CSS pixel. When evaluating the size of absolute units like pt browsers simply assume that the device is running at 96 CSS pixels per inch. This means that a pt ends up being 1.33 CSS pixels, since 96/72 = 1.33. You can’t actually use the physical DPI of the device because it could make the Web site look horribly wrong.

That’s another way to think about this problem: a single point of text on your Mac will be 1.33 times larger in your browser.

Now What?

Now that we know what caused the problem, how can we avoid it?

The key piece of information that’s missing in the “1em = 12pt = 16px = 100%” rule is resolution. If you’re specifying point sizes without also specifying resolution, you can end up with widely varying results. For example, each of these is “24 points tall”:

Comparison

If you’re lucky enough to have a Retina Display on your Mac, you’ll be rendering text at 144 DPI (2 × 72 DPI). That’s 20% larger than the last example above (Windows’ 120 DPI setting.) Display resolution is increasing and so are the number of pixels needed to represent “a point”.

Note that you can’t “fix” this problem by specifying sizes in pixels. If you specify a font size of 16px, your browser will still treat that as 12pt and display it accordingly.

As a designer or developer, you’ll want to make sure that any layout you’re doing for the web takes these size differences into account. Some apps, like Photoshop, allow you to specify the resolution of a document (that’s how I performed my sizing tests.) Make sure it’s set to 96 DPI. Resolution may not matter for exporting images from Photoshop, but it will make a difference in the size of the text in your design comps.

Most other Mac apps will rely on Cocoa’s text system to render text, so if there’s no setting to change the document’s resolution, a default of 72 DPI should be assumed.

An alternative that would suck is to start doing your design and development work on Windows where the default is 96 DPI. That setting also explains why web browsers use this resolution: remember when Internet Explorer had huge market share?

And finally, expect some exciting new features for measuring, inspecting and testing text in your favorite tool for design and development. I’m doing all this precise measurement for a reason :-)

Updated February 25, 2014: It turns out Todd Fahrner first identified this problem back in the late 1990’s. His work with the W3C was instrumental in standardizing on 96 DPI. What’s depressing is that I knew about this work: Jeffrey Zeldman and I even developed the Photoshop filters mentioned on his site. Maybe I should send in my AARP membership form now.

Fingerprints

It looks like my hunch about the iPhone invite was right: new phones are likely to have “silver rings” that are fingerprint sensors embedded into the home button.

So what does this mean? Most people assume that it’s just going to be easier to access your home screen (without entering a passcode.) But I think it goes much deeper than that.

iCloud services are a huge part of Apple’s next decade. Everything the company is doing these days has some kind of connection to data that is stored remotely. They’re investing heavily in new data centers.

And anytime you want to access this data, you’re logging into iCloud. Wouldn’t it be great if you could skip the part where you have to type in your Apple ID?

It’s clear to me that your unique fingerprint will be tied to your unique Apple ID. Once this connection between your physical and online presences is established, some very interesting things become possible. Let’s take a look at a few things I think might happen:

Protecting access to apps

From the beginning, I’ve wanted a way to protect my personal information when sharing a device with friends and family. But any secure solution to that problem would be a pain in the butt. Typing a password before launching an app? No thanks!

But imagine if opening your favorite Twitter app only required a fingerprint scan before proceeding. Everyone’s favorite Twitter prank would thankfully die. And more importantly, we’d all feel a lot better about using things like online banking apps.

Corporate security

Most corporate networks are protected by VPN. The profiles that enable this network configuration often specify that a user must use a passcode lock. And it’s rarely a simple passcode. And it kicks in immediately.

Imagine needing to type in a eight character password with letters and numerals just to check the current weather. That’s a reality for millions of people who use their device for both personal and business tasks.

A fingerprint scanner that avoids this complex password will sell a lot of devices.

Multiple accounts

There are many scenarios where you want to let someone do whatever they want with your personal device: a partner providing directions while you drive, a kid playing games in a waiting room, your parents looking at vacation photos. All these people have something different than you: a fingerprint.

Entering a login and password has always seemed out of place in the simplified world of iOS. But detecting which account to use when tapping on the home button actually makes it easier for members of your family to use your personal device: they don’t even have to slide to unlock.

And once everyone has their own personal space on the device, no one can mess with it. This is important in many contexts: even something simple like a game has sensitive information as soon as a sibling comes along and screws up high scores or play position.

Borrowing devices

Most of us backup our data to iCloud. That data is restored to a device when it’s first put into service or if something goes wrong.

Now imagine if you had the ability to restore your data onto any device with the touch of your finger. Borrow an iPad and make it yours instantly. Businesses that strive to make a customer feel at home, like hotels and airlines, would love this capability.

Personal identification

If you think Apple is going to give developers access to this biometric information, think again. Google would love this data, so you know it’s not going to happen.

Slowly but surely

Don’t expect all these things to appear on September 10th. Apple will start out simply and continue to improve the software that uses this new sensor. That’s how they roll.

Acknowledgments: The genesis of these ideas came from a conversation with my colleague Sean Heber. He’s the one that first made the connection between iCloud and your finger. Thanks also go to Ryan Jones for the links about the sensor.

Cheap as in Computing

There’s a good case for an iPhone that costs less. But with this lower cost, developers fear that device specifications will suffer:

As someone who’s actively working on an iOS 7 update, I’m noticing a definite pattern emerging: we’re removing a lot of shadows, gradients, and transparency. A lot of views that were previously required to make an app look at home on iOS 6 are no longer needed.

The visual simplification of iOS has led directly to a simplified implementation. As every developer knows, the less work your app does on a mobile device, the better it performs. It’s a lot easier now to make an app that feels fluid and uses less CPU and GPU resources.

While everyone focuses on what Jony Ive has put on the screen, he’s also made the hardware under that screen able to do more with less. And yet again, Apple’s tight integration of hardware and software is going to kick everyone’s ass.

Updated August 14th, 2013: A lot of Twitter followers are saying, “But what about the live blurs? Aren’t those CPU/GPU intensive?”

Yes, they are. And you should also note that access to those features is strictly through private API. An iPhone 4 turns off blur, an iPhone 5C could do the same if necessary.

(If you look closely at the blur behind a toolbar, you notice that there’s some kind of sub-sampling of the screen image. Because this implementation is private, the algorithm could also be adapted for other devices.)

There’s also the question of all the new dynamics in the UI (like bouncy messages.) The highest costs in a GPU, both with computation and hardware components, is with dealing a lot of textures. The math for a physics engine is relatively easy to handle.

AMBER Alert Usability

If you live in the State of California, you got an AMBER Alert last night just before 11 PM. If you have an iOS device, you saw something like this on your lock screen:

Or like this in Notification Center:

And if you’re like most people, you had no idea what it meant. Considering it’s a broadcast to locate missing children, that’s a bad thing. Let’s examine the usability of this alert and think about how this system can be improved.

Updated August 6th, 2013: Lex Friedman has written a great summary of AMBER Alerts at Macworld. Among the revelations: the text of the messages are limited to just 90 characters!

First Impressions

In our household, there are many iOS devices. At the time of the alert, we were in the living room with two iPads and an iPhone. The alert’s sound is designed to get your attention, and that it did!

Both my wife and I gave each other that “what the hell is wrong” look as I grabbed my iPhone from my pocket. Turns out we weren’t the only ones who were frightened by the sound:

My wife’s first question as I looked at my phone was “Are we having a tsunami?” (we’ve had these kind of emergency broadcasts before.) I replied with, “No, It’s an AMBER Alert”. To which she replied, “What’s that?”

And therein lies the first problem: I had no idea.

Unlike all other notifications on my iPhone, I couldn’t interact with the alert. There was no way to slide the icon for more information or tap on it in the Notification Center to get additional information. Through a combination of Google and my Twitter timeline, I eventually figured it out:

But I was also seeing a lot of people on Twitter whose response to the confusion was to ask how to turn the damn thing off. And since AMBER Alerts aren’t affected by the “Do Not Disturb” settings, a lot of people went to Settings > Notification Center so they wouldn’t get woken again in the future.

That’s exactly what you don’t want to happen when a child is abducted.

Alert Text

In looking at the message itself, it’s hard to tell what it’s about. Starting an alert with an acronym may make sense to the government, but there’s wording that could resonate much effectively with normal folks:

It’s also hard to tell if this is a problem with “Boulevard, CA” or a “Nissan Versa”. And where is Boulevard? And what does a Nissan Versa look like? Who do I call if I see license 6WCU986? In summary, this notification provides more questions than answers.

This one image has answers to all of these questions and more. Why can’t I see that image after tapping on the notification?

The Alert Sound

As I mentioned above, the sound definitely let us know that something was wrong. But we were sitting comfortably in our living room. A friend of mine was driving at the time and probably listening to music from their iPhone on a car stereo. Being startled at high speed is dangerous:

Unlike the devices that existed when the AMBER Alert system was first introduced in 1996, our iPhones and iPads do a lot more than calls and texts. Music and video are immersive activities and hearing a loud horn can be a cure that’s worse than the disease.

Also, we’re all conditioned to immediately look at our phone when the normal alert sounds are used: a simple ding would have gotten just as much attention.

Do Not Disturb

And what the heck is up with this crazy sound happening if Do Not Disturb is turned on? Dammit, it’s my phone and I get to tell it what to do. Stupid Apple!

Well, take a look at your government first:

And if you want the details, it’s only going to cost you $205 to download:

Bottom line: these bugs aren’t going to be easy fixes.

(And if you think getting woken up because of an AMBER Alert is such a terrible thing, why don’t you explain your pitiful situation to this boy’s parents.)

File A Radar

Even though these bugs probably aren’t Apple’s direct problem, I’m still going to file a bug report. If you have a developer account, please feel free to duplicate that Radar.

Apple is in a very unique position to address these issues:

  • It has direct access to millions of customers and their mobile devices.
  • It employs many people with a deep understanding of how mobile devices are affecting our lives.

This is clearly a problem where cooperation between Apple, the Department of Justice, and the public can improve a system where everyone benefits. Better usability with AMBER Alerts is case where “think of the children” isn’t a trite platitude.