Bug Writing Season

It’s that time of year: Apple has just released a bunch of amazing new things and we start poking at them. And of course, we encounter things that don’t work like we’d want or expect.

Apple’s engineers like to remind us that there’s a good way to handle that situation. You’ll also see a lot of bug report numbers pass by on the Twitter and Facebook feeds as your colleagues ask for duplicates. We’ll all be spending a lot of time writing Radars.

Now’s the perfect time to start using QuickRadar. As its name suggests, this project run by Amy Worrall, makes creating or duplicating bug reports much quicker. You’ll also find that a native Mac user interface is much easier to deal with than some web form pretending to be iOS 6.

The QuickRadar icon can be shown in your dock or in the menu bar. When you click on the icon or use the global hot key, you’ll be presented with the following UI:

QuickRadar_UI

Note: Since Apple doesn’t provide an API for Radar, the app works by scraping web pages. You’ll need to enter your Apple ID and password in Preferences before you can submit a bug report (and you have the source, so you know exactly how those credentials are being stored and used—it’s all done securely.)

Just fill in the fields and hit Submit. If you have your own internal bug tracker or want to document a workaround, you’ll probably want to copy the Radar number to the clipboard. Sending the bug to Open Radar is awesome if you want to let other developers know about the issue and/or get some duplicate bug reports.

If you want to duplicate a Radar, all you need is the bug report number. For example, say you want to complain about the new developer website being a huge pain in the ass. Just select File a Duplicate… from the QuickRadar menu and you’ll be presented with:

FileDuplicate_UI

After entering the bug number, the user interface will be pre-populated with the information from Open Radar along with a header saying “This is a duplicate of rdar://21372721”. Click Submit and you’re done.

We all know that writing bugs takes time away from doing more interesting things, but it’s still our best way to affect change from within Apple. With these tips, you’ll be able to get back to coding as soon as possible!

An @import-ant Change in Xcode

How many times have you done something like this?

(lldb) p self.window.bounds
error: property 'bounds' not found on object of type 'UIWindow *'
error: 1 errors parsing expression

Followed by a quietly whispered “crap” and some more typing:

(lldb) p (CGRect)[self.window bounds]
(CGRect) $0 = (origin = (x = 0, y = 0), size = (width = 375, height = 667))

And even if you remember to do it when debugging views, you’ll forget in some of the more arcane situations:

(lldb) p CGPointFromString(@"{10.0, 20.0}")
error: 'CGPointFromString' has unknown return type; cast the call to its declared return type
error: 1 errors parsing expression
(lldb) p (CGPoint)CGPointFromString(@"{10.0, 20.0}")
(CGPoint) $1 = (x = 10, y = 20)

Buried deep within the Xcode 6.3 release notes there is a true gem that can relieve this daily frustration.

LLDB’s parser for Objective-C can now go through any module used in your app and determine the types used for all functions and methods it defines. If you’re using UIKit in your app, you can do this:

(lldb) expr @import UIKit

Which will save a lot of subsequent typing:

(lldb) p self.window.bounds
(CGRect) $4 = (origin = (x = 0, y = 0), size = (width = 375, height = 667))
(lldb) p CGPointFromString(@"{10.0, 20.0}")
(CGPoint) $5 = (x = 10, y = 20)

Note that the app must be linked against the module being used in the @import. For example, if you try to use that command in a Mac app, you’ll get:

(lldb) expr @import UIKit
error: Header search couldn't locate module UIKit
error: 1 errors parsing expression

And if you’re a forgetful developer like I am, you can save yourself some more typing by adding this to ~/.lldbinit:

command alias uikit expr @import UIKit
command alias foundation expr @import Foundation

For those of us that work on both Mac and iOS projects, it’s just a matter of typing “uikit” or “foundation” at the beginning of a debugging session:

(lldb) uikit
(lldb) p self.window.bounds
(CGRect) $4 = (origin = (x = 0, y = 0), size = (width = 375, height = 667))

Note that you’ll need to use the “uikit” alias every time you start a new debugging session: the @import is not maintained across build and run cycles. Similarly, if you add something to .lldbinit, it won’t be read until the next time start debugging. Also, as much as we’d all love to automatically do the @import in the LLDB initialization file without typing “uikit”, that’s not currently possible (probably because this file is loaded before any modules are known by the debugger.)

Updated: Steve Streza came up with a brilliant hack: add the “uikit” alias as a breakpoint in the application delegate and set it to auto-continue.

Thanks to Oliver Letterer for the tweet that inspired this post and Peter Steinberger for bringing it to my attention with a retweet. This little trick is going to save us all a lot of time and frustration!

discoveryd Clusterfuck

I usually keep things fairly clean on this site. I have a simple metric: would I be embarrassed if my Mom read this post? As you’ve probably guessed from the title, this post is going to be different.

So, Mom, it’s time to stop reading. I’m pissed off and you know how I get when that happens.

In case you’re wondering what I’m talking about, look at this shit. A network process using 100% of the CPU, WiFi disconnecting at random times, and names, names (1), names (2), names (4). All caused by a crappy piece of software called discoveryd.

I started reporting these issues early in the Yosemite beta release and provided tons of documentation to Apple engineering. It was frustrating to have a Mac that lost its network connection every few days because the network interfaces were disabled while waking from sleep (and there was no way to disable this new “feature”.)

Regardless of the many issues people were reporting with discoveryd, Apple went ahead and released it anyway. As a result, this piece of software is responsible for a large portion of the thousand cuts. Personally, I’ve wasted many hours just trying to keep my devices talking to each other. Macs that used to go months between restarts were being rebooted weekly. The situation is so bad that I actually feel good when I can just kill discoveryd and toggle the network interface to get back to work.

Only good thing that’s come of this whole situation is that we now have more empathy for the bullshit that folks using Windows have suffered with for years. It’s too bad that Apple only uses place names from California, because OS X Redmond would be a nice homage.

It’s no secret in the tech community that discoveryd is the root cause of so many problems. There are even crazy workarounds. With so many issues, you’d expect some information from Apple explaining ways to mitigate the problems.

Nope.

The only explanation I can come up with for this astounding lack of information is that there’s some mid-level product manager at Apple who’s covering their ass. I hope this person who’s responsible for withholding advice feels good about themselves, because the rest of us hate them with the burning passion of a thousand suns. Being stingy with knowledge in an engineering organization is a fucking stupid career move.

To give you an idea of how helpful a tiny piece of information is towards people’s productivity, let me give you a simple example that’s already saved me hours of frustration.

For months, I’ve seen bullshit like this in Bonjour:

Phantom

That shows the xScope service on the Mac that provides data for the Mirror on iOS. In that screenshot, the service is being shown as available on three devices: one with just an IPV6 address, one with no IP addresses, and one with a duplicate IPv6 address and a valid IPv4 address. The name “CedarX” was the only way I could find to prevent names from incrementing (and breaking things that use the host name of that device.)

The “funny” thing is that this Mac is running the latest version of 10.10 with fixes for “WiFi issues”. And after tweeting about it in frustration, I got this response:

I followed Hendrik’s advice and guess what? No more network issues.

Bonjour keeps a cache that’s shared amongst devices on the network. This is so that if the device is asleep, another one that’s awake can provide the necessary information. I suspect that a device running an older version of discoveryd poisoned this cache. For some reason, the invalid cache information couldn’t be corrected by a newer version of the software which screwed things up in the first place.

But this is all just conjecture because Apple hasn’t written that fucking tech note.

This situation also shows another important aspect of the discoveryd clusterfuck: this code is all over the place. It’s in use by iOS, OS X and presumably whatever is running on the Apple Watch. As such, any one of those devices can poison Bonjour for everything else on your network.

This workaround is fairly simple if you’re on a home network where you have direct physical access to the all the devices. But as we all know, wireless networking is essential in places like an office, an airport or a coffee shop. Good luck rebooting everything in that kind of environment. And what happens when someone running an older version of OS X connects to that network and poisons it? Time to reboot!

You also can’t rely on software updates to fix everything: I have both an Airport Express and Apple TV that are no longer receiving fixes. Having to buy new hardware because of crappy software adds insult to injury.

Ironically, these issues are most likely to affect Apple’s best customers. The more devices you have, and the longer you have them, the more likely you are to get an unstable network. The only advice I can offer is to restart your entire network.

C:\ONGRTLNS.OSX

A New Way to Display

To date, Apple’s Retina displays have relied on LCD technologies. Since the iPhone 4, our mobile devices have used in-plane switching (IPS) technology to achieve high resolution with accurate color from a wide range of viewing angles. The IPS LCD panels are also present in many of the desktop monitors we use on our Macs.

Chances are good that’s about to change with an OLED display in the Apple Watch.

Unless you’ve done work on Android, you’re probably unaware how different AMOLED displays are from LCD. Let’s take a look at what lies ahead.

Physical Differences

An LCD display relies on a backlight that’s projected through three layers: one for each of the primary colors. Red, green and blue appear on screen because crystals in those layers can be aligned electrically to allow the backlight to pass through.

OLED has no backlight and has only one layer that produces light. That layer is an organic compound that emits light when subjected to an electric current. When you put them in a two dimensional matrix of transistors, you end up with an AMOLED display.

Based on these short descriptions, it’s easy to see why an OLED is thinner than its LCD counterpart: there are simply fewer layers of electronics. And it gets even better when you discover that the compounds and electronics can be fabricated on flexible plastic substrates.

LCDs have always been problematic in direct sunlight because the backlight must pass through filters which can also reflect ambient light. This has also been an issue for OLED displays where the light is emitted below a reflective metal cathode. The good news is that recent advances with the technology are producing brighter results than an LCD.

The biggest challenge for OLED is with the organic material where light is emitted. Basically, byproducts of the chemical reaction that produces light accumulate over time and reduce the efficiency of the output. Again, this is an area where manufacturers are focusing their efforts. In just a few years the lifespan of these devices have increased by several orders of magnitude, but is still limited to tens of thousands of hours. Don’t expect your Apple Watch to become a family heirloom.

How Not to Do It

OLED displays have gotten a bad rap on mobile devices primarily because of a thing called PenTile.

PenTile mimics how our eye works: 72% of the luminance we perceive is determined by the green wavelengths of the electromagnetic spectrum. The RGBG arrangement of sub-pixels lets a display get brighter without increasing the overall number of transistors needed. This, of course, keeps manufacturing costs down.

Unfortunately this physical layout of the light emitters also makes colors grainy and text hard to read. Color accuracy also suffers. PenTile is also a trademark of Samsung. I can’t see Apple using this approach in their Retina displays.

It’s much more likely that Apple’s industrial designers have been working hard to find a new and better way to use OLED technology without losing fidelity. I can’t wait for someone to look at the Apple Watch’s display under a microscope.

Black is Best

From Apple’s point-of-view, one of the most important things about OLED is how it consumes power. A transistor on the display only uses energy when it’s producing light. Compare this with an LCD backlight which must be lit in order to see any pixel.

Folks with OLED displays on their Android devices have figured out that a lot of black pixels makes their battery last longer. So too Apple.

It’s also important to remember that each pixel on the display has a limited lifetime. The less time the OLED spends producing light, the longer it lasts.

One of my first impressions of the Apple Watch user interface was that it used a lot of black. This makes the face of the device feel more expansive because you can’t see the edges. But more importantly, those black pixels are saving power and extending the life of the display. It’s rare that engineering and design goals can align so perfectly.

And from what we’ve seen so far of the watch, that black is really really black. We’ve become accustomed to blacks on LCD displays that aren’t really dark: that’s because the crystals that are blocking light let a small amount pass through. Total darkness lets the edgeless illusion work.

Flat Black

I’ve always felt that the flattening of Apple’s user interface that began in iOS 7 was as much a strategic move as an aesthetic one. Our first reaction was to realize that an unadorned interface makes it easier to focus on content.

But with this new display technology, it’s clear that interfaces with fewer pixels have another advantage. A richly detailed button from iOS 6 would need more of that precious juice strapped to our wrists. Never underestimate the long-term benefits of simplification.

The Future

Apple is a company that likes to leverage its technologies across a wide range of products. Look at how many of their devices are using IPS LCD displays now, then imagine a move to an OLED display pioneered by the Apple Watch.

When Jony Ive taps the home button on his iPhone and says, “The whole of the display comes on. That, to me, feels very, very old.” it’s a sign that individually addressable sources of light are the wave of the future.

Of course it will take time for this to happen, but you know it’s going to look awesome when they’re done. And along the way, we’ll learn to think about pixels differently.