iCloud Clusterfuck

Luckily, I don’t have to use this kind of title often. But when I do, there’s a good reason: this year’s beta release cycle for all of Apple’s operating systems has been a mess. The months since WWDC in June have been a terrible experience for both customers and developers alike and the literal center of the chaos was Apple’s iCloud syncing service.

For us, it all started with customers reporting lost Linea sketches in their iCloud Drive. Initial investigations led to a common factor: all of the people affected had installed the iOS 13 beta release.

And when I say lost, I mean really lost. Entire folders were either gone or corrupted. Apple’s mechanism to recover deleted files was of no help. The customers with weird folder duplicates were the “lucky” ones.

We couldn’t find any problems in our code and there was no information from Apple about problems with iCloud, so we did the only thing possible: we blocked people from using Linea on iOS 13. If a customer absolutely needed the app, we got them onto TestFlight after a stern warning about the real possibility of losing data.

A few weeks later, Apple finally indicated that there were some issues with iCloud and the beta release. In the same week, they released a public beta and sent out an email to customers encouraging them to try out iOS 13.

We did our best to understand the situation and provide information to Apple, but it felt like we were tossing bug reports into a black hole. The most discouraging part was when we tried to open an incident with Apple Developer Technical Support (DTS). After writing up a detailed report, we were informed that they don’t support beta releases!

By the time beta 6 rolled around, things were improving, but there were still isolated reports that the service might not be completely back in working order. Confirming the fixes on our end was impossible: the folks who’d encountered the previous data loss had no desire to mess with iCloud again.

Now it appears that the entire stack is getting rolled back and there won’t be new iCloud features in iOS 13 (at least initially.) I honestly think that’s the wisest course of action at this point. My only wish is that Apple would make an official statement.

Now that we’re past the worst of it, it’s time to think about why this whole episode happened and how it can be prevented in the future.

Folks don’t understand beta

Apple’s biggest fuck up was a bad assumption about who is testing a beta release. In WWDC presentations and developer documentation, there’s always warnings about using the release on non-production devices and only with test accounts.

But there are many folks that are just looking to get the new and shiny features. In past iOS beta releases, Apple hasn’t suffered too much from this because the early software was relatively stable. Maybe you got some dropped calls or bad battery life, but it was nothing too serious.

These early adopters installed iOS 13 and expected a similar experience. They also weren’t using an iCloud test account, so any instability in the beta release propagated bad data to their other devices.

Developers have long known to unhook external drives when testing a new OS release. Shit happens, and that’s OK because it’s a beta and we expect a bumpy road. In fact, I currently have an external Photos Library that I can’t use in Mojave because it got upgraded when I booted into Catalina: it’s my dumb mistake and I have no complaints.

Anyone who’s not a developer, and hasn’t been burned by a bad OS, does not know the kind of trouble that lies ahead. It’s irresponsible for Apple to release a public beta with known issues in iCloud. It’s doubly egregious to then promote that release with an email campaign to customers. For a company that prides itself in presenting a unified front, it sure looks like the left hand doesn’t know what the right hand is doing.

Disconnect iCloud by default

The solution to this situation is relatively simple, but with painful consequences. Apple needs to help unwitting customers by automatically disconnecting that external hard drive called iCloud.

If a device is using an Apple ID that’s also being used on a non-beta device, then iCloud shouldn’t be allowed. If you install an iOS beta on your iPad, it doesn’t get to use any cloud services because it puts the data on your iPhone or Mac at risk.

Of course, once this restriction is put into place, Apple will quickly realize that no one is testing iCloud or anything that touches it. If a beta tester can’t have their contacts, synced notes, reminders, and appointments over the course of three months, they’re not going to be testing much. I’d also expect to see lots of guides on “How to Get Back on Normal iOS” around the web.

But there’s a better, and even more unlikely solution…

iCloud can’t be a beta

Because it’s a service, iCloud doesn’t get to go into beta. It needs to be reliable all the time, regardless of whether iOS or any other platform is in beta test.

As it is now, Apple is effectively telling you that your storage device will be unreliable for a few months. It’s like having a hard drive where the manufacturer tells you it won’t work well for ¼ of the year. Would you purchase storage with a caveat that “the drive mechanism may not work properly during the hot summer months”?

You don’t see these kinds of issues with Google Drive or Dropbox because they don’t get a pass while an OS is being tested. Dropbox moved your folders from AWS to their own servers without you noticing even the slightest hiccup. Compare this to the last three months of iCloud in beta.

And yes this is a hard task: akin to swapping out the engines of an airplane in mid-flight. But data services, like hard drives, must work all the time.

We’re well past that point with iCloud and it’s just going to get worse as Apple moves more of our data there. This time next year, will I suddenly not know which episodes I’ve watched on Apple TV? Or will I suddenly lose all my scores and achievements in Apple Arcade? Will I shoot myself in the foot with some cool new feature in iOS 14?

Unfortunately for Apple, breaking iCloud away from the OS isn’t a simple problem to solve. It’s not a technical challenge as much as it is an organizational one: a move from a functional form to a divisional one.

So while we all dream of an OS release roadmap to ease the transition to new features, Apple needs to give some very close to attention to the underlying services that would enable that new way of working.

The lack of substantive communication about iCloud’s problems this summer shows how far the company has to go in that regard. You can’t use a roadmap without letting people know about your successes and failures. There will be wrong turns, and you can’t continue on in silence when people depend on the service’s reliability.

iCloud gives data a bad name

As a developer whose job requires frequent beta installations, this year’s problems with Apple’s service has brought something clearly into focus: as much as I love the functionality that iCloud provides, I cannot put my valuable data at risk.

And you can bet your ass that I have a new backup strategy for anything that’s in iCloud Drive. (Hint: rsync and ~/Library/Mobile Documents.)

As an Apple shareholder, I also worry about how these failures will damage the iCloud brand. Apple’s growth hinges on services and when they get a bad name, the stock price can only go one way. There is only one brand that people think of when you mention “click of death”.

I sincerely hope that Apple can address these issues, because I love the idea of services that focus on simplicity and privacy. I know my last clusterfuck memo got some notice within the company, and I hope this one does as well.

The Future of Interaction

Shortly after finishing my treatise on Marzipan, I started thinking about what lies beyond. Some of those initial thoughts made it into a thread on Twitter.

This post can be considered an addendum or a hell of a long footnote: in either case, you’ll want to start by reading my thoughts on Marzipan. Because what’s happening this year is just the start of a major shift in how we’re going to build apps.

So while everyone else is making predictions about WWDC 2019, what you’ll find below are the ones I’m making for 2020 and beyond.

Update June 7th, 2019: It turns out I was making predictions for 2019, after all.

What is Apple’s Problem?

Before we get into thinking about the future, let’s look at a problem that Apple has today: too many products on too many platforms.

Historically, a person only had one computer to deal with. For several generations it was a mainframe; more recently it was a PC. One of the disruptive changes that started with the iPhone was the need to juggle two computers. Now we have watches generating and displaying data: another computer. Increasingly, the audio and video devices in our living rooms are added to the mix.

Syncing and cloud services help manage the data, but we all know the challenges of keeping a consistent view on so many machines.

If you’re an iMessage developer, you have to think about a product that works on iOS, macOS, and watchOS. You get a pass on tvOS, but that’s small consolation. The same situation exists in various combinations for all of Apple’s major apps: Music, Calendar, Reminders, Notes, Mail, etc.

It’s likely that all of these apps share a common data model, probably supported by an internal framework that can be shared amongst platforms. That leaves the views and the controllers as an area where code can’t be shared.

Marzipan is About Views

With this insight, it’s easy to see Marzipan as a way towards views that share code. A UIView can be used on your TV, on your desktop, on your wrist, and in your pocket. That’s a big win for developer productivity.

It’s also a win for designer productivity: you can share app design elements. We already see this in Apple’s cross-platform apps when colors in Calendar match, speech bubbles in Message have the same shape, and Notes shares a special glyph for the “A”.

Everyone’s excited to know what the Dark Mode on iOS is going to look like. My guess is that people who have been running a dark user interface on their Mac have already seen it. It’s hard to find a balance of readability and contrast with dark elements and I don’t see Apple’s designers making any major changes in next week’s announcement. There will certainly be refinements, but it makes no sense to throw out the huge amount of work that’s already been done.

I also see the Mac leading the way with techniques that provide a more vibrant interface and allow a customer to customize their device. The accent color in System Preferences would be a welcome addition in the iOS Settings app.

All of this leads to a common appearance across platforms. In the near future, we’ll be in a nice place where our architecture can be shared in models, and our designs shared in views.

That leaves us with one final problem to solve: how do we share our interactions and controllers?

The Arrival of New Interactions

Apple ties interactions to platforms and their associated hardware. The Mac has interactions that are different than iOS. And watchOS has ones that are different than iOS. On tvOS, you can be limited to four arrow keys and two buttons.

There are some interactions, such as a swipe, that appear on multiple platforms, but as a whole each platform is different. This approach lets the customer get the most out of the device they purchase.

As we start to think about how interactions are shared amongst platforms, it’s wise to consider new hardware might be arriving soon.

For the past few years, Apple has been putting a lot of effort into augmented reality (AR). And I have no doubt that this hard work is not for our current devices.

AR is a great demo on an iPhone or iPad, but the reality is that you can’t hold a device in front of your face for an extended period of time: your arm gets tired after just a few minutes. I made this argument over a decade ago when everyone was getting excited about multi-touch displays coming to their desktop. It still holds true because it’s a dumb idea, just like AR on a mobile phone.

Apple will solve this problem with new hardware. And these devices will run “headgearOS” with new and completely different interactions. If you’re that Messages developer, it means you’ll be writing new code for yet another platform. Yay.

There are other twists to this story: rumors about iPads with mice and Macs with touch screens. And let’s not forget about interactions with voice commands using Siri technologies.

It all adds up to a situation where the complexity of products is increasing exponentially as new devices and interactions are introduced. There has to be a better way.

How Not to Do It

Cross-platform frameworks have a long history of sucking. If you ever used a Java app during the early days of Mac OS X, you know immediately what I’m talking about: the interactions were from a different universe. The design of the system was a “least common denominator” where only a limited set of capabilities was exposed. It just felt wrong.

More recent attempts have had more success but they still fail to address the problem of ever-expanding interactivity.

Apple’s been down this road before and I don’t see them making the journey again. Instead, I see them taking a new and forward thinking direction. A bold and pragmatic change that Apple is famous for: they’d be setting themselves up for the next decade of user interaction.

So what could they do that no one else in the industry is doing?

Declarative Interactions

As Matt Gallagher notes, we’ve slowly been heading towards a declarative programming style.

Declarative programming is describing a system using set of rules and relationships. The rules and relationships cannot be changed during the lifetime of the system (they are invariant), so any dynamic behavior in the system must be part of the description from the beginning.

Syntactically, declarative programming is often about assembling a whole system of rules as either a single expression or domain-specific language whose structure reflects the relationship between the rules in the system.

Layout is an inherently declarative task. Layout is a set of rules (which Auto Layout calls “constraints”) that apply to the contents of a view, ideally for the entire lifetime of the contents. Constraint programming itself is sometimes considered a sub-discipline of declarative programming.

He pulls this together with his own experiences into a prediction about a Swift-only framework for “declarative views”.

Independently, John Gruber has made similar observations.

The general idea is that rather than writing classic procedural code to, say, make a button, then configure the button, then position the button inside a view, you instead declare the button and its attributes using some other form. HTML is probably the most easily understood example. In HTML you don’t procedurally create elements like paragraphs, images, and tables — you declare them with tags and attributes in markup.

In my opinion, limiting this thinking to just views and layout is short-sighted.

That’s because it doesn’t address the interaction problem with an ever increasing set of platforms. But what if this new framework not only let you declare views, but also the behaviors they enable?

The developer would describe the interactions an app supports. There would be relationships between those declared interactions. All this immutable information would then be processed by user interface frameworks. Your app’s behavior would be determined at runtime, not when it was compiled.

Think about how this would work using auto layout as a point of comparison. Until that declarative system came along, we all worried about frame placement. Now we just worry about the relationships between those frames and let the system pick what’s best.

With our interactions, we still have this tight coupling between a user action and app’s behavior. Tapping on a button or a swipe gesture invokes some code directly. A declarative interaction would be a layer of abstraction between what a customer does and how your app reacts.

Again, using auto layout to help form our thinking, what if there were “interaction classes” that functioned like size classes? Your iPad would behave as it always has until you plugged in a mouse. At that point, how you interact with the device adapts:

  • Controls could get smaller because of the increased pointing accuracy
  • Views could gain a hover state to display additional information
  • Drag & drop could change because it no longer depends on two fingers
  • A mechanical wheel could replace a finger for scrolling

This kind of adaptability would work across platforms – your app would behave differently when it was running in augmented reality or on a TV screen. As a developer you wouldn’t have to worry about what kind of hardware is available, you’d have to worry about what to do when a customer used it to perform a task.

I’m not going to predict how this would be accomplished. Yes, it could draw inspiration from React or other similar technologies. The only thing I’m confident of at this point is that Apple knows it has a problem and is working actively to solve it in a platform-independent fashion.

Marzipan is our first step. And what I’ve described above is Amber, the next step.

Adulterated Swift

There has been some discussion lately about Swift not having the dynamic features of the Objective-C runtime. Brent Simmons has been doing a great job of pointing out why this is a problem.

It’s very easy to overlook the importance of the dynamic runtime environment. Many of the things it enables happen behind the scenes. There’s no better example of this misunderstanding than a developer who says their app is “Pure Swift”.

That’s because you can’t currently write an app that only uses Swift. The purity of your code is lost as soon as you #import UIKit.

A picture’s worth a thousand words, so here’s a project that demonstrates why it’s impossible. The app itself is ridiculously simple: there’s a button, a few labels, and a single action method. It’s as “Pure Swift” as you can get.

Download Pure.zip

This project also contains an Objective-C category named NSObject+Adulterated. This category overrides two methods that lie at the heart of the dynamic runtime: -respondsToSelector: and -methodForSelector:. The normal operation of these methods is not affected; the only additional functionality is some logging to the console when they’re used. Other than being in the bridging header, the code is not used directly by the app.

When you run the app, you’ll immediately see all the places where Swift is not being used. Pretty much everything you take for granted in your app is using the dynamic runtime. Layout, drawing, animation, and event handling are all dependent on something that Swift doesn’t have yet.

Of course, you’ll immediately say, “This isn’t Swift’s problem! Just rewrite the frameworks!”

And that is exactly the point I’m trying to make.

The community around Swift’s evolution is amazing. The language is improving quickly and dramatically thanks to talented developers inside and outside of Apple. It’s a remarkable open source project.

My concern is that there isn’t a corresponding discussion about the things we’re going to build with this new language. As you’ve just seen, frameworks are important, yet there is no uikit-evolution mailing list. There is an imbalance between the tool and the craft.

I’m guessing that part of this problem lies within Apple itself. There are plenty of developers in Cupertino who have built large applications and the frameworks that they use. I’m absolutely certain that this subject has been discussed in detail by some very smart folks. And they, of course, can’t talk about internal projects.

My suggestion for the folks on the Swift project is to be a little more forthcoming about future plans with frameworks and other infrastructure besides the language itself. It’s a big piece of a puzzle that long-time app developers want and need.

A lot of hand-wringing could be ameliorated with a simple “Yep, we’re working on it.”

The Forensic Shit Show

It turns out someone at the FBI advised another law enforcement officer in San Bernardino to reset the iPhone that the government wants Apple to unlock.

This is just another episode in a complete forensic shit show.

Remember, this is the same case where the media was allowed to roam freely through a crime scene. One of the photos in that gallery shows a computer without an Ethernet connection on the wall (the age of the apartment also suggests that there would be no wired Internet.)

What are the chances that there was a wireless network in that apartment? What are the chances that there are IP logs on that router? Or maybe some kind of data backed up to a disk on the router? Here’s another wild guess: maybe that router was used to connect to an online backup service.

Yep, someone did the equivalent of a “restore factory defaults” on a device under active investigation.

What we’re seeing here is law enforcement’s complete lack of understanding of how digital devices store and transmit data. This new evidence is much more intricate than smoking guns or blood splatters. The important stuff is what you don’t see: it’s a hard problem where the people dealing with it are untrained. Shit, I work in this business and trying to decipher what’s going on makes my head spin.

Yet law enforcement is asking Apple to not only provide data, but also to create a forensic instrument that allows them to extract information from any device. And by its very nature, this tool would be made widely available throughout the forensic and law enforcement community.

Basically, the government is asking Apple to hand over a golden key that can defeat the security of any device to folks that can’t even secure a wireless network. Worse, this whole process is being overseen by politicians that think the problem is predators getting access to their grandkid’s Playstation.

This is why the entire tech community is saying “No fucking way.”

Updated February 21st, 2016: Several people have commented about my use of “restore factory defaults” in the post above. My intention was figurative, not literal.

The folks involved with the investigation were pressing buttons without understanding the consequences of their actions. To me, it feels like a “reboot to fix” approach. The password reset did not damage any data, it just made automatic backups stop working because iCloud information on the device needed to be updated, and that can’t be done without a passcode.

Others have reminded me that the FBI had cleared the crime scene. That’s true, but since the Wi-Fi equipment was not collected as evidence, it still shows that the investigators were out of their league. In an electronic investigation, a router is a key piece of the puzzle.

Both of these things are details in a bigger picture: the FBI wants to hold the private keys to a public key encryption system that affects the privacy of hundreds of millions people. If they can’t get the details of an online backup service right, how the hell do we expect them to guard a back door?

There’s also a possibility that the iCloud password reset was intentional. If this is the case, we have a government that is extorting Apple by essentially planting evidence. Imagine what they could do with a private key.

Half-Assed

Every Mac developer that uses iCloud has a dirty little secret:

They don’t fully test their software before they ship it to customers on the Mac App Store. It’s because Apple won’t let them.

iOS developers, on the other hand, can upload a build to TestFlight and use the app with the iCloud production servers to make sure everything is working great before it gets sent to the App Store for review.

TestFlight has been available to internal developers since iOS 8 was announced in 2014. The system was opened up to external testers who have an iTunes account in the early part of 2015.

Mac developers have never had access to TestFlight, either internally or externally. It’s “coming soon”, and until that day comes, there’s no way to test apps that use the iCloud servers. Which sucks for both the developer and the customer.

But wait, there’s more.

Apple is touting analytics as an awesome new feature for developers that use the App Store to distribute their creations. It’s a huge benefit to our businesses, but only when you’re selling solely on iOS. This feature is nowhere to be found on the Mac App Store. Again, it’s “coming soon”.

Just yesterday, Apple did something great for developers. They now block reviews on beta OS releases. Unless that operating system is for the Mac.

Let me guess: it’s “coming soon”.

It doesn’t take a genius to see that Apple is doing something it rarely does: a half-assed job.

As developers, we completely understand how much work it is to announce these kinds of initiatives and get them working on multiple platforms. It’s not easy and takes a lot of resources. But it’s clear that these precious resources are not being allocated.

Apple needs to change its priorities for the Mac App Store or just shut the whole thing down. As it now stands, developers who are tired of being second-class citizens are making that decision for them and leaving on their own.

This is a pity because the Mac App Store is a great way for customers to download and purchase software. No one benefits from this half-assed job.

Updated July 23rd, 2015: I think the thing that bothers me most about this situation is the inequality. Mac developers aren’t getting the same value from the App Store as their counterparts on iOS. We all pay Apple 30% of our earnings to reach our customers, we should all get the same functionality for that fee.

Dupe this Radar if you agree.