Behind The App: Twitterrific 5

I’ve always loved shows that take you behind the scenes of creative efforts: Project Runway and Classic Albums being two of my favorites. Here’s one about the development of Twitterrific 5.

Since a lot of the people reading this aren’t developers, I’m going to keep the jargon to a minimum. You’ll be able to enjoy this piece even if you don’t know what an Xcode is.

Statistics

To get an idea of the scope of the project, and see who worked on it, let’s look at a few numbers.

The first source code file was created on February 15th, 2012. Since that first check-in, there have been 2032 changes resulting in 824 files. Of those files, 73% are source code, 25% are images, and 2% are other assets such as fonts and sounds.

Sean Heber was responsible for 976 changes, while I came in second with roughly half as many (557). Tyler Anderson made 397 commits and David Lanham followed that with 95. Anthony Piraino and Gedeon Maheux picked up the rest.

We did weekly beta releases between October 26th and November 19th. The following day, we submitted the build to Apple for approval, which happened one week later. The app was released today, which is actually yesterday in Japan. :-)

Twitter Changes

If you’re at all interested in Twitter’s development ecosystem, you know that approximately six months into our product development there were some major changes announced.

In August, a posting on Twitter’s developer blog announced that changes were coming. In spite of the new requirements and limitations, we were happy to know that our API access wasn’t going to be revoked. Since the early days of the project, we had been worried that third-party development might be eliminated completely.

As more details about API 1.1 became available in September, we became even more upbeat about the changes. There were a lot of improvements in the API and the new infrastructure was extremely robust. Adapting our existing code to the new API took very little time and allowed us to simplify a lot of our internal code. It’s good to know that management’s decision to control the core Twitter experience hasn’t affected their great API engineering.

At this point in time, we also learned that we had plenty of user tokens for the upcoming version. We agreed with our friends at Tapbots, there is no reason to panic.

One downside to the limit of user tokens is that we can’t afford the “cost” of a free app. If someone downloads a free version of Twitterrific and then uses it once, we can never reclaim that token. Tokens are a scarce resource that developers have to manage carefully in order to recoup their development costs. If you’re holding out for a price drop with our app, you’ll be disappointed. The price of Twitter apps is going to rise from now on.

Development Approach

So when you have five people working on a piece of software, how do you go about it?

This time around, we took a much more design-centric approach. When you have a large project, be it a movie or a piece of software, you have to have someone calling the creative shots. For Twitterrific 5, that person was David Lanham.

We’d done Twitter clients before and honestly weren’t that enthused about doing another one. That was until we saw a complete and clickable prototype that David had created. That “wow!” moment you have when you launch and start to use Twitterrific 5 is something we experienced last February. We all knew we had to make this real, even if it was a bit crazy. Our foreman, Gedeon Maheux, made the call and the project began.

Some natural pairings occurred between the three developers and single designer.

As senior developers, Sean Heber and I decided to split the workload between the front-end and the back-end. Sean had a part in everything you see on screen. I had a part in everything you see going over port 443 (for you non-developers who won’t get that joke, that’s the secure network connection that the Twitter API uses.)

This turned out to be a great decision. The user interface doesn’t depend on knowledge of how data is transported over the network, processed and then cached in a database. In fact, in the early stages of the project, much of our back-end existed as a separate project with no user interface. It used automated tests that exercised multi-threaded access to the Twitter API. This independence between the UI and the network services allows new APIs to be added quickly and easily.

(AND IT MAKES ME KING OF THE BACK END IF YOU KNOW WHAT I MEAN)

One of the benefits of writing your own version of UIKit is that you have a pretty complete understanding on how to use it effectively. All the sexy animations and views you see in the the app are Sean’s doing. As he continued to amaze us all, we came up for a word for what he does: juice. For example, just the other day, he closed an issue titled “Add JUICE™ to Photo view”. (You’ll see that change in the 5.0.1 release that’s in review now, and yeah, it’s great.)

Splitting the front and back-end work between Sean and myself had another benefit: we were constantly optimizing our code so that it would make the other guy’s code work better. A lot of people have commented on how fast Twitterrific feels: hundreds of iterations made that happen.

Another pairing was between David Lanham and Tyler Anderson. From the start, we knew that we wanted customers to have control over the appearance of their timelines. Since Sean had done of version of UIAppearance proxy in Chameleon, that was a natural choice to accomplish our goals.

What happened next though, surprised us in a very good way. David started using Xcode.

Since Tyler was writing code that allowed all the appearance modifications to be collected in a single file, it was easy for David to go into that file and tweak the settings to get things exactly as he wanted. Any coder who’s worked closely with a designer had heard these phrases: “Can you replace this image?” “Tweak this color please!” “Nudge that button to the right a few pixels.”

Instead, David was adding files to Xcode and modifying UIColor and UIEdgeInset definitions by himself. Letting the designer take control of his designs, do test builds, and refine the interface produced some amazing results. And it let our developers focus on what they do best: typing code.

As work progressed on the product, we realized we were making something special. When that happens, all sorts of crazy and wonderful things start to form without you really knowing how or why. The best example is when Sean and David added gestures. Twitterrific was already in beta at that point and we were completely happy with the action menus, but the two of them went quiet for a couple of days saying only they were working on “something cool”. And they were right: we can’t imagine using the app without gestures now.

A Clean Slate

So now that you’ve heard about what led up to today’s release, let’s finish with some thoughts about future directions.

As soon as you launch Twitterrific 5, you’ll realize that it’s a clean slate. The visual design is obviously a fresh start, but everything under the covers is as well. When presented with a clean slate, you’re very careful what goes on it. It’s best feature is simplicity. I wrote about Gruber’s First Law of iPhone Development four years ago:

Figure out the absolute least you need to do to implement the idea, do just that, and then polish the hell out of the experience.

That still holds true, and we feel like we’ve done the least and polished the most.

We are well aware that people are going to complain about missing features: push notifications and streaming are obvious examples. But so are trends, and video support, and in-line photos, and… well none of that matters. We believe in building opinionated software.

The product you have in your hands is what we wanted and needed it to be. As the tagline on the website says, “A simply beautiful way to tweet.” We achieved that goal.

Personally, I find myself actively disabling notifications in most of the apps I install these days. Notifications are great when used in moderation, but it’s very easy to use them to the point of distraction. Since I read Twitter as free time permits, I don’t need a reminder. Similarly, a constant flow of streaming tweets interrupting my day sounds more like a bug than a feature.

But guess what? That doesn’t matter either. I’m just one of the opinions that made this software.

And now we’re going to start listening to our customer’s opinions. We’re all excited to get feedback from a larger audience and see how that fits in with our new minimalist vision. If you think Twitterrific is good now, just wait until you see what happens with your collaboration.

iPhone 5 Scuff Remover

Removes unsightly scratches, dings and scuffs IN MERE SECONDS! *

Get yours NOW!

And we think you’re going to love it.

* black iPhones only

Retina for Masochists

Today we released an update for xScope that supports the Retina display. As I alluded to on Episode 14 of The Talk Show, this update was harder than most. The 68k to PowerPC, Carbon to Cocoa, and PowerPC to Intel transitions were no walk in the park, but this update really kicked my butt. Here’s hoping that sharing some of the things I learned along the way will help you with your own Retina work.

For most developers who are working strictly in window points, an update for the Retina display is a fairly straightforward process. xScope, however, does a lot of work with both pixels and points. And that’s where the fun begins…

Mouse Input

The first gotcha I encountered while doing the Retina update was with mouse input using NSEvent’s +mouseLocation. The team at Apple has done some amazing work making sure output looks stunning on the Retina display, but being able to get high-resolution input is definitely lacking.

There are two problems at play here. The first is that mouse coordinates can be reported for coordinates that do not exist on any attached screen. The second is that the NSPoint does not contain enough resolution to address every pixel on screen.

To deal with the first problem, I used an NSEvent category that clamps +mouseLocation results to valid coordinates.

For the second problem, the only workable solution was to capture the +mouseLocation and then track -keyDown: events so the arrow keys can home in on the destination pixel. Yes, kids, that’s what we call a painful fricken’ hack.

Window Positioning

The next big headache was caused because you can’t set a window’s frame using non-integral points.

xScope does this a lot. The best example is with the Ruler tool: the origin of the ruler can be positioned to both even and odd pixels on screen. The red position indicators are also small windows that point to individual pixels in the ruler window.

The workaround is to make an NSWindow that’s larger than you need and then adjust the bounds origin of the NSViews it contains. The pain here is that it immediately introduces a dependency between windows and views. For example, the position indicator windows need to know if the view they’re hovering over has had its bounds origin shifted.

Pixel Alignment

There are many cases where xScope has to align to a pixel boundary. I found myself using this pattern many times throughout NSView -drawRect: code:

CGFloat backingScaleFactor = [self.window backingScaleFactor];
CGFloat pixelWidth = 1.0 / backingScaleFactor;
CGFloat pointOffset = lineWidth / 2.0;

The pixelWidth tells you how wide a single pixel is in points, while the pointOffset can be used to align a coordinate so that it straddles the Quartz drawing point:

NSBezierPath *line = [NSBezierPath bezierPath];
[line setLineWidth:pixelWidth];
[line moveToPoint:NSMakePoint(point.x + pointOffset, 0.0);
[line lineToPoint:NSMakePoint(point.x + pointOffset, 100.0);
[line stroke];

Another common pattern was to use the backingScaleFactor to align a coordinate to the nearest pixel in the view:

NSPoint point;
point.x = floor(x * backingScaleFactor) / backingScaleFactor;
point.y = floor(y * backingScaleFactor) / backingScaleFactor;

Of course you can do much the same thing with NSView’s -centerScanRect:, but in my experience it’s much more common to need aligned NSPoint values when you’re doing custom drawing. Creating an NSRect just to align the origin is a pain.

Flipped Coordinates

As Cocoa developers, we’re used to the pain and suffering caused by flipped view coordinates. Retina support can add a new dimension to this headache when you’re dealing with pixel coordinates.

Say you have a flipped NSView with an origin of 0,0 and dimensions of 1440 x 900 (e.g it covers the entire Retina screen in points.) Y coordinates on the screen can range in value from 0.0 to 899.5. When those coordinates are flipped, the range of values then become 0.5 to 900.0: which is off by a pixel (half point) if you’re trying to address the full range of view coordinates. The solution is to adjust the flipped coordinates like this:

NSRect windowRect = [self.window convertRectFromScreen:NSMakeRect(screenPoint.x, screenPoint.y, 0.0, 0.0)];
NSPoint viewPoint = [self convertPoint:windowRect.origin fromView:nil];
	
CGFloat backingScaleFactor = [self.window backingScaleFactor];
CGFloat pixelWidth = 1.0 / backingScaleFactor;
viewPoint.y -= pixelWidth;

I’ve always wondered why there’s this odd note in the +mouseLocation documentation:

Note: The y coordinate of the returned point will never be less than 1.

Even though it’s a lie, the extra pixel does make the coordinate flip work correctly—and now you have to take care because that pixel can be a fractional point.

Summary

Hopefully my trials and tribulations will help you in your own development. The good news is that the beautiful results on the Retina display make all this hard work worthwhile.

The First Apple Channel

Dear Tech Media,

While you’re looking for meaning in the shadows of an Apple press invite, you’re missing something important: Apple is producing content for its own distribution channel.

For the month of September, Apple is letting customers view live shows through a combination of apps, the web, and Apple TV. It’s the fourth year of the iTunes Festival in London, but this is the first year that it’s been broadcast via iTunes.

Why is this important? Let’s look at what this means for the various players involved:

Artists

As an app developer, I know what it’s like to be featured by Apple in one of its promotions. It sells a lot of product. And that, in turn, funds our creative efforts.

I’m sure the featured artists will gain fans as a result of their performances. I’ve watched a few shows and have already seen some bands that I’ll be keeping my eye on.

A lot of these artists are also probably working with Apple for the first time and getting a feel for what a more direct relationship with a distributor feels like.

Customers

As a customer, I’m all too familiar with the hassles and restrictions on digital content. It’s an eye opener to be able to play this content wherever and however I want. No crap, just good shows.

Tickets for the events are also free: seeing your favorite band in a small venue where all you have to buy are the drinks? Sign me up!

Apple has chosen the artists wisely. I couldn’t care less about some of the bands, but you should have seen my niece’s eyes light up when I told her that she could watch a free One Direction show on September 20th. Talk about keeping your customers happy!

Media Industry

The iTunes Festival shows everyone above what a world without a middle man would be like. We’re loving it: they’re fearing it.

Apple

You need an iTunes account to view these shows. If you didn’t have one already, you’ll certainly get one to see your favorite band.

The best viewing experience for these shows is on a $99 Apple TV. That’s less than the cost of a couple of tickets to see the big name acts. The drinks aren’t watered down, either.

It also sets a precedent for the future. Could this be akin to HBO creating premium content for it’s subscribers? Or Netflix producing its own shows to make it’s streaming service more desirable?

Apple first got its feet wet in the content business with music in iTunes. What we’re seeing here may be the company’s first effort in the video business.

Updated September 6th, 2012: I’ve heard from several sources that last year’s iTunes Festival was an iPad-only app (with AirPlay capabilities.) Apple has taken small, calculated steps with the Apple TV platform and this is another example of that approach.

Responding to App Store Reviews

When developers talk about wanting to respond to reviews, many of them haven’t thought through the social implications of what that means. Matt Gemmell has. As Marco Arment points out, replying publicly also leaves iTunes (more) open for abuse by unscrupulous or uninformed developers.

One idea I’ve had is giving developers the ability to add a support link to a review. This helps both the developer and customer in several ways:

  • The customer who reported the problem could be notified that a support link was added to their review and would be directed to a site which is designed to help them out. This could also lead to direct contact if there are other issues to be resolved.
  • Potential customers that are reading reviews can see how a developer responds to problems. If you come across a product with lots of support links, you know that’s a developer who cares about his customers.
  • Putting customer service front and center in iTunes makes it desirable for developers to create and maintain sites that provide helpful information. There are far too many products where the customer support link just goes to a product page that’s unhelpful.

Of course, restrictions would be needed to prevent abuse of these external links. For example, Apple could decide to only allow links to a developer’s support domain. There could also be limits on the number of support links a developer has at their disposal (like promotion codes, we would then use them judiciously.)

Finally, these thoughts only cover the information we exchange with the customers publicly. I still think there are cases where private contact via email is vital.