Introducing Waterscope

It’s common wisdom that you should release a software product when it is minimally viable: get an early version out in the world as soon as it can perform a useful task for a customer.

When that product is for someone who is a developer that’s been coding since the dawn of time, the equation gets flipped on its head. Waterscope is a Maximally Viable Product™ and the customer is me.

The app got its start when Swift 1.0 was announced back in 2014: I wanted to build something with the new language. About that same time, I had also started learning about tides and how they are predicted. It’s a complex problem that has been vexing scientists since the three-body problem was first proposed by Newton with the publication of the Principia in 1687. Like determining the time and place for a lunar eclipse, we rely on derived approximations.

For learning a new language, tide prediction provided a lot of interesting work: data collection, complex calculations, graphical presentation, and animatable data. It also let me know when it was a good time for a dog walk.

As an ocean swimmer, I also wanted my weather app to provide information about water conditions. It turns out the scientists at the National Oceanic and Atmospheric Administration (NOAA) have that all figured out. As do the meteorologists at the National Weather Service (NWS) with their API for weather observations and forecasts. There are even high resolution images from environmental satellites launched by NASA. The United States government provides a treasure trove of data; the challenge with Waterscope was to organize and present it in a consistent manner.

Screenshots of Waterscope showing the home screen graphs, a map view with sea and land weather conditions, and a satellite view of California.

Which leads to a secondary goal for this app: to make it completely by myself. I work with some incredibly talented designers, but I wanted Waterscope to be uniquely my own. All the design, for better or worse, was created by my own hand. (The only exception is the use of SF Symbols when showing weather conditions.)

So not only was I learning Swift, I was also learning Sketch and, of course, how much time it takes to get something to feel right. Like coding, it’s not as easy as we sometimes make it look. Don’t take your designers for granted :-)

Along the way, there were some interesting hurdles. Some visual, some simple, and some complex. Many of the things I wanted to do required learning about astronomy and orbital mechanics. My sketches from Linea will give you an idea of the breadth of the challenges. (And being the day after the solstice, that first sketch is particularly relevant.)

Sketches for a user interface design that shows seasons using the illumination of the Earth, the math for linear interpolation, and a spherical projection of a point on a satellite image.
Sketches for a user interface that shows seasons by illuminating the Earth, the math for linear interpolation, and a spherical projection for a location on a satellite image.

Here I am, six years later. I’ve learned a lot, but as with my first app in Objective-C, the most important thing about this exercise was how not to use Swift. It will probably take me another 18 years to come to terms with this new language, and feel like I’ve mastered it, but a journey can’t start without the first steps. Another insight is that a programming language is just a means to an end: the hard part is not the code, it’s understanding what needs to be done.

I’m releasing Waterscope today because there are certainly other folks who will benefit from my personal weather app. There may even be some educational value in seeing how I approached a data-rich user interface (hint: Edward Tufte’s books taught me). Information can be dynamic and beautiful.

At the same time, if you’re outside the U.S., it’s unlikely to be a satisfying experience: most of the data sources and their presentation are oriented towards North America. An example: in the southern hemisphere your view of the sun and moon’s orbit is in a counterclockwise direction as you look north. Waterscope displays a clockwise orbit.

But the good news is that Waterscope, like the data it uses, is FREE to download and use. Enjoy!

Luna Display

What if I told you that you could add a Retina Display to your MacBook Pro for under $100? And what would you think when I showed how it plugs into your computer?

The business card underneath this hardware gives you some hints, but where did this magical device come from and how does it work?

This story all starts out in The Iconfactory office in Minnesota where we do custom app development. It’s a co-working space that we share with the fine folks at Astro HQ, the makers of the popular AstroPad app.

Troy Gaul showed me the video from the Kickstarter page, Matt Ronge asked me if I’d give it a try (“Hell yeah!”), and Savannah Reising sent me the prototype hardware you see above.

I’ve had two displays on my development machines since the 1990’s. By now, it’s fully ingrained in my habits and makes for a very efficient workflow. So much so that it’s the capability I miss the most when I travel, especially when I have an iPad with a kick-ass display. It’s not hyperbole to say that this new product, called Luna Display, solves this problem completely.

It’s also got me thinking about how to reorganize my home office, because this is a great setup no matter where you are!

The prototype hardware arrived late last Saturday, so here is how my Sunday morning started…

Setup

I put the USB-C hardware you saw above into my wife’s MacBook Pro. My laptop is a bit older and uses Mini DisplayPort for attaching external displays (Astro HQ is working on this interface but the prototypes aren’t quite ready.) The device immediately started blinking. Who doesn’t love a blinking light on new hardware?

I then downloaded the Luna Display app for the Mac. It’s clearly labeled as a “Technology Preview” and I’m fully aware of what that means :-) For example, there are some menu options in this version, like “Reload Codec Config” and “Luna Device Reset”, that tell me engineers are still tweaking things. Like any Kickstarter project, anything I write about now may not be in the shipping product.

Even that blinking light, which stopped after I launched the app, might not be in the final product. But I hope it is, because it’s simple feedback that the Mac side isn’t running. And it blinks!

Earlier in the week, I got a TestFlight invite for the Luna Display app on iOS, so I got that all set up. After launching this app, you get a message saying “Look at your Mac Screen”. Man, I love it when developers think about how you’re going to use something!

On the Mac, there was a dialog asking if PRO BABY could connect via Wi-Fi. That’s my 9.7” iPad Pro, so of course I clicked “Allow”, but it’s good to know that they’re thinking about who gets to see what’s on my Mac.

At this point, Luna Display is all set up and I’ve got dual displays. This is awesome!

Tweaking Things

The Mac app presents two buttons: “Enable HiDPI” and “Display Arrangement”. Getting the Luna Display in the right place was the first thing I wanted to do, so it was very helpful to have that second button.

The first button presented a message to “Install the Luna System Extension”. This extension is needed to put the display into a full Retina resolution, but years of happy marriage have taught me that you don’t install with a password on your spouse’s laptop. So I declined and can’t wait to do it when my miniDP prototype arrives!

Since the Luna Display is running on an iPad, touches and Pencil input work as you’d expect. In fact, they work a little too well, because I caught myself moving my finger over to the MacBook sitting next to it. Maybe Microsoft is onto something here, after all.

I was also curious if I could use multiple iPads simultaneously, I’m an iOS developer so conceivably I could have a MacBook with six displays of varying resolution and speediness. As we’ll see next, there’s a good reason you’re limited to only one iOS device at a time.

A Networked Display

The first thing on my mind, and probably yours, is how’s the display quality? And the answer to that question leads us to the most important part of this technology: the link between your Mac and iPad.

When we renovated our 1920’s bungalow in Laguna Beach, Wi-Fi was just getting started and the speeds weren’t great. As a result, we have a lot of gigabit Ethernet and not a lot of motivation to upgrade a bunch of old Airport Extremes. The network isn’t terribly fast, but most of the devices that can outrun it can be hardwired.

Except now I have a Mac that’s sending screenfuls of compressed pixels to my iPad’s Retina Display over an aging network. And guess what?

The display looks pretty damn good! If I look closely, there’s some blockiness as I move the windows around quickly, but things like the iTunes visualizer look much better than I expected.

I have some experience flinging pixels between a Mac and an iOS device so the effectiveness of Astro HQ’s LIQUID technology is not lost on me. It’s clearly something they’ve honed over the years with their Astropad product. It also explains that oddly named menu item we saw above: “Reload Codec Config”.

There’s also a “Vitals” window that you can open on the Mac to see how well this technology is working. I love to geek out on graphs and the data shown above helped me understand what was going on under the hood: it’s all about the bandwidth. The more you can give Luna Display, the better it looks. You want those blue throughput lines to be as high as possible. The bump in graphs is while I was moving a window around furiously on the Luna Display — I saw similar results while running the iTunes visualizer.

To make my wireless network worse than it normally is, I started a download Xcode on another Mac connected to the same access point. This chewed up about 50 Mbps, and the quality of the Luna Display decreased dramatically.

So yes, someone else in your office could ruin your productivity. Bummer.

A Wired Display

It was at this point I remembered that you could also use a USB connection between the Mac and iPad. Let’s give that a try!

As soon as the cable went in the display quality was perfect. Wowza.

And those blue graph lines you saw above? They were literally off the chart with the USB connection.

(Something tells me that the price to upgrade my network will be way more than this Kickstarter is costing me.)

Practically speaking, I think this is a likely configuration if you’re using these devices for an entire workday. The USB cable lets you share the power load and recharge during extended use. It also completely avoids the problem of your coworkers watching 4K video on YouTube all day.

In the likely event that you’ll take your Luna Display while you travel, make sure to pack a cable, because we all know how great hotel Wi-Fi can be.

A Display With an OS

Everything else behaves just as you’d expect and acts like any other display you’d plug into your Mac. The only difference with Luna Display is that there’s an operating system controlling what gets sent to your second screen.

When you put the Luna Display app into the background with the iPad’s home button, the display disappears from the Mac after a few minutes. All iOS apps have a limited time in the background and Luna Display is no different. When this timeout occurs, you’ll see the Mac’s display reconfigure and windows will move to new spaces. As soon as the iOS app is brought back to the foreground, all your windows move back to their original locations.

Quirks

All new software has its quirks, but I encountered surprisingly few of them while testing Luna Display. This product is much more mature than I expected from a Technology Preview.

One thing I didn’t expect to see was “Luna Display” as the color profile for the display. Considering that I’m running on the first “Display P3” device that Apple shipped, that wider gamut should be utilized. Changing the profile didn’t appear to have any effect, either. I’m kind of a stickler for this kind of thing :-)

In talking with the Astro HQ developers, the reason is because this version uses an 8-bit color pipeline. They are working on 10-bit color and wider gamuts since it’s a great feature for designers using AstroPad.

There were a few times where things disconnected unexpectedly or got hung up. Nothing serious, and it was easy to recover from (it’s like accidentally unplugging a display: the Mac handles that gracefully.)

Even so, the developers are working to improve this situation, which is mainly caused by congested Wi-Fi. They still have some work to do on their UDP-based network protocol when other HTTP requests start hoarding bandwidth.

As I’ve only had the device for a few days and been testing on my wife’s computer, I didn’t do any battery life measurements. I did notice that plugging in the USB cable between the Mac and iPad caused the iOS device to charge, decreasing the battery life of the MacBook Pro.

In Conclusion

I know the folks behind this device. They’ve been shipping great products on macOS and iOS for many years, and I have great confidence that they will iron out any of the kinks in Luna Display.

I also know that I’ll never have a MacBook or an iPad without this tiny bit of magic. Please join me in backing the Kickstarter.

Accessibility Matters

At the Iconfactory, we look at accessibility as a way to give back to a community that’s given us so much. We were thrilled to be inducted into the AppleVis Hall of Fame.

Adding VoiceOver and other accessibility features to your own app is extra work. But as soon as you realize that you’re making someone else’s life better, it’s all worth it.

This is also a good time to remind you that we have an accessibilty project of our own: xScope’s vision defect simulation is open source.

Pretending You’re Not Busy As Hell

You know those last few weeks of a project where it seems like every ball you own is up in the air? Your desktop looks like bomb went off: stuff like “website comp (with hero)-20160414-final-2.1 copy.psd” and “DO NOT DELETE YET” scattered all over the place. You’re busy as hell.

And then you realize that you need to take product screenshots. Or do a screencast.

While doing screenshots for my upcoming book, I solved this problem by writing a simple shell script. It updates an undocumented Finder preference that controls whether the desktop is created or not. Without the desktop, all of your icons disappear (don’t worry, the files are still there!)

Simply typing finder_icons off in your Terminal lets you pretend that you’re working in complete zen and take the shots you need. Doing finder_icons on quickly brings you back to reality and lets you create an even bigger mess.

Enjoy!

Updated April 29th, 2016: Dr. Drang points out that this technique also works well for screen sharing. I try to avoid the use of killall when dealing with the Finder because you never know when it’s in the middle of a file operation (such as copying a file or deleting a folder.) Using the AppleScript quit command lets the Finder determine when it’s safe to shut down.

Updated October 16th, 2018: For those of you who prefer not to use a script, download the FreeMyDesktop app. This simple app modifies the preferences and kills the Finder from a button in the menubar. Handy!

The Forensic Shit Show

It turns out someone at the FBI advised another law enforcement officer in San Bernardino to reset the iPhone that the government wants Apple to unlock.

This is just another episode in a complete forensic shit show.

Remember, this is the same case where the media was allowed to roam freely through a crime scene. One of the photos in that gallery shows a computer without an Ethernet connection on the wall (the age of the apartment also suggests that there would be no wired Internet.)

What are the chances that there was a wireless network in that apartment? What are the chances that there are IP logs on that router? Or maybe some kind of data backed up to a disk on the router? Here’s another wild guess: maybe that router was used to connect to an online backup service.

Yep, someone did the equivalent of a “restore factory defaults” on a device under active investigation.

What we’re seeing here is law enforcement’s complete lack of understanding of how digital devices store and transmit data. This new evidence is much more intricate than smoking guns or blood splatters. The important stuff is what you don’t see: it’s a hard problem where the people dealing with it are untrained. Shit, I work in this business and trying to decipher what’s going on makes my head spin.

Yet law enforcement is asking Apple to not only provide data, but also to create a forensic instrument that allows them to extract information from any device. And by its very nature, this tool would be made widely available throughout the forensic and law enforcement community.

Basically, the government is asking Apple to hand over a golden key that can defeat the security of any device to folks that can’t even secure a wireless network. Worse, this whole process is being overseen by politicians that think the problem is predators getting access to their grandkid’s Playstation.

This is why the entire tech community is saying “No fucking way.”

Updated February 21st, 2016: Several people have commented about my use of “restore factory defaults” in the post above. My intention was figurative, not literal.

The folks involved with the investigation were pressing buttons without understanding the consequences of their actions. To me, it feels like a “reboot to fix” approach. The password reset did not damage any data, it just made automatic backups stop working because iCloud information on the device needed to be updated, and that can’t be done without a passcode.

Others have reminded me that the FBI had cleared the crime scene. That’s true, but since the Wi-Fi equipment was not collected as evidence, it still shows that the investigators were out of their league. In an electronic investigation, a router is a key piece of the puzzle.

Both of these things are details in a bigger picture: the FBI wants to hold the private keys to a public key encryption system that affects the privacy of hundreds of millions people. If they can’t get the details of an online backup service right, how the hell do we expect them to guard a back door?

There’s also a possibility that the iCloud password reset was intentional. If this is the case, we have a government that is extorting Apple by essentially planting evidence. Imagine what they could do with a private key.