Notes on Migrating from Aperture to Photos for OS X

I have enjoyed photography for many years, particularly since the transition to digital. I've been shooting exclusively digitally since I got my first digital camera in 2002 and the result is rather a large body of digital pictures.

This is the story of migrating from a system that involved Aperture and a bunch of jury-rigged hacks to Apple's new Photos for OS X.

Background

Since Aperture first shipped, I used it to manage all my digital photographs - until the iPhone came along and wrecked my workflow. I can't properly explain what went wrong but I think we all recognise now that there is a sense in which photos "live" on an iPhone in a way that they don't live on a digital camera. At least they didn't for us "serious" photographers but, if you recall the days before smartphones, many regular users did store photos on their digital camera as a way to take the photos with them and show friends.

Since the iPhone, I rather relied on occasionally dumping photos out into Aperture and then, later, relying on the never-very-good iCloud PhotoStream to get photos into Aperture. The implementation of Photo Stream in both Aperture and iPhoto was a mess, as evidenced by the dozens of projects I have named "PhotoSteam (month year)", each roughly correlating to the dates I opened Aperture.

So, at the end of this era, I had:

  • About 31,000 photographs in Aperture, totalling over 300GB.
  • The Aperture library residing on my Mac's internal SSD storage taking up 38GB
  • All master images referenced on a 3TB external hard drive
  • About 6,000 photographs in iCloud Photo Library, totalling around 6-7GB of iCloud storage space.
  • A folder of unknown number of other photographs of around 40GB in size from the time between early iOS 7 betas and iOS 8 shipping.

So, how to get all of this into iCloud Photo Library and back down to my iPhone and iPad?

Migrating the Library

The first step after installing the 10.10.3 beta was to migrate the Aperture library. Photos did this more or less automatically and quite well. The software is obviously still in flux, so exact details of UI are not worth discussing right now and I want to focus on the data migration.

Photos correctly maintained the connection of photos in the library to their referenced masters on the external drive. Everything worked well as long as the drive was connected. At this point I discovered that images with referenced masters cannot be uploaded to iCloud Photo Library. Only images with managed masters can.

The result at this point was that I could browse around 36,000 photos on my Mac but my iOS devices only showed about 6,000 photos. These were the 'native' iOS photos taken on my various devices in the iCloud Photo Library era, but none of my 'legacy' DSLR photos were coming across.

It is also possible to consolidate the masters into the Photos library (using File > Consolidate). Now, recall that my Aperture library was over 300GB in size. My MacBook Air only has 512GB of internal storage and a fair amount of that is being used for other things. This seemed like a bit of a stalemate since I couldn't consolidate the library for lack of disk space.

At this point I left things for a while as other parts of life interfered....and a few new 10.10.3 betas arrived.

Migrating the Data to the Cloud

The story resumes this week as I sat down to try and solve the problem once and for all. The impetus only somewhat increased by an unexpected burst of techno-lust for the new MacBook Apple announced this week.

My first try was to consolidate the entire library. This quickly failed due to lack of disk space, as expected. An indicated 200GB was required over and above what was available.

The next step was to consolidate some images. I started off consolidating a year at a time, which allowed me to make some progress. The default setup for Photos is to keep the full-resolution masters on the Mac's storage but there is an option to optimise instead. I turned this on.

At this point, it's worth noting that I'm still struggling a bit with a mental model of where all this data is going and what's happening to it. As a precaution, I have made backups of my entire Aperture library and masters on the external hard drive.

The result of this was that I was able to, in batches, consolidate around 25,000 of the legacy images into the Photos library. The library on disk grew to around 214GB. At the same time, I was monitoring outbound network traffic from my Mac in Activity Monitor. This indicated that uploads were ongoing, which I assumed to be Photos sending these images to the cloud now that they were no longer referenced.

I opened Photos on my iPhone and iPad and could see photos streaming down to these devices. It was helpful to turn off Summarize Photos (Settings > Photos & Camera > Summarize Photos) on iOS to see the full extent of progress when viewing the Years view on the devices.

The Impact of 30,000 Photos

As the number of photos in my iOS device libraries grew, I started to notice some impacts on performance and correctness in some apps on my devices. Twitter correspondents were telling me that they had seen poor performance on iPhone 5s devices.

Once I crossed 20,000 images, I started to notice the following on my iPhone 6 Plus running iOS 8.2:

  • The first effect was occasional instability in the Photos app on iOS. It was by no means unusable but the crash rate went from zero to not-zero (note: this is much better - but still not perfect - on iOS 8.3).
  • The next effect was that iOS apps that implement their own photo-picking UI started to really struggle to either remain performant or even show a correct view of the photos on the device. Particular offenders include Instagram and Explain Everything.
  • Apps that use the system photo picker UI continued to work well.
  • The Camera app started to be slightly slower to launch.

Time Taken

Having started the migration on Friday evening, by the time I woke up on Tuesday morning my devices were showing a more-or-less consistent view of my photo library. My iPhone had downloaded tiny thumbnails for all images although my iPad was still catching up.

The image counts were a little inconsistent at this point:

  • Mac: 30,931 Photos, 211 Videos
  • iPhone and iPad: 30,190 Photos, 211 Videos
  • iCloud Settings Panel: 31,030 Photos, 213 Videos (Settings > iCloud > Storage > Manage Storage > iCloud Photo Library)

The only reliable way I found to determine whether my Mac was completely finished migrating all the data to the cloud was to observe the Networking tab in Activity Monitor. When Photos was migrating, there was a very obvious pattern to the upstream bandwidth usage.

During this time, the system did not instantly sync anything. If I deleted an image on one device, it persisted for days on the others. This is hardly surprising given the background workload that was going on and I wasn’t particularly concerned about it.

Mike Bradshaw let me know via Twitter that he had observed image counts being incorrect for around 48 hours after migration as the various devices reached a quiescent state between them.

Data Usage and Optimisations

As I said earlier, I started with over 300GB of Aperture master images. In the final tally, here is how much data the system used after all optimisations:

  • Mac Library: 95GB (as reported by du(1))
  • iCloud Storage: 269.3GB (as reported by Settings > iCloud > Storage)
  • iPhone on-board storage: 10GB (as reported by Settings > General > Usage > Manage Storage)
  • iPad on-board storage: 8.6GB

I have turned on “optimise storage” on every device, including the Mac.

I still don’t have a perfect mental model of the data placement in Photos but my current understanding of what has happened is this:

  • I consolidated (copied) all my master files from my external drive to my Mac’s internal storage
  • Photos proceeded to upload that entire collection to iCloud
  • Once the photos were in iCloud, the consolidated masters were replaced by mac-optimised versions
  • On the iPhone and iPad, Photos was aware of the existence of these images but no data was downloaded to any device until required.
  • When I accessed the Years view, and tiny thumbnails were required, these were loaded from iCloud
  • When I accessed any individual image, a high-resolution version was downloaded from iCloud. A circular pie-chart progress meter in the corner tells you this.

Steady State Operation

7 days into the migration, I woke up to find my iPad and iPhone in total agreement about the number of photos I finally had:

  • Mac Library: 109GB (as reported by du(1))
  • iCloud Storage: 269.3GB (as reported by Settings > iCloud > Storage)
  • iPhone on-board storage: 10.3GB (as reported by Settings > General > Usage > Manage Storage)
  • iPad on-board storage: 8.6GB

At this point, my Mac’s photo count was still ahead by 21 photos. I’m now assuming that there are 21 photos that are either corrupt or missing their master files in some way somewhere in my photo library. I doubt I’ll ever find them.

Sync performance is good at this scale:

  • Photos deleted from one device disappear from the other in under 5 seconds.
  • Photo edits on one device appear on the other in about 15 seconds.
  • New photos from the phone appear on the Mac in under a minute (I have image optimisation turned on at both ends so this is likely to take some extra time).

It’s rare that I’ll be wanting or needing faster syncing than this. What most people really want, I’d guess, is “my photos are on the other device when I get back to it” and Photos certainly seems to be offering that. There is one caveat, however: Photos/iOS will not upload newly-taken photos unless on WiFi and there seems to be no way around this. I have Settings > Cellular > Use Cellular Data For: turned ON for Photos but it won’t upload unless the phone is on WiFi. I understand why that is the way it is but I have an eat-all-you-can data plan for my phone and it’d be nice to have this.

Overall, I can say that I'm really very pleased with Photos for OS X and with iCloud Photo Library.

Conference Room Display on AppleTV

Apple TV comes out of the box with all of Apple's movie and TV services enabled, as well as all the additional channels that keep appearing on the Apple TV home screen.

In times past, the typical practice was to laboriously go through and hide all the channels. This becomes a game of whack-a-mole as channels come and go.

There is a way, however, to have the Apple TV boot into Conference Room mode by default. Conference Room mode is a view that hides all of the extraneous features and just provides information about how to connect to the Apple TV.

Here are the steps:

  • Settings > AirPlay > AirPlay > On
  • Set up the requirement for on-screen codes if desired
  • Settings > AirPlay > Conference Room Display > On
  • Set background picture if desired
  • Settings > General > Restrictions > Turn On Restrictions
  • Configure the restrictions PIN here.
  • Settings > General > Restrictions > AirPlay Settings > Show
  • Settings > General > Restrictions > Conference Room Display > Ask
  • Settings > General > Restart

The restart step is really important. This won't work unless you restart.

The result of these instructions should be:

  • Apple TV boots directly into Conference Room Mode, hiding all the channels and rental services.
  • A verification code is requested when starting AirPlay from any device (this is optional).
  • The restrictions PIN is required to exit Conference Room Mode.

You can also load a custom background from iCloud if you're logged into a specific iCloud account on the Apple TV.

The result: an Apple TV that boots into Conference Room mode.

The result: an Apple TV that boots into Conference Room mode.

The Post-Mobile Era

Twitter followers will know that I've been interested in Chrome OS for a while. Podcast listeners will know that I've been crazily frustrated with Apple's technology since iOS 7 shipped, particularly from a quality standpoint.

Put these two things together and it's time to experiment further with Chromebooks.

When you work in educational technology, you have to be a little like the Roman god Janus and look both forward and backward. You look backward because everyone else is behind you: pupils, parents, colleagues, administrators, regulators, government. These are the people you have to take with you into the new.

At the same time, we have to periodically make very clear judgment calls about what is happening right now - without reference to the past or the future. This is what happens in your summer refresh: it doesn't matter what's coming out in October or at CES and it doesn't much matter what you've deployed in the past - you have to sign your PO in June and the trucks roll up in August with whatever is the best possible decision at the time. Such are the hard scheduling realities of school life.

Like Janus, it's also essential to keep one eye on the future. Trends change, the conversation moves on and, if you want to serve your school community correctly and well, you have to not just be abreast of them but be leading and living those changes well before you expect others to.

This is what keeps me up at night.

When we started with iPad in 2010, the argument was around the appropriateness of "mobile devices" in the classroom. Could we manage without the standard computer tropes that adults of the time had been brought up with?

Douglas Adams:

I've come up with a set of rules that describe our reactions to technologies:

1. Anything that is in the world when you’re born is normal and ordinary and is just a natural part of the way the world works.

2. Anything that's invented between when you’re fifteen and thirty-five is new and exciting and revolutionary and you can probably get a career in it.

3. Anything invented after you're thirty-five is against the natural order of things.

I was 32 when we started with the iPad and, you know what? Douglas Adams was right.

It's odd to think back to those days and remember the heat we took for doing what is now an accepted (if not yet widely-implemented) part of the educational technology stack.

They said children couldn't multi-task on iPads. Wrong. They said children couldn't type on iPads. Wrong. They said children would break their iPads. Wrong. They said children would lose their iPads. Wrong. They said Android tablets would be better and cheaper within a year. Wrong.

What the over-35s meant was that they couldn't multi-task, type on or handle their iPad without breaking it.

Having said that, the move to mobile devices wasn't as much of a paradigm shift as some people thought. An iPad or an iPhone is, after all, a fairly doctrinaire computer of the type that Apple has made since 1984. Many people couldn't see past the introduction of the touchscreen and thought it heralded a completely new era of computing. Many thought that the tablet had to be understood from first principles rather than looking at it as an evolution of the laptop computer. While the touchscreen and the tablet form factor have enabled a number of new and important use cases and contexts, I'm not so sure it represents a completely new era. The smartphone, because of scale, reach and carrier subsidies, is genuinely fundamentally a different proposition but that's another post.

Like every Macintosh before it, an iOS device is essentially a package of processing capability, IO, sensors and local data storage and state maintenance. Later revisions brought some online syncing capabilities with iCloud. Even with iCloud, a user's "suite" of devices - their Mac, iPad and iPhone - remain three distinct bundles of local data and state, some parts of which are synchronised through the cloud but don't live in the cloud.

This distinction is crucial. This distinction is the spring from which all the confusion arises when your colleagues and relatives don't understand that buying more iCloud storage space won't solve their storage space problems on their 16GB iPhone.

Continuity and Handoff in iOS 8 and Yosemite attempts to bridge the divide between devices. While we have had data syncing for some time in OS X and iOS, Continuity is about attempting - at some level - to synchronise state between devices.

You can understand why iOS is built that way. The first iPhone had only EDGE networking and a weak battery. Any software process that depended on constant connectivity to the network was a total non-starter in the mid-2000s. Today, though, we have much more power-efficient hardware in our devices, better batteries and much faster cellular networks.

It seems to me that the prospect of a cloud-only existence is very close. Hence my interest in Chromebook.

I don't wish to reiterate the simplistic arguments about "you can't do this or that on whatever device". When we're looking at longer-term trends, rather than making tactical decisions about the current deployment, we need to think deeper. We need to avoid the human tendency to over-estimate the short term and vastly under-estimate the long term.

What I want to think about more is the idea that we are moving into a post-mobile era. Encapsulated in that phrase "post-mobile" is all kinds of opportunity for misunderstanding and erroneous refutation, so let me be clear: post-mobile doesn't connote that mobile devices are going away. Far from it. They may eventually be the only devices we own.

What I mean by "post-mobile" is that we may be about to move away from the idea of local state and storage, even on our mobile devices. To a certain extent - even possibly to a great extent - most people have already done this on the desktop (and laptop). Every significant application or service that has arisen in the last ten years or more on the desktop has been a web app. The last exception I can think of is possibly iTunes and, in the broad scheme of computing, it's even debatable if iTunes counts as "significant".

I started to notice this when I started describing my iPhone as " a remote control for cloud services". It seemed that every app I touched regularly on my iPhone was an app that more or less totally depended on networking for its function. Let's look at the main ones:

  • Mail
  • Safari
  • Twitter
  • Music streaming (iTunes Match)
  • Google Drive
  • Maps
  • Feedly (RSS)
  • Pocket
  • Travel apps
  • Instagram
  • Netflix, BBC iPlayer, Amazon Instant Video, YouTube, Plex
  • Evernote

It seemed, ultimately, that my iPhone was becoming a stateless device. This hit home to me when I upgraded to my latest iPhone. Instead of restoring my backups, I set the phone up as new. There was almost no data loss: everything I had access to on that phone came back from cloud services almost immediately.

I think this is largely a function of the use cases that a smartphone is put to: communication and entertainment-oriented tasks that depend on up-to-date information. It can be done but it's not comfortable to write a Keynote presentation on your iPhone.

The iPad, however, is a different story. There, I do build movies in iMovie, work in GarageBand and create in Keynote. There is a lot of local state on the iPad and it can be quite difficult to manage at times.

So, where does my interest in Chromebook arise from? Well, ChomeOS has always felt to me like it really had the soul of Google in it, in a way that Android never did. Google is all about the web and ChromeOS is all about the web.

My interest in ChromeOS definitely also waxes and wanes along in inverse proportion to my frustration with Apple. Right now, it waxes strongly as Apple's ability to ship reliable software appears to be disappearing like snow off a dyke, as we say in Scotland.

ChromeOS isn't interesting because it's got better apps than iOS. Generally, it doesn't. It's not interesting because Chromebooks are nicer tools than Apple computers; they're not. I won't lie: Chrome OS is partly interesting because Chromebooks are 20-50% of the price of Apple computers.

ChromeOS is really interesting, though, because it's a computer whose entire existence is built around the idea that neither state nor data is local to the machine. In some ways, we had this before when we used OS X Server to manage OS X machines with auto-mounted home directories and so forth. Auto mounted home directories barely worked across a LAN, however, far less a WAN. Software just wasn't designed to talk sparingly to storage in those days.

The total decoupling of state and data from the machine and coupling it to the user's account has a number of interesting implications. The device becomes essentially disposable or at least highly fungible. It becomes secure, since there's little or no local data to attack and even logging into the computer can require 2-factor authentication.

When I first started looking at Chomebooks, they were cheap and quite weak computers. They were slow and made of poor plastics. Today, though, they are much faster and much better built and have achieved this without the kind of price increases that we have seen from the once-cheap Android tablets not trying to compete with iPad on performance and quality. Chromebooks are reaping the dividend of 30 years of development on PCs.

At its heart, though, a Chromebook is a computer built around Google Drive and Google Docs. The Drive suite is the killer app for Chromebook, and the rest the rest. It is interesting, though, that there increasingly exists a class of software that is "synchronised local state" and another class that is "cloud state accessed locally". This is the difference between Pages and Google Docs or between OmniFocus and Todoist or between iMessage and Slack.

The long-term strategic part of this is that it appears to be much harder to build a robust cloud-coordinated back-end to previously local-state software than it is to make a cloud-backed application work offline. Witness the rather sorry state of collaboration tools like iWork's iCloud collaboration, OneDrive or even Dropbox.

The flipside of this coin is that it's not just about having your state and data in the cloud; it's also about having your applications running continuously, even when you're not actively using them. There's no IFTTT channel for Microsoft Office. I'm very interested in what happens when our tools are no longer tools that we start and stop using but rather are processes that operate continually in the cloud working on our behalf and which we check in with from time to time as we need. This is the difference between Google Now and Siri: Google Now works for you when you're not watching; Siri works only when you whistle.

Phase one was about adopting "mobile" technology in schools. It worked and it's embedded now. iPad is the workhorse tool and I appreciate that very much. It just means it's no longer particularly intellectually interesting. For me, phase one is over.

To my mind, phase two - the next five years or so - is about making full use of the cloud in schools. I hope Google moves ChromeOS beyond the laptop form factor, so that we don't lose some of those benefits of mobility. I sure hope Apple decides to be part of that conversation at all.

The Department of Ungrateful Users

So iOS 8 is upon us and it brings many of the features that I've been waiting in iOS for a very, very long time.

So what's wrong with iOS 8? I've already criticised the bugs, and bugs are bad, but bugs get fixed. What about the design choices and feature set?

Safari and the Web

On our podcast, and online, Bradley and I have criticised Safari as one of the areas of iOS that is materially holding the platform back. Safari's compatibility with the majority of websites is very good. No real complaints there.

The remaining issues centre around two things: web designers being either too clever or too dumb for their own good and file transfer over HTTP.

The first is harder to fix. How do you convince web designers to take touch seriously? The current solution for most is to basically hive off touch compatibility into an m.example.org ghetto. It's not that the desktop web is unusable on an iOS device; it's more that many navigation designs that depend on things like mouseover actions on page elements are extremely non-obvious on touch devices. Despite the fact that a touch, then a second touch usually makes the element work, it's indistinguishable from "broken" in many cases.

The second issue is that the browser isn't just about loading and rendering HTML. Many productivity tasks involve downloading files from, or uploading files to, the web. It could be updating a website, filling in an online expenses form, or whatever.

The download part of this problem has been kind of solved for a while now. You can download one file and, when it's done, use Open In... to send it to another application. This is functional but basic: you have to wait for the file to complete downloading before you can do any action on it, or open another tab (which risks overwhelming Safari if you open too many).

A download manager for Safari would be most welcome. Even better - and Google is starting to do this in Chrome - would be a "download this URL into my iCloud Drive" in which Apple's servers would download the file directly into your Google Drive account from where you could later access it.

The big blocker right now is file upload. Since around iOS 6, it has been possible to upload photos to a website through Safari. This mechanism needs to be generalised to any file. Today, the procedure for any company wanting to accept arbitrary file uploads from iOS looks like:

  • Design, build, test and ship an iOS app.
  • Enable it to accept and upload files via Open In... or picking from iCloud Drive.

We now have a filesystem-like representation on iOS - it's called iCloud Drive - so it should be possible to pick any file from iCloud Drive and upload that through Safari.

All of this, by the way, should also apply to Mail on iOS when it comes to picking attachments.

Inconsistent File Presentation

I do a lot of work on iOS and I use a wide range of apps. What I see right now is a highly inconsistent approach to file handling in many apps. This is not unexpected as developers have spent substantial time over the years building custom integrations to services like Dropbox, Box and Google Drive.

What needs to happen soon is for Apple to seriously tighten up the App Store Review guidelines on file presentation. Everyone needs to use the iOS 8 document picker to present file operations to the user.

Here's a concrete example: I use Auria by Wavemachine to record our podcast. It's a great, powerful app for editing audio on iOS. When I'm done with the mixdown, I'm presented with a fixed range of export options that is, in full: Dropbox, Soundcloud, AudioShare, email or none. We use Google Drive to transfer the audio files for the show, but there's no way to get there directly from Auria.

Basically, all of these custom integrations need to go away and Dropbox, Google Drive, et al should be presenting themselves as iOS 8 Document Providers. To their credit, Dropbox already does but too many apps right now do not present the user that option when moving files around.

Back to the Mac

As iOS evolves, I keep using the same question to gauge its progress: what is it that keeps me going back to the Mac? The list is shorter now than it's ever been. Clipping to Evernote is now easy in iOS 8 with their Safari extension. Using 1Password is now as slick and integrated on iOS as it is on OS X. There remain a few stumbling blocks, but not many.

I ask myself what it would take for me to completely eschew owning a Mac. I'm not there yet and I'm not even all that close to it in practical terms. Like your pal that doesn't have a car but who can only do so because you give him a lift, I could possibly do without my own personal Mac only because I have access to Macs at school.

One of the reasons for this is that the Mac is how you recover an iOS device. If your device turns up its toes completely, one way to get it back is to plug it into a Mac and perform various incantations to revive it. If your iOS device ends up totally full of images and video, the fastest way to solve that problem is to plug it into a Mac and download them all through Image Capture.

You may wish to argue that a "mobile OS" doesn't need to have all the features and power of a "desktop OS" but I disagree. For many, the mobile OS is their first OS. It may even be their only OS. I argue that these devices need to be a superset of desktop functionality, not a subset. They can't be that today because of power, CPU, storage and bandwidth constraints but the gap is closing fast.