Presenting with Apple Watch

I've been using Keynote on the iPhone to control my presentations for a few years now. My standard presentation setup is to run Keynote on my iPad connected to the projection equipment and use my iPhone as the controller. This is effective, reliable and convenient as it's one less thing to remember.

With the Apple watch comes along two ways to control presentations from this new device. Both Keynote and PowerPoint for iOS have companion Apple Watch apps that are able to control their respective parent apps.

Here is a brief comparison of each without, at this point, any comment on reliability as I have not used either in anger so far.

Keynote

The Keynote Watch app can operate in two modes. Firstly, it can control a Keynote presentation that's running on your paired iPhone. In this mode, you would connect the phone to the projector and control it with the watch.

In the second mode, Keynote on the Watch can control Keynote on your iPhone which is itself operating in the existing remote-control mode to control Keynote on another device. It's a bit like Keynote Inception and, at first blush, is at least 100% more chance of wireless failure than I'm entirely happy with. Still, this is the only way that you can use your Apple Watch to control Keynote on your Mac or iPad.

I was initially confused about this as it's not obvious what puts it into each mode but here's the rule: if you have a Keynote presentation open in Keynote on your iPhone when you launch the Watch app, it will control that presentation. If you're looking at the Keynote file picker when you launch the Watch app, it will go into Keynote Remote Remote mode and start trying to connect to Keynote on your Mac or iPad. That's a lot of Keynote.

IMG_1846.PNG

Once the presentation is running, the entire surface of the screen is a big "forward" button. This is obviously ideal for no-look advancing of your slides: just mash your big thumb anywhere on the screen of the watch. You do get a slide progress counter at the bottom and there's the current time in the top-right corner - but no indication of elapsed time since the start of the presentation.

A force touch on the screen reveals two additional options: Back and Exit Slideshow. I quite like this approach as it's rare enough that you want to go backwards through slides.

What is lacking on the watch is any of the other niceties of the Keynote presenter display. Presenter Notes are obviously not really going to work here but it would be extremely helpful if the watch screen included the red/green "Ready to Advance" indicator and the number of builds remaining in the slide.

PowerPoint

The PowerPoint watch app is basically the same idea as the Keynote app: forward and back through your slides. PowerPoint, however, only has the ability to control a presentation running on the Watch's paired iPhone. There's no ability to control PowerPoint on a third device.

As with Keynote, the PowerPoint app has to be running on the iPhone. First, you're presented with the Start Slideshow button.

IMG_1849.PNG

When you're presenting, PowerPoint offers a couple of additional options on the main screen. You get the ability to navigate both back and forward as well as an elapsed time counter and a slide progress counter. To me, this is a win-one-lose-one scenario: the elapsed time counter is a great addition. However, putting both back and forward buttons on such a small screen would - I imagine - increase the chances of a navigation error.

PowerPoint provides two options with a force touch: Restart and Exit Slideshow. I can't think of too many PowerPoint presentations after which I would wish the presenter to have such easy power to restart the show, but I suppose there is a use case in there somewhere.

Setting up the Watch for presenting

In order to usefully use the Watch as a presentation remote, you're going to want to make a few adjustments.

Firstly, I think that you're going to want to take the watch off your wrist. Unless you're a very stationary and gesture-free speaker, it's going to be really obvious when you go to advance a slide with your Watch. Also, for the next few months at least, you're going to be That Speaker Who Controlled Their Presentation With Their Watch And Was A Bit of a Douche rather than the Speaker Who Was Awesome.

You want to have the face of the watch nestling in your cupped fingers. The same place you'd interact with a TV remote or a more traditional presenter's remote. I found that taking the watch off, re-closing the sport band and placing three fingers through the band, in the way that you might pick up a watch to look at it, was an effective way to hold it. The most important thing here is that you don't distract your audience by fiddling to switch slides and you don't make a mistake when navigating.

Secondly, you want to make sure that the watch doesn't turn itself off or otherwise jump to some other function while you're using it. To minimise the chance of this, you should:

  • Disable Wrist Detection, the feature that locks the watch if it comes off your wrist. You do this in the Apple Watch app on your iPhone, in General > Wrist Detection.
  • Set "Activate on Wrist Raise" to "Resume Previous Activity". You can do this right on the Watch in Settings > General > Activate on Wrist Raise. This ensures that, if the watch does sleep during your presentation, tapping the screen will bring you back to the remote app, rather than the watch face.

I've been presenting exclusively with iPad and iPhone for several years now and it has been unfailingly reliable for me. With the advent of the iPhone 6 Plus and the Apple Watch, it seems entirely possible to me that my presenting kit just got a whole lot smaller.

As always, if you're interested in hiring me to present to your school, university or business, I'm available.

Notes on Migrating from Aperture to Photos for OS X

I have enjoyed photography for many years, particularly since the transition to digital. I've been shooting exclusively digitally since I got my first digital camera in 2002 and the result is rather a large body of digital pictures.

This is the story of migrating from a system that involved Aperture and a bunch of jury-rigged hacks to Apple's new Photos for OS X.

Background

Since Aperture first shipped, I used it to manage all my digital photographs - until the iPhone came along and wrecked my workflow. I can't properly explain what went wrong but I think we all recognise now that there is a sense in which photos "live" on an iPhone in a way that they don't live on a digital camera. At least they didn't for us "serious" photographers but, if you recall the days before smartphones, many regular users did store photos on their digital camera as a way to take the photos with them and show friends.

Since the iPhone, I rather relied on occasionally dumping photos out into Aperture and then, later, relying on the never-very-good iCloud PhotoStream to get photos into Aperture. The implementation of Photo Stream in both Aperture and iPhoto was a mess, as evidenced by the dozens of projects I have named "PhotoSteam (month year)", each roughly correlating to the dates I opened Aperture.

So, at the end of this era, I had:

  • About 31,000 photographs in Aperture, totalling over 300GB.
  • The Aperture library residing on my Mac's internal SSD storage taking up 38GB
  • All master images referenced on a 3TB external hard drive
  • About 6,000 photographs in iCloud Photo Library, totalling around 6-7GB of iCloud storage space.
  • A folder of unknown number of other photographs of around 40GB in size from the time between early iOS 7 betas and iOS 8 shipping.

So, how to get all of this into iCloud Photo Library and back down to my iPhone and iPad?

Migrating the Library

The first step after installing the 10.10.3 beta was to migrate the Aperture library. Photos did this more or less automatically and quite well. The software is obviously still in flux, so exact details of UI are not worth discussing right now and I want to focus on the data migration.

Photos correctly maintained the connection of photos in the library to their referenced masters on the external drive. Everything worked well as long as the drive was connected. At this point I discovered that images with referenced masters cannot be uploaded to iCloud Photo Library. Only images with managed masters can.

The result at this point was that I could browse around 36,000 photos on my Mac but my iOS devices only showed about 6,000 photos. These were the 'native' iOS photos taken on my various devices in the iCloud Photo Library era, but none of my 'legacy' DSLR photos were coming across.

It is also possible to consolidate the masters into the Photos library (using File > Consolidate). Now, recall that my Aperture library was over 300GB in size. My MacBook Air only has 512GB of internal storage and a fair amount of that is being used for other things. This seemed like a bit of a stalemate since I couldn't consolidate the library for lack of disk space.

At this point I left things for a while as other parts of life interfered....and a few new 10.10.3 betas arrived.

Migrating the Data to the Cloud

The story resumes this week as I sat down to try and solve the problem once and for all. The impetus only somewhat increased by an unexpected burst of techno-lust for the new MacBook Apple announced this week.

My first try was to consolidate the entire library. This quickly failed due to lack of disk space, as expected. An indicated 200GB was required over and above what was available.

The next step was to consolidate some images. I started off consolidating a year at a time, which allowed me to make some progress. The default setup for Photos is to keep the full-resolution masters on the Mac's storage but there is an option to optimise instead. I turned this on.

At this point, it's worth noting that I'm still struggling a bit with a mental model of where all this data is going and what's happening to it. As a precaution, I have made backups of my entire Aperture library and masters on the external hard drive.

The result of this was that I was able to, in batches, consolidate around 25,000 of the legacy images into the Photos library. The library on disk grew to around 214GB. At the same time, I was monitoring outbound network traffic from my Mac in Activity Monitor. This indicated that uploads were ongoing, which I assumed to be Photos sending these images to the cloud now that they were no longer referenced.

I opened Photos on my iPhone and iPad and could see photos streaming down to these devices. It was helpful to turn off Summarize Photos (Settings > Photos & Camera > Summarize Photos) on iOS to see the full extent of progress when viewing the Years view on the devices.

The Impact of 30,000 Photos

As the number of photos in my iOS device libraries grew, I started to notice some impacts on performance and correctness in some apps on my devices. Twitter correspondents were telling me that they had seen poor performance on iPhone 5s devices.

Once I crossed 20,000 images, I started to notice the following on my iPhone 6 Plus running iOS 8.2:

  • The first effect was occasional instability in the Photos app on iOS. It was by no means unusable but the crash rate went from zero to not-zero (note: this is much better - but still not perfect - on iOS 8.3).
  • The next effect was that iOS apps that implement their own photo-picking UI started to really struggle to either remain performant or even show a correct view of the photos on the device. Particular offenders include Instagram and Explain Everything.
  • Apps that use the system photo picker UI continued to work well.
  • The Camera app started to be slightly slower to launch.

Time Taken

Having started the migration on Friday evening, by the time I woke up on Tuesday morning my devices were showing a more-or-less consistent view of my photo library. My iPhone had downloaded tiny thumbnails for all images although my iPad was still catching up.

The image counts were a little inconsistent at this point:

  • Mac: 30,931 Photos, 211 Videos
  • iPhone and iPad: 30,190 Photos, 211 Videos
  • iCloud Settings Panel: 31,030 Photos, 213 Videos (Settings > iCloud > Storage > Manage Storage > iCloud Photo Library)

The only reliable way I found to determine whether my Mac was completely finished migrating all the data to the cloud was to observe the Networking tab in Activity Monitor. When Photos was migrating, there was a very obvious pattern to the upstream bandwidth usage.

During this time, the system did not instantly sync anything. If I deleted an image on one device, it persisted for days on the others. This is hardly surprising given the background workload that was going on and I wasn’t particularly concerned about it.

Mike Bradshaw let me know via Twitter that he had observed image counts being incorrect for around 48 hours after migration as the various devices reached a quiescent state between them.

Data Usage and Optimisations

As I said earlier, I started with over 300GB of Aperture master images. In the final tally, here is how much data the system used after all optimisations:

  • Mac Library: 95GB (as reported by du(1))
  • iCloud Storage: 269.3GB (as reported by Settings > iCloud > Storage)
  • iPhone on-board storage: 10GB (as reported by Settings > General > Usage > Manage Storage)
  • iPad on-board storage: 8.6GB

I have turned on “optimise storage” on every device, including the Mac.

I still don’t have a perfect mental model of the data placement in Photos but my current understanding of what has happened is this:

  • I consolidated (copied) all my master files from my external drive to my Mac’s internal storage
  • Photos proceeded to upload that entire collection to iCloud
  • Once the photos were in iCloud, the consolidated masters were replaced by mac-optimised versions
  • On the iPhone and iPad, Photos was aware of the existence of these images but no data was downloaded to any device until required.
  • When I accessed the Years view, and tiny thumbnails were required, these were loaded from iCloud
  • When I accessed any individual image, a high-resolution version was downloaded from iCloud. A circular pie-chart progress meter in the corner tells you this.

Steady State Operation

7 days into the migration, I woke up to find my iPad and iPhone in total agreement about the number of photos I finally had:

  • Mac Library: 109GB (as reported by du(1))
  • iCloud Storage: 269.3GB (as reported by Settings > iCloud > Storage)
  • iPhone on-board storage: 10.3GB (as reported by Settings > General > Usage > Manage Storage)
  • iPad on-board storage: 8.6GB

At this point, my Mac’s photo count was still ahead by 21 photos. I’m now assuming that there are 21 photos that are either corrupt or missing their master files in some way somewhere in my photo library. I doubt I’ll ever find them.

Sync performance is good at this scale:

  • Photos deleted from one device disappear from the other in under 5 seconds.
  • Photo edits on one device appear on the other in about 15 seconds.
  • New photos from the phone appear on the Mac in under a minute (I have image optimisation turned on at both ends so this is likely to take some extra time).

It’s rare that I’ll be wanting or needing faster syncing than this. What most people really want, I’d guess, is “my photos are on the other device when I get back to it” and Photos certainly seems to be offering that. There is one caveat, however: Photos/iOS will not upload newly-taken photos unless on WiFi and there seems to be no way around this. I have Settings > Cellular > Use Cellular Data For: turned ON for Photos but it won’t upload unless the phone is on WiFi. I understand why that is the way it is but I have an eat-all-you-can data plan for my phone and it’d be nice to have this.

Overall, I can say that I'm really very pleased with Photos for OS X and with iCloud Photo Library.

Conference Room Display on AppleTV

Apple TV comes out of the box with all of Apple's movie and TV services enabled, as well as all the additional channels that keep appearing on the Apple TV home screen.

In times past, the typical practice was to laboriously go through and hide all the channels. This becomes a game of whack-a-mole as channels come and go.

There is a way, however, to have the Apple TV boot into Conference Room mode by default. Conference Room mode is a view that hides all of the extraneous features and just provides information about how to connect to the Apple TV.

Here are the steps:

  • Settings > AirPlay > AirPlay > On
  • Set up the requirement for on-screen codes if desired
  • Settings > AirPlay > Conference Room Display > On
  • Set background picture if desired
  • Settings > General > Restrictions > Turn On Restrictions
  • Configure the restrictions PIN here.
  • Settings > General > Restrictions > AirPlay Settings > Show
  • Settings > General > Restrictions > Conference Room Display > Ask
  • Settings > General > Restart

The restart step is really important. This won't work unless you restart.

The result of these instructions should be:

  • Apple TV boots directly into Conference Room Mode, hiding all the channels and rental services.
  • A verification code is requested when starting AirPlay from any device (this is optional).
  • The restrictions PIN is required to exit Conference Room Mode.

You can also load a custom background from iCloud if you're logged into a specific iCloud account on the Apple TV.

The result: an Apple TV that boots into Conference Room mode.

The result: an Apple TV that boots into Conference Room mode.

The Post-Mobile Era

Twitter followers will know that I've been interested in Chrome OS for a while. Podcast listeners will know that I've been crazily frustrated with Apple's technology since iOS 7 shipped, particularly from a quality standpoint.

Put these two things together and it's time to experiment further with Chromebooks.

When you work in educational technology, you have to be a little like the Roman god Janus and look both forward and backward. You look backward because everyone else is behind you: pupils, parents, colleagues, administrators, regulators, government. These are the people you have to take with you into the new.

At the same time, we have to periodically make very clear judgment calls about what is happening right now - without reference to the past or the future. This is what happens in your summer refresh: it doesn't matter what's coming out in October or at CES and it doesn't much matter what you've deployed in the past - you have to sign your PO in June and the trucks roll up in August with whatever is the best possible decision at the time. Such are the hard scheduling realities of school life.

Like Janus, it's also essential to keep one eye on the future. Trends change, the conversation moves on and, if you want to serve your school community correctly and well, you have to not just be abreast of them but be leading and living those changes well before you expect others to.

This is what keeps me up at night.

When we started with iPad in 2010, the argument was around the appropriateness of "mobile devices" in the classroom. Could we manage without the standard computer tropes that adults of the time had been brought up with?

Douglas Adams:

I've come up with a set of rules that describe our reactions to technologies:

1. Anything that is in the world when you’re born is normal and ordinary and is just a natural part of the way the world works.

2. Anything that's invented between when you’re fifteen and thirty-five is new and exciting and revolutionary and you can probably get a career in it.

3. Anything invented after you're thirty-five is against the natural order of things.

I was 32 when we started with the iPad and, you know what? Douglas Adams was right.

It's odd to think back to those days and remember the heat we took for doing what is now an accepted (if not yet widely-implemented) part of the educational technology stack.

They said children couldn't multi-task on iPads. Wrong. They said children couldn't type on iPads. Wrong. They said children would break their iPads. Wrong. They said children would lose their iPads. Wrong. They said Android tablets would be better and cheaper within a year. Wrong.

What the over-35s meant was that they couldn't multi-task, type on or handle their iPad without breaking it.

Having said that, the move to mobile devices wasn't as much of a paradigm shift as some people thought. An iPad or an iPhone is, after all, a fairly doctrinaire computer of the type that Apple has made since 1984. Many people couldn't see past the introduction of the touchscreen and thought it heralded a completely new era of computing. Many thought that the tablet had to be understood from first principles rather than looking at it as an evolution of the laptop computer. While the touchscreen and the tablet form factor have enabled a number of new and important use cases and contexts, I'm not so sure it represents a completely new era. The smartphone, because of scale, reach and carrier subsidies, is genuinely fundamentally a different proposition but that's another post.

Like every Macintosh before it, an iOS device is essentially a package of processing capability, IO, sensors and local data storage and state maintenance. Later revisions brought some online syncing capabilities with iCloud. Even with iCloud, a user's "suite" of devices - their Mac, iPad and iPhone - remain three distinct bundles of local data and state, some parts of which are synchronised through the cloud but don't live in the cloud.

This distinction is crucial. This distinction is the spring from which all the confusion arises when your colleagues and relatives don't understand that buying more iCloud storage space won't solve their storage space problems on their 16GB iPhone.

Continuity and Handoff in iOS 8 and Yosemite attempts to bridge the divide between devices. While we have had data syncing for some time in OS X and iOS, Continuity is about attempting - at some level - to synchronise state between devices.

You can understand why iOS is built that way. The first iPhone had only EDGE networking and a weak battery. Any software process that depended on constant connectivity to the network was a total non-starter in the mid-2000s. Today, though, we have much more power-efficient hardware in our devices, better batteries and much faster cellular networks.

It seems to me that the prospect of a cloud-only existence is very close. Hence my interest in Chromebook.

I don't wish to reiterate the simplistic arguments about "you can't do this or that on whatever device". When we're looking at longer-term trends, rather than making tactical decisions about the current deployment, we need to think deeper. We need to avoid the human tendency to over-estimate the short term and vastly under-estimate the long term.

What I want to think about more is the idea that we are moving into a post-mobile era. Encapsulated in that phrase "post-mobile" is all kinds of opportunity for misunderstanding and erroneous refutation, so let me be clear: post-mobile doesn't connote that mobile devices are going away. Far from it. They may eventually be the only devices we own.

What I mean by "post-mobile" is that we may be about to move away from the idea of local state and storage, even on our mobile devices. To a certain extent - even possibly to a great extent - most people have already done this on the desktop (and laptop). Every significant application or service that has arisen in the last ten years or more on the desktop has been a web app. The last exception I can think of is possibly iTunes and, in the broad scheme of computing, it's even debatable if iTunes counts as "significant".

I started to notice this when I started describing my iPhone as " a remote control for cloud services". It seemed that every app I touched regularly on my iPhone was an app that more or less totally depended on networking for its function. Let's look at the main ones:

  • Mail
  • Safari
  • Twitter
  • Music streaming (iTunes Match)
  • Google Drive
  • Maps
  • Feedly (RSS)
  • Pocket
  • Travel apps
  • Instagram
  • Netflix, BBC iPlayer, Amazon Instant Video, YouTube, Plex
  • Evernote

It seemed, ultimately, that my iPhone was becoming a stateless device. This hit home to me when I upgraded to my latest iPhone. Instead of restoring my backups, I set the phone up as new. There was almost no data loss: everything I had access to on that phone came back from cloud services almost immediately.

I think this is largely a function of the use cases that a smartphone is put to: communication and entertainment-oriented tasks that depend on up-to-date information. It can be done but it's not comfortable to write a Keynote presentation on your iPhone.

The iPad, however, is a different story. There, I do build movies in iMovie, work in GarageBand and create in Keynote. There is a lot of local state on the iPad and it can be quite difficult to manage at times.

So, where does my interest in Chromebook arise from? Well, ChomeOS has always felt to me like it really had the soul of Google in it, in a way that Android never did. Google is all about the web and ChromeOS is all about the web.

My interest in ChromeOS definitely also waxes and wanes along in inverse proportion to my frustration with Apple. Right now, it waxes strongly as Apple's ability to ship reliable software appears to be disappearing like snow off a dyke, as we say in Scotland.

ChromeOS isn't interesting because it's got better apps than iOS. Generally, it doesn't. It's not interesting because Chromebooks are nicer tools than Apple computers; they're not. I won't lie: Chrome OS is partly interesting because Chromebooks are 20-50% of the price of Apple computers.

ChromeOS is really interesting, though, because it's a computer whose entire existence is built around the idea that neither state nor data is local to the machine. In some ways, we had this before when we used OS X Server to manage OS X machines with auto-mounted home directories and so forth. Auto mounted home directories barely worked across a LAN, however, far less a WAN. Software just wasn't designed to talk sparingly to storage in those days.

The total decoupling of state and data from the machine and coupling it to the user's account has a number of interesting implications. The device becomes essentially disposable or at least highly fungible. It becomes secure, since there's little or no local data to attack and even logging into the computer can require 2-factor authentication.

When I first started looking at Chomebooks, they were cheap and quite weak computers. They were slow and made of poor plastics. Today, though, they are much faster and much better built and have achieved this without the kind of price increases that we have seen from the once-cheap Android tablets not trying to compete with iPad on performance and quality. Chromebooks are reaping the dividend of 30 years of development on PCs.

At its heart, though, a Chromebook is a computer built around Google Drive and Google Docs. The Drive suite is the killer app for Chromebook, and the rest the rest. It is interesting, though, that there increasingly exists a class of software that is "synchronised local state" and another class that is "cloud state accessed locally". This is the difference between Pages and Google Docs or between OmniFocus and Todoist or between iMessage and Slack.

The long-term strategic part of this is that it appears to be much harder to build a robust cloud-coordinated back-end to previously local-state software than it is to make a cloud-backed application work offline. Witness the rather sorry state of collaboration tools like iWork's iCloud collaboration, OneDrive or even Dropbox.

The flipside of this coin is that it's not just about having your state and data in the cloud; it's also about having your applications running continuously, even when you're not actively using them. There's no IFTTT channel for Microsoft Office. I'm very interested in what happens when our tools are no longer tools that we start and stop using but rather are processes that operate continually in the cloud working on our behalf and which we check in with from time to time as we need. This is the difference between Google Now and Siri: Google Now works for you when you're not watching; Siri works only when you whistle.

Phase one was about adopting "mobile" technology in schools. It worked and it's embedded now. iPad is the workhorse tool and I appreciate that very much. It just means it's no longer particularly intellectually interesting. For me, phase one is over.

To my mind, phase two - the next five years or so - is about making full use of the cloud in schools. I hope Google moves ChromeOS beyond the laptop form factor, so that we don't lose some of those benefits of mobility. I sure hope Apple decides to be part of that conversation at all.