There’s more to breaking down the barriers between desktop and mobile than making the UIs on an iPhone and a MacBook look alike, and Apple used its WWDC 2014 opening keynote to hint at how. Continuity came up several times during the presentation, on both the OS X and the iOS sides, explaining how later this year our Macs and iPhones will work together to transfer tasks as you move between them. Yet the potential for Continuity goes far further than just giving you a bigger keyboard to finish your email on.
As Apple explained, Continuity is relatively straightforward. By using proximity-awareness, tasks that you’re completing on one device - say, your iPhone - can be switched over to another - like the iMac on your desktop - depending on where you physically are.
So, if you’re working on a document on your iPad, and then you walk up to your iMac, the same document can be brought up at the same point in your workflow, giving you the full-sized screen and peripherals to use.
The converse is also true: move away from the Mac and you can swipe up on the iPad to bring up the current task on the tablet, instead.
It’s called Handoff, and it works with compatible apps - like Mail, Safari, the iWork suite, and third-party apps in which developers have added support - as long as you’re signed into the same iCloud account on all devices. There’s also the ability to place and answer calls from your Mac through your iPhone, turning the computer into a speakerphone, as well as to send an SMS and MMS from an iPad or Mac.
That understanding of proximity unlocks huge potential for persistent computing. Our lives are already filled with screens; however, our workflow is often limited by the size of the screen we first started each task on. That can have ergonomic considerations - it’s usually easier to type on a regular keyboard and an on-screen equivalent - or convenience ones, like wanting to leave the Mac in your office but not necessarily leave behind the project you’re currently working on.
Continuity and Handoff opens the door to treating any display, and peripheral, as a dynamically-assigned terminal. Think of iCloud as an identity, rather than as a storage location: your experience of your own data and tasks, with seamless handover between the form-factors that make most sense.
We’ve already seen Apple do that with multimedia, and streaming audio and video through AirPlay, but it’s clear that Continuity is an ecosystem play on several levels. Most obvious is to encourage you to buy more Apple hardware, but with HomeKit and HealthKit also in the pipeline, there’s the potential to tie proximity and device context into the conjoined devices around your smart home and on your wrist.
How successful that is will to some extent depend on how accurately and consistently Apple can track exactly where you are. Right now, the most granular that gets is your iPhone. Should the company indeed step into wearables - as they’re broadly expected to do with a health-centric iWatch - then the understanding of location (not to mention activity) could be even further refined.
If you’re on the couch, are you watching TV - so would it make sense to bring HomeKit notifications to the big screen rather than the phone in your pocket? - or are you sleeping, and so only the most important alerts should be let through? Should your phone calls be automatically routed through your Apple TV, instead?
One of the key benefits of the cloud has always been the idea that your data isn’t locked up on any one, single device. The next challenge may be figuring out a way to deliver that in the most timely way, to the most appropriate display.
Missed anything from WWDC this week? Catch up with all the news in our Apple Hub