Apple's 3D Touch was doomed from the start

3D Touch was meant to unlock a new level of interaction with our iPhones: instead, it just added to the confusion. Now, with the iPhone XR arriving without Apple's pressure-sensitive screens, it looks like the technology is finally being abandoned. Then again, the writing has been on the wall for 3D Touch for some time now.

Apple first started using pressure-sensitive displays with the original Apple Watch. Dubbed Force Touch, it uses an array of electrodes that, when you press down, can figure out how much pressure is being applied. With that, watchOS can differentiate between a regular tap and a firmer one, doubling the input options.

It transitioned to Apple's smartphones with the iPhone 6s and iPhone 6s Plus in late 2015, though the reception there was less positive. The official line was the same: 3D Touch "senses force to enable intuitive new ways to access features and interact with content." Apple even baked new gestures into its interface, dubbed Peek and Pop, which promised faster ways to preview messages, photos, and webpages, then jump straight to them.

Still, it's not hard to see how 3D Touch stumbled. As a concept, it wasn't a bad idea: give access to extra features from a "different' type of tap, whether that be contextual menus, previews, or something else. Problem was, actually figuring out where 3D Touch was available was always trial-and-error.

With no standardized way to flag whether an icon, dialog, or button supported 3D Touch or not, there was little incentive to try it. Either you'd have to be an avid 3D Toucher and hard-pressing on everything on the off-chance it worked, or you ignored the feature altogether and just used iOS normally.

Of those that did embrace 3D Touch, I suspect most are like me. Aware of a handful of times when the feature is supported, and indeed useful – like summoning the context menu from lock screen notifications, so you can quickly delete unwanted emails before they clog up your inbox – but not going out of my way to find other implementations. Apple could've encouraged the use of some sort of graphic or stylistic element to highlight where 3D Touch was available, but it didn't, and so I never bothered to dig much deeper.

What may well have sealed 3D Touch's fate wasn't so much that its core competency wasn't being used, though, but that its potential above and beyond a regular old long-press wasn't. The differentiator between a pressure-sensitive tap and a time-sensitive tap is, obviously, how hard you're touching the screen. Apple unlocked that data for third-party developers to use, but few did.

Indeed the only real applications of it we saw were in music apps. ROLI, maker of squishy, pressure-sensitive keyboards, gave its Noise music app the 3D Touch treatment: the firmer you pressed your iPhone screen, the more you could change the way the notes sounded. DJay Pro then used the technology with its iPhone DJ app.

In the end, though, the possibilities unlocked by figuring out just how firmly you were tapping a phone screen simply weren't broad enough in their appeal to make it all worthwhile. At the same time, 3D Touch brought with it multiple compromises Apple needed to accommodate. The capacitive sensors involved add cost, and the sandwich of layers add thickness. Without mainstream adoption, absorbing those compromises must've started to look foolhardy.

I suspect the same technology won't be so quick to expire on the Apple Watch. There, Force Touch serves a much more practical purpose: a smartwatch screen is small, and different ways to interact with it are required in comparison to a phone display. It's also arguably easier to gauge how much pressure you're applying, when your fingertip is pushing down and – through the watch – onto your wrist.

Apple's transition to OLED hasn't been straightforward, as numerous leaks in the run-up to the first iPhone X last year illustrated. Getting sufficient yields from suppliers, up to the display quality demands the Cupertino firm makes, and at sufficient number and price to make widespread deployment practical, saw big names stumble. That those displays also had to integrate with 3D Touch is unlikely to have made the process any simpler.

Interestingly, the same "force" data that iOS offers to app developers with 3D Touch is supplied another way, too. Apple was slow to embrace the stylus, but the Apple Pencil delivers its pressure data through the same mechanism, only on the iPad Pro rather than the iPhone. A pen arguably makes a lot more sense for that, too: it's much easier to manipulate a stylus across multiple pressure levels when, for example, sketching or painting, than it is your fingertip.

Apple Pencil support for iPhone is one of those perennial rumors still yet to pan out in practice. For 3D Touch, though, it seems the writing is on the wall. The iPhone XR may not be the cheapest handset in Apple's smartphone range, but it's likely to be the most successful given its balance of 2018 design language and relative affordability compared to the iPhone Xs and iPhone Xs Max. That seems a fitting first step to putting 3D Touch out to pasture.