Sony is an unexpected victim of its own success

Sony smartphones may not be flying off the shelves, but even if you bought a recent iPhone or one of many Android phones you're actually a Sony customer – and contributing to its production headache. While Sony Xperia handsets remain niche, despite the company's best efforts to pitch unlocked and SIM-free devices, its camera sensor business is struggling to keep up with its own success.

Samsung often gets the lion's share of attention when it comes to dominating the supply chain, what with Samsung Display churning out panels, its semiconductor arm delivering memory, and its battery division dealing with power. However Sony has a similar success story, with its semiconductor unit responsible for wildly popular camera chips.

Although most phone-makers have their own branding for cameras, not to mention their own photography boasts, many dip into the same camera chip supplies. Sony is one of the most successful, providing the CMOS for Apple's iPhone 11 Pro, for example. While individual devices have their own software and tweaks, the core sensor comes from Sony's production lines.

With the rise in how many cameras a single device now has, that's turned out to be a challenge for Sony. Indeed, the company plans to run its production lines constantly throughout the holidays, division chief Terushi Shimizu confirmed to Bloomberg. Even then, meeting demand is apparently difficult.

"We are having to apologize to customers because we just can't make enough," Shimizu says. Sony Semiconductor is now only second to PlayStation in the company's various businesses when it comes to profit, and of that 86-percent of revenue comes from image sensors.

Unsurprisingly, then, reinvestment is on the roadmap. Sony plans to open a new production facility in Nagasaki, Japan, though that won't be ready to manufacture sensors until April 2021 the company says. Before that, capital spending is more than doubling this fiscal year.

That'll go not only on maximizing production but developing new sensor types, as Sony tries to keep ahead of the curve of what its phone-making customers want to offer their users. Time-of-Flight, or ToF, is one area of interest, where sensors can accurately create depth maps of an area by rapidly bouncing laser light off the scene and timing its return.

It's a technology likely to be instrumental as augmented reality gains traction, and app-makers look to include digital graphics into real-world scenes. Apple is believed to be readying new iPhone models with ToF capabilities for release later in 2020. Beyond that, the same sensor technology could be applied to augmented reality eyewear, such as the much-rumored Apple smart glasses.

For a while, it seemed like the megapixel race had stalled, as smartphone cameras reached a certain peak and then the hunt for squeezing in more and more pixels was sidelined in favor of alternative lenses, zooms, and computational photography. That momentum has built up again over the past year or so, with Samsung rumored to include a 108-megapixel camera in the upcoming Galaxy S11 flagship. At the same time, though, trying to accommodate multiple sensors into a single phone presents clear packaging headaches, and limiting the ability to boost resolution simply by including a larger overall sensor.

The solution may well involve artificial intelligence. Sony announced last month that it was forming a new global division, Sony AI, which would explore how machine learning, neural networks, and other technologies could be used to improve different areas of its business. One of the three key divisions expected to benefit is sensors & imaging.