Ford taps Intel Mobileye for smarter car vision and crowdsourced-cleverness

Ford and Intel have inked a deal to give future cars better vision, with Mobileye vision-sensing tech being used to power the automaker's driver assistance systems. Mobileye EyeQ will be used for systems like Ford Co-Pilot360, including both hardware and software, and eventually form part of the technology behind Ford Active Drive Assist in 2021.

It's a big win for Intel, which has been positioning Mobileye as its major play in ADAS tech for some time now. The chip-maker acquired Mobileye in early 2017, in a deal worth over $15 billion. However while much of the focus of tomorrow's vehicles has been on fully-autonomous cars, the reality is that the market right now is still focused on driver-assistance, not driver-replacement.

So-called Level 2 vehicles, and the first hints of Level 3 models to come, still expect to have human drivers behind the wheel. The ADAS tech, however, is able to better support them, with features like adaptive cruise control and lane-centering. Next year, meanwhile, Ford will bring its Active Drive Assist to market, a hands-off system in which vehicles like the Mustang Mach-E will be able to keep pace with highway traffic, and stay in lane, without the driver's hands being on the wheel. Cameras will ensure the driver is still paying attention to the road, however.

Even before that, EyeQ chips and software will be used for features more broadly available across Ford's line-up. A camera integrated into the windshield will be used to identify and track things like lane markings and traffic signs, as well as identifying other vehicles and pedestrians. It'll be instrumental for features such as lane-keeping, pre-collision assistance, and automatic emergency braking.

"Ford will take advantage of Mobileye's technology throughout the life of its next-generation production vehicles," the automaker says, "including F-150 and Mustang Mach-E, as well as future products that offer advanced driver-assistance systems features."

This isn't the first time Ford and Mobileye have worked together. The difference this time around, the automaker says, is that it's now committing to EyeQ "for the entire lifecycle of its next-generation vehicles." Given a vehicle lifecycle can easily be 4+ years, that's a big win for Intel.

Ford and Mobileye will be working with the car company's Tier 1 suppliers to better integrate the EyeQ tech for mass production. Meanwhile, Ford is also considering whether to use Mobileye's Roadbook technology in future.

Roadbook effectively turns vehicles into crowdsourcing road traffic and mapping probes. The vehicle cameras feed back anonymized data that's used to build and maintain a high-definition map; it's a system we've seen HERE promote too, and indeed the two companies signed a collaboration deal in 2016 that could see them share that data. That navigation info can be used by driver-assistance and Active Drive Assist technologies, so that they're better-equipped to handle changing road conditions.

One of the issues with highway assistance systems, for example, can be basing the technology on a pre-assessed map: if the roadway changes, the car can be left ill-equipped to deal with the alterations. Rather than automakers or mapping-providers going out to physically re-survey each road, which is both time-consuming and inefficient, Roadbook could use crowdsourced data from a fleet of consumer cars to warn the system that a highway has changed. That data could either be integrated into the core map, or it could trigger a professional re-surveying, or both.

It's not the first time we've seen Ford talk about crowdsourced data in recent months. In June it revealed a way to better estimate range for the all-electric Mustang Mach-E, with Intelligent Range using data shared by other EVs in similar road conditions to better gauge how much driving can be done on what battery power remains. The same technology is likely to feature in the all-electric F-150 EV which Ford says should arrive sometime in the next two years.