Eyes on the Road: Deep Learning and Real-time Crowdsourcing at Mobileye

Mobileye uses deep learning and crowd-sourcing to sense, map, and navigate the road. New ownership under Intel can help provide the processing and connectivity needed to take this technology to the next level, but with data being the driver of success, how can Mobileye build a broad network across automakers in a competitive market?

 

Human error is responsible for over 90% of automotive accidents, causing 1.5 million deaths, 50 million injuries, and $600 billion in propertydamage each year1. Advanced Driver Assist Systems (ADAS) help reduce human error by automating and enhancing vehicle systems including adaptive cruise control, anti-lock braking, and collision avoidance systems.

In March 2017, Intel announced the acquisition of Mobileye, a leader in vision-based ADAS, for ~$15 billion2, hoping to pair their computing and connectivity capabilities with Mobileye’s expertise in computer vision to deliver autonomous driving solutions across a network of vehicles.

Mobileye’s technology identifies vehicles, pedestrians, and other objects, as well as reads roadway markings and traffic signs and lights (sensing). Using this input, their software builds a 360-degree environmental model of the surrounding area in real-time (mapping), which then feeds into decision-making algorithms to avoid collisions or safely navigate the road (driving policy). Mobileye has partnered with 27 different automakers including BMW, Ford, General Motors, Nissan, Volvo, Audi and Hyundai, and its ADAS vision safety technology is currently installed on over 25 million vehicles3.

Autonomous driving requires a host vehicle to navigate complex situations considering the movement of other vehicles, unexpected behavior from other drivers or pedestrians, unpredictable weather conditions, and a myriad of other situations that require split second judgments. Mobileye uses artificial intelligence to train vehicles to navigate these complex situations, based on environmental models and maps created from the data collected by its cameras.

Mobileye uses deep learning, a subset of machine learning that uses logical structures inspired by those of a human mind, to recognize complex patterns in this data and react accordingly. Deep learning goes beyond the typical pattern recognition common in machine learning, as the goal is to seek logical processes by which a computer can come up with its own algorithms based on large amounts of data, figuring out how to do the task by itself rather than explicitly being taught. Mobileye’s sensing and driving policy (negotiating behavior) algorithms use deep learning techniques called supervised learning (pre-labeled data sets for finding correlations) and reinforcement learning (using rewards and punishments) to train the AI to process live information from the sensors and historical data to make informed decisions based on past learning and hypotheses on how drivers will react to potential actions based on simulations4.

The process described above is based on the decision-making power of a single vehicle, but Mobileye also uses the internet to share location-based data across a network of connected vehicles. In a world of fully autonomous vehicles, this network would allow driverless cars to communicate and co-ordinate actions to safely and efficiently navigate the road together with negotiated movement. Today, Mobileye uses its mapping technology to crowd-source data across its 25 million vehicle install-base4. These vehicles are accumulating images and data over millions of miles to create a map of the drivable path on each lane of each road traveled.

Deep learning algorithms require processing power, and an Internet of Things (IoT) enabled series of vehicles requires strong networking capabilities; a great fit for Intel’s strategy of entering data-intensive market opportunities. With the acquisition of Mobileye, they are placing a long-term bet on the fast-growing market for highly and fully autonomous vehicles, estimated at ~$70 billion by 20301. ADAS is one of the fastest growing fields within automotive electronics, driven by public awareness and acceptance of active safety technologies, and the influence of regulators in advocating for new ADAS functions for automotive manufacturers to maintain safety ratings5.

Mobileye and Intel have partnered with BMW to develop production-ready fully autonomous vehicles launching in 2021, and another partnership with Delphi to deliver a “turnkey” solution that can be applied across auto manufacturers launching in 2019. Based on their current technological advantage, Mobileye and Intel hope to increase market penetration in newer vehicle vintages to increase scale. The network effect of new installations and the incremental millions of miles of data sourced from them are critical to improving the software through deep learning and the common map through crowdsourcing. Given time and data, these proprietary algorithms (especially surrounding driving policy between assisted vehicles) may pave the way for fully autonomous vehicles.

The difficulty is making sure that Mobileye positions itself to have the widest reach possible, further complicated by the entrance of well-funded competitors like Alphabet and Uber in the autonomous driving race. To make matters worse, GM, Ford, and other prominent automotive manufacturers are each developing their own self-driving cars6. Intel and Mobileye, by partnering with BMW (over larger customers like GM), may miss opportunities to mass market their platform. Is it better to stay neutral and potentially miss the momentum of being first to market, or potentially picking the wrong camp in the race ahead?

(800 words)

 

Endnotes

  1. “Future of Mobility.” Mobileye, www.mobileye.com/future-of-mobility/.
  2. “Intel Acquisition of Mobileye.” Intel Acquisition of Mobileye., Intel Corporation, 2017, intelandmobileye.transactionannouncement.com/.
  3. “About.” Mobileye, https://www.mobileye.com/about/.
  4. “Advanced Technologies.” Mobileye, https://www.mobileye.com/future-of-mobility/mobileye-advanced-technologies/.
  5. Choi, Seunghyuk, et al. “Advanced Driver-Assistance Systems: Challenges and Opportunities Ahead.” McKinsey & Company, 2016, mckinsey.com/industries/semiconductors/our-insights/advanced-driver-assistance-systems-challenges-and-opportunities-ahead.
  6. Colias, Mike. “GM Aims for Self-Driving Taxi Fleet by 2019.” The Wall Street Journal, Dow Jones & Company, 30 Nov. 2017, www.wsj.com/articles/gm-aims-for-self-driving-taxi-fleet-by-2019-1512085260?ns=prod%2Faccounts-wsj.

Previous:

Machine Learning and Radiologists: Friends or Foes?

Next:

1-800-Flowers And IBM Watson Take On The Future Of Gifting, And The Future Of Relationships

Student comments on Eyes on the Road: Deep Learning and Real-time Crowdsourcing at Mobileye

  1. Thank you for an interesting read. In addition to “the widest reach possible”, I think Mobileye should consider a few other factors in choosing their platform partner. For instance, which business will be their best thought partner from an R&D perspective? With whom can they have the most resources and freedom with experimentation/implementation? While time to market is a critical success factor here, ensuring long term, strategic alignment with the partner seems more important as the commitment is likely to be long-term. On a separate note, I wonder whether as the data environment matures for self-driving cars, platforms where companies can trade driving data will emerge. In that future, companies like Mobileye will be able to train their algorithms without having to rely on an install-base.

Leave a comment