The Fusion Project aims to optimize data collection from vehicles
The Fusion Project, which promises to provide a more efficient way to collect the data required to train AI models for autonomous vehicles, is being launched today by Airbiquity, Cloudera, NXP Semiconductors, Teraki, and Wind River.
The goal is to compress the data collected from autonomous vehicles to the point where it becomes possible to update the AI models employed in an autonomous vehicle faster. Today, autonomous vehicles rely on inference engines based on AI models trained in the cloud. The automotive industry is a long way from being able to train AI models in real time on the vehicle itself. In the meantime, the members of the Fusion Project are committing to making it easier to collect data by compressing data on the vehicles before it is transferred back to AI models residing in the cloud.
Those data compression techniques will eventually be applied to other forms of transportation such as trains and planes, said David LeGrand, senior industry and solutions marketing manager for manufacturing and retail at Cloudera.
The members of the Fusion Project are pledging to develop an integrated embedded system for collecting compressed data from vehicles that can be fed back to a cloud platform. That capability will substantially reduce the cost of collecting data from what one day might be millions of vehicles, noted LeGrand.
In addition to compressing the data collected using software developed by Cloudera, the members of the Fusion Project will enable over-the-air updates to the inference engines installed in a vehicle using software management software from Airbiquity.
NXP, meanwhile, will provide the vehicle processing platforms, while Teraki provides the AI software that will be deployed at the edge. Finally, Wind River will provide the embedded system software.
Initially, the Fusion Project will specifically focus its efforts on advancing the ability of autonomous vehicles to recognize when to optimally change lanes based on the data gathered via vision AI engines installed in the vehicle, said LeGrand. The first tests of vehicles embedded with Project Fusion technologies will take place in Europe, added LeGrand.
The immediate goal is to not eliminate the need for drivers, but rather to take the current alert systems that most vehicles have today to the next level by training AI models based on the data about the actual driving experience being collected by vehicles, noted LeGrand. “It’s not going to be fully autonomous,” said LeGrand. “It’s more like a driver-assist system.”
There are, of course, fully autonomous vehicles that can follow a highly prescribed set of programming instructions to get from one point to another. The challenge is that the level of responsiveness required for an autonomous vehicle to navigate traffic flows that include vehicles driven by humans that are likely to make random decisions remains elusive.
There may eventually come a day when AI models embedded within a vehicle could be trained and updated in real time. Today, achieving that goal would require the equivalent of a server based on a graphics processing unit (GPU) to be installed in the trunk of every vehicle. Naturally, that would make autonomous vehicles prohibitively expensive.
In the meantime, the process of transferring data between inference engines and the AI models on which they are based will continue to become more efficient. The AI model might not make it all the way out to the vehicle itself, but it will become more feasible to deploy AI models at the network edge. The challenge, of course, is finding a way to achieve that goal in a way that is economically viable for automotive manufacturers.
- up-to-date information on the subjects of interest to you
- our newsletters
- gated thought-leader content and discounted access to our prized events, such as Transform
- networking features, and more
Source: Read Full Article