Nvidia builds a co-pilot into its autonomous drive computer

January 4, 2017

Nvidia teased out a powerful system that would serve as an AI supercomputer earlier this year, giving cars the ability to drive themselves. But while there’s been a ton of progress in building safer autonomous driving, we might not quite be at the point where we can fully trust the car to do everything.

Enter Co-Pilot. Nvidia is baking in new technology in its Xavier drive computers that builds an AI assistant to drivers when they are doing the driving. It pulls in data from sensors in and outside of the car, whether that’s through cameras or microphones, and builds a profile of what’s going on in and around the car. It’ll then flag its drivers — through voice, or potentially through other means — when they need to react to something.

“This car has incredible perception capability if only the AI was running all the time,” Nvidia CEO Jen-Hsun Huang said. “We believe the AI is either driving you or looking out for you. When it’s not driving you, it is still completely engaged. Even though it doesn’t have the confidence to drive — maybe the mapping has changed, or there are too many pedestrians. It should still be completely aware, and it should continue to look out for you.”

For now, this still seems a little more like a curiosity or a research project. Nvidia would still have to partner up with some manufacturer to build a car that would have this built in and also have all the sensors necessary to support the technology. So we won’t know exactly how things are going to go when and if the technology gets widely adopted, and it’s not going to come easy. The head of Toyota Research Institution, Gill Pratt, said we’re “nowhere close” to fully autonomous driving.

The company is partnering with ZF, a commercial supplier to the trucking industry in Europe. Bosch, another automotive supplier, would also adopt the Nvidia Drive computer, Hsun said, as well as working with Audi to build cars powered by Nvidia’s Drive computer. (More on that coming soon.)

One example, shown on stage at CES where the announcement was made, would be a biker in a lane nearby. If a driver isn’t fully paying attention, they may need that quick notification that will help them ensure they aren’t turning in front of a biker they may not see. The tech can also observe a driver’s face, seeing where he or she is looking or potentially what kind of facial expressions they have.

“The AI should pay attention, maybe the AI notices you’re a little aggravated and probably should pull over,” Nvidia CEO Jen-Hsun Huang Those capabilities, modern AI networks can absolutely do.”

That requires a lot of oomph, which is what Nvidia has thrown a lot of resources behind in building its GPUs. Nvidia has found itself in a good position as a lot of AI processing demands the kinds of compute capabilities that GPUs have rather than normal processors. That’s led Nvidia to have a huge run this year, with the stock price tripling.

liveapps

About the Author

liveapps