Technalysis Research
Previous USAToday Columns

December 31, 2018
2018: Why it's not just the year the tech industry would like to forget

December 27, 2018
From autonomous cars to foldable phones, 2019 looks promising for 5G wireless technology

November 24, 2018
The era of foldable smartphones is finally here – and the impact will be enormous

October 2, 2018
Everything on the web is tailored just for you. That's a problem

May 25, 2018
Internet users to get more control of their info, thanks to Europe's GDPR

April 8, 2018
How safe should we expect autonomous cars to be?

March 7, 2018
Why the Qualcomm/Broadcom deal wouldn’t be great for you, the smartphone user

February 4, 2018
We don't need fully self-driving cars to save lives

2017 USAToday Columns

2016 USAToday Columns

2015 USAToday Columns

2014 USAToday Columns

USAToday Column

January 19, 2019
Autonomous cars? Not yet. Digital cockpits, assisted driving the latest auto tech focus

By Bob O'Donnell

FOSTER CITY, CALIF. – In all the hoopla surrounding fully autonomous cars, it’s easy to forget that, for the foreseeable future, most of the automobiles we find ourselves in are going to be driven by people. Yes, real people, responsible for getting their vehicle from point A to point B.
Car companies and the tech companies they increasingly work with seemed to acknowledge this reality. The focus has clearly shifted away from the unrealistic expectations about fully autonomous cars being available to purchase in the near future, and toward valuable, practical and safety-focused enhancements to the driving experience.

Earlier this month at the Consumer Electronics Show in Las Vegas, several vendors demonstrated “digital cockpits” that dramatically improve the displays and controls with which drivers interact. Additionally, chipmakers Nvidia and Intel discussed advances in assisted driving platforms, commonly known as ADAS (Advanced Driver Assistance Systems).
Though these two developments may seem distinct, they are actually closely related. The enhanced information available to drivers via digital cockpits is designed to provide more insights into how the car is functioning and how – or why – certain driving assistance technologies are engaging.

The idea is that by better understanding how and when the technologies start to do their work, people will be able to trust them and the driving experience in these new cars. Given the concerns and fears that have been raised due to early autonomous car experiments, it’s an extremely important goal.

Digital Cockpits
Though Tesla initially made a huge splash in the automotive segment for the enormous 17-inch display in its Model S and Model X cars, the new digital cockpit designs on display at CES went even further by extending the experiences across several screens that surround the driver and passengers.

One of the more interesting designs came from Samsung, who purchased automotive supplier Harman back in November 2016, and leveraged both Samsung and Harman technologies in their digital cockpit concept. In addition to multiple screens across the entire dashboard, the system offered individual voice recognition via Samsung’s Bixby personal assistant. This allowed you do things like control key car functions and adjust settings for each driver with your voice. In addition, the cockpit used computer vision technology to look for things like driver drowsiness, mirror replacement, and more.

Qualcomm announced three different versions of its third-generation cockpit experience, driven by its auto-grade Snapdragon 820a processors. The Performance, Premiere, and Paramount solutions – cleverly targeted at entry-level, mid-tier, and high-end cars respectively – all feature AI-enabled capabilities such as voice recognition. In addition, their cockpit concept featured driver and passenger personalization options, as well as an enormous dashboard-length display, segmented into multiple groupings. Leveraging the company’s expertise in wireless communication, Qualcomm also announced support for their C-V2X (cellular vehicle-to-everything) technology, which enables appropriately equipped vehicles to communicate with each other, as well as things like stoplights and other transportation-focused elements. The company also announced a partnership with Ford to bring the technology to market in 2022.

One of the more surprising, and complex, digital cockpit announcements came from BlackBerry – yes, the former phone company, now a player in automotive and cybersecurity software. They showed a platform leveraging its important, though little known to consumers, QNX operating system that sits at the heart of over 120 million car infotainment systems currently on the road. The new QNX Platform for Digital Cockpits combines BlackBerry’s ISO 26262 safety-certified hypervisor with separate modules for instrument clusters and infotainment systems.

It also allows Android applications to run securely on cars. Though it’s a complicated software story, the bottom line is that the company is enabling automakers like Volvo and Jaguar Land Rover to build highly secure, highly informative, and highly efficient cockpit systems that work across multiple chip architectures.

Mercedes Benz exhibited its new MBUX cockpit design, which they built in conjunction with graphics chip specialist Nvidia. Unlike the other conceptual cockpit announcements, the MBUX system will be available in the company’s A-Class sedan, which they formally introduced at CES with a starting price of $32,500. Like some of the other systems, MBUX supports multiple high-resolution displays and responds to voice-commands, in this case, “Hey, Mercedes.”

Assisted Driving
Nvidia also made an important announcement in the field of assisted driving with the debut of their Drive AutoPilot platform, which they claim is the first commercially available Level 2+ system. (The National Highway Transportation Association, or NHTSA, has a 5-level scale of autonomous driving technologies.) The new platform is based on the company's Xavier chip designed for automotive applications along with its Drive AV external tracking software solution and Drive IX internal-focused tracking software for applications like driver monitoring.

What the Level 2/2+ rating means is that it provides a number of tech-driven safety enhancements around braking, accelerating, and steering, but it requires drivers to keep their hands on the wheel and stay in control of the driving process. All told, carmakers that use the technology (starting in 2020) will be able to provide a safer, more enjoyable driving experience without having to worry about the concerns around fully autonomous driving.

Intel’s Mobileye CEO talked about leveraging some of its RSS (Responsibility-Sensitive Safety) autonomous driving technology – originally designed to support Level 5 autonomy  – for Level 2-type implementations. Again, this reflects the more realistic perspective that the auto industry and its tech partners now have for autonomy, and it shows how Intel is adapting its offerings to meet these more realistic needs.

While there’s no doubt that we’ll continue to see plenty of advancements in the world of autonomous vehicles, particularly for applications like robo-taxis or delivery services, it’s also very clear that automotive tech advancements are now focused on improving the quality, safety, and enjoyment of the traditional driving experience.

That may not be as sexy as fully autonomous cars, but it’s something we can all look forward to.

Here’s a link to the original column:

USA TODAY columnist Bob O'Donnell is the president and chief analyst of TECHnalysis Research, a market research and consulting firm that provides strategic consulting and market research services to the technology industry and professional financial community. His clients are major technology firms including Microsoft, HP, Dell, and Intel. You can follow him on Twitter @bobodtech.