Technalysis Research
Previous Guest Columns

July 10, 2017
Keeping Track of All These Voice Assistants is Becoming a Problem

November 10, 2016
What Does The Global Rise Of Nationalism Mean For Tech?

September 6, 2016
Survey Says: Yes, U.S. Consumers Will Buy Cars From Apple And Google

June 21, 2016
Be Prepared: We're Entering A Post-Device Era

May 18, 2016
Smartphone Makers Desperately Need A "Next Big Thing"

April 22, 2016
Why Sony And Other Big Tech Companies Are Tapping Into The Maker Movement's Spirit

November 12, 2015
The Best New Notebook May Not Be a Notebook

September 28, 2015
The Philosophical Challenges with Smart Homes and Smart Cars

September 2, 2015
Electronics Made In the USA

June 25, 2015
Appreciating HD Audio

June 9, 2015
The New Semiconductor Challenge: Doing More Without Moore

December 4, 2014
Limited by Design

June 2, 2014
RIP, Device Operating Systems

TECHnalysis Research Guest Columns
TECHnalysis Research founder Bob O'Donnell writes intermittent guest columns in various publications, including FastCompany, engadget and re/code. Those columns are reprinted here.

February 17, 2020
How the race to autonomous cars got sidetracked by human nature

The National Highway Traffic Safety Administration’s terminology for self-driving vehicles have sent the auto and tech industries on a wild-goose chase.

If you believed the early hype about autonomous cars, we’d all be buying them by now, or at least enjoying the view from autonomously driven ride-sharing vehicles. Needless to say, that hasn’t happened. In fact, the day that becomes a reality seems to get pushed further out every day.

On one hand, it’s easy to understand why the delay occurred. Overenthusiastic tech industry titans were eager to make an impact on a large industry that seemed ripe for disruption. Automakers, frightened by potential tech competitors, were so engulfed by FOMO that they felt caught in an environment that didn’t just enable but encouraged unrealistic time frames. This started a cycle of overpromising and underdelivering that’s still going today.

Both automakers and tech industry suppliers have pushed back their timelines for delivering real-world products. The challenge of achieving full autonomy turned out to be a significantly harder problem than many people initially acknowledged, and the grim reality of several autonomous vehicle-related deaths didn’t help matters.

In addition to these more overt problems, there’s another underlying and lesser-known one–the problem of defining the evolutionary stages of autonomous driving technology. And the impact of that challenge may last longer.

The Society of Automotive Engineers (SAE) and the US National Highway Traffic Safety Administration (NHTSA) created definitions for six different levels of autonomous driving—commonly called Levels 0-5. (Here’s a link to the NHTSA page with complete descriptions.) The goal, seemingly, was to provide a logical, technological progression made up of what appeared to be concrete, achievable steps that would take the automotive industry into its autonomous future.

The levels range from Level 0, or “No Automation,” in which there is absolutely no machine-based control of any kind (not even cruise-control type features), up to Level 5, or “Full Automation,” in which the vehicle is able to function on its own in all types of environments (though the driver may have the option to take control). In between, Level 1 is Driver Assistance, with basic automation like cruise control. Level 2 is Partial Automation, including things like automatic braking. Level 3 is Conditional Automation, in which the car drives itself, but requires the driver to be ready to take over at any time. And Level 4 is High Automation, where the car can handle all driving tasks in most (though not all) environments.


Automakers and their tech suppliers can follow these levels like a roadmap, progressing from one step to the next, adding more amounts of automated control along the way. From a pure technology perspective, it makes perfect sense.

The problem is that this system doesn’t factor in human nature. This is particularly true in the middle steps along the path—Levels 2, 3 and 4—which is exactly where most autonomous driving efforts are today.

Developers tend to assume that people can “partially” pay attention while using a “semiautonomous” car. Levels 2-4 describe certain amounts of autonomy that the vehicles can achieve, but always with the caveat that the driver needs to be ready to take control.

But people don’t behave that way in the car. They either they pay attention, or they don’t. In fact, test drivers of various autonomous vehicles have told me that it’s actually harder for a human to remain ready to take control than it is for them to fully control the vehicle. On top of that, studies have shown that even in controlled test environments, human reaction time for taking over an autonomous system is often to slow too avoid potential safety issues. People get lulled into thinking that their vehicle is safely under autonomous control, and then often fail to take back control quickly enough.

If the point of autonomous vehicles is to make our lives easier (as well as safer), that seems like a pretty big failure.

Unfortunately, some of the fatal accidents related to autonomous vehicles are directly related to the false sense of security. The death from an Uber self-driven car in Arizona in March of 2018, for example, was due in large part to the fact that the safety driver wasn’t paying attention when a pedestrian suddenly came into the street. In addition, Tesla owners using the cars’ AutoPilot feature have found themselves in similar situations, and have paid for it, not with their lives but through scary near-accidents.

In fact, the Tesla AutoPilot has become a bit of a lightning rod for many autonomous vehicle fans and detractors on several different levels. The problem is that the very name implies that drivers do not have to be in charge (and has encouraged some rather outrageous examples of people testing the capabilities of the system). U.S. Senator Edward Markey recently publicly rebuked Tesla and demanded that it “rebrand and remarket” the feature because of its potentially misleading name.

Tesla’s AutoPilot has also triggered debates about what driving level it is. Some call it a Level 3 system, while Tesla still officially deems it to be Level 2. Others in the industry suggest it fits somewhere in between. In fact, many companies have recently been describing a new Level “2+” standard that lies between Level 2 and 3 (though there is no official definition or standardization of what a vehicle should be able to do at this level).

Though that may be an interesting discussion, it sets the stage for development decisions driven by semantics and technical minutia, rather than by human behavior in the car.

All the variations on Level 2, 2+, and 3 require the driver to constantly remain vigilant in the event that the autonomous driving system requires intervention, and that simply isn’t a viable or sustainable long-term option based on typical human behavior. The grey area in between being in control and giving control over to the car is ripe for safety problems. In fact, we may see more accidents involving semiautonomous vehicles than human-controlled cars or fully-autonomous cars.

It makes much more sense to simplify any kind of driving level classification to two simple options: assisted or autonomous—an idea that some automotive industry suppliers were discussing at the latest CES convention.

Assisted driving (commonly referred to as ADAS—Advanced Driver-Assistance System) would incorporate any type of capability that assists the driver and improves the driving experience, including things like smart cruise control, veer warnings, and automatic braking. It would include types of automation that still require the driver to be in control. Autonomous, on the other hand, would mean full autonomy, or the same as the current Level 5.

Most automakers are just bringing ADAS features to market now–in both electric and gas-powered vehicles. It’s important that they introduce these features without also introducing the potential risks that semiautonomous driving levels carry with them.

Most of the fully autonomous technology is being developed for electric cars. But given than many more gas-powered vehicles are still sold than electric ones, ADAS features will reach a lot more people much sooner.

Full-throated advocates of full vehicle autonomy may decry any further delays to an autonomous driving world, but simplifying and focusing car technology developments into two simple categories is a safer and more realistic choice for the long term. Anything in between isn’t only impractical, it’s dangerous.

Here’s a link to the original column:

Bob O’Donnell is the president and chief analyst of TECHnalysis Research, LLC a market research firm that provides strategic consulting and market research services to the technology industry and professional financial community. You can follow him on Twitter @bobodtech.

Leveraging more than 10 years of award-winning, professional radio experience, TECHnalysis Research participates in a video-based podcast called Everything Technology.
  Research Offerings
TECHnalysis Research offers a wide range of research deliverables that you can read about here.