Visualize: Insights that power innovation

Self-driving cars no cure for distracted driving

By Lucian McMahon  |  April 16, 2018

Distracted driving and autonomous vehicles“Distracted driving will stop when cars drive themselves,” a Forbes article boldly asserted as recently as December 2017.

More dangerous form of distracted driving

Sadly, data from recently reported fatal crashes involving semi- and fully autonomous vehicles suggests that day is not quite here and the “self-driving” cars now on the road may represent distracted driving in a new and more dangerous form.

Commercially available vehicles with semi-autonomous features typically require the vehicle operator to remain prepared to take control in the event of a safety-critical event. Many autonomous vehicle tests also are still conducted with a human operator present, who monitors vehicle performance and intervenes if needed. Manufacturers recognize that, at present, safe operation of such vehicles requires human involvement.

However, the recent crashes suggest that remaining alert enough to take control of such a vehicle in time to prevent accidents may not be so easy. Given the number of accidents attributable to distracted driving in conventional vehicles, this should not be surprising. Distracted driving claimed 3,450 lives in 2016 alone, according to the National Highway Traffic Safety Administration (NHTSA).

Complacency and distraction

Distracted driving reduces a vehicle operator’s response time, even doubly reducing it in the case of texting. Current studies and the recent self-driving vehicle crashes seem to indicate that complacency, not typically an element of conventional driving, also may reduce response time.

According to a study published in the Journal of Safety Research: “Drivers using the automated systems responded worse than those manually driving in terms of reaction time, lane departure duration, and maximum steering wheel angle to an induced lane departure event. These results also found that non-driving tasks further impaired driver responses.”

Thus, automated vehicle systems seem to encourage complacency among the humans who are supposed to serve as the vehicle’s fail-safe mechanism.
Reports about the fatal crashes involving automated vehicle technology in the United States also bear this out. In cases involving semi- and fully autonomous vehicles, the operators either did not react in time to a safety-critical event or were distracted at the time of the crash. In the case of a fatal 2016 crash involving a vehicle operating in semi-autonomous mode, the National Transportation Safety Board (NTSB) reported, “[the vehicle] driver’s pattern of use of the [semi-autonomous] system indicated an over-reliance on the automation and a lack of understanding of the system limitations.”

Semi-autonomous mode

Regarding the 2016 crash, the NTSB report further states: “Vehicle performance data showed that for the 41-minute trip … [semi-autonomous mode] was active for 37 minutes. During the trip, while [semi-autonomous mode] was in use, the system detected driver-applied torque on the steering wheel on seven different occasions for a total of 25 seconds. The longest period between alerts during which [semi-autonomous mode] did not detect the driver’s hands on the steering wheel was nearly 6 minutes. For the entire trip, [semi-autonomous mode] was in some form of warning mode for a total of approximately 2 minutes.”

The NTSB found “lack of reaction” on the part of the vehicle operator to be partly responsible for the crash.

Subsequently, on March 18, 2018, a vehicle operating in autonomous mode reportedly struck and killed a woman in Tempe, Arizona, resulting in the first fatal accident involving an autonomous vehicle and a pedestrian. While the investigation is ongoing, initial reports indicate the vehicle was traveling at 40 mph with no signs of slowing down and that the safety driver was looking down for most of the 10 seconds leading up to the crash.

So, it appears that response times during automated vehicle operation are worse than when a human is driving, and even if a driver is prepared to take control of the vehicle, it will take that person longer to do so. Distraction may exacerbate this hazard.

Safety and technology experts reportedly have begun calling for safety standards for semi-autonomous vehicles in the wake of these crashes, expressing concerns about current methods of engaging driver attention while the vehicle is operating autonomously and about the possible inability of human operators to take control quickly enough.

Technology challenges remain

Taking control quickly would not be an issue if semi- and fully autonomous vehicles were infallible, but currently they are not. Manufacturers seem to be recognizing these technological challenges.

In 2014, Volvo executives said the company’s plan to have 100 self-driving Volvos on public roads in everyday driving conditions was “moving forward rapidly.” However, as 2017 came to a close, Volvo stated it would not reach its goal until 2021. “In some areas, we are finding that there were more issues to dig into and solve than we expected,” Marcus Rothoff, Volvo’s autonomous driving program director, told Automotive News Europe.“Sensors still have a long way to go,” writes Bryan Salesky, who heads Ford-backed autonomous-vehicle firm Argo AI. “Cameras are challenged in poor lighting and tend to struggle to provide enough focus and resolution at all desired ranges of operation…. Individual sensors don’t fully reproduce what they capture, so the computer has to combine the inputs from multiple sensors, then sort out the errors and inconsistencies. Combining all of this into one comprehensive and robust picture of the world for the computer to process is incredibly difficult.”

“Those who think fully self-driving vehicles will be ubiquitous on city streets months from now or even in a few years are not well connected to the state of the art or committed to the safe deployment of the technology,” Salesky writes.

Until manufacturers can produce vehicles that are significantly better than people at detecting and reacting to the myriad issues that arise during routine driving conditions, fallible machines will have to be backed up by fallible, distractible humans.


Lucian McMahon, CPCU, ARM-E, AU-M, is a product development specialist with the ISO Emerging Issues team. You can contact Lucian at lucian.mcmahon@verisk.com.