Tesla's Autopilot Makes it Too Easy for Drivers to Tune Out, NTSB Says

Consumer Reports has no financial relationship with advertisers on this site.

Consumer Reports has no financial relationship with advertisers on this site.

The design of Tesla’s Autopilot driver-assist technology was a factor in a high-profile crash in California last year, federal investigators said Wednesday. The conclusions underscore a growing body of research suggesting automakers need to do more to make sure these systems are used safely.

The National Transportation Safety Board said in its report that Tesla’s design allows drivers to disengage too easily from the driving task. The report also blamed the driver’s inattention and overreliance on Autopilot for the January 2018 crash, when a Model S plowed into a fire truck blocking a carpool lane on Interstate 405. No one was injured in the incident.

The NTSB, which is known for its in-depth examinations of plane crashes, investigates a limited number of highway incidents each year. But the agency has taken a special interest in new driver- assist technology. Autopilot has been engaged in at least three fatal crashes—all investigated by the agency.

The NTSB released its final report Wednesday listing probable causes for the crash. The day before, the board released most of the technical data it collected during its investigation. It found that Autopilot was engaged nearly continuously during the final 13 minutes and 48 seconds before the crash and that the driver, Robin Geoulla, didn’t touch the steering wheel for the last 3 minutes and 41 seconds. The safety board noted that Geoulla was distracted by several things, including coffee, a bagel, and the radio.

The NTSB conclusions about Autopilot track with the latest research and Consumer Reports’ testing of advanced driving assistance features, says Jake Fisher, director of auto testing at CR. He says that the cited causes of the crash demonstrate the need for automakers to improve how well the systems monitor driver attention.

Systems such as GM’s Super Cruise use cameras to make sure the driver is looking at the road. If the driver isn’t engaged, the car delivers an escalating series of warnings before the car pulls itself over. Super Cruise was the highest-rated of four advanced driver-assist systems CR tested last year. In contrast, Autopilot’s hand-on-the-wheel sensors don’t detect whether a driver is actually looking at the road.

“The main flaw in Tesla’s system is that checking to see if a driver’s hand is on the wheel isn’t sufficient,” Fisher said. “It’s about doing enough to make sure the driver is engaged.”

Before the crash, the Model S was following a lead vehicle in the carpool lane. After that vehicle changed lanes, the Tesla began speeding up from 21 mph toward a preset cruise control speed of 80 mph. Autopilot detected the truck 0.49 seconds before impact and displayed a visual warning, the report said. The car hit the stopped truck at 30.9 mph, the report said. Geoulla told the NTSB he didn’t see the fire truck until after the crash.

Billions of Miles Driven

Tesla owners have driven billions of miles with Autopilot engaged, and company data indicate that drivers using the system remain safer than those operating without assistance, the company said in an emailed response to questions from Consumer Reports on Wednesday. Tesla says it made updates to its system since the crash, including adjusting the time intervals between hands-on the steering wheel warnings and the conditions under which they’re activated.

“While our driver-monitoring system for Autopilot repeatedly reminds drivers of their responsibility to remain attentive and prohibits the use of Autopilot when warnings are ignored, we’ve also introduced numerous updates to make our safeguards smarter, safer, and more effective across every hardware platform we’ve deployed,” the company said in its statement.

Advanced driver-assistance systems, such as Autopilot, aren’t the same as self-driving cars—meaning the human driver is still responsible for paying attention to the road. Cadillac, Infiniti, Mercedes-Benz, Nissan, and Volvo offer systems similar to Autopilot, under various names. These systems can maintain a vehicle’s place in the flow of traffic and keep it within the lines of its lane.

In its report, the NTSB noted that following its investigation of a fatal 2016 Tesla crash in Florida, it issued a recommendation to Tesla and five other automakers to “develop applications to more effectively sense the driver’s level of engagement.” Volkswagen, BMW, Nissan, Mercedes-Benz, and Volvo have responded with explanations about their technology and efforts to reduce misuse. Tesla hasn’t responded, the safety board said.

Tesla should immediately fix Autopilot by limiting it to conditions where it can be used safely and installing a far more effective system to verify driver engagement, says William Wallace, manager of home and safety policy at Consumer Reports.

"NTSB first raised fundamental flaws in 2017, and Tesla has failed to address them ever since,” he said. “If the company won't take these steps, then the National Highway Traffic Safety Administration should use its authority to require them to do so."

The Center for Auto Safety, a consumer group focused on auto safety, called on Wednesday for NHTSA to recall Tesla vehicles to fix Autopilot.

“The time to allow an unregulated, unsafe experiment on our roads is over,” the center said in its statement. “NHTSA needs to do its job by issuing rules and removing unsafe vehicles from the road until they can meet minimum performance standards.”



More from Consumer Reports:
Top pick tires for 2016
Best used cars for $25,000 and less
7 best mattresses for couples

Consumer Reports is an independent, nonprofit organization that works side by side with consumers to create a fairer, safer, and healthier world. CR does not endorse products or services, and does not accept advertising. Copyright © 2019, Consumer Reports, Inc.