NTSB Findings Put Pressure on Tesla to Change Autopilot

Consumer Reports has no financial relationship with advertisers on this site.

The fatal crash of a Tesla Model X in California two years ago shows that drivers put too much trust in driver-assist technology and that federal regulators aren’t doing enough to make sure automakers are deploying their systems safely, the National Transportation Safety Board said.

The March 2018 crash killed its driver, an engineer who was commuting to his job at Apple while using Autopilot, the automated technology that enables vehicles to steer and brake themselves. Crash data showed that the driver failed to react in the moments before the crash, and his phone was playing a video game at the time of impact. The NTSB concluded that the car steered itself toward a barrier where a left-hand exit led off the highway.

It was the over-reliance on Autopilot in this and three other crashes that drew the attention of the NTSB. Decades of research have shown that people put too much trust in the technology, using it in ways that are both unintended and dangerous. Tesla hasn’t responded to these known threats, and the National Highway Traffic Safety Administration hasn’t set standards that could prevent fatalities from happening, the safety board said.

Tesla didn’t immediately respond to a request for comment.

“Industry keeps implementing technology in such a way that people can get injured or killed,” Robert Sumwalt, chairman of the NTSB, said at a hearing in Washington Tuesday. “It is foreseeable that some drivers will attempt to inappropriately use driving automation systems.”

Tesla is the only one of six automakers that hasn’t responded to the NTSB’s call three years ago for better monitoring of drivers while they use systems like Autopilot that can steer or brake the vehicle, Sumwalt said.

The Mountain View crash was caused by Autopilot steering the vehicle into the barrier, as well as driver distraction and his overreliance on the driver-assistance technology, the NTSB said. Tesla’s ineffective driver monitoring technology, which relied on sensors detecting a hand on the steering wheel, was a contributing factor, it said.

“Manufacturers and NHTSA must make sure that these driver-assist systems come with critical safety features that actually verify drivers are monitoring the road and ready to take action at all times,” said Ethan Douglas, senior policy analyst for cars and product safety at Consumer Reports. “Otherwise, the safety risks of these systems could end up outweighing their benefits.”

Recommendations for NHTSA

The NTSB called on NHTSA to evaluate the risk of using Autopilot in ways it wasn’t designed for and to use its authority to recall vehicles if needed. It said a previous investigation the agency launched after a 2017 Florida crash wasn’t adequate. It also called the agency’s overall approach to autonomous vehicle oversight misguided, because it waits for problems to happen rather than proactively implementing policies to prevent crashes.

The driver himself was aware of Autopilot’s limitations before the crash, the NTSB said. He reported to family members and colleagues that the Model X under Autopilot drove toward the same highway barrier it later hit.

A missing safety device called a crash attenuator would have mitigated damage to the vehicle and possibly saved the driver’s life, the NTSB said. The attenuator at that highway barrier had been crushed in a previous crash and hadn’t been replaced.

The NTSB investigation showed that a video game called Three Kingdoms was playing on the driver’s phone during the fatal trip. The world-building, strategy game usually requires both hands on the phone to play, the NTSB said. But the board stopped short of saying conclusively that the driver was holding the phone at the time of the crash.

The safety board called on Apple and other employers to implement policies to discourage employees from using distracting devices, and it is asking mobile-phone developers to lock out distracting features while vehicles are in motion.

For its part, NHTSA said in a statement that all commercially available motor vehicles require the human driver to be in control at all times, and all states hold the human driver responsible for vehicle operations. Crashes related to distraction remain a major concern, it said. The agency said it would review the NTSB’s new recommendations.

The Center for Auto Safety, a watchdog group, has called for Federal Trade Commission action to address Tesla’s technology names like “Autopilot” and “Full Self Driving.” These systems can’t replace human drivers like their names imply, said Jason Levine, executive director of the Washington-based center. The group also wants NHTSA to investigate, force a recall if necessary and look at new regulations.

NHTSA “has itself desisted from pursuing obvious safety concerns that arise from allowing this feature to be unregulated and used on public streets, highways, and anywhere else Tesla owners choose to engage it,” Levine said.



More from Consumer Reports:
Top pick tires for 2016
Best used cars for $25,000 and less
7 best mattresses for couples

Consumer Reports is an independent, nonprofit organization that works side by side with consumers to create a fairer, safer, and healthier world. CR does not endorse products or services, and does not accept advertising. Copyright © 2020, Consumer Reports, Inc.