U.S. opens probe into Tesla’s Autopilot over emergency vehicle crashes

FILE PHOTO: The logo of Tesla is seen in Taipei
·4 min read

By David Shepardson and Hyunjoo Jin

WASHINGTON (Reuters) -U.S. auto safety regulators on Monday opened a formal safety probe into Tesla Inc's driver assistance system Autopilot after a series of crashes involving Tesla models and emergency vehicles.

The National Highway Traffic Safety Administration (NHTSA) said it had identified 11 crashes since January 2018 in which Teslas "have encountered first responder scenes and subsequently struck one or more vehicles involved with those scenes."

The probe will take in 765,000 U.S. vehicles with Autopilot built since 2014. Tesla shares closed down 4.3% on the news.

NHTSA, which closed an earlier investigation into Autopilot in 2017 without taking any action, has come under fire for failing to ensure the safety of the system that handles some driving tasks and allows drivers to keep their hands off the wheel for extended periods.

After the new probe, the auto safety agency could opt to take no action, or it could demand a recall, which might effectively impose limits on how, when and where Autopilot operates.

Any restrictions could narrow the competitive gap between Tesla's system and similar advanced driver assistance systems offered by established automakers.

Tesla did not immediately respond to a request for comment. Chief Executive Elon Musk has repeatedly defended Autopilot and in April tweeted that "Tesla with Autopilot engaged now approaching 10 times lower chance of accident than average vehicle."

NHTSA said it had reports of 17 injuries and one death in the 11 crashes, including the December 2019 crash of a Tesla Model 3 that left a passenger dead after the vehicle collided with a parked fire truck in Indiana.

The 11 crashes included four this year and it had opened a preliminary evaluation of Autopilot in the 2014-2021 Tesla Models Y, X, S, and 3. The crashes involved vehicles "all confirmed to have been engaged in either Autopilot or Traffic Aware Cruise Control," it said.

Before NHTSA could demand a recall, it must first upgrade an investigation into an engineering analysis. The two-step investigative process often takes a year or more.

AFTER DARK

NHTSA said most of the 11 crashes took place after dark and crash scenes included measures like emergency vehicle lights, flares or road cones.

Its investigation will assess technologies "used to monitor, assist, and enforce the driver's engagement" with driving when using Autopilot operation.

Musk tweeted last month that Tesla's advanced camera-only driver assistance system, known as "Tesla Vision," will soon "capture turn signals, hazards, ambulance/police lights & even hand gestures."

Autopilot was operating in at least three fatal Tesla U.S. crashes since 2016, the National Transportation Safety Board (NTSB) has said.

The NTSB has criticized Tesla's lack of system safeguards for Autopilot and NHTSA's failure to ensure the safety of Autopilot.

NTSB chair Jennifer Homendy on Monday praised the new probe. She said the board has urged the agency to develop standards for drive monitoring systems and require automakers to "incorporate system safeguards that limit the use of automated vehicle control systems to those conditions for which they were designed."

In February 2020, Tesla's director of autonomous driving technology, Andrej Karpathy, identified a challenge for its Autopilot system: how to recognize when a parked police car's emergency flashing lights are turned on.

"This is an example of a new task we would like to know about," Karpathy said at a conference.

KEY CONCERNS

Bryant Walker Smith, a law professor at the University of South Carolina, said the parked emergency crashes "really seem to illustrate in vivid and even tragic fashion some of the key concerns with Tesla's system."

NHTSA, he said, "has been far too deferential and timid, particularly with respect to Tesla."

Democratic Senators Richard Blumenthal and Ed Markey, who have previously questioned the Autopilot system, urged a thorough and transparent probe that would lead to improvements in the safety of Tesla's automated driving and driver assistance technology and "prevent future crashes."

NHTSA said Monday it has sent teams to review 31 Tesla crashes involving 10 deaths since 2016 where it suspected advanced driver assistance systems use. It ruled out Autopilot in three of the crashes.

It noted that "no commercially available motor vehicles today are capable of driving themselves" and said drivers must use them correctly and responsibly.

NHTSA has been without a Senate-confirmed administrator since January 2017. President Joe Biden has yet to nominate anyone for the post.

(Reporting by David Shepardson in Washington and Hyunjoo Jin in Oakland, CaliforniaEditing by David Holmes, Nick Zieminski and Richard Pullin)

Our goal is to create a safe and engaging place for users to connect over interests and passions. In order to improve our community experience, we are temporarily suspending article commenting