Tesla Model 3 Autopilot Involved in Third Fatal Crash

Photo credit: Sjoerd van der Wal/Getty Images - Getty Images
Photo credit: Sjoerd van der Wal/Getty Images - Getty Images

From Popular Mechanics

For the third time, a Tesla Model 3 has crashed with its semi-automated Autopilot mode still on, according to a preliminary report from the US National Transportation Safety Board (NTSB).

Early in the morning of March 1, 2019, at 6:17 a.m., a 50-year-old man named Jeremy Banner was driving a 2018 Tesla Model 3 southbound on State Highway 441 (US 441) in Delray Beach, Palm Beach County, Florida. Banner was driving at a speed of 65 mph when he struck an eastbound 2019 truck-tractor in combination with a semitrailer. Banner was killed, but the other driver was uninjured.

According to the NTSB report, the Tesla's Autopilot system had been turned on for approximately 10 seconds prior to the crash. According to the report, "from less than 8 seconds before the crash to the time of impact, the vehicle did not detect the driver’s hands on the steering wheel." In other words, Banner turned Autopilot on and took his hands off the wheel.

Photo credit: NTSB
Photo credit: NTSB

Tesla has walked a fine line in warning drivers about such actions while also promoting the Autopilot's capabilities. Last year, the company introduced an improved version of what some Telsa owners have referred to as "Autopilot Nag" reminders.

When a car is traveling above 45 mph, like Banner's, it issues a "Hold Steering Wheel" alert after 1 minute if there isn't a car in front of the Autopilot to mimic. If there is a vehicle in front, it sends an alert after 3 minutes.

At the time, Tesla CEO Elon Musk described a balancing act of keeping Autopilot useful and safe.

But Musk has also made bold claims about Autopilot.

"I think it will become very, very quickly, maybe even towards the end of this year-but I'd say, I'd be shocked if it's not next year at the latest-that having a human intervene will decrease safety," Musk said earlier this year, speaking in an interview with MIT researcher Lex Fridman.

Speaking to The Register after the latest incident, Tesla issued this statement: "Autopilot had not been used at any other time during that drive. We are deeply saddened by this accident and our thoughts are with everyone affected by this tragedy."

"Our data shows that, when used properly by an attentive driver who is prepared to take control at all times, drivers supported by Autopilot are safer than those operating without assistance," the company continued. "For the past three quarters we have released quarterly safety data directly from our vehicles which demonstrates that.”

The first Tesla crash involving Autopilot occurred on May 7, 2016, in Gainesville, Florida. The second was on March 23, 2018, in Mountain View, California.

David Friedman, acting head of the NHTSA in 2014 and current vice president of advocacy for Consumer Reports, tells the Washington Post that he was surprised his former agency didn't seek an Autopilot recall after Gainesville. The Delray Beach crash, he says, strengthens that argument.

“Their system cannot literally see the broad side of an 18-wheeler on the highway,” Friedman says. “Tesla has for too long been using human drivers as guinea pigs. This is tragically what happens. There are multiple systems out on the roads right now that take over some level of steering and speed control, but there’s only one of them that we keep hearing about where people are dying or getting into crashes. That kind of stands out.”

Other companies have experienced fatal crashes with AI-related driving systems. In 2018, a semi-autonomous car operated by Uber fatally struck a pedestrian.

Source: MIT Technology Review

('You Might Also Like',)