Self-Driving Test Cars Should Be Treated Like Inexperienced Teen Drivers, Experts Say

Self-Driving Test Cars Should Be Treated Like Inexperienced Teen Drivers, Experts Say

Consumer Reports has no financial relationship with advertisers on this site.

Consumer Reports has no financial relationship with advertisers on this site.

As companies test self-driving vehicles in the real world, these tons of steel guided by cameras and sensors and software should be treated more like nervous teen drivers prone to mistakes than the sophisticated technological marvels many imagine them to be, several experts told Consumer Reports.

Since an Uber self-driving test vehicle struck and killed a pedestrian last week in Tempe, Ariz., CR has interviewed multiple experts to ask about the safety of test vehicles.

They say the software behind pedestrian detection technology for Uber and other companies might not be advanced enough for public roads—even with so-called safety drivers monitoring the systems.

"We think about these technologies as being highly intelligent. They’re not," says Bryan Reimer, Ph.D., associate director of the New England University Transportation Center at MIT. "What are machines doing here? They’re operating on behavior they have learned from us.”

And just as with teen drivers, there's really no good reason to push them onto the road too soon without real supervision and lots of practice, says Missy Cummings, Ph.D., director of the Humans and Autonomy Lab at Duke University.

Cummings said she's in favor of giving self-driving cars “vision tests” before they’re allowed on the road so that companies can prove the vehicles operate safely and can avoid pedestrians, cyclists, and other cars.

Some of the experts, including one from CR, recommend a graduated licensing process, like that used for teen drivers, forcing companies to prove their cars are safe. Only after testing at low speeds with a safety driver could vehicles be allowed greater autonomy.

“You want to find out how mature the technology is,” says Dariu Gavrila, Ph.D., head of Intelligent Vehicles and Cognitive Robotics at Delft Technical University in The Netherlands.

Rate Self-Driving Cars

Cummings and Gavrila also argue for testing speed limits on public roads, specifically 25 mph or slower. Research from AAA and ProPublica shows that a pedestrian struck by a vehicle traveling at 30 mph is about 70 percent more likely to be killed than if struck at 25 mph.

“I’m a big fan of the technology when it’s moving slow because there’s a lot that could be done to mitigate problems,” she says. “But the idea of that middle, between 25 mph and on an interstate, this is a real gray area of operation that’s been untested and unproven.”

Marta Hall, president of Velodyne Lidar, the manufacturer of the Lidar system used by Uber, says she is not in favor of more government regulation of self-driving cars. But she did suggest a star-based ratings system similar to how National Highway Safety Traffic Administration (NHTSA) rates vehicles for crash test performance.

Stars could be awarded after researchers demonstrate how safely their vehicles operate.

“I don’t think that there should be this ‘fake it until you make it’ attitude,” she says. “Which is, not being transparent, pretending that your system is better than it is.”

Testing Suspended

The safety concerns were punctuated last week after the Uber vehicle struck and killed 49-year-old Elaine Herzberg, who was walking her bicycle across the road.

Arizona Gov. Doug Ducey suspended Uber’s program on Monday, saying a video he saw of the crash prompted him to take the action. The video shows the Uber SUV—which had a human safety monitor in the car—didn't slow before striking Herzberg.

Uber also said it was suspending its testing in Tempe, Pittsburgh, Toronto, and San Francisco, the cities where the ride-sharing company has been testing its self-driving vehicles.

Toyota and NuTonomy (owned by Delphi’s self-driving startup, Aptiv) also temporarily halted public road tests of self-driving vehicles.

For this story, CR asked several of the companies that compete with Uber’s self-driving car program, including General Motors and Ford, about their safety protocols, and CR was pointed to either written statements or prepared responses.

On Tuesday, the California Department of Motor Vehicles said Uber would not renew its permit to test vehicles in California. Uber said in a letter to the agency that it’s not re-applying because it knows the application would not be approved until investigations into the Tempe incident are concluded.

Despite the pedestrian death, all the researchers CR spoke with say that real-world evaluations should continue but under a set of clear rules.

“We have to be testing these technologies,” says Reimer.

How It Works

Currently, most self-driving cars detect obstacles, road markings, other vehicles, and pedestrians using a mix of radar, cameras, and Lidar, which builds a 3D picture of a vehicle’s surroundings by sweeping a laser around the vehicle multiple times a second.

Armed with data from those sensors, the vehicle processes what the sensors see, interprets what it means, and decides what—if anything—the car should do as a result. That's the software part.

Cummings says self-driving vehicles can decide which data to pay attention to and which to ignore based on location.

“The faster that the data is coming in, the faster it has to be processed, and the faster it has to be associated,” Cummings says. "As a result, self-driving cars often rely on maps to determine how dense an area is, then choose how much virtual attention sensors pay to the possibility of pedestrians," she says.

Hall says that “the Lidar itself doesn’t make the decisions in terms of whether to stop, or turn, or swerve.” Those decisions are up to collision avoidance software, which she says lags behind sensor technology.

Gavrila agrees, telling CR that Lidar can be even better than the human eye at detecting pedestrians but that a vehicle’s software must properly interpret the data.

It’s possible that, at the time of the crash in Tempe, the Uber vehicle was not using its Lidar to detect pedestrians. Sometimes self-driving vehicle prototypes use Lidar solely to map their surroundings—known in the industry as localization.

Hall says there was a chance this may have been a factor, although she emphasized that she didn't know whether this was the case. “That’s a possibility, that it was being used for localization,” she says, “and it wasn’t being used for object detection or collision avoidance.”

According to experts CR interviewed, it’s likely that investigators will find either that the vehicle’s sensors failed to pick up Herzberg’s presence, or that Uber’s algorithms failed to respond to that information.

Cummings says investigators should examine whether Uber’s software ignored the pedestrian by design because it was traveling on a road where pedestrians weren’t expected to be present. “Was it not looking for pedestrians at all because it thought it was in an area where there weren’t pedestrians?” she asks.

It will ultimately be up to National Highway Traffic Safety Administration (NHTSA) and National Transportation Safety Board (NTSB) investigators to determine whether the crash was caused by equipment, software, or something else entirely.

Big Plans

Even as Uber and other companies suspended testing, other self-driving technology companies are announcing plans to continue forward.

Waymo—Google’s autonomous vehicle company—said Tuesday that it plans to launch the world’s first self-driving transportation service this year, in Arizona. Riders will be able to use a Waymo app to request a vehicle in a designated area, according to CEO John Krafcik, who spoke at a press conference Tuesday in New York City, ahead of the New York Auto Show.

As part of that program, Waymo announced it would buy up to 20,000 Jaguar I-Pace all-electric vehicles over the next two years. Testing would begin this year, with the goal of making them part of the driverless fleet by 2020. Waymo officials noted that the SUVs would be able to travel as much as a million miles a day, gathering needed information and data on how the vehicles—and the driverless technology—are working.

According to a company statement, Waymo’s vehicles have self-driven more than 5 million miles on public roads across 25 U.S. cities, and 5 billion miles in simulation.

As soon as next week, California will allow researchers to apply for licenses to operate self-driving prototypes without a safety driver present in the vehicle, and the U.S. Senate is considering a bill that would make some standards less strict for self-driving cars than for conventional vehicles.

Jessica Gonzalez, a spokeswoman for the California DMV, told CR last week that interested companies will still need to meet certain safety requirements in order to obtain a permit, and that so far there has been little interest. “No one has applied for a driverless testing and/or deployment permit,” she wrote in an e-mail.

Safety First

Consumers Union, the advocacy division of Consumer Reports, says federal law should set the bar higher than states for safety standards when it comes to autonomous vehicle testing on public roads.

“Most states haven’t held companies accountable for the safety of automated vehicles, and current proposals before Congress would largely replicate this lax approach at the federal level and could block states from doing more,” says William Wallace, senior policy analyst for Consumers Union. “Any federal law for self-driving cars should include stronger safety measures than the ones in these bills today.”

Gavrila says a lack of transparency in testing coupled with the desire for some states to attract investment without understanding the limitations of the technology could mean unsafe autonomous vehicles are currently operating on public roads.

He says the key to a safe experience is to make sure that policy keeps pace with technology.

“For me, it seems a bit too quick a step forward by some of these states who want to basically attract jobs and new technology,” Gavrila says. “It seems like Arizona is not doing it gradually.”

Policy Failures

Policymakers have yet to develop a robust framework for safe testing, similar to how the FDA approves drugs, or how the FAA investigates accidents, says Reimer.

“There’s a number of organizations out there who are treating it [as] more of a revolution, something that we can just do quickly,” Reimer says. These organizations are “the companies calling for massive deployments in the next couple years, [and] . . . the states calling for driverless testing without a safety driver.”

For example, the presence of safety drivers in test cars—and rules governing their working hours and qualifications—are largely left up to state regulations and researchers’ internal protocols. For instance, Uber had one safety driver, but Nissan and Toyota told CR that they require two individuals in their self-driving prototypes at all times.

Ducey, the Arizona governor, issued an executive order on March 1 (several weeks before the Tempe crash) clarifying that self-driving vehicles in Arizona do not need to operate with a human driver as a backup as long as they follow applicable traffic rules.

Last week, Patrick Ptak, a Ducey spokesman, responded to questions from CR with an e-mailed statement that read, in part, “Public safety is our top priority, and the Governor's latest Executive Order provides enhanced enforcement measures and clarity on responsibility in these accidents.”

The U.S. House passed the SELF DRIVE Act in 2017, while the Senate is currently considering its own bill called the AV START Act, which seeks to speed the rollout of self-driving cars in part by setting up a separate, looser regulatory framework for these vehicles than the one currently governing traditional cars on the road. The same bill would need to pass both chambers and be signed by the president to become law.

Data Sharing Critical

Reimer has proposed that testing take place with the supervision of a public-private clearinghouse, where autonomous vehicle developers share safety data, and where highly trained investigators could help them learn from their mistakes.

“It will allow companies a neutral playing field in which data exchanges can happen strategically,” he says.

To learn, self-driving cars must experience novel situations. “In some sense, the evolution of the technology depends on learning from failures,” says Reimer.

The key is to ensure those failures aren’t fatal.

Cummings agrees that the various players in the self-driving car space need to share their findings and their methodologies with regulators and the public in addition to each other, or else their safety risks will remain unknown.

“This is one of the reasons the companies need to be more transparent about what they’re doing—because all the companies are doing it differently,” she says.

Although Hall says that she was not in favor of additional government regulation of self-driving vehicles, she still wants additional transparency that would keep unsafe cars off the road.

“If we have transparency," she says, "I think the industry can regulate itself."

Despite the publicity surrounding the Arizona crash, Cummings isn’t optimistic that it will lead to a major policy change before another fatality takes place. “The question on my mind is, how many more will have to happen before the big outcry happens?”



More from Consumer Reports:
Top pick tires for 2016
Best used cars for $25,000 and less
7 best mattresses for couples

Consumer Reports is an independent, nonprofit organization that works side by side with consumers to create a fairer, safer, and healthier world. CR does not endorse products or services, and does not accept advertising. Copyright © 2018, Consumer Reports, Inc.