After Probing Tesla's Deadly Crash, Feds Say Yay to Self-Driving

Tesla's Autopilot system isn't defective, investigators say—and it's stopping crashes.
teslaTA.jpg
Tesla

On May 7, 2016, Joshua Brown made a grim entry in the annals of technological history. The 40-year-old Ohio resident, driving a Tesla Model S, slammed into the side of a tractor trailer turning across his path on a divided highway in central Florida, becoming the first person to die in a partially autonomous car. Brown's Tesla was in "Autopilot" mode at the time of the crash, and neither human nor computer hit the brakes.

The grisly collision peeled the roof off the car and raised concerns about the safety of semi-autonomous systems, and the way in which Tesla had delivered the feature to customers. The National Highway Traffic Safety Administration (NHTSA) opened an investigation into how Autopilot works and its role in the Florida crash, and on Thursday published its findings.

"A safety-related defect trend has not been identified at this time and further examination of this issue does not appear to be warranted," NHTSA's investigators found. In other words: Tesla didn't cause Brown's death.

The verdict should relieve not just the electric car builder, but the industry at large. Semi-autonomous and driver assistance technologies are more than a fresh source of revenue for automakers. They're the most promising way to cut into the more than 30,000 traffic deaths on US roads every year. Today's systems aren't perfect. They demand human oversight and can't handle everything the real world throws at them. But they're already saving lives.

NHTSA's goal wasn't to find the exact cause of the crash (that's up to the National Transportation Safety Board, which is running its own inquiry), but to root out any problems or defects with Tesla's Autopilot hardware and software.

Even before Brown's death, some worried the inevitable first death caused---or not prevented---by autonomous tech could cripple the shift toward a safer, self-driving future. Tesla's perceived hubris in pushing immature technology into consumers hands enhanced fears of regulatory intervention. Had NHTSA come to a different conclusion, it could have decided to recall every Tesla with the tech.

Instead, the agency exculpated Tesla, and then some. It crunched the numbers to find that among Tesla cars on the road, those carrying its Autosteer technology, which can keep the car within clear lane markings, crashed 40 percent less frequently than those without.

“It’s very positive for Tesla,” says Bart Selman, who studies AI safety systems at Cornell University. “It puts the whole issue of the Florida accident in the right context.” Meaning, the system isn't flawed. It just isn't fully advanced yet.

The circumstances of the Florida accident, NHTSA found, were outside the capabilities of the Autopilot and Automatic Emergency Braking (AEB) systems. The truck was cutting across the car's path instead of driving directly in front of it, which the radar is better at detecting, and the camera-based system wasn't trained to recognize the flat slab of a truck's side as a threat.

And, NHTSA noted, Tesla does an adequate job warning its customers that the Autopilot system demands their supervision, that their hands should remain on the wheel and their eyes on the road.

“We want to promote these technologies,” says Bryan Thomas, communications director at the agency. “They will save lives and cut crashes dramatically, but innovation is a bumpy road.”

That's a less brazen echo of Elon Musk’s argument that it would be "morally wrong" not to roll out driver assistance systems as soon as possible, even if they're in beta. In response to NHTSA's report, a Tesla spokesperson said in a statement, "The safety of our customers comes first, and we appreciate the thoroughness of NHTSA’s report and its conclusion."

That thoroughness involved NHTSA sending a Special Crash Investigations Team to Florida, and to the scene of another, non-fatal, Tesla crash in Pennsylvania. The team did a bunch of track testing in Ohio, putting a 2015 Tesla Model S (the same model Brown drove) through a variety of scenarios to see when it would, and wouldn’t, brake automatically. Investigation crews, including attorneys, also collected data from Tesla on how its systems operate.

Tesla updated its own technology, via an over-the-air software push, after the Florida fatality. The system now uses more data from the in-built radar, rather than the camera, to spot obstacles. Its newer vehicles come with a whole new suite of sensors designed to push the boundaries again, and to, eventually, go beyond driver assistance, and allow fully autonomous driving.

A Tesla Model S that crashed while in self driving mode which resulted in the death of Joshua Brown on May 7, 2016.

Florida Highway Patrol/AP
A Cautious Future

For anyone hoping to buy a driverless, crash-proof, car anytime soon, this may feel disappointing. What Tesla markets as "Autopilot" is in fact a collection of driver assistance technologies, including active cruise control to control speed, and steering to keep the car in its lane. Other high end cars offer similar features. They're great for backing up humans whose attention drifts away from the road, but not yet good enough to take over full-time.

“It’s a positive finding for the continued development of automated vehicle technology,” says Richard Wallace, who directs research on connected and automated vehicles at the Center for Automotive Research in Michigan. It would be naive to think these early stage systems could eliminate crashes, and any reduction in the accident rate is a good thing.

Even if everyone on the road isn't driving an Autopilot-equipped car anytime soon, some active safety systems are going to become more prevalent. Some 20 automakers are already introducing automatic emergency braking, which can cut rear-end crashes by 40 percent, effectively making it standard on all new cars over the next five years.

It's a near guarantee Joshua Brown won't be the only person to die at the wheel of a semi-autonomous car. The systems aren’t perfect, but the statistics now show the humans who cause 94 percent of crashes are worse. From that perspective, driverless cars can’t come quickly enough.