Last Thursday, Teslas equipped with Full Self-Driving software were deemed defective enough to warrant a recall because they’re prone to crashing. On Friday, all of those defective cars remained on the road, with the unreliable software still available to drivers, and no firm deadline on when it will get fixed.
The Tesla recall raises important and thorny questions not only about Tesla but also about auto safety regulation in the United States.
For starters, why is the National Highway Traffic Safety Administration allowing drivers to continue to use experimental and dangerous software while Tesla tries to repair it? It’s unclear when the software will be fixed. NHTSA has imposed no deadline, and Tesla Chief Executive Elon Musk has a record of making grand promises he doesn’t keep.
Also at issue is what the recall might mean for Tesla’s future if the fixes don’t work. After all, Musk said last year that whether Full Self-Driving succeeds or fails “is the difference between Tesla being worth a lot of money or worth basically zero.”
FSD is an option that Tesla sells for $15,000. In its current state, the technology is deemed by Tesla to be “beta” software, a term computer users might recognize as a warning label on a newly issued, not-ready-for-prime-time software program whose code might still be pocked with bugs. (Important to note: So-called Full Self Driving Teslas are incapable of fully driving themselves.)
According to NHTSA, the defects that plague FSD can cause a car to suddenly speed up and race through yellow lights, violate speed limits and continue driving straight ahead from turn-only traffic lanes. A search for “FSD” on YouTube will turn up evidence of many other software problems, including a tendency for cars with FSD engaged to cross double yellow lines into oncoming traffic.
Because of the decades-old process by which traffic statistics are collected in the United States, it’s impossible to know how many injuries and deaths FSD and its limited-feature sibling, Autopilot, have caused.
Safety officials are struggling not only with new technology in the auto industry but also with Tesla in particular, an automaker that “thumbs its nose at NHTSA on a regular basis,” said Phil Koopman, a professor and autonomous technology expert at Carnegie Mellon University.
So, in its negotiations with Musk, why did NHTSA not require that FSD or the defective functions be turned off while Tesla attempts a fix? NHTSA won’t say. Koopman, emphasizing he’s only speculating, said it’s possible NHTSA feared being sued by Musk, which would require a huge commitment of resources and would drag out the situation.
“NHTSA would be motivated to get this thing fixed in a way that involves the least trauma and gets it done faster,” Koopman said.
Bryant Walker Smith, law professor at the University of South Carolina, said regulators are just coming to grips with the recent radical changes in automotive technology.
Even the term “recall” is becoming outmoded: The Tesla fix will be delivered wirelessly to cars wherever they are through what’s called over-the-air software delivery. Smith proposes the term “virtual recall.”
NHTSA’s recall rules have evolved over decades, based mainly on defects in hardware, not software. A recall can be voluntary, usually after a negotiation with NHTSA, or forced, which might happen if negotiations fail. A voluntary recall for a steering problem, for instance, would lead a carmaker to notify owners within a “reasonable” amount of time that a defect exists and that the company will fix it, Smith said. “But federal law does not require the private owner of a noncommercial vehicle to actually complete the recall.”
What’s “obviously different here,” Smith said, “is that Tesla has the ability through over-the-air software updates to immediately disable the entire system in which defects have been identified and then, when an update is ready, to achieve a 100% recall completion rate.” But Tesla’s not doing that, and NHTSA’s not forcing it to.