The tragic death of a woman struck by a self-driving Uber while walking her bicycle on a Tempe, Ariz., road on Sunday night highlights the danger of testing autonomous vehicles on public streets without a national policy for doing so.
That’s not to say that federal regulations would have saved her life, or even that the Uber and its safety operator, who was in the driver’s seat of the autonomous Volvo SUV but did not take the controls, were at fault. In the preliminary investigation, Tempe Police Chief Sylvia Moir told the San Francisco Chronicle, it wasn’t clear a human operator could have avoided the collision either.
But what is clear is that federal standards could dictate what the human in the car should or shouldn’t be required to do, or indeed, what makes an autonomous vehicle roadworthy. And consistency is needed to forge those standards. At present, autonomous vehicle testing and operation are governed by a patchwork of state laws, many inconsistent with one another. (A California provision that took effect in January prohibits individuals from participating in testing unless they hold a valid driver’s license for the particular class of vehicle, while a Georgia law exempts a license requirement for a person operating an autonomous vehicle if the automated driving system is engaged.)
If driverless cars still seem a long way off, they may already be in your driveway. In the last two years, the number of car models offering the main semiautonomous features — adaptive cruise control, lane assist, and collision avoidance — has ballooned from a handful to availability on almost every brand. Those features allow cars to accelerate, decelerate, steer, and brake on their own — in effect, the essentials of driving. The final and more difficult piece is how those vehicles react to the road and everything on it, especially unpredictable humans.
Transportation Secretary Elaine Chao, one of the few grownups in the Trump Cabinet room, has lauded the promise of autonomous vehicles to save thousands of lives, in part because human error has almost always been the cause of an alarmingly rising number of auto deaths in recent years. And despite being stereotyped as antiregulation businesses, automakers too have asked for new rules, testifying in Congress in favor of consistent national standards to give them direction for what to build.
Congress has obliged, with the House last September unanimously passing a bill setting national standards for driverless vehicle testing, followed by a nod from the Senate Commerce Committee. But that effort stalled in December, when Massachusetts Senator Ed Markey and Connecticut’s Richard Blumenthal, both Democrats, put a hold on the bill, citing safety and privacy concerns.
With Monday’s accident, Uber has suspended its testing nationwide and Boston officials have asked NuTomony, the local entry in the driverless sweeps, to cease experimental operations in the Seaport District. Both are reasonable actions erring on the side of caution.
Yet the autonomous revolution continues worldwide, undaunted in countries like Singapore and China, where more authoritarian regimes make policy-setting far easier. By all means, we prefer democracy, and the warnings of Senators Markey and Blumenthal illustrate the value of minority voices against unchecked enthusiasm.
Now it’s time to incorporate their concerns and create a single national standard that encourages both innovation and safety. Or do nothing, and watch the driverless world leave us behind.