Last fall, Missy Cummings sent a document to her colleagues at the National Highway Traffic Safety Administration that revealed a surprising trend: When people using advanced driver-assistance systems die or are injured in a car crash, they are more likely to have been speeding than people driving cars on their own.

The two-page analysis of nearly 400 crashes involving systems such as Tesla’s Autopilot and General Motors’ Super Cruise is far from conclusive. But it raises fresh questions about the technologies that have been installed in hundreds of thousands of cars on U.S. roads. Cummings said the data indicated that drivers were becoming too confident in the systems’ abilities and that automakers and regulators should restrict when and how the technology was used.

People “are overtrusting the technology,” she said. “They are letting the cars speed. And they are getting into accidents that are seriously injuring them or killing them.”

Cummings, an engineering and computer science professor at George Mason University who specializes in autonomous systems, recently returned to academia after more than a year at the safety agency. On Wednesday, she presented some of her findings at the University of Michigan, a short drive from Detroit, the main hub of the U.S. auto industry.

Systems such as Autopilot and Super Cruise, which can steer, brake and accelerate vehicles on their own, are becoming increasingly common as automakers compete to win over car buyers with promises of superior technology. Companies sometimes market these systems as if they made cars autonomous. But their legal fine print requires drivers to stay alert and be ready to take control of the vehicle at any time.

In interviews last week, Cummings said automakers and regulators ought to prevent such systems from operating over the speed limit and require drivers using them to keep their hands on the steering wheel and eyes on the road.

“Car companies — meaning Tesla and others — are marketing this as a hands-free technology,” she said. “That is a nightmare.”

But these are not measures that NHTSA can easily put in place. Any effort to rein in how driver-assistance systems are used will probably be met with criticism and lawsuits from the auto industry, especially from Tesla and its CEO, Elon Musk, who has long chafed at rules he considers antiquated.Safety experts also said the agency was chronically underfunded and lacked enough skilled staff to adequately do its job. The agency has also operated without a permanent leader confirmed by the Senate for much of the past six years.

Cummings acknowledged that putting in effect the rules she was calling for would be difficult. She said she also knew that her comments could again inflame supporters of Musk and Tesla who attacked her on social media and sent her death threats after she was appointed a senior adviser at the safety agency.

But Cummings, 56, one of the first female fighter pilots in the Navy, said she felt compelled to speak out because “the technology is being abused by humans.”

“We need to put in regulations that deal with this,” she said.

The safety agency and Tesla did not respond to requests for comment. GM pointed to studies that it had conducted with the University of Michigan that examined the safety of its technology.

Because Autopilot and other similar systems allow drivers to relinquish active control of the car, many safety experts worry that the technology will lull people into believing the cars are driving themselves. When the technology malfunctions or cannot handle situations such as having to veer quickly to miss stalled vehicles, drivers may be unprepared to take control quickly enough.

The systems use cameras and other sensors to check whether a driver’s hands are on the wheel and his or her eyes are watching the road. And they will disengage if the driver is not attentive for a significant amount of time. But they operate for stretches when the driver is not focused on driving.

Cummings has long warned that this can be a problem — in academic papers, in interviews and on social media. She was named senior adviser for safety at NHTSA in October 2021, not long after the agency began collecting crash data involving cars using driver-assistance systems.

Musk responded to her appointment in a post on Twitter, accusing her of being “extremely biased against Tesla,” without citing any evidence. This set off an avalanche of similar statements from his supporters on social media and in emails to Cummings.

She said she eventually had to shut down her Twitter account and temporarily leave her home because of the harassment and death threats she was receiving at the time. One threat was serious enough to be investigated by the police in Durham, North Carolina, where she lived.

Many of the claims were nonsensical and false. Some of Musk’s supporters noticed that she was serving as a board member of Veoneer, a Swedish company that sells sensors to Tesla and other automakers, but confused the company with Velodyne, a U.S. company whose laser sensor technology — called lidar — is seen as a competitor to the sensors that Tesla uses for Autopilot.

“We know you own lidar companies and if you accept the NHTSA adviser position, we will kill you and your family,” one email sent to her said.

Jennifer Homendy, who leads the National Transportation Safety Board, the agency that investigates serious automobile crashes, and who has also been attacked by fans of Musk, told CNN Business in 2021 that the false claims about Cummings were a “calculated attempt to distract from the real safety issues.”

Before joining NHTSA, Cummings left Veoneer’s board, sold her shares in the company and recused herself from the agency’s investigations that solely involved Tesla, one of which was announced before her arrival.

The analysis she sent to agency officials in the fall looked at advanced driver-assistance systems from multiple companies, including Tesla, GM and Ford Motor. When cars using these systems were involved in fatal crashes, they were traveling over the speed limit 50% of the time. In crashes with serious injuries, they were speeding 42% of the time.

In crashes that did not involve driver-assistance systems, those figures were 29% and 13%.

The amount of data that the government has collected on crashes involving these systems is still relatively small. Other factors could be skewing the results.