U.S. safety regulators have pressured Tesla into recalling nearly 363,000 vehicles with its “Full Self-Driving” system because it misbehaves around intersections and doesn’t always follow speed limits.

The recall, part of part of a larger investigation by the National Highway Traffic Safety Administration into Tesla’s automated driving systems, is the most serious action taken yet against the electric vehicle maker.

It raises questions about CEO Elon Musk’s claims that he can prove to regulators that cars equipped with “Full Self-Driving” are safer than humans, and that humans almost never have to touch the controls.

Musk at one point had promised that a fleet of autonomous robotaxis would be in use in 2020. The latest action appears to push that development further into the future.

The safety agency says in documents posted on its website Thursday that Tesla will fix the concerns with an online software update in the coming weeks. The documents say Tesla is doing the recall but does not agree with an agency analysis of the problem.

The system, which is being tested on public roads by as many as 400,000 Tesla owners, makes unsafe actions such as traveling straight through an intersection while in a turn-only lane, failing to come to a complete stop at stop signs, or going through an intersection during a yellow traffic light without proper caution, NHTSA said.

In addition, the system may not adequately respond to changes in posted speed limits, or it may not account for the driver’s adjustments in speed, the documents said.

“FSD beta software that allows a vehicle to exceed speed limits or travel through intersections in an unlawful or unpredictable manner increases the risk of a crash,” the agency said in documents.

Musk complained Thursday on Twitter, which he now owns, that calling an over-the-air software update a recall is “anachronistic and just flat wrong!” A message was left Thursday seeking further comment from Tesla, which has disbanded its media relations department.

Tesla has received 18 warranty claims that could be caused by the software from May 2019 through Sept. 12, 2022, the documents said. The Austin-based electric vehicle maker told the agency it is not aware of any deaths or injuries.

In a statement, NHTSA said it found the problems during tests performed as part of an investigation into Tesla’s “Full Self-Driving” and “Autopilot” software that take on some driving tasks. The investigation remains open.

Despite the names “Full Self-Driving” and “Autopilot,” Tesla says on its website that the cars cannot drive themselves and owners must be ready to intervene at all times.

NHTSA’s testing found that Tesla’s FSD beta testing “led to an unreasonable risk to motor vehicle safety based on insufficient adherence to traffic safety laws.”

Raj Rajkumar, a professor of computer engineering at Carnegie Mellon University, doubts that Tesla can fix the problems cited by NHTSA with a software update. The automaker, he says, relies only on cameras and artificial intelligence to make driving decisions, a system that will make mistakes.

“Cameras can miss a lot of things,” Rajkumar said. “These are not straightforward issues to fix. If they could have fixed it, they would have fixed it a long time back.”

Most other companies with self-driving vehicles use laser sensors and radar in addition to cameras to make sure vehicles see everything.