One of the purported advantages of self-driving car tech is that every car can learn from one vehicle’s mistakes. Here’s how Waymo puts it on its website: “The Waymo Driver learns from the collective experiences gathered across our fleet, including previous hardware generations.”
But in Austin, Waymo’s vehicles struggled for months to learn how to stop for school buses as drivers picked up and dropped off children. An official with the Austin Independent School District (AISD) alleged that the vehicles had, in at least 19 instances, “illegally and dangerously” passed the district’s school buses while their red lights were flashing and their stop arms were extended rather than coming to complete stops, as the law requires.
In early December, Waymo even issued a federal recall related to the incidents, acknowledging at least 12 of them to federal regulators at the National Highway Traffic Safety Administration (NHTSA), which oversees road safety. According to federal filings, engineers with the self-driving vehicle company had “developed software changes to address the behavior” weeks before.
But even after the recall, the school-bus-passing incidents continued, according to school officials and a report from the National Transportation Safety Board (NTSB), an independent federal safety watchdog that’s also investigating the situation.
Now, email and text messages between school officials and Waymo representatives, obtained by WIRED through a public records request, show the lengths that the Austin public school district and Waymo went to try to solve the problem. AISD even hosted a half-day “data collection” event in a school parking lot in mid-December, the documents show, with several employees pulling together school buses and stop-arm signals from across the fleet so the self-driving car company could collect information related to vehicles and their flashing lights.
Still, by mid-January, over a month later, the school district reported at least four more school-bus-passing incidents had taken place in Austin. “The data we collected from the beginning of the school year to the end of the semester shows that about 98 percent of people that receive one violation do not receive another,” an official with the school’s police department told the local NBC affiliate that month. “That tells us that the person is learning, but it does not appear the Waymo automated driver system is learning through its software updates, its recall, what have you, because we are still having violations.”
The situation raises questions about the self-driving technologies' curious blind spots and the industry’s ability to compensate for them even after they’ve been spotted.
Self-driving software has long struggled with recognizing flashing emergency lights and road safety devices with long, thin arms, including gates and stop-arms, says Missy Cummings, who researches autonomous vehicles at George Mason University and served as a safety adviser to the NHTSA during the Biden administration. “If [the company] didn't fix this a few years ago, the more they drive, the more it’s going to be a problem,” she says. “That’s exactly what’s happening here.”
Waymo did not respond to WIRED’s requests for comment. A spokesperson for the Austin Independent School District referred WIRED to the NTSB while the incidents are under investigation. A spokesperson for the NTSB declined to answer WIRED’s questions while its investigation continues.
Illegal Passing
By midwinter of 2025, AISD officials were frustrated. In one of the 19 incidents alleged by a lawyer for the district in a letter later released by federal road safety regulators, a Waymo passed a school bus letting off children “only moments after a student crossed in front of the vehicle, and while the student was still in the road.”
“Alarmingly,” the lawyer wrote, five of the alleged incidents had occurred after Waymo had assured the district that it had updated its software to fix the problem. Federal regulators with the NHTSA had already launched a probe into the behavior. “Austin ISD is evaluating all potential legal remedies at its disposal and intends to take whatever action is necessary to protect the safety of its students, if required,” the lawyer warned.
Just a few days later, on the same early December day when news of Waymo’s recall hit the news, Waymo’s Emergency Response and Outreach manager, Rob Patrick, called the assistant chief of the Austin district’s police department, according to records obtained by WIRED. Patrick and Waymo had a proposal, according to an email the assistant chief, Travis Pickford, sent to counterparts at the district: The company wanted to collect data on the district school buses, presumably so its cars could more readily identify when they shouldn’t be passed.
“Specifically, they wanted to focus their data collection on the amber and red light signals on each of our school buses, and their cars’ ability to see them at varying distances,” Pickford wrote in the email.
By the following Monday, three days later, school transportation officials had agreed to gather at least seven buses representing all the models in its more than 550-vehicle fleet at the school’s athletic complex. Officials even passed Waymo the specifications of the buses’ lights setup. By mid-afternoon the following Wednesday, the data collection event was over, and Waymo told school officials the company had what it needed.
But Waymo’s school-bus-passing incidents didn’t stop.
A preliminary report by the NTSB published in early March found that one ensuing incident, on January 12, occurred after a Waymo remote assistant, a Michigan-based human tasked with “helping” the software when it was struggling on the road, incorrectly told the robotaxi that the school bus ahead of it didn’t have active signals on. Six vehicles passed the school bus while it was stopped, the agency said. It is still investigating.
Just days after those January incidents, a Waymo struck a child crossing in front of the vehicle near a school in Santa Monica, California. The child was reportedly unharmed, and Waymo later said that its models showed that a human driver would have hit the child at a higher speed than its car.
Robot Learning
Cummings, the autonomous vehicle researcher, says she’s not surprised that Waymo’s data-collection event didn’t solve all its school bus problems. It can take weeks or even months for software developers to use new data to train a self-driving car’s AI-driven systems. What’s more, data collected in an athletic facility’s parking lot, outside of the context in which Waymo’s cars usually drive, might not help much. “Data generated in a parking lot is not going to be sufficient,” she says.
Waymo, Cummings says, “should not be allowed to operate around schools during school pickup and drop-off until they get this problem fixed and can demonstrate it with specific tests.”
Philip Koopman, an autonomous-vehicle software and safety researcher at Carnegie Mellon University, says the school bus incidents might be particularly challenging for Waymo’s software because stop signs mean slightly different things in different contexts. There are the stop signs at intersections, stop signs held in the hands of construction workers directing traffic, and stop signs attached to school buses. Waymo is trying to teach its software “something very subtle,” he says.
The whole episode points to the complex challenges of operating a software-driven robot on sometimes unpredictably human roads. Teaching software to drive safely 99 percent of the time is the easy part. But “the last 1 percent is really tough, because we’re trying to teach an exception,” Koopman says.
“Waymo is struggling to teach their machine learning the lesson Waymo wants it to learn,” Koopman says. “That’s not a surprise. This was always going to be a problem.”