A Waymo autonomous Jaguar electric vehicle is seen in Tempe, Ariz., on the outskirts of Phoenix, on Sept. 15. The company is recalling software for its robotaxis after reports that some of them failed to stop for school buses.
Charly Triballeau/AFP via Getty Images
hide caption
toggle caption
Charly Triballeau/AFP via Getty Images
The autonomous ride-hailing service Waymo plans to file a voluntarily software recall after several reports that its self-driving taxis illegally passed stopped school buses.
The National Highway Traffic Safety Administration (NHTSA) opened an investigation in October in response to “a media report involving a Waymo AV [autonomous vehicle] that failed to remain stopped when approaching a school bus that was stopped with its red lights flashing, stop arm deployed, and crossing control arm deployed.”
WXIA-TV in Atlanta aired video in September that showed a Waymo vehicle driving around a school bus.
The NHTSA website also includes a letter from the Austin Independent School District, saying the district has documented 19 instances of Waymo vehicles “illegally and dangerously” passing the district’s school buses. The letter, signed by the district’s senior counsel, says in one instance the Waymo vehicle drove past the stopped bus “only moments after a student crossed in front of the vehicle, and while the student was still in the road.”
In a statement emailed to NPR, Waymo Chief Safety Officer Mauricio Peña said that while the company is proud of its safety record, “holding the highest safety standards means recognizing when our behavior should be better.” Peña wrote that Waymo plans “to file a voluntary software recall with NHTSA” and it “will continue analyzing our vehicles performance and making necessary fixes.”
The company says it identified a software issue that contributed to the incidents and it believes subsequent updates will fix the problem. Waymo says it plans to file the voluntary recall early next week and it points out that no injuries have occurred because of this problem.
Waymo is a subsidiary of Alphabet, the parent company of Google. It has focused on safety in public statements, showing that driverless Waymo cars have a lot fewer crashes than those with human drivers. In the cities where the company operates, it says there have been 91% fewer crashes with serious injuries and 92% fewer crashes with pedestrian injuries.
Recent independent analyses conducted by leading technology news website Ars Technica and the reputable newsletter Understanding AI have lent support to Waymo’s assertion that their autonomous vehicles (AVs) are indeed safer than human drivers. However, federal regulators are now requesting Waymo to furnish additional details concerning these incidents.
The National Highway Traffic Safety Administration (NHTSA) reported that Waymo’s AVs crossed the milestone of 100 million miles of driving last July and are currently adding 2 million miles per week. Based on this data and their discussions with Waymo, the agency has indicated that “the probability of other similar incidents in the past is substantial.”
In a recent development, NHTSA investigators have forwarded a comprehensive list of specific questions regarding the incidents to Waymo as part of their investigation. The agency has requested Waymo to document comparable incidents and provide a detailed account of their response to each. Waymo has been given a deadline of January 20, 2026, to submit their responses.
Editor’s note: It is worth mentioning that Google is a financial supporter of NPR.

