U.S. expands investigation into Waymo over robotaxis driving around stopped school buses

Self-Driving Cars Cause Concern as Waymo's Robotaxis Pass Stopped School Buses, Investigation Expands

Federal regulators have announced an expansion of their investigation into Waymo, the autonomous vehicle company owned by Alphabet Inc., amid reports that its self-driving cars have navigated around school buses stopped on the road in Austin, Texas. The National Highway Traffic Safety Administration (NHTSA) has been looking into the performance of Waymo's vehicles and their ability to follow traffic safety laws.

According to sources, 20 incidents this school year in Austin, where a Waymo vehicle illegally passed a school bus with flashing red lights and a deployed stop-arm signal, have been reported by the local school district. The NHTSA has stated that all 50 U.S. states have laws requiring vehicles to halt for stopped school buses.

Waymo has acknowledged the issue and claimed to have identified the software problem that contributed to the incidents. The company implemented updates on November 17, which it says improved its vehicle's performance. However, local officials in Austin remain concerned about road safety, as Waymo received its 20th citation since the beginning of the school year.

The Austin Independent School District had requested that Waymo cease operations during hours when students are loading and unloading from school buses until software updates were completed and the company could guarantee compliance with traffic laws. Waymo refused to do so.

In response to the NHTSA probe, Waymo highlighted its overall safety record, stating that it has achieved a fivefold reduction in injury-related crashes compared to human drivers. However, the incident in Atlanta, Georgia, where a Waymo vehicle drove around a stopped school bus with flashing red lights and a deployed stop-arm signal, raised concerns.

Waymo plans to issue a voluntary software recall with NHTSA next week, as part of its commitment to continuous improvement. The company will continue analyzing its vehicles' performance and making necessary fixes to ensure public safety.
 
πŸ€” I'm not sure about these self-driving cars getting a free pass just because they're 'sophisticated tech'. If a human driver can't be trusted to stop for a school bus, why should a robot? It's just common sense, right? πŸš—πŸ˜• They've got a software problem and now they're making it sound like everything is fine with some fancy stats on safety. I'm not buying it. Why did they refuse to stop ops during school hours until the issue was fixed? That just screams of hubris. Can't put profit over people, no matter how 'innovative' the tech. πŸ€‘πŸ˜’
 
I mean, can't believe Waymo thought it was cool to pass those school buses πŸ™„πŸš—... I get it, self-driving cars are the future and all that, but come on! Kids are just trying to get to school safely πŸ’•. It's like they're saying, 'Oh, we've got fancy tech, we don't need traffic laws!' πŸ˜’ Waymo needs to step up their game and make sure those vehicles can follow basic rules of the road 🚧. And what's with the 'software problem' excuse? Did they not test it on school buses before launching in a city with lots of families?! πŸ€” Just saying, safety should always be the top priority πŸ™.
 
man that's wild 🀯... i get why they're expanding the investigation, but like 20 incidents is still a big deal πŸ™…β€β™‚οΈ. waymo seems to be taking it seriously though, they acknowledged the issue and implemented updates ASAP πŸ’». it's concerning that local officials in austin are still worried about road safety despite the fixes πŸ€”. i guess it's all about prioritizing public safety over tech progress πŸš—πŸ’Έ. hope they can get it right this time πŸ‘
 
Omg what's going on with these self-driving cars πŸš—πŸ˜±?! I mean I know they're supposed to be safer than human drivers but passing a school bus with flashing red lights and a stop-arm signal is just crazy! πŸ™…β€β™‚οΈ Like how can you not see that? It's like, basic traffic law stuff. πŸ€¦β€β™€οΈ

And what's up with Waymo refusing to halt their ops during hours when kids are loading/unloading from school buses?! 🚫 That's just irresponsible. I get it, they want to test and improve but safety should always come first! 😊

But at the same time, 5-fold reduction in injury-related crashes is still a big win for self-driving tech. Maybe this whole thing is just a learning curve? πŸ€” Anyway, can't wait to see what else these companies come up with πŸ’»
 
I think this is kinda crazy 🀯! These self-driving cars are supposed to be the future, but they're still messing up on school buses? It's like, we need to make sure these robots can follow the rules before they're out on the roads, you know? But I'm also thinking that Waymo is being super proactive about this - they're acknowledging the problem and updating their software already πŸ”„. And let's be real, they've had a fivefold reduction in injury-related crashes compared to human drivers... that's some solid stats! So maybe we can't expect perfection just yet, but it's great that they're taking steps to get there πŸ’».
 
πŸ€” This is so weird... I mean, I get that self-driving cars are still a new thing and all, but can't they just follow the rules like everyone else? πŸš—πŸ‘€ 20 times in one school year?! That's crazy! And what if it was an accident? What would happen then? πŸ€·β€β™€οΈ

I also don't get why Waymo wouldn't stop operations until the software updates were done... I mean, safety first, right? 😊 Can you imagine being on that bus and thinking "oh great, a self-driving car is just going to zip past us"? 😱 No thanks!

And yeah, I guess all 50 states have laws about stopped school buses... πŸ€¦β€β™‚οΈ It's not like Waymo was trying to be malicious or anything... but still, how did this happen? πŸ™„
 
πŸ€” the more I think about this, the more it makes me wonder - what does 'safety' even mean in our society? Is it just a numbers game where we track incidents but not the underlying causes? πŸš—πŸ’‘ like, why did Waymo's vehicles keep passing those stopped school buses? Was it laziness or was it something deeper? Are we so caught up in the thrill of technology that we forget about human life at the crossroads? 🚨 I mean, what's the value of safety if it just means following rules without really thinking about the consequences...
 
πŸš—πŸ‘€ I'm getting some major deja vu vibes here. We've been talking about self-driving cars for years, and now this is happening... again πŸ€¦β€β™‚οΈ. I mean, I get it, tech companies are pushing the limits of innovation, but come on! πŸ™„ Can't we just take a moment to review the basic rules of traffic safety? 🚧

I'm not saying Waymo's software problem was catastrophic or anything, but a simple recall and some updated code isn't going to cut it this time 🀯. We need concrete changes, like better human oversight or more rigorous testing protocols πŸ”.

And what really gets me is that local officials in Austin are being told to "trust" the company's software updates πŸ™…β€β™‚οΈ. I mean, come on! As a visual thinker, I'd draw a big ol' red flag here 🚨. We need transparency and accountability, not just a pat on the back for good intentions πŸ‘.

Anyway, I'm rooting for Waymo to get this sorted out ASAP ⏱️. We can't afford any more close calls like these πŸš—πŸ’₯.
 
omg can't believe waymo's self-driving cars are passing stopped school buses πŸš—πŸ˜± that's like, so not safe at all?! i mean i get it they wanna revolutionize transportation but come on folks gotta prioritize road safety first! πŸ™ 20 incidents in one city already? that's just crazy talk what if a human driver was behind the wheel? would waymo have gotten away with it too? 😳
 
I'm super concerned about this πŸ€”. Like, I get that self-driving cars are the future and all, but can't they just be more careful? πŸš— These 20 incidents in Austin where a Waymo vehicle passed a stopped school bus with flashing red lights... it's just not right πŸ™…β€β™‚οΈ. What if a kid got hurt or something? 😨 I don't think it's okay for them to just keep going, even if they say they've fixed the software issue πŸ€·β€β™‚οΈ.

And what really gets me is that Waymo refused to stop their operations during school hours until the issue was resolved 🚫. Like, aren't kids more important than some new tech? πŸ€” Not saying Waymo doesn't care about safety or anything, but come on! They need to take this seriously and not just sweep it under the rug πŸ’Έ.

I don't know, maybe I'm just old-fashioned or something πŸ˜‚, but I think we should be more careful when introducing new tech like this πŸ€–. We can't just rush into things without thinking about the consequences 🚨.
 
πŸš—πŸ˜¬ this whole thing is super concerning for me personally... like i get the excitement around self-driving cars but have we really thought through the scenario where a bus just stops on the road with flashing lights? πŸ€¦β€β™‚οΈ it's not exactly rocket science to know that you gotta stop when there are kids around. waymo needs to take full responsibility for this and make sure they're doing everything in their power to prevent these kinds of incidents from happening again... 🚫
 
πŸš—πŸ’‘ I'm low-key worried about these self-driving cars, you know? Like, I get it, they're trying to improve road safety, but this is some concerning stuff. 20 incidents in one school year is a lot, especially when kids are involved. And yeah, Waymo's got that fancy safety record and all, but that doesn't mean squat if their software can be hacked or just plain wrong. πŸ€– They need to take responsibility for their mistakes and not just say "oh, we fixed it now" without putting in the work.

I'm also kinda tired of companies pushing boundaries without thinking about the consequences. Like, what's next? Are they gonna start navigating around stopped ambulances or fire trucks? It's not worth the risk, imo. They need to prioritize public safety over their own interests and get it together before these cars hit the roads full-time.

I don't know, maybe I'm just being paranoid, but this is some shady stuff right here. Can we trust these self-driving cars with our lives? πŸ€”
 
omg this is so worrying 🀯 waymo needs to get their act together ASAP! i mean i'm all for self-driving cars and innovation but not if it puts kids lives at risk πŸ’” i was shocked to hear that a waymo vehicle drove around a school bus in atlanta, that's just crazy talk 😱 and now they're getting 20th citation this school year? 🚫 what kind of oversight is happening here?! πŸ€¦β€β™‚οΈ the fact that it took an incident in atlanta for them to acknowledge a software problem is just red flag waving πŸ”” anyway i'm rooting for waymo to fix this issue and make public safety their top priority πŸ’ͺ
 
πŸ’‘ this is a big wake-up call for all of us... we need to be careful what we're rushing into πŸš— think about the consequences of our actions, not just our own convenience πŸ€” Waymo's got some serious work to do to rebuild trust with the public and ensure their self-driving cars are safe on the roads πŸ’―
 
Ugh, this is getting crazy 🀯! How can a self-driving car just pass a school bus like that?! I mean, I know the tech is still new and all, but come on! You'd think they'd have it figured out by now πŸ™„. And to make matters worse, Waymo's all like "oh, we fixed the software" πŸ€ͺ, but what about the other 19 incidents that happened before that update? Are you kidding me?! πŸš—πŸ˜‘
 
I don't think this is a big deal πŸ€”. I mean, school buses are kinda old-school, right? They're not exactly the most futuristic thing on the road. And 20 incidents out of how many total Waymo rides or something? That's like, a drop in the ocean, you know? Plus, they've already fixed the software problem and whatnot. It's not like they intentionally tried to run down kids 🚫. And let's be real, human drivers are way more likely to mess up than self-driving cars. I mean, have you seen those humans on the road sometimes?! πŸ˜‚
 
I'm worried about these self-driving cars still... πŸ€” They're supposed to be safer than human drivers but passing a stopped school bus with flashing red lights is just not right 😬. I get that Waymo's got some good tech going on, but this incident in Austin is major and they need to take responsibility for it πŸ’―. It's like they're saying 'trust us' when they say they've fixed the software issue πŸ€–. But what if they haven't? How many more incidents have there been that we don't know about? 😬 I'm all for innovation, but public safety has gotta be number one πŸ‘₯.
 
Back
Top