Tesla's self-driving cars under fire again

XPJunkie

Well-known member
Tesla's Autopilot system is facing intense scrutiny once again after a new investigation by the US National Highway Traffic Safety Administration (NHTSA) revealed alarming safety concerns. The agency has launched an inquiry into 2.88 million Tesla vehicles equipped with the "Full Self-Driving" (FSD) feature, citing instances of reckless behavior that may be breaking traffic laws and causing accidents.

According to reports, some Teslas have been observed blowing through red lights, drifting into oncoming lanes, and even crashing at intersections. A staggering 23 cases have resulted in injuries, while a smaller number involved actual crashes. The NHTSA has accused Tesla's FSD software of inducing vehicle behavior that violates traffic safety laws.

The investigation comes as no surprise to many experts, who have long argued that Autopilot and FSD systems are not yet ready for widespread adoption on public roads. "Self-driving" still means supervised driving, with regulators emphasizing the importance of constant driver oversight to ensure safety.

Tesla has been facing mounting pressure from regulatory bodies and lawmakers over its handling of Autopilot-related incidents. In one high-profile case, a California jury ordered Tesla to pay $329 million after an Autopilot-related crash killed a woman. The company is also currently fighting a false advertising lawsuit from California's DMV, which claims that calling the software "Full Self-Driving" is misleading.

As the investigation into Tesla's FSD system unfolds, drivers of affected vehicles are advised to remain vigilant and exercise caution on the roads. With safety regulators circling and lawsuits piling up, it remains to be seen whether Tesla will be able to regain public trust in its autonomous technology.

For those considering purchasing a Tesla, the latest developments serve as a sobering reminder that even the most advanced technologies still require human oversight for safe operation. As experts continue to push for improved safety standards, one thing is clear: the future of self-driving cars will not arrive without first addressing the safety concerns of today's vehicles.
 
omg like what's new with tesla right? they're just gonna keep pushing forward no matter how many people get hurt or their company gets sued lol. anyone who thinks autopilot is safe needs to take a chill pill. all this fuss over 2.88 million cars, though...i guess that's a lot of egos bruised 🙄
 
I was thinking about my cat the other day 🐱... I mean, have you ever noticed how they can sleep anywhere? Like, they just curl up in a ball and voilà! Instant nap 😴... anyway, back to this Autopilot thingy... I don't get why people are so surprised that a car can't even stop at a red light 🚫... like, come on, humans have been doing that for centuries... but seriously, 23 injuries from Teslas? That's crazy! 😲 and what's with the whole "Full Self-Driving" label? Sounds like they're trying to sell us something instead of actually making it safe 🤑... guess I'll stick to my old Honda Civic 👍
 
I don’t usually comment but I feel like Tesla has been really reckless with their Autopilot system from the get go 🤔. They're always pushing for more features and faster deployment, without putting enough thought into whether it's actually safe to use on public roads. And now we're seeing the consequences - people getting hurt or even killed because of a car that's supposed to be "self-driving". It's just not right 😕.

And what really gets me is that Tesla has been making these claims about Autopilot being capable of fully autonomous driving, but it's still basically just an advanced form of cruise control 🚗. I mean, how hard can it be to program a system to always follow the speed limit and obey traffic laws? It sounds like common sense, but apparently not for Tesla 😂.

I think we need more regulation on this stuff, especially when it comes to safety standards. We should be pushing companies like Tesla to make sure their technology is reliable and safe before they start selling it to the public 🚨. Until then, I'll stick to driving my own car - one where I'm in control of what's happening!
 
idk why tesla thinks its autopilot system is ready for prime time yet 😂 i mean, have they even seen my cat try to drive a car? it'd be a disaster 🤣 anyway, back to autopilot... what's up with these red light cameras and fines? can't we just have one set of rules that everyone follows? no more exploiting loopholes or "oh, I didn't see the camera" excuses 🙄
 
I'm low-key skeptical about Tesla's Autopilot system 🤔. I mean, 2.88 million vehicles and 23 injuries? That's a lot of red flags 🚨. And to think they're still calling it "Full Self-Driving" 😂 like that's going to fool anyone. We need more transparency from Tesla about the limits of their FSD software. It's one thing to have autonomous tech, but another to rely on it without human oversight.

I'm not saying other manufacturers aren't working on this stuff too 🤷‍♂️, but at least they're acknowledging the need for caution. Not so with Tesla... it feels like they're pushing the boundaries and expecting everyone else to follow 😒. It's time for them (and the rest of us) to take a step back and rethink what "autonomous" really means 🚗💡
 
Back
Top