by computerliker on 2/11/25, 7:09 PM with 219 comments
by MiscIdeaMaker99 on 2/11/25, 7:35 PM
by dexzod on 2/11/25, 7:43 PM
Thank you @Tesla for engineering the best passive safety in the world. I walked away without a scratch.
I walked away without a scratch. This could have easily killed an innocent pedestrian or bicyclist. How is this best safety engineering? If the FSD failed there should have been some secondary system to detect an imminent collision and apply brakes.by nashashmi on 2/11/25, 7:27 PM
My head hurts with how oxymoronic this is. My best guess is he wants to critique tesla without triggering the ego and arrogance of its owner. “Thank you sir for doing treat work and for fixing this problem in the future”
by generalizations on 2/11/25, 7:46 PM
> Soooooo my @Tesla @cybertruck crashed into a curb and then a light post on v13.2.4.
> Thank you @Tesla for engineering the best passive safety in the world. I walked away without a scratch.
> It failed to merge out of a lane that was ending (there was no one on my left) and made no attempt to slow down or turn until it had already hit the curb.
> Big fail on my part, obviously. Don't make the same mistake I did. Pay attention. It can happen. I follow Tesla and FSD pretty closely and haven't heard of any accident on V13 at all before this happened. It is easy to get complacent now - don't.
> @Tesla_AI how do I make sure you have the data you need from this incident? Service center etc has been less than responsive on this.
> I do have the dashcam footage. I want to get it out there as a PSA that it can happen, even on v13, but I'm hesitant because I don't want the attention and I don't want to give the bears/haters any material.
> Spread my message and help save others from the same fate or far worse.
by nuancebydefault on 2/11/25, 7:24 PM
by smitelli on 2/11/25, 7:27 PM
Very glad to hear no pedestrians got hit. Really hope the driver takes some kind of lesson away from this experience.
by arghandugh on 2/11/25, 7:42 PM
by smallpipe on 2/11/25, 7:22 PM
by mmastrac on 2/11/25, 7:42 PM
by huijzer on 2/11/25, 8:55 PM
Snowball: "So FSD failed but you still managed to find a way to praise Tesla. You failed too for not taking over in time. But your concern isn't for the lives of third parties that You and FSD endangered. No, you are worried about Tesla getting bad publicity. You have misplaced priorities."
Jonathan Challinger (the driver who crashed): "I am rightly praising Tesla's automotive safety engineers who saved my life and limbs from my own stupidity, which I take responsibility for. [...]"
Fair points from both sides I think.
by godelski on 2/11/25, 11:37 PM
It is worth noting that this picture is a reply to a screenshot of someone saying the following:
> I've lived in 8 different states in my life and most roads I've seen do everything they can to prevent human error (or at least they do once the human has shown them what they did wrong). The FSD should not have been fooled this easily, but the environment was the worst it could have been, also.
- Tweet source: https://x.com/MuscleIQ2/status/1888695047044124989
I point this out because I think probably the biggest takeaway here is how often people will bend over backwards to reach the conclusion that they want, rather than update their model to the new data (akin to Bayesian Updating for you math nerds). While this example is egregious, I think we all should take a hard look at ourselves and question where we do this too. There's not one among us that isn't resistant to change our beliefs, yet it's probably one of the most important things we can do if we want to improve things. If we have any hope of being able to not be easily fooled by hype, if we are to be able to differentiate real innovation from cons, if we are able to avoid joining Cargo Cults, then this seems to be a necessity. It's easy to poke fun at this dude, but are we all certain that we're so different? I would like to think so, but I fear making such a claim is repeating the same mistake I/we are calling out.by UncleEntity on 2/11/25, 7:27 PM
In all fairness he really should have been paying attention.
You don't get to abdicate your responsibility to Team Elon because reasons. At the end of the day you will be sitting in the defendant's chair while Tesla will just quietly settle out of court.
by stretchwithme on 2/11/25, 7:24 PM
by Animats on 2/11/25, 7:29 PM
by Animats on 2/11/25, 10:38 PM
Be afraid. Be very afraid.
Tesla is in a bind. They've been promising self driving Real Soon Now since 2016, with occasional fake demos. Meanwhile, Waymo slowly made it work, and is taking over the taxi and car service industry, city by city.
This is a huge problem for Tesla's stock price and Musk's net worth. Now that everybody in automotive makes electric cars, that's not a high-margin business any more. Tesla is having ordinary car company problems - market share, build quality, parts, service, unsold inventory. Tesla pretends they are a special snowflake and deserve a huge P/E ratio, but that's no longer the reality.
Tesla doesn't want to test in California because of "regulation". This is bogus. The California DMV is rather lenient on testing driverless cars, and California was the first state to allow them. There was no new legislation, so DMV just copied the procedures for human drivers with a few mods. Companies can get a "learner's permit" for testing with a safety driver easily, and quite a few companies have done that. The next step up is the permit for testing without a safety driver, which is comparable to a regular driver's license. It's harder to get, and there are tests. About a half dozen companies have reached that point. No driving for hire at that level. Finally there's the deployment license, which Waymo and Zoox have. That's like a commercial drivers license, and is hard to get and keep. Cruise had one, but it was revoked after a crash where someone was killed.
That's what really scares Tesla. The California DMV can and will revoke or suspend an autonomous driving license just like they'd revoke a human one. Tesla can't just pay off everyone involved and go on.
Waymos are all over San Francisco and Los Angeles, dealing with heavy traffic, working their way around double-parked cars, dodging bikes, skateboarders, and homeless crazies, backing out when faced with an oncoming truck in a one lane street, and doing OK in complex urban settings. Tesla has never demoed that level of performance. Not even close.
[1] https://www.reuters.com/technology/tesla-robotaxis-by-june-m...
by datadrivenangel on 2/11/25, 7:29 PM
by hermitcrab on 2/11/25, 8:46 PM
by throw7 on 2/11/25, 7:54 PM
Hello? Whether it's Full Self-Driving or not, it's always your fault.
by drewg123 on 2/11/25, 9:11 PM
I try it every major release, and am disappointed every time. In situations where I'd be confident, it is overly cautious. In situations where I'd be cautious, its overly confident and dangerous.
I think its best use is to keep the car in the lane while I'm distracted by something (pulling out a sandwich to eat, etc). And it seems like newer Teslas have eye tracking, so it might not even be useful for that.
by agumonkey on 2/11/25, 8:09 PM
by andix on 2/11/25, 8:29 PM
I'm not saying it wasn't FSD, but it is a possibility FSD wasn't even enabled.
by jeffbee on 2/11/25, 8:11 PM
by belter on 2/11/25, 7:16 PM
― George Carlin
by simonjgreen on 2/11/25, 8:42 PM
by josefritzishere on 2/11/25, 7:45 PM
by ilamont on 2/11/25, 7:17 PM
by rsynnott on 2/11/25, 10:55 PM
by layer8 on 2/11/25, 7:47 PM
by lubujackson on 2/11/25, 8:30 PM
by gargalatas on 2/11/25, 8:48 PM
by ck2 on 2/11/25, 7:37 PM
FSD is clearly not even beta quality
people keep saying it's "trying to commit suicide"
and it's being fixed on the fly at everyone else's cost in life
But now they are removing federal reporting requirements so buyers will NEVER know