by stablemap on 10/2/19, 9:52 PM with 43 comments
by Shank on 10/2/19, 11:21 PM
The warnings say you need direct line of sight, that the system isn't perfect, and that it may not detect all obstacles. Even ones expected to be in parking lots. Those warnings all make sense, but the people recording the videos don't care. They're just pressing the button and being shocked that it doesn't work.
Ultimately, regulators will step in if they feel that people still get into crashes despite having the warnings. If the warnings don't stop people from doing stupid things, they'll require more warnings or kill the feature. Unfortunately for Tesla, the regulator focus is public safety (and the underlying statistics). If the tables turn then it doesn't matter how safe the feature is when used correctly. Instead, it matters how safe the feature is when used incorrectly.
And that will only hurt Tesla in the long run. And that's a shame, because it will make true self driving cars that much harder to get to market.
by pdq on 10/2/19, 11:32 PM
IMO if Tesla were really confident in their summon technology, they would cover accidents with it themselves. Relying on errors with their technology to be covered by outside insurance is reckless.
by powerbroker on 10/3/19, 12:45 AM
by Glawen on 10/3/19, 8:00 AM
by underwater on 10/2/19, 11:28 PM
Your car is not a fucking toy. He could have injured the occupants of the other car while he watched and filmed because he wanted to play with a cool feature without first learning how it works.
If you don't know what's going to happen, don't assume your automated car knows how to avoid a collision.
by whatever1 on 10/3/19, 6:57 PM
Tesla on the other hand can freely release half baked software killing people, causing accidents with being 100% at fault and now threatening kids in shopping malls without any consequences.
by vkou on 10/2/19, 10:51 PM
1. Legally, just because you're a licensed driver does not mean you've been licensed to remote-operate a car, while standing on the sideline, in a pedestrian-heavy environment, where your line of sight, and your vehicle's path of travel is obstructed by hazards.
2. Teslas seem incapable of reliable collision avoidance, so you can't just expect your car to do the right thing.
by jmpman on 10/3/19, 5:14 AM
by bob33212 on 10/2/19, 10:49 PM
by tibbydudeza on 10/3/19, 4:43 AM
It is one thing to test "beta" software on your PC but it is a totally another thing when it can actually kill people.
They should look at how medical devices are developed and certified.
by dang on 10/2/19, 10:57 PM
by sschueller on 10/3/19, 7:44 PM
Does anyone really think that will become a reality before the cars end of life?