from Hacker News

Troubles with Tesla's automated parking summon safety regulators

by stablemap on 10/2/19, 9:52 PM with 43 comments

  • by Shank on 10/2/19, 11:21 PM

    The real problem here is that Tesla is operating under the assumption that people obey warnings they give them, and that they behave rationally around their luxury cars. They do not. In a perfectly rational world, people would read the warnings and understand the risks, and thus not "expect" the Tesla to detect things that it warns that it cannot detect.

    The warnings say you need direct line of sight, that the system isn't perfect, and that it may not detect all obstacles. Even ones expected to be in parking lots. Those warnings all make sense, but the people recording the videos don't care. They're just pressing the button and being shocked that it doesn't work.

    Ultimately, regulators will step in if they feel that people still get into crashes despite having the warnings. If the warnings don't stop people from doing stupid things, they'll require more warnings or kill the feature. Unfortunately for Tesla, the regulator focus is public safety (and the underlying statistics). If the tables turn then it doesn't matter how safe the feature is when used correctly. Instead, it matters how safe the feature is when used incorrectly.

    And that will only hurt Tesla in the long run. And that's a shame, because it will make true self driving cars that much harder to get to market.

  • by pdq on 10/2/19, 11:32 PM

    Another point which needs to be brought up is insurance. If accidents are being caused by this technology, insurers are eventually going to either: increase rates on cars with FSD, or just stop offering insurance on cars with FSD.

    IMO if Tesla were really confident in their summon technology, they would cover accidents with it themselves. Relying on errors with their technology to be covered by outside insurance is reckless.

  • by powerbroker on 10/3/19, 12:45 AM

    I tried the V10 SW a few days ago. In sunny, dry weather, with no debris and 0 traffic. In a flat, well-maintained parking lot. A dog with a paper bag over its head would navigate better. I won't trust it until they are 10 iterations into it.
  • by Glawen on 10/3/19, 8:00 AM

    I find it funny that the US requires to have engraved "Objects in mirror are closer than they appear" on their mirror, and yet allow to sell cars with dubious autonomous driving claims which are misleading.
  • by underwater on 10/2/19, 11:28 PM

    "I expected the Tesla to ‘see’ it and stop, however I had to take my finger off the (app) button when I saw that my Tesla wasn’t slowing down,”

    Your car is not a fucking toy. He could have injured the occupants of the other car while he watched and filmed because he wanted to play with a cool feature without first learning how it works.

    If you don't know what's going to happen, don't assume your automated car knows how to avoid a collision.

  • by whatever1 on 10/3/19, 6:57 PM

    If I drive drunk then I have to face legal consequences, even if I don’t cause an accident. Just because of the higher probability of causing an accident.

    Tesla on the other hand can freely release half baked software killing people, causing accidents with being 100% at fault and now threatening kids in shopping malls without any consequences.

  • by vkou on 10/2/19, 10:51 PM

    As well they should, for a couple of reasons.

    1. Legally, just because you're a licensed driver does not mean you've been licensed to remote-operate a car, while standing on the sideline, in a pedestrian-heavy environment, where your line of sight, and your vehicle's path of travel is obstructed by hazards.

    2. Teslas seem incapable of reliable collision avoidance, so you can't just expect your car to do the right thing.

  • by jmpman on 10/3/19, 5:14 AM

    Maybe Tesla is coaxing insurers into disallowing claims when using summon. This would the force Tesla customers to switch to Tesla insurance.
  • by bob33212 on 10/2/19, 10:49 PM

    How much authority do they have on private property? If I buy a farm who has the authority to stop me from driving remote control or autonomous cars on that farm? If i make that farm accessible to the public to buy my vegetables how does that change?
  • by tibbydudeza on 10/3/19, 4:43 AM

    Yet another dumb boneheaded Elon move from Tesla ... cars are not browsers.

    It is one thing to test "beta" software on your PC but it is a totally another thing when it can actually kill people.

    They should look at how medical devices are developed and certified.

  • by dang on 10/2/19, 10:57 PM

  • by sschueller on 10/3/19, 7:44 PM

    So everyone who has this feature purchased the full autonomous future upgrade for $6000.

    Does anyone really think that will become a reality before the cars end of life?