Tesla knew Autopilot caused death, but didn’t fix it::Software’s alleged inability to handle cross traffic central to court battle after two road deaths

  • dub@lemmy.world
    link
    fedilink
    English
    arrow-up
    17
    arrow-down
    1
    ·
    1 year ago

    A times B times C equals X… I am jacks something something something

    • tool@lemmy.world
      link
      fedilink
      English
      arrow-up
      19
      arrow-down
      1
      ·
      1 year ago

      A times B times C equals X… I am jacks something something something

      Narrator: A new car built by my company leaves somewhere traveling at 60 mph. The rear differential locks up. The car crashes and burns with everyone trapped inside. Now, should we initiate a recall? Take the number of vehicles in the field, A, multiply by the probable rate of failure, B, multiply by the average out-of-court settlement, C. A times B times C equals X. If X is less than the cost of a recall, we don’t do one.

      Woman on Plane: Are there a lot of these kinds of accidents?

      Narrator: You wouldn’t believe.

      Woman on Plane: Which car company do you work for?

      Narrator: A major one.

      • droans@lemmy.world
        link
        fedilink
        English
        arrow-up
        8
        arrow-down
        1
        ·
        1 year ago

        When you’re selling a million cars, it’s guaranteed that some of them will have a missed defect, no matter how good your QC is.

        That’s why you have agencies like the NHTSA. You need someone who can decide at what point the issue is a major defect that constitutes a recall.

          • Clent@lemmy.world
            link
            fedilink
            English
            arrow-up
            3
            ·
            1 year ago

            Correct. They also push updates so they know exactly what software is running.

            They’ve create a chain of liability.

            The big automakers better be taking notes because it they seem to be trying to follow Tesla down this legal rabbit hole

  • harold@lemmy.world
    link
    fedilink
    English
    arrow-up
    14
    arrow-down
    2
    ·
    edit-2
    1 year ago

    but im saving the planet and making sure elon gets a cut of my money

  • Nogami@lemmy.world
    link
    fedilink
    English
    arrow-up
    7
    arrow-down
    1
    ·
    edit-2
    1 year ago

    Calling it Autopilot was always a marketing decision. It’s a driver assistance feature, nothing more. When used “as intended”, it works great. I drove for 14 hours during a road trip using AP and arrived not dead tired and still alert. That’s awesome, and would never have happened in a conventional car.

    I have the “FSD” beta right now. It has potential, but I still always keep a hand on the wheel and am in control of my car.

    At the end of the day, if the car makes a poor choice because of the automation, I’m still responsible as the driver, and I don’t want an accident, injury, or death on my conscience.

  • chakan2@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    ·
    1 year ago

    It’s time to give up the Tesla FSD dream. I loved the idea of it when it came out, and believed it would get better over time. FSD simply hasn’t. Worse, Musk has either fired or lost all the engineering talent Telsa had. FSD is only going to get worse from here and it’s time to put a stop to it.

    • NιƙƙιDιɱҽʂ@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      edit-2
      1 year ago

      The article isn’t talking about FSD, these accidents are from 2019 and 2016 before public availability of FSD. Of course, “Full Self Driving” ain’t great either…

      The whole article is kind of FUD. It’s saying engineers didn’t “fix” the issue, when the issue is people are using Autopilot, essentially advanced lane keep, on roads it shouldn’t be used on. It doesn’t give a shit about intersections, stop signs, or stop lights. It just keeps you in your lane and prevents you from rear ending someone. That’s it. It’s a super useful tool in it’s element, but shouldn’t be used outside of freeways or very simple roads at reasonable speeds. That said, it also shouldn’t be fucking called “autopilot”. That’s purely marketing and it’s extremely dangerous, as we can see.

  • fne8w2ah@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    ·
    1 year ago

    Yet Phoney Stark keeps on whinging about the risks of AI but at the same time slags off humans who actually know their stuff especially regarding safety.

  • _stranger_@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    arrow-down
    1
    ·
    edit-2
    1 year ago

    There’s like three comments in here talking about the technology, everyone else is arguing about names like people are magically absolved of personal responsibilities when they believe advertising over common sense.

    • chakan2@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      Because the tech has inarguably failed. It’s all about the lawyers and how long they can extend Tesla’s irresponsibility.

      • _stranger_@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        1 year ago

        See, I would much rather have this discussion vs another one about advertising and names.

        We’re seeing progress. Ford is expanding features on Blue Cruise (in-lane avoidance maneuvers I believe). I think Mercedes is expanding the area theirs works in. Tesla added off-highway navigation in the last year.

        No one’s reached full autonomy for more than a few minutes or a few miles, but I wouldn’t say there’s no argument there. In fact, I’d say they’re arguably making visible progress.

  • NathanielThomas@lemmy.world
    link
    fedilink
    English
    arrow-up
    7
    arrow-down
    7
    ·
    1 year ago

    I am the last person to defend Elon and his company but honestly it’s user error. It’s like blaming Microsoft for deliberately ignoring logic and downloading viruses. The autopilot should be called driver assist and that people still need to pay attention. Deaths were caused by user negligence.

  • BruinBears@lemmy.world
    link
    fedilink
    English
    arrow-up
    9
    arrow-down
    22
    ·
    1 year ago

    I do agree the name and Teslas general advertising of drivers assists are a bit misleading.

    But this is really on the driver for not paying attention.

    • Doug7070@lemmy.world
      link
      fedilink
      English
      arrow-up
      28
      arrow-down
      4
      ·
      1 year ago

      “A bit misleading” is, I think, a bit of a misleading way to describe their marketing. It’s literally called Autopilot, and their marketing material has very aggressively pitched it as a ‘full self driving’ feature since the beginning, even without mentioning Musk’s own constant and ridiculous hyperbole when advertising it. It’s software that should never have been tested outside of vehicles run by company employees under controlled conditions, but Tesla chose to push it to the public as a paid feature and significantly downplay the fact that it is a poorly tested, unreliable beta, specifically to profit from the data generated by its widespread use, not to mention the price they charge for it as if it were a normal, ready to use consumer feature. Everything about their deployment of the system has been reckless, careless, and actively disdainful of their customers’ safety.