Tesla recalls nearly all vehicles sold in US to fix system that monitors drivers using Autopilot::Tesla is recalling nearly all vehicles sold in the U.S., more than 2 million, to update software and fix a defective system when using Autopilot.

  • givesomefucks@lemmy.world
    link
    fedilink
    English
    arrow-up
    24
    arrow-down
    1
    ·
    1 year ago

    The attempt to address the flaws in Autopilot seemed like a case of too little, too late to Dillon Angulo, who was seriously injured in 2019 crash involving a Tesla that was using the technology along a rural stretch of Florida highway where the software isn’t supposed to be deployed.

    "This technology is not safe, we have to get it off the road,” said Angulo, who is suing Tesla as he recovers from injuries that included brain trauma and broken bones. “The government has to do something about it. We can’t be experimenting like this.”

    This is the important part, it’s not just Tesla not giving a shit about their customers safety, it’s dangerous to literally anyone in the same road as a Tesla

  • proudblond@lemmy.world
    link
    fedilink
    English
    arrow-up
    10
    ·
    1 year ago

    I live in a major metropolitan area, drive a model 3, and almost never use autopilot.

    I am lucky enough to rarely be in stop-and-go traffic, but when I am, I don’t even use cruise control, because it’s too reactive to the car in front of me and subsequently too jerky for my preference.

    As for autopilot, I was on a relatively unpopulated freeway in the second lane from the right, when a small truck came around a clover leaf to merge into the right lane next to me. My car flipped out and slammed on the breaks. The truck wasn’t even coming into my lane; he was just merging. Thankfully there was a large gap behind me, and I was also paying enough attention to immediately jam on the accelerator to counteract it, but it spooked me pretty badly. And this was on a road that it’s designed for.

    Autopilot (much less FSD) can’t really think like our brains can. It can only “see” so far ahead and behind. It can’t look at another driver’s behavior and make assessments that they might be distracted or or drunk. We’re not there yet. We’re FAR from there.

  • ieightpi@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    ·
    1 year ago

    Why can’t we just regulate autopilot to interstate systems for the time being? There’s no reason a person needs to be attentive for miles on end with no stops. And if the tech isnt there for complicated traffic patterns in suburban and urban environments, make it inaccessible unless you are on the interstate. Seems like an easy enough fix for the time being.