Tesla knew Autopilot caused death, but didn’t fix it::Software’s alleged inability to handle cross traffic central to court battle after two road deaths
Ford Pinto says what?
A times B times C equals X… I am jacks something something something
A times B times C equals X… I am jacks something something something
Narrator: A new car built by my company leaves somewhere traveling at 60 mph. The rear differential locks up. The car crashes and burns with everyone trapped inside. Now, should we initiate a recall? Take the number of vehicles in the field, A, multiply by the probable rate of failure, B, multiply by the average out-of-court settlement, C. A times B times C equals X. If X is less than the cost of a recall, we don’t do one.
Woman on Plane: Are there a lot of these kinds of accidents?
Narrator: You wouldn’t believe.
Woman on Plane: Which car company do you work for?
Narrator: A major one.
When you’re selling a million cars, it’s guaranteed that some of them will have a missed defect, no matter how good your QC is.
That’s why you have agencies like the NHTSA. You need someone who can decide at what point the issue is a major defect that constitutes a recall.
but when it’s the software it would be the same on all vehicles more than likely
Correct. They also push updates so they know exactly what software is running.
They’ve create a chain of liability.
The big automakers better be taking notes because it they seem to be trying to follow Tesla down this legal rabbit hole
”One of the major ones”
but im saving the planet and making sure elon gets a cut of my money
Full Self Driving is such a sim name. The feature is level 2 advanced cruise control.
Calling it Autopilot was always a marketing decision. It’s a driver assistance feature, nothing more. When used “as intended”, it works great. I drove for 14 hours during a road trip using AP and arrived not dead tired and still alert. That’s awesome, and would never have happened in a conventional car.
I have the “FSD” beta right now. It has potential, but I still always keep a hand on the wheel and am in control of my car.
At the end of the day, if the car makes a poor choice because of the automation, I’m still responsible as the driver, and I don’t want an accident, injury, or death on my conscience.
It’s time to give up the Tesla FSD dream. I loved the idea of it when it came out, and believed it would get better over time. FSD simply hasn’t. Worse, Musk has either fired or lost all the engineering talent Telsa had. FSD is only going to get worse from here and it’s time to put a stop to it.
The article isn’t talking about FSD, these accidents are from 2019 and 2016 before public availability of FSD. Of course, “Full Self Driving” ain’t great either…
The whole article is kind of FUD. It’s saying engineers didn’t “fix” the issue, when the issue is people are using Autopilot, essentially advanced lane keep, on roads it shouldn’t be used on. It doesn’t give a shit about intersections, stop signs, or stop lights. It just keeps you in your lane and prevents you from rear ending someone. That’s it. It’s a super useful tool in it’s element, but shouldn’t be used outside of freeways or very simple roads at reasonable speeds. That said, it also shouldn’t be fucking called “autopilot”. That’s purely marketing and it’s extremely dangerous, as we can see.
Yet Phoney Stark keeps on whinging about the risks of AI but at the same time slags off humans who actually know their stuff especially regarding safety.
There’s like three comments in here talking about the technology, everyone else is arguing about names like people are magically absolved of personal responsibilities when they believe advertising over common sense.
Because the tech has inarguably failed. It’s all about the lawyers and how long they can extend Tesla’s irresponsibility.
See, I would much rather have this discussion vs another one about advertising and names.
We’re seeing progress. Ford is expanding features on Blue Cruise (in-lane avoidance maneuvers I believe). I think Mercedes is expanding the area theirs works in. Tesla added off-highway navigation in the last year.
No one’s reached full autonomy for more than a few minutes or a few miles, but I wouldn’t say there’s no argument there. In fact, I’d say they’re arguably making visible progress.
I am the last person to defend Elon and his company but honestly it’s user error. It’s like blaming Microsoft for deliberately ignoring logic and downloading viruses. The autopilot should be called driver assist and that people still need to pay attention. Deaths were caused by user negligence.
deleted by creator
I do agree the name and Teslas general advertising of drivers assists are a bit misleading.
But this is really on the driver for not paying attention.
“A bit misleading” is, I think, a bit of a misleading way to describe their marketing. It’s literally called Autopilot, and their marketing material has very aggressively pitched it as a ‘full self driving’ feature since the beginning, even without mentioning Musk’s own constant and ridiculous hyperbole when advertising it. It’s software that should never have been tested outside of vehicles run by company employees under controlled conditions, but Tesla chose to push it to the public as a paid feature and significantly downplay the fact that it is a poorly tested, unreliable beta, specifically to profit from the data generated by its widespread use, not to mention the price they charge for it as if it were a normal, ready to use consumer feature. Everything about their deployment of the system has been reckless, careless, and actively disdainful of their customers’ safety.