this post was submitted on 16 Dec 2023
567 points (96.1% liked)

Technology

59157 readers
2528 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] TheGrandNagus@lemmy.world 174 points 10 months ago (1 children)

Oh no, it's even worse than that.

It's the CEO and other staff repeatedly speaking of the system as if it's basically fully capable and it's only for legal reasons why a driver is even required. Even saying that the car could drive from one side of the US to the other without driver interaction (only to not actually do that, of course).

It's the company never correcting people when they call it a self driving system.

It's the company saying they're ready for autonomous taxis and saying owner's cars will make money for them while they aren't driving it.

It's calling their software subscription Full Self Driving

It's honestly staggering to me that they're able to get away with this shit.

[–] meleecrits@lemmy.world 87 points 10 months ago (4 children)

I love my Model 3, but everything you said is spot on. Autopilot is a great driver assist, but it is nowhere near autonomous driving. I was using it on the highway and was passing a truck on the left. The road veered left and the truck did as well, keeping in its lane the entire time. The car interpreted this as the truck merging over into my lane and slammed the brakes. Fortunately, I was able to figure out what went wrong and quickly accelerated myself so as to not become a hazard to the cars behind me.

Using Autopilot as anything more than a nice dynamic cruise control setting is putting your life, and other lives, in danger.

[–] Neato@kbin.social 53 points 10 months ago (3 children)

Holy shit. My car doing that once and I'd be a nervous wreck just thinking about using it again.

[–] Wrench@lemmy.world 27 points 10 months ago

I give teslas more room because I have been brake checked by them on empty roads before. These ghost brake problems are prevalent.

[–] snooggums@kbin.social 20 points 10 months ago (1 children)

I have had the adaptive cruise control brake on multiple Hondas and Subarus in similar situations. Not like slamming on the brakes, but firm enough to confuse the hell out of me.

Every time it was confusing and now I just don't use it if the road is anything but open and clear.

[–] buran@lemmy.world 22 points 10 months ago* (last edited 10 months ago)

Honda’s sensing system will read shadows from bridges as obstructions in the road that it needs to brake for. It’s easy enough to accelerate out of the slowdown, but I was surprised to find that there is apparently no radar check to see if the obstruction is real.

My current vehicle doesn’t have that issue, so either the programming has been improved or the vendor for the sensing systems is a different one (different vehicle make, so it’s entirely possible).

[–] burliman@lemm.ee -1 points 10 months ago (3 children)

That’s the bar that automatic driving has. It messes up once and you never trust it again and the news spins the failure far and wide.

Your uncle doing the same thing just triggers you to yell at him, the guy behind him flips you off, he apologizes, you’re nervous for a while, and you continue your road trip. Even if he killed someone we would blame the one uncle, or some may blame his entire class at worst. But we would not say that no human should drive again until it is fixed like we do with automated cars.

I do get the difference between those, and I do think that they should try to make automated drivers better, but we can at least agree about that premise: automated cars have a seriously unreasonable bar to maintain. Maybe that’s fair, and we will never accept anything but perfect, but then we may never have automated cars. And as someone who drives with humans every day, that makes me very sad.

[–] maynarkh@feddit.nl 15 points 10 months ago (1 children)

There is a big difference between Autopilot and that hypotethical uncle. If the uncle causes an accident or breaks shit, he or his insurance pays. Autopilot doesn't.

By your analogy, it's like putting a ton of learner drivers on the road with unqualified instructors, and not telling the instructors that they are supposed to be instructors, but that they are actually taking a taxi service. Except it's somehow their responsibility. And of course pocketing both the instruction and taxi fees.

The bar is not incredibly high for self driving cars to be accepted. The only thing is that they should take the blame if they mess up, like all other drivers.

[–] burliman@lemm.ee 2 points 10 months ago

Yeah, for sure. Like I said, I get the difference. But ultimately we are talking about injury prevention. If automated cars prevented one less death per mile than human drivers, we would think they are terrible. Even though they saved one life.

And even if they only caused one death per year we’d hear about it and we might still think they are terrible.

[–] Neato@kbin.social 4 points 10 months ago (1 children)

The difference is that Tesla said it was autopilot when it's really not. It's also clearly not ready for primetime. And auto regulators have pretty strict requirements about reliability and safety.

While that's true that autonomous cars kill FAR less people than human drivers, ever human is different. If an autonomous driver is subpar and that AI is rolled out to millions of cars, we've vastly lowered safety of cars. We need autonomous cars to be better than the best driver because, frankly, humans are shit drivers.

I'm 100% for autonomous cars taking over entirely. But Tesla isn't really trying to get there. They are trying to sell cars and lying about their capabilities. And because of that, Tesla should be liable for the deaths. We already have them partially liable: this case caused a recall of this feature.

[–] Staiden@lemmy.dbzer0.com 4 points 10 months ago* (last edited 10 months ago)

But the vaporware salesman said fully automatic driving was 1 year away! In 2018, 2019, 2020, 2021... he should be held responsible. The guy once said to further technology some people will die and that's just the price we pay. It was in a comment about going to Mars, but we should take that in to accout for everything he does. If I owned a business and one of my workers died or killed someone because of gross negligence I'd be held responsible why does he get away with it.

[–] SlopppyEngineer@discuss.tchncs.de 0 points 10 months ago

Except Tesla's uncle had brain damage and doesn't really learn from the situation so will go it again, and had clones of him driving thousands of other cars.

[–] Damage@slrpnk.net 8 points 10 months ago

Something like that happened to me while using adaptive cruise control on a rental Jeep Renegade, it slammed the brakes twice on the highway but for no clear reason. I deactivated it before it tried a third one.

[–] Alchemy@lemmy.world 7 points 10 months ago (1 children)

Your cars actions could kill someone.

[–] Speculater@lemmy.world 5 points 10 months ago

That's only a $11.5k fine though.

[–] LordKitsuna@lemmy.world 1 points 10 months ago

The auto cruise on the Priuses at work do this a lot. If the freeway curves to the left or something it will panic and think I'm about to hit the cars in the lane next to me also going through the Curve