this post was submitted on 01 Nov 2025
503 points (97.4% liked)

Not The Onion

18500 readers
993 users here now

Welcome

We're not The Onion! Not affiliated with them in any way! Not operated by them in any way! All the news here is real!

The Rules

Posts must be:

  1. Links to news stories from...
  2. ...credible sources, with...
  3. ...their original headlines, that...
  4. ...would make people who see the headline think, “That has got to be a story from The Onion, America’s Finest News Source.”

Please also avoid duplicates.

Comments and post content must abide by the server rules for Lemmy.world and generally abstain from trollish, bigoted, or otherwise disruptive behavior that makes this community less fun for everyone.

And that’s basically it!

founded 2 years ago
MODERATORS
 

What an odd thing to say...

you are viewing a single comment's thread
view the rest of the comments
[–] p03locke@lemmy.dbzer0.com 10 points 1 day ago

There’s around a million people dying from cars every year and we just shrug and normalize them. Human or not, we just have to have cars and “accidents” are just that.

The difference is accountability. If a human kills another human because of a car accident, they are liable, even criminally liable, given the right circumstances. If a driverless car kills another human because of a car accident, you're presented with a lose-lose scenario, depending on the legal implementation:

  1. If the car manufacturer says that somebody must be behind the wheel, even though the car is doing all of the driving, the person is suddenly liable for the accident. They are expected to just sit there and watch for a potential accident, but the behavior of what an AI model will do is undefined. Is the model going to stop in front of that passenger as expected? How long do they wait to see before they take back control? It's not like cruise control, a feature that only controls part of the car, where they know exactly how it behaves and when to take back control. It's the equivalent of asking a person to watch a panel with a single red light for an hour, and push a button as fast as possible when it blinks for a half-second.

  2. If the model is truly driverless (like these taxis), then NOBODY is liable for the accident. The company behind it might get sued, or might end up in a class-action lawsuit, but there is no criminal liability, and none of these lawsuits will result in enough financial impact to facilitate change. The companies have no incentive to fix their software, and will continue to parrot this shitty line about how it's somehow better than humans at driving, despite these easily hackable scenarios and zero accountability.

Humans have an incentive to not kill people, since nobody wants to have that on their conscience, and nobody wants to go to prison over it.

Corporations don't. In fact, they have an incentive to kill people over profits, if the choice presents itself!