this post was submitted on 05 Aug 2024
90 points (98.9% liked)

Technology

59427 readers
3782 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] fubarx@lemmy.ml 6 points 3 months ago (2 children)

This topic came up when self-driving was first coming up. If a car runs over someone, who is to blame?

  • Person in driver seat
  • Dealer
  • Car manufacturer
  • Supplier who provided the driving control system
  • The people who designed the algorithm and did the ML training
  • People who wrote and tested the code
  • Insurer

Most of these would likely be indemnified by all kinds of legal and contractual agreements, but the matter would still stand that someone died.

[–] Badeendje@lemmy.world 1 points 3 months ago (1 children)

Throughout the entire chain based on value/value add. Not to the consumer.

So if a car manufacturer adds a shitty 3rd party self-driving to their car. And the license etc is 100 euro per car and the car 10k and sold by the dealer for 20k..

  • 100/20k for the 3rd party
  • 10k/20k for the manufacturer
  • 10k/20k for the dealer

Hhmm how would this work for private re-sale... Still the dealer imho.

[–] conciselyverbose@sh.itjust.works 1 points 3 months ago* (last edited 3 months ago) (1 children)

Dealers don't (and shouldn't have to) validate safety features. If they're approved by the NHTSA, that's their responsibility handled.

It's all the manufacturer.

[–] Badeendje@lemmy.world 1 points 3 months ago
[–] HauntedCupcake@lemmy.world 1 points 3 months ago

An insurer is an interesting one for sure. They'd have the stats of how many times that AI model makes mistakes and be able to charge accordingly. They'd also have the funds and evidence to go after big corps if their AI was faulty.

They seem like a good starting point, until negligence elsewhere can be proven.