this post was submitted on 05 Aug 2024
90 points (98.9% liked)

Technology

59427 readers
3782 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] NeoNachtwaechter@lemmy.world 34 points 3 months ago (3 children)

When these AIs make autonomous decisions that inadvertently cause harm – whether financial loss or actual injury – whom do we hold liable?

The person who allowed the AI to make these decisions autonomously.

We should do it like Asimov has shown us: create "robot laws" that are similar to slavery laws:

In principle, the AI is a non-person and therefore a person must take responsibility.

[–] Nommer@sh.itjust.works 6 points 3 months ago

No you see the corporations will just lobby until the courts get enough money to classify AI as it's own individual entity, just like with citizens united.

[–] Nomecks@lemmy.ca 4 points 3 months ago* (last edited 3 months ago)

The whole point of Asimov's three laws were to show how they could never work in reality because it would be very easy to circumvent them.

[–] RandomVideos@programming.dev 1 points 3 months ago

(At least in Romania) if a child does a crime, the parents are punished

The person allowing the AI to make these decisions should be punished until the AI is at least 15 years old(and killing it and replacing it with a clone of the AI or a better AI with the same name doesnt mean the age doesnt reset to 0)