this post was submitted on 21 Oct 2023
118 points (97.6% liked)

Technology

59135 readers
2878 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] ericisshort@lemmy.world 4 points 1 year ago (1 children)

No, I’m not assuming that. It’s not about concluding AI’s are human. It’s about having concrete standards on which to design laws. Setting a lower standard for copyright violation by LLMs would be like setting a lower speed limit for a self-driving car, and I don’t think it makes any logical sense. To me that would be a disappointingly protectionist and luddite perspective to apply to this new technology.

[–] lemmyvore@feddit.nl 1 points 1 year ago (1 children)

If LLM are software then they can't commit copyright violation, the onus for breaking laws falls on the people who use them. And until someone proves otherwise in a court of law they are software.

[–] ericisshort@lemmy.world 3 points 1 year ago

No one is saying we charge a piece of software with a crime. Corporations aren’t human, but they can absolutely be charged with copyright violations, so being human isn’t a requirement for this at all.

Depending on the situation, you would either charge the user of the software (if they directed the software to violate copyright) and/or the company that makes the software (if they negligently release an LLM that has been proven to produce results that violate copyright).