this post was submitted on 05 Nov 2023
195 points (93.7% liked)

Technology

59219 readers
4663 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

AI companies have all kinds of arguments against paying for copyrighted content::The companies building generative AI tools like ChatGPT say updated copyright laws could interfere with their ability to train capable AI models. Here are comments from OpenAI, StabilityAI, Meta, Google, Microsoft and more.

you are viewing a single comment's thread
view the rest of the comments
[–] realharo@lemm.ee 9 points 1 year ago* (last edited 1 year ago) (2 children)

Scale matters. For example

  • A bunch of random shops having security cameras, where their employees can review footage

  • Every business in a country having a camera connected to a central surveillance network with facial recognition and search capabilities

Those two things are not the same, even though you could say they're "not much different" - it's just a bunch of cameras after all.

Also, the similarity between human learning and AI training is highly debatable.

[–] ChairmanMeow@programming.dev 4 points 1 year ago (1 children)

Both of your examples are governed by the same set of privacy laws, which talk about consent, purpose and necessity, but not about scale. Legislating around scale open up the inevitable legal quagmires of "what scale is acceptable" and "should activity x be counted the same as activity y to meet the scale-level defined in the law".

Scale makes a difference, but it shouldn't make a legal difference w.r.t. the legality of the activity.

[–] lollow88@lemmy.ml 2 points 1 year ago (2 children)

Scale makes a difference, but it shouldn't make a legal difference w.r.t. the legality of the activity.

What do you think the difference between normal internet traffic and a ddos attack is?

[–] fsmacolyte@lemmy.world 2 points 1 year ago (1 children)

Intent is part of it as well. If you have too many people who want to use your service, you're not being attacked, you have an actual shortage of ability to service requests and need to adjust accordingly.

[–] lollow88@lemmy.ml 0 points 1 year ago (1 children)

In this context I meant that it was the same person doing a "normal" thing at such a scale that it becomes illegal. Scale absolutely is something that can turn something from legal to illegal.

[–] bored_runaway@lemmy.world 2 points 1 year ago (1 children)

But isn't the intent and not the scale that makes it illegal? Scale only evidence for the intent.

[–] lollow88@lemmy.ml 3 points 1 year ago

I see what you mean. Perhaps cold calling would be a better example then, where it is illegal if it is automated.

[–] ChairmanMeow@programming.dev 2 points 1 year ago (1 children)

Lack of consent and the intent to cause harm.

[–] lollow88@lemmy.ml 1 points 1 year ago (1 children)

Ok, then how about automated cold calling vs "live" cold calling?

[–] ChairmanMeow@programming.dev 1 points 1 year ago (1 children)

Falls under unwanted calls, you should be able to opt out of both (though I believe both are currently legal in the US).

[–] lollow88@lemmy.ml 1 points 1 year ago

You can opt out of both, but automated cold calling is straight up illegal in the UK (and it's a good thing it is).

[–] ryannathans@aussie.zone 1 points 1 year ago

Legally no difference