this post was submitted on 04 Oct 2023
472 points (96.6% liked)

Technology

59219 readers
4404 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

TikTok ran a deepfake ad of an AI MrBeast hawking iPhones for $2 — and it's the 'tip of the iceberg'::As AI spreads, it brings new challenges for influencers like MrBeast and platforms like TikTok aiming to police unauthorized advertising.

you are viewing a single comment's thread
view the rest of the comments
[–] KairuByte@lemmy.dbzer0.com 44 points 1 year ago (1 children)

So, the first reason is that the law likely already covers most cases where someone is using deepfakes. Using it to sell a product? Fraud. Using it to scam someone? Fraud. Using it to make the person say something they didn’t? Likely falls into libel.

The second reason is that the current legislation doesn’t even understand how the internet works, is likely amazed by the fact that cell phones exist without the use of magic, and half of them likely have dementia. Good luck getting them to even properly understand the problem, never mind come up with a solution that isn’t terrible.

[–] Pxtl@lemmy.ca 7 points 1 year ago (2 children)

The problem is that realistically this kind of tort law is hilariously difficult to enforce.

Like, 25 years ago we were pirating like mad, and it was illegal! But enforcing it meant suing individual people for piracy, so it was unenforceable.

Then the DMCA was introduced, which defined how platforms were responsible for policing IP crime. Now every platform heavily automates copyright enforcement.

Because there, it was big moneybags who were being harmed.

But somebody trying to empty out everybody's Gramma's chequing account with fraud? Nope, no convenient platform enforcement system for that.

[–] Ullallulloo@civilloquy.com 2 points 1 year ago (1 children)

You're saying that the solution would be to hold TikTok liable in this case for failing to prevent fraud on its platform? In that case, we wouldn't even really need a new law. Mostly just repealing or adding exceptions to Section 230 would make platforms responsible. That's not a new solution though. People have been pushing for that for years.

[–] Pxtl@lemmy.ca 3 points 1 year ago

DMCA wasn't a blanket "you're responsible now", but defined a specific process for "this is how you demand something is taken down and the process the provider must follow".

[–] CosmicCleric@lemmy.world 0 points 1 year ago (1 children)

IANAL, but can't MrBeast sue the ad creator company for damaging his reputation?

[–] Petter1@lemm.ee 4 points 1 year ago

Good luck with that, I guess This company is gone before misterB can finish writing his lawsuit, and with it all the scammed money. But I guess there is some law forcing platforms to not promote scams, I hope, at least in some countries.