this post was submitted on 18 Dec 2023
283 points (93.0% liked)

Technology

59219 readers
4492 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

Data poisoning: how artists are sabotaging AI to take revenge on image generators::As AI developers indiscriminately suck up online content to train their models, artists are seeking ways to fight back.

you are viewing a single comment's thread
view the rest of the comments
[–] uriel238@lemmy.blahaj.zone 10 points 10 months ago* (last edited 10 months ago)

The general term for this is adversarial input, and we've seen published reports about it since 2011 when ot was considered a threat if CSAM could be overlayed with secondary images so they weren't recognized by Google image filters or CSAM image trackers. If Apple went through with their plan to scan private iCloud accounts for CSAM we may have seen this development.

So far (AFAIK) we've not seen adversarial overlays on CSAM though in China the technique is used to deter trackng by facial recognition. Images on social media are overlaid by human rights activists / mischief-makers so that social media pics fail to match secirity footage.

The thing is like an invisible watermark, these processes are easy to detect (and reverse) once users are aware they're a thing. So if a generative AI project is aware that some images may be poisoned, it's just a matter of adding a detection and removal process to the pathway from candidate image to training database.

Similarly, once enough people start poisoning their social media images, the data scrapers will start scaning and removing overlays even before the database sets are sold to law enforcement and commercial interests.