this post was submitted on 05 Mar 2024
543 points (97.2% liked)

Technology

59377 readers
5843 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

A trial program conducted by Pornhub in collaboration with UK-based child protection organizations aimed to deter users from searching for child abuse material (CSAM) on its website. Whenever CSAM-related terms were searched, a warning message and a chatbot appeared, directing users to support services. The trial reported a significant reduction in CSAM searches and an increase in users seeking help. Despite some limitations in data and complexity, the chatbot showed promise in deterring illegal behavior online. While the trial has ended, the chatbot and warnings remain active on Pornhub's UK site, with hopes for similar measures across other platforms to create a safer internet environment.

you are viewing a single comment's thread
view the rest of the comments
[–] FraidyBear@lemmy.world 124 points 8 months ago (4 children)

Imagine a porn site telling you to seek help because you're a filthy pervert. Thats gotta push some to get some help I'd think.

[–] John_McMurray@lemmy.world 45 points 8 months ago* (last edited 8 months ago) (4 children)

Imagine how dumb, in addition to deranged, these people would have to be to look for child porn on a basically legitimate website. Misleading headline too, it didn't stop anything, it just told them "Not here"

[–] abhibeckert@lemmy.world 19 points 8 months ago (1 children)

We have culturally drawn a line in the sand where one side is legal and the other side of the line is illegal.

Of course the real world isn't like that - there's a range of material available and a lot of it is pretty close to being abusive material, while still being perfectly legal because it falls on the right side of someone's date of birth.

It sounds like this initiative by Pornhub's chatbot successfully pushes people away from borderline content... I'm not sure I buy that... but if it's directing some of those users to support services then that's a good thing. I worry though some people might instead be pushed over to the dark web.

[–] John_McMurray@lemmy.world 13 points 8 months ago (1 children)

Yeah...I forgot that the UK classifies some activities between consenting adults as "abusive", and it seems some people are now using that definition in the real world.

[–] Scirocco@lemm.ee 2 points 8 months ago (1 children)

Facesitting porn (of adults) is illegal in UK for the reason that it's potentially dangerous

[–] Quicky@lemmy.world 5 points 8 months ago* (last edited 8 months ago)

Which led to some amazing protests.

Weirdly, watching facesitting porn in the UK is perfectly fine, as long as it wasn’t filmed in the UK.

I can just imagine trying to defend that in court. “Your honour, it’s clear to me that the muffled moans of the face-sittee are those of a Frenchman”

[–] A_Random_Idiot@lemmy.world 14 points 8 months ago* (last edited 8 months ago) (1 children)

I mean, is it dumb?

Didnt pornhub face a massive lawsuit or something because of the amount of unmoderated child porn that was hidden in its bowels by uploaders (in addition to rape victims, revenge porn, etc etc..), to the point that they apparently only allow verified uploaders now and purged a huge swath of their videos?

[–] theherk@lemmy.world 12 points 8 months ago (1 children)

Until a few years ago, when they finally stopped allowing unmoderated, user uploaded content they had a ton a very problematic videos. And they were roasted about it in public for years. Including by many who were the unconsenting, sometimes underage subjects of these videos, and they did nothing. Good that they finally did, but they trained users for years that it was a place to find that content.

[–] r3df0x@7.62x54r.ru -2 points 8 months ago

Pornhub also knowingly hosted child porn. Ready or Not put them on blast for it when you raid a company called "Mindjot" for distributing child porn.

[–] squid_slime@lemmy.world 18 points 8 months ago

filthy pervert is down playing it but yea definitely hope to see more of this

[–] Clbull@lemmy.world 11 points 8 months ago (1 children)

IIRC Xhamster started doing this a few years ago, minus the AI chatbot.

[–] Gabu@lemmy.world 8 points 8 months ago (1 children)

Didn't they just block certain search terms (which actually made the site somewhat difficult to use for legitimate/legal content)?

[–] Deceptichum@sh.itjust.works 3 points 8 months ago* (last edited 8 months ago)