this post was submitted on 22 May 2024
296 points (96.8% liked)

News

23287 readers
4563 users here now

Welcome to the News community!

Rules:

1. Be civil


Attack the argument, not the person. No racism/sexism/bigotry. Good faith argumentation only. This includes accusing another user of being a bot or paid actor. Trolling is uncivil and is grounds for removal and/or a community ban. Do not respond to rule-breaking content; report it and move on.


2. All posts should contain a source (url) that is as reliable and unbiased as possible and must only contain one link.


Obvious right or left wing sources will be removed at the mods discretion. We have an actively updated blocklist, which you can see here: https://lemmy.world/post/2246130 if you feel like any website is missing, contact the mods. Supporting links can be added in comments or posted seperately but not to the post body.


3. No bots, spam or self-promotion.


Only approved bots, which follow the guidelines for bots set by the instance, are allowed.


4. Post titles should be the same as the article used as source.


Posts which titles don’t match the source won’t be removed, but the autoMod will notify you, and if your title misrepresents the original article, the post will be deleted. If the site changed their headline, the bot might still contact you, just ignore it, we won’t delete your post.


5. Only recent news is allowed.


Posts must be news from the most recent 30 days.


6. All posts must be news articles.


No opinion pieces, Listicles, editorials or celebrity gossip is allowed. All posts will be judged on a case-by-case basis.


7. No duplicate posts.


If a source you used was already posted by someone else, the autoMod will leave a message. Please remove your post if the autoMod is correct. If the post that matches your post is very old, we refer you to rule 5.


8. Misinformation is prohibited.


Misinformation / propaganda is strictly prohibited. Any comment or post containing or linking to misinformation will be removed. If you feel that your post has been removed in error, credible sources must be provided.


9. No link shorteners.


The auto mod will contact you if a link shortener is detected, please delete your post if they are right.


10. Don't copy entire article in your post body


For copyright reasons, you are not allowed to copy an entire article into your post body. This is an instance wide rule, that is strictly enforced in this community.

founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] Empricorn@feddit.nl 86 points 5 months ago (5 children)

This is tough. If it was just a sicko who generated the images for himself locally... that is the definition of a victimless crime, no? And it might actually dissuade him from seeking out real CSAM....

BUT, iirc he was actually distributing the material, and even contacted minors, so... yeah he definitely needed to be arrested.

But, I'm still torn on the first scenario...

[–] kromem@lemmy.world 66 points 5 months ago (3 children)

But, I'm still torn on the first scenario...

To me it comes down to a single question:

"Does exposure and availability to CSAM for pedophiles correlate with increased or decreased likelihood of harming a child?"

If there's a reduction effect by providing an outlet for arousal that isn't actually harming anyone - that sounds like a pretty big win.

If there's a force multiplier effect where exposure and availability means it's even more of an obsession and focus such that there's increased likelihood to harm children, then society should make the AI generated version illegal too.

[–] TheDoozer@lemmy.world 52 points 5 months ago (1 children)

Hoooooly hell, good luck getting that study going. No ethical concerns there!

[–] ricecake@sh.itjust.works 13 points 5 months ago

How they've done it in the past is by tracking the criminal history of people caught with csam, arrested for abuse, or some combination thereof, or by tracking the outcomes of people seeking therapy for pedophilia.

It's not perfect due to the sample biases, but the results are also quite inconsistent, even amongst similar populations.

[–] HonoraryMancunian@lemmy.world 19 points 5 months ago

I'm willing to bet it'll differ from person to person, to complicate matters further

I think the general consensus is that availability of CSAM is bad, because it desensitizes and makes harming of actual children more likely. But I must admit that I only remember reading about that and don't have a scientific source.

[–] Dave@lemmy.nz 12 points 5 months ago (2 children)
[–] FaceDeer@fedia.io 53 points 5 months ago (2 children)

Image-generating AI is capable of generating images that are not like anything that was in its training set.

[–] Dave@lemmy.nz 0 points 5 months ago (2 children)

In that case probably the strongest argument is that if it were legal, many people would get off charges of real CSAM because the prosecuter can't prove that it wasn't AI generated.

[–] FaceDeer@fedia.io 22 points 5 months ago (4 children)

Better a dozen innocent men go to prison than one guilty man go free?

[–] Dave@lemmy.nz 4 points 5 months ago* (last edited 5 months ago) (2 children)

In this case if they know it's illegal, then they knowingly broke the law? Things are still illegal even if you don't agree with it.

Most (many?) Western countries also ban cartoon underage content, what's the justification for that?

[–] FaceDeer@fedia.io 10 points 5 months ago (1 children)

You suggested a situation where "many people would get off charges of real CSAM because the prosecuter can't prove that it wasn't AI generated." That implies that in that situation AI-generated CSAM is legal. If it's not legal then what does it matter if it's AI-generated or not?

[–] Dave@lemmy.nz 1 points 5 months ago (1 children)

That's not quite what I was getting at over the course of the comment thread.

It one scenario, AI material is legal. Those with real CSAM use the defense that it's actually AI and you can't prove otherwise. In this scenario, no innocent men are going to prison, and most guilty men aren't either.

The second scenario we make AI material illegal. Now the ones with real CSAM go to prison, and many people with AI material do too because it's illegal and they broke the law.

[–] FaceDeer@fedia.io 6 points 5 months ago

This comment thread started with you implying that the AI was trained on illegal material, I'm really not sure how it's got to this point from that one.

[–] HubertManne@kbin.social 3 points 5 months ago

Im completely against restrictions on art depictions and writing. Those don't have the dangers of being real but being pawned off as fake.

[–] Chainweasel@lemmy.world 0 points 5 months ago

If it's illegal, and they produce the AI CSAM anyway, they've broken the law and are by definition not Innocent.

[–] HubertManne@kbin.social 3 points 5 months ago

this is the real problem.

[–] Empricorn@feddit.nl 1 points 5 months ago

Very, very good point. Depending on the answer, I retract the "victimless" narrative.

[–] Corkyskog@sh.itjust.works 7 points 5 months ago

I'm fine with it just being illegal, but realistically you could just ban the transmission and distribution of it and then you cover enforceable scenarios. You can police someone sending or posting that stuff, it's probably next to impossible to police someone generating it at home.

[–] lolrightythen@lemmy.world 3 points 5 months ago

Agreed. And props for making a point that isn't palatable. The first one is complicated. Not many folk I talk to can set aside their revulsion and consider the situation logically. I wish we didn't have to in the first place.

[–] 0110010001100010@lemmy.world 1 points 5 months ago (3 children)

It's interesting your bring this up. Not long ago I was having basically this exact same discussion with my brother. Baring you second point, I honestly don't know how I feel.

On the one hand - if it's strictly images for himself and it DOES dissuade seeking out real CSAM (I'm not convinced of this) then I don't really see the issue.

On the other hand - I feel like it could be a gateway to something more (your second point). Kinda like a drug, right? You need a heavier and heavier hit to keep the same high. Seems like it wouldn't be a stretch to go from AI generated imagery to actual CSAM.

But yeah, I don't know. We live in an odd time for sure.

[–] Fal@yiffit.net 15 points 5 months ago (2 children)

On the other hand - I feel like it could be a gateway to something m

You mean like marijuana and violent video games?

[–] ricecake@sh.itjust.works 6 points 5 months ago

Except in the case of pornography, it's an open question if viewing it has a net increase or decrease in sexual desire.
With legal pornography, it's typically correlated with higher sexual desire. This tracks intuitively, since the existence of pornography does not typically seem to line up with a drop in people looking for romantic partners.

There's little reason to believe it works the other way around for people attracted to children.
What's unknown is if that desire is enough to outweigh the legal consequences they're aware of, or any social or ethical boundaries present.
Studies have been done, but finding people outside of the legal system who abuse children is exceptionally difficult, even before the ethical obligation to report them to the police would trash the study.
So the studies end up focusing either on people actively seeking treatment for unwanted impulses (less likely to show a correlation), or people engaged with the legal system in some capacity (more likely to show correlation).

[–] Empricorn@feddit.nl -5 points 5 months ago

Holy strawman, Batman! Just because someone uses the term "gateway" doesn't mean they think that games and weed are going to turn all people and frogs gay and violent.

[–] agamemnonymous@sh.itjust.works 13 points 5 months ago

First off, this is obviously a sticky topic. Every conversation is controversial and speculative.

Second, I don't really see a lot of legitimacy to the "gateway" concept. The vast majority of people use some variety of drug (caffeine, alcohol, nicotine), and that doesn't really reliably predict "harder" drug use. Lots of people use marijuana and that doesn't reliably predict hard drug use. Obviously, the people who use heroin and meth have probably used cocaine and ketamine, and weed before that, and alcohol/caffeine/nicotine before that, but that's not really a "gateway" pipeline so much as paying through finer and finer filters. As far as I know, the concept has fallen pretty heavily out of favor with serious researchers.

In light of that perspective, I think you have to consider the goal. Is your goal to punish people, or to reduce the number and severity of victims? Mine is the latter. Personally, I think this sort of thing peels off many more low-level offenders to low-effort outlets than it emboldens to higher-severity outlets. I think this is ultimately a mental-health problem, and zero-tolerance mandatory reporting (while well-meaning) does more harm than good.

I'd rather that those with these kinds of mental issues have 1. the tools to take the edge off in victimless ways 2. safe spaces to discuss these inclinations without fear of incarceration. I think blockading those avenues yields a net increase the number and severity of victims.

This seems like a net benefit, reducing the overall number and severity of actual victims.

[–] Empricorn@feddit.nl 2 points 5 months ago* (last edited 5 months ago)

Thanks for being honest and well-meaning. Sorry you're getting downvoted, we both said pretty much exactly the same thing! A difficult subject, but important to get right...