this post was submitted on 02 Jan 2026
86 points (100.0% liked)

Technology

41498 readers
538 users here now

A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.

Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.

Subcommunities on Beehaw:


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 4 years ago
MODERATORS
 

archive link

content warning: besides the discussion of CSAM, the article contains an example of a Grok-generated image of a child in a bikini. at least it was consensually generated, by the subject of the photo, I guess?

Samantha Smith, a survivor of childhood sexual abuse, tested whether Grok would alter a childhood photo of her. It did. “I thought ‘surely this can’t be real,’” she wrote on X. “So I tested it with a photo from my First Holy Communion. It’s real. And it’s fucking sick.”

you are viewing a single comment's thread
view the rest of the comments
[–] oldguycrusty@mastodon.world 3 points 3 weeks ago (1 children)

@spit_evil_olive_tips

Yes. Thank you. Exactly.

#Grok is not a person. It's a plug-in.
It has no awareness, intelligence, values, ethics, or even standards.

Grok is an appliance. Like a toaster. It is not sentient.

Blaming Grok for 'making porn' is like blaming a browser for 'showing porn'.

Some person used Grok to make porn.

[–] apotheotic@beehaw.org 6 points 3 weeks ago

More importantly, xAI is hosting and creating a tool which can easily create this content