riskable

joined 2 years ago
[–] riskable@programming.dev 2 points 1 month ago (3 children)

If you went to a human illustrator and asked for that, you would (hopefully) get run out of the room or hung up on, because there's a built in filter for 'is this gross / will it harm my reputation to publish,'

If there was no filter for the guy that requested the bot create this, what makes you think illustrators will have such a filter? How do you know it's not an illustrator that would make such a thing?

The problem here is human behavior. Not the machine's ability to make such things.

AI is just the latest way to give instructions to a computer. That used to be a difficult problem and required expertise. Now we've given that power to immoral imbeciles. Rather than take the technology away entirely (which is really the only solution since LLMs are so easy to trick; even with a ton of anti-abuse stuff in system prompts), perhaps we should work on taking the ability of immoral imbeciles to use them away instead.

Do I know how to do that without screwing over everyone's right to privacy? No. That too, may not be possible.

[–] riskable@programming.dev 21 points 1 month ago (1 children)

Correction: Newer versions of ChatGPT (GPT-5.x) are failing in insidious ways. The article has no mention of the other popular services or the dozens of open source coding assist AI models (e.g. Qwen, gpt-oss, etc).

The open source stuff is amazing and gets better just as quickly as the big AI options. Yet they're boring so they don't make the news.

[–] riskable@programming.dev -1 points 1 month ago (2 children)

Theft is something that happens to physical things. What's actually happening is "copying".

The MPAA/RIAA made the "copying is theft" argument over and over again in the 90s and early 2000s. It was wrong then and it's wrong now.

[–] riskable@programming.dev 2 points 1 month ago (1 children)

Well, the CSAM stuff is unforgivable but I seriously doubt even the soulless demon that is Elon Musk wants his AI tool generating that. I'm sure they're working on it (it's actually a hard computer science sort of problem because the tool is supposed to generate what the user asks for and there's always going to be an infinite number of ways to trick it since LLMs aren't actually intelligent).

Porn itself is not illegal.

[–] riskable@programming.dev 3 points 1 month ago

I don't know, man... Have you even seen Amber? It might be worth an alert 🤷

[–] riskable@programming.dev 2 points 1 month ago

I don't know how to tell you this but... Every body gives a shit. We're born shitters.

[–] riskable@programming.dev 1 points 1 month ago

Good catch!

[–] riskable@programming.dev 95 points 1 month ago (16 children)

The real problem here is that Xitter isn't supposed to be a porn site (even though it's hosted loads of porn since before Musk bought it). They basically deeply integrated a porn generator into their very publicly-accessible "short text posts" website. Anyone can ask it to generate porn inside of any post and it'll happily do so.

It's like showing up at Walmart and seeing everyone naked (and many fucking), all over the store. That's not why you're there (though: Why TF are you still using that shithole of a site‽).

The solution is simple: Everyone everywhere needs to classify Xitter as a porn site. It'll get blocked by businesses and schools and the world will be a better place.

[–] riskable@programming.dev 2 points 1 month ago

"To solve this puzzle, you have to get your dog to poop in the circle..."

[–] riskable@programming.dev 9 points 1 month ago (1 children)

Yep. Stadia also had a feature like this (that no one ever used).

Just another example of why software patents should not exist.

[–] riskable@programming.dev 31 points 1 month ago* (last edited 1 month ago)

It's cold outside all year round and there's abundant geothermal energy. Basically, it's the perfect place to build data centers.

view more: ‹ prev next ›