this post was submitted on 05 Jun 2025
968 points (98.8% liked)

Not The Onion

16546 readers
888 users here now

Welcome

We're not The Onion! Not affiliated with them in any way! Not operated by them in any way! All the news here is real!

The Rules

Posts must be:

  1. Links to news stories from...
  2. ...credible sources, with...
  3. ...their original headlines, that...
  4. ...would make people who see the headline think, “That has got to be a story from The Onion, America’s Finest News Source.”

Please also avoid duplicates.

Comments and post content must abide by the server rules for Lemmy.world and generally abstain from trollish, bigoted, or otherwise disruptive behavior that makes this community less fun for everyone.

And that’s basically it!

founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] ExtremeDullard@lemmy.sdf.org 213 points 2 days ago (5 children)

Remember: AI chatbots are designed to maximize engagement, not speak the truth. Telling a methhead to do more meth is called customer capture.

[–] floo@retrolemmy.com 69 points 2 days ago (1 children)

Sounds a lot like a drug dealer’s business model. How ironic

[–] thefartographer@lemm.ee 16 points 2 days ago

You don't look so good... Here, try some meth—that always perks you right up. Sobriety? Oh, sure, if you want a solution that takes a long time, but don't you wanna feel better now???

[–] Owlboi@lemm.ee 4 points 1 day ago

I dont think Ai Chatbots care about engagement. the more you use them the more expensive it is for them. They just want you on the hook for the subscription service and hope you use them as little as possible while still enough to stay subscribed for maximum profit.

[–] morrowind@lemmy.ml 10 points 1 day ago

Not engagement, that's what social media does. They just maximize what they're trained for, which is increasingly math proofs and user preference. People like flattery

[–] webghost0101@sopuli.xyz 27 points 2 days ago* (last edited 2 days ago) (1 children)

The llm models aren’t, they don't really have focus or discriminate.

The ai chatbots that are build using those models absolutely are and its no secret.

What confuses me is that the article points to llama3 which is a meta owned model. But not to a chatbot.

This could be an official facebook ai (do they have one?) but it could also be. Bro i used this self hosted model to build a therapist, wanna try it for your meth problem?

Heck i could even see it happen that a dealer pretends to help customers who are trying to kick it.

[–] smee@poeng.link 2 points 1 day ago

For all we know, they could have self-hosted "Llama3.1_NightmareExtreme_RPG-StoryHorror8B_Q4_K_M" and instructed it to take on the role of a therapist.

But if the meth head does meth instead of engaging with the AI, that would do the opposite.