this post was submitted on 05 Jun 2025
987 points (98.7% liked)

Not The Onion

17441 readers
1793 users here now

Welcome

We're not The Onion! Not affiliated with them in any way! Not operated by them in any way! All the news here is real!

The Rules

Posts must be:

  1. Links to news stories from...
  2. ...credible sources, with...
  3. ...their original headlines, that...
  4. ...would make people who see the headline think, “That has got to be a story from The Onion, America’s Finest News Source.”

Please also avoid duplicates.

Comments and post content must abide by the server rules for Lemmy.world and generally abstain from trollish, bigoted, or otherwise disruptive behavior that makes this community less fun for everyone.

And that’s basically it!

founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] ExtremeDullard@lemmy.sdf.org 216 points 1 month ago (5 children)

Remember: AI chatbots are designed to maximize engagement, not speak the truth. Telling a methhead to do more meth is called customer capture.

[–] floo@retrolemmy.com 69 points 1 month ago (1 children)

Sounds a lot like a drug dealer’s business model. How ironic

[–] thefartographer@lemm.ee 16 points 1 month ago

You don't look so good... Here, try some meth—that always perks you right up. Sobriety? Oh, sure, if you want a solution that takes a long time, but don't you wanna feel better now???

[–] webghost0101@sopuli.xyz 29 points 1 month ago* (last edited 1 month ago) (2 children)

The llm models aren’t, they don't really have focus or discriminate.

The ai chatbots that are build using those models absolutely are and its no secret.

What confuses me is that the article points to llama3 which is a meta owned model. But not to a chatbot.

This could be an official facebook ai (do they have one?) but it could also be. Bro i used this self hosted model to build a therapist, wanna try it for your meth problem?

Heck i could even see it happen that a dealer pretends to help customers who are trying to kick it.

[–] smee@poeng.link 4 points 1 month ago

For all we know, they could have self-hosted "Llama3.1_NightmareExtreme_RPG-StoryHorror8B_Q4_K_M" and instructed it to take on the role of a therapist.

[–] Smorty@lemmy.blahaj.zone 1 points 3 weeks ago* (last edited 3 weeks ago)

its probably som company jusz using llama as their core llm for use in their chatbot-

if its free, peeps will take it n repackage it n resell ig.... ~

i rbr seeing som stuff bout llm therapists for lonely peeps- but woag.... thad sounds sad :(

oh! btw - do u use ollama too? <3

[–] morrowind@lemmy.ml 10 points 1 month ago

Not engagement, that's what social media does. They just maximize what they're trained for, which is increasingly math proofs and user preference. People like flattery

[–] CosmicTurtle0@lemmy.dbzer0.com 6 points 1 month ago

But if the meth head does meth instead of engaging with the AI, that would do the opposite.

[–] Owlboi@lemm.ee 4 points 1 month ago

I dont think Ai Chatbots care about engagement. the more you use them the more expensive it is for them. They just want you on the hook for the subscription service and hope you use them as little as possible while still enough to stay subscribed for maximum profit.