this post was submitted on 28 Apr 2026
875 points (98.3% liked)

Programmer Humor

31291 readers
2049 users here now

Welcome to Programmer Humor!

This is a place where you can post jokes, memes, humor, etc. related to programming!

For sharing awful code theres also Programming Horror.

Rules

founded 2 years ago
MODERATORS
 

I stole this from LinkedIn.

you are viewing a single comment's thread
view the rest of the comments
[–] scops@reddthat.com 26 points 1 week ago (6 children)

I support a call center and we're about to implement an AI agent. We're paying for a model that essentially can talk and has "learned how to learn", but is otherwise dumb. It's trained on a very small amount of information, anything we'd give to a real agent, plus the public info on our website.

The result of this should be a bot that says, "I don't know, should I transfer you to a real person?" a lot, but should hopefully never hallucinate or teach someone how to build a bomb or something.

Dunno how others do it though

[–] TachyonTele@piefed.social 13 points 1 week ago

That's the kind of system set up that makes sense

[–] Steve@startrek.website 8 points 1 week ago (1 children)

I have never seen a chatbot say “i dont know”

[–] cepelinas@sopuli.xyz 2 points 1 week ago

"I don't know"

hopefully never hallucinate or teach someone how to build a bomb or something.

that's so fucking easy you just lick toads until you find the right one who needs to go to the internet for that.

[–] null@piefed.nullspace.lol 3 points 1 week ago

Those kinds of bots work fine these days.

[–] WoodScientist@lemmy.world 3 points 1 week ago

The result of this should be a bot that says, “I don’t know, should I transfer you to a real person?” a lot, but should hopefully never hallucinate or teach someone how to build a bomb or something.

This is in contrast for the AI agent for my company, whose customer service number is 1-800-BLD-A-BMB.

[–] 0_o7@lemmy.dbzer0.com 2 points 1 week ago

The one you're using isn't probably a wrapper around OpenAI or other cloud based API, the ones that are misconfigured are more prone to these types of abuse.