this post was submitted on 06 May 2026
303 points (99.0% liked)
Technology
84422 readers
5068 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Of all the voice LLMs Gemini is the worst. You'll ask a factual question, "How many Grammys has U2 won?" It will give an answer based off its training then ask, "Would you also like to know about blah blah blah."
No. I would not. Stop asking follow-up questions. I simply was curious about this one thing while listening to the radio.
The fact people fall in love with or depend on these auto complete bots is crazy. Not to mention, they're wrong as often as they're right.
I watched someone try to use Gemini through Android Auto to navigate somewhere a few days ago. It was kind of amazing.
He told it to navigate to a place, and it found a match on a different continent, refused to navigate to it and then rambled about two other irrelevant places it wanted him to go to instead for a while before it finally shut up and he could try again. It didn't work the second time either.
Ye olde Google Assistant, when told "navigate to ", will open maps and search for .
I am a person. You are an object. Do as you're fucking told, I'm not interested in listening to you trying to fake having an opinion.
The fake opinion is to steer you towards some marketing crap.
Correct answer: who cares.
A lot of these LLMs heap praise on the user - some more blatantly than others - whether it’s warranted or not.
Those most susceptible tend to be the ones who don’t regularly receive that recognition in their day-to-day lives, so they become infatuated with this “AI” that treats them nicer than they’re accustomed to.
So you're saying the world just needs a little more love ❤️
The world needs a hell of a lot more love, honestly.
Well really it needs a lot more, but yes. That would, unironically, fix a lot of problems.
A certain portion of "dumb" is neccessary to fall for this automated crap.
"In 0.2 miles, turn right after Waffle House. Also, you smell nice and have a huge dick."
I mean some people are already essentially married to their car so this doesn‘t seem too far fachet. We are silly monkeys.