this post was submitted on 30 May 2025
26 points (75.0% liked)

Ask Lemmy

31996 readers
1360 users here now

A Fediverse community for open-ended, thought provoking questions


Rules: (interactive)


1) Be nice and; have funDoxxing, trolling, sealioning, racism, and toxicity are not welcomed in AskLemmy. Remember what your mother said: if you can't say something nice, don't say anything at all. In addition, the site-wide Lemmy.world terms of service also apply here. Please familiarize yourself with them


2) All posts must end with a '?'This is sort of like Jeopardy. Please phrase all post titles in the form of a proper question ending with ?


3) No spamPlease do not flood the community with nonsense. Actual suspected spammers will be banned on site. No astroturfing.


4) NSFW is okay, within reasonJust remember to tag posts with either a content warning or a [NSFW] tag. Overtly sexual posts are not allowed, please direct them to either !asklemmyafterdark@lemmy.world or !asklemmynsfw@lemmynsfw.com. NSFW comments should be restricted to posts tagged [NSFW].


5) This is not a support community.
It is not a place for 'how do I?', type questions. If you have any questions regarding the site itself or would like to report a community, please direct them to Lemmy.world Support or email info@lemmy.world. For other questions check our partnered communities list, or use the search function.


6) No US Politics.
Please don't post about current US Politics. If you need to do this, try !politicaldiscussion@lemmy.world or !askusa@discuss.online


Reminder: The terms of service apply here too.

Partnered Communities:

Tech Support

No Stupid Questions

You Should Know

Reddit

Jokes

Ask Ouija


Logo design credit goes to: tubbadu


founded 2 years ago
MODERATORS
 

As LLMs become the go-to for quick answers, fewer people are posting questions on forums or social media. This shift could make online searches less fruitful in the future, with fewer discussions and solutions available publicly. Imagine troubleshooting a tech issue and finding nothing online because everyone else asked an LLM instead. You do the same, but the LLM only knows the manual, offering no further help. Stuck, you contact tech support, wait weeks for a reply, and the cycle continues—no new training data for LLMs or new pages for search engines to index. Could this lead to a future where both search results and LLMs are less effective?

you are viewing a single comment's thread
view the rest of the comments
[–] psx_crab@lemmy.zip 14 points 4 days ago (3 children)

And where does LLM take the answer? Forum and socmed. And if LLM don't have the actual answer they blabbering like a redditor, and if someone can't get an accurate answer they start asking forum and socmed.

So no, LLM will not replace human interaction because LLM relies on human interaction. LLM cannot diagnose your car without human first diagnose your car.

[–] leftzero@lemmynsfw.com 9 points 4 days ago* (last edited 4 days ago) (1 children)

And if LLM don't have the actual answer they blabbering like a redditor, and if someone can't get an accurate answer they start asking forum and socmed.

LLM's are completely incapable of giving a correct answer, except by random chance.

They're extremely good at giving what looks like a correct answer, and convincing their users that it's correct, though.

When LLMs are the only option, people won't go elsewhere to look for answers, regardless of how nonsensical or incorrect they are, because the answers will look correct, and we'll have no way of checking them for correctness.

People will get hurt, of course. And die. (But we won't hear about it, because the LLM's won't talk about it.) And civilization will enter a truly dark age of mindless ignorance.

But that doesn't matter, because the company will have already got their money, and the line will go up.

[–] Tar_alcaran@sh.itjust.works 1 points 3 days ago

They're extremely good at giving what looks like a correct answer,

Exactly. Sometimes the thing that looks right IS right, and sometimes it's not. The stochastic parrot doesn't know the difference

[–] oyo@lemm.ee 4 points 4 days ago (1 children)

The problem is that the LLMs have stolen all that information, repackaged it in ways that are subtly (or blatantly) false or misleading, and then hidden the real information behind a wall of search results that are entire domains of ai trash. It's very difficult to even locate the original sources or forums anymore.

[–] chaosCruiser@futurology.today 1 points 4 days ago

I've even tried to use Gemini to find a particular YouTube video that matches specific criteria. Unsurprisingly, it gave me a bunch of videos, none of which were even close to what I'm looking for.

[–] chaosCruiser@futurology.today 1 points 4 days ago

That’s true. There could be a balance of sorts. Who knows. If LLMs become increasingly useful, people start using them more. As they loose training data, quality goes down, and people shift back to forums etc. Could work that way too.