this post was submitted on 30 Mar 2025
34 points (90.5% liked)

Asklemmy

47155 readers
462 users here now

A loosely moderated place to ask open-ended questions

Search asklemmy 🔍

If your post meets the following criteria, it's welcome here!

  1. Open-ended question
  2. Not offensive: at this point, we do not have the bandwidth to moderate overtly political discussions. Assume best intent and be excellent to each other.
  3. Not regarding using or support for Lemmy: context, see the list of support communities and tools for finding communities below
  4. Not ad nauseam inducing: please make sure it is a question that would be new to most members
  5. An actual topic of discussion

Looking for support?

Looking for a community?

~Icon~ ~by~ ~@Double_A@discuss.tchncs.de~

founded 6 years ago
MODERATORS
 

I can't see it happening tbh, but like the USA government discussed putting restriction on AI development, I think OpenAI or some other companies asked them to do so!? And there were short/reels of high profile developers hyping out the fact that "we don't know what we're doing", and one of them quit his job. So why is all that hype? Is the "Matrix" route actually a possible future ?

you are viewing a single comment's thread
view the rest of the comments
[–] timmytbt@sh.itjust.works 11 points 3 days ago (2 children)

Of late, my biggest concern is certain parties feeding LLMs with a different version of history.

Search has become so shit of late that LLMs are often the better path to answering a question. But as everyone knows they are only as good as what they’ve been trained on.

Do we, as a society, move past basic search to a preference for AI to answer our questions? If we do, how do we ensure that the history they feed the models is accurate?

[–] nfreak@lemmy.ml 7 points 3 days ago

This is absolutely one of the reasons they're pushing this garbage so hard. It's VERY easy to manipulate as a propaganda tool.

You can already see that most of these tools lean right because their userbase does - leftists don't touch this garbage because of numerous ethical concerns as-is. Add more astroturfing on top of that, and now it's just a straight up automated fascist mouthpiece.

[–] daniskarma@lemmy.dbzer0.com 2 points 3 days ago* (last edited 3 days ago) (1 children)

My country used to be a fascist dictatorship and there's still plenty of people alive who were educated on false information and a different fabricated version of history.

Certainly missinformation is not anything new.

And people should prevent it and solve it the same way it has always be solved. Taking the missinformators out of power.

It's not a tech issue. It's a political issue.

[–] timmytbt@sh.itjust.works 3 points 2 days ago

“It's not a tech issue. It's a political issue.”

It kinda is a tech issue if the output is skewed because nefarious parties are feeding the model shit.

If they control the tech and what’s being fed into it then it makes the process rife for manipulation.