this post was submitted on 30 Aug 2024
33 points (73.9% liked)
Technology
59402 readers
2735 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
LLMs predict text, they don't have feelings or awareness. Even if a researcher did say that I call to attention the Google chatbot programmer who thought an LLM became sentient because it said so when generating text.
Guys, my paper is sentient, it says so.
If the AI says he's disonhest and sensational that's because enough people on the internet have said so that the AI considers it to be true.
It doesn't take people on the internet saying it though; just an association with people saying something and the name, which happens to people who write news articles about something.