this post was submitted on 16 Apr 2025
320 points (98.2% liked)

Technology

72414 readers
3819 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] Telorand@reddthat.com 102 points 2 months ago (16 children)

The lede is buried deep in this one. Yeah, these dumb LLMs got bad training data that persists to this day, but more concerning is the fact that some scientists are relying upon LLMs to write their papers. This is literally the way scientists communicate their findings to other scientists, lawmakers, and the public, and they're using fucking predictive text like it has cognition and knows anything.

Sure, most (all?) of those papers got retracted, but those are just the ones that got caught. How many more are lurking out there with garbage claims fabricated by a chatbot?

Thankfully, science will inevitably sus those papers out eventually, as it always does, but it's shameful that any scientist would be so fatuous to put out a paper written by a dumb bot. You're the experts. Write your own goddamn papers.

[–] adespoton@lemmy.ca 40 points 2 months ago (6 children)

In some cases, it’s people who’ve done the research and written the paper who then use an LLM to give it a final polish. Often, it’s people who are writing in a non-native language.

Doesn’t make it good or right, but adds some context.

[–] Telorand@reddthat.com 11 points 2 months ago (4 children)

Sure, and I'm sympathetic to the baffling difficulties of English, but use Google Translate and ask someone who's more fluent for help with the final polish (as a single suggestion). Trusting your work, trusting science to an LLM is lunacy.

[–] Squirrelsdrivemenuts@lemmy.world 7 points 2 months ago (1 children)

It might be hard for them to find someone who is both fluent in english AND knows the field well enough to know vegetative electron microscopy is not a thing. Most universities have one general translation help service and science has a lot of field-specific weird terms.

[–] moakley@lemmy.world 1 points 2 months ago

That's why he said start with Google Translate. Because Google Translate isn't giving gibberish like vegetative electron microscopy.

load more comments (2 replies)
load more comments (3 replies)
load more comments (12 replies)