this post was submitted on 05 Feb 2024
195 points (97.1% liked)

Technology

60082 readers
3382 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[โ€“] JoeyJoeJoeJr@lemmy.ml 10 points 10 months ago (1 children)

You are falling into a common trap. LLMs do not have understanding - asking it to do things like convert dates and put them on a number line may yield correct results sometimes, but since the LLM does not understand what it's doing, it may "hallucinate" dates that look correct, but don't actually align with the source.

[โ€“] Byter@lemmy.one 1 points 10 months ago

Thank you for calling that out. I'm well aware, but appreciate your cautioning.

I've seen hallucinations from LLMs at home and at work (where I've literally had them transcribe dates like this). They're still absolutely worth it for their ability to handle unstructured data and the speed of iteration you get -- whether they "understand" the task or not.

I know to check my (its) work when it matters, and I can add guard rails and selectively make parts of the process more robust later if need be.