this post was submitted on 11 Nov 2024
44 points (92.3% liked)

Technology

59358 readers
5091 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] Imgonnatrythis@sh.itjust.works 20 points 4 days ago (3 children)

Training AI on "synthetic" data generated from other AIs sounds genius! Seems like a bulletproof way to make AI infinity smarter just by recuressively feeding itself! Great success is on the horizon!

[–] Voroxpete@sh.itjust.works 10 points 4 days ago

It's been proven that even small amounts of synthetic data injected into a training set quickly leads to a phenomenon termed "model collapse", though I prefer the term "Hapsburg AI" (not mine).

Basically, this is the kind of thing you announce you're doing because it will hopefully get you one more round of investment funding while Sam Altman finishes working out how to fake his death.

[–] FaceDeer@fedia.io 6 points 4 days ago

That's not how synthetic data generation generally works. It uses AI to process data sources, generating well-formed training data based on existing data that's not so useful directly. Not to generate it entirely from its own imagination.

The comments assuming otherwise are ironic because it's misinformation that people keep telling each other.

[–] itsathursday@lemmy.world -1 points 4 days ago

I like to call it, saving the red jpeg. One more save will make it better surely.