this post was submitted on 22 Feb 2024
486 points (96.2% liked)

Technology

59135 readers
2968 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

Google apologizes for ‘missing the mark’ after Gemini generated racially diverse Nazis::Google says it’s aware of historically inaccurate results for its Gemini AI image generator, following criticism that it depicted historically white groups as people of color.

you are viewing a single comment's thread
view the rest of the comments
[–] barsoap@lemm.ee 7 points 8 months ago

Where do you get this? What kind of data requires a T3 system to be representable?

It's not about the type of data but data organisation and operations thereon. I already gave you a link to Nikolic' site feel free to read it in its entirety, this paper has a short and sweet information-theoretical argument.

I don’t think I’ve made any claims that are related to T2 or T3 systems, and I haven’t defined “memory”, so I’m not sure how you’re trying to put it in my terms.

I'm trying to map your fuzzy terms to something concrete.

I wouldn’t define memory as an adaptable system, so T2 would by my definition be intelligence as well.

My mattress is an adaptable system.

Where do you see “wild hallucination”?

All of it. Not in the AI but conventional term: Nothing of it ever happened, also, none of the details make sense. When humans are asked to recall an accident they witnessed they report like 10% fact (what they saw) and 90% bullshit (what their brain hallucinates to make sense of what happened). Just like human memory the AI is taking a bit of information and then combining it with wild speculation into something that looks plausible. But which, if reasoning is applied, quickly falls apart.