this post was submitted on 13 Aug 2023
954 points (96.2% liked)

Technology

58173 readers
5100 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

As the AI market continues to balloon, experts are warning that its VC-driven rise is eerily similar to that of the dot com bubble.

you are viewing a single comment's thread
view the rest of the comments
[–] Not_Alec_Baldwin@lemmy.world 6 points 1 year ago (1 children)

I've started going down this rabbit hole. The takeaway is that if we define intelligence as "ability to solve problems", we've already created artificial intelligence. It's not flawless, but it's remarkable.

There's the concept of Artificial General Intelligence (AGI) or Artificial Consciousness which people are somewhat obsessed with, that we'll create an artificial mind that thinks like a human mind does.

But that's not really how we do things. Think about how we walk, and then look at a bicycle. A car. A train. A plane. The things we make look and work nothing like we do, and they do the things we do significantly better than we do them.

I expect AI to be a very similar monster.

If you're curious about this kind of conversation I'd highly recommend looking for books or podcasts by Joscha Bach, he did 3 amazing episodes with Lex.

[–] orphiebaby@lemmy.world -2 points 1 year ago* (last edited 1 year ago) (1 children)

Current "AI" doesn't solve problems. It doesn't understand context. It can't see fingers and say "those are fingers, make sure there's only five". It can't tell the difference between a truth and a lie. It can't say "well that can't be right!" It just regurgitates an amalgamation of things humans have showed it or said, with zero understanding. "Consciousness" and certainly "sapience" aren't really relevant factors here.

[–] minikieff@lemmy.world -1 points 1 year ago (1 children)
[–] orphiebaby@lemmy.world 1 points 1 year ago* (last edited 1 year ago)

No? There's a whole lot more to being human than being able to separate one object from another and identify it, recognize that object, and say "my database says that there should only be two of these in this context". Current "AI" can't even do that much-- especially not with art.

Do you know what "sapience" means, by the way?