this post was submitted on 23 Nov 2024
556 points (95.9% liked)

Technology

60123 readers
3119 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 2 years ago
MODERATORS
 

Writing a 100-word email using ChatGPT (GPT-4, latest model) consumes 1 x 500ml bottle of water It uses 140Wh of energy, enough for 7 full charges of an iPhone Pro Max

you are viewing a single comment's thread
view the rest of the comments
[–] lennivelkant@discuss.tchncs.de 1 points 4 weeks ago (1 children)

AI strong enough would be smarter than a human

General AI might be, but the type of "AI" we have right now isn't general, isn't smarter, it's just a really expensive imitation engine that people keep mistaking for actual intelligence.

And the energy consumption and heat production are really not what our global situation needs right now.

[–] areyouevenreal@lemm.ee 0 points 3 weeks ago (1 children)

AGI and ASI are what I am referring to. Of course we don't actually have that right now, I never claimed we did.

It is hilarious and insulting you trying to "erm actually" me when I literally work in this field doing research on uses of current gen ML/AI models. Go fuck yourself.

[–] lennivelkant@discuss.tchncs.de 1 points 3 weeks ago

AGI and ASI are what I am referring to. Of course we don't actually have that right now, I never claimed we did.

I was talking about the currently available technology though, its inefficiency, and the danger of tech illiteracy leading to overreliance on tools that aren't quite so "smart" yet to warrant that reliance.

I agree with your sentiment that it may well some day reach that point. If it does and the energy consumption is no longer an active concern, I do see how it could justifiably be deployed at scale.

But we also agree that "we don't actually have that right now", and with what we do have, I don't think it's reasonable. I'm happy to debate that point civilly, if you're interested in that.

It is hilarious and insulting you trying to "erm actually" me when I literally work in this field doing research on uses of current gen ML/AI models.

And how would I know that? Everyone on the Internet is an expert, how would I come to assume you're actually one? Given the misunderstanding outlined above, I assumed you were conflating the (topical) current models with the (hypothetical) future ones.

Go fuck yourself

There is no need for such hostility. I meant no insult, I just misunderstood what you were talking about and sought to correct a common misconception. Seeing how the Internet is already full of vitriol, I think we'd all do each other a favour if we tried applying Hanlon's Razor more often and look for explanations of human error instead of concluding malice.

I hope you have a wonderful week, and good luck with your ongoing research!