this post was submitted on 04 Aug 2023
55 points (100.0% liked)

Technology

37720 readers
541 users here now

A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.

Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.

Subcommunities on Beehaw:


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 2 years ago
MODERATORS
 

Longtermism poses a real threat to humanity

https://www.newstatesman.com/ideas/2023/08/longtermism-threat-humanity

"AI researchers such as Timnit Gebru affirm that longtermism is everywhere in Silicon Valley. The current race to create advanced AI by companies like OpenAI and DeepMind is driven in part by the longtermist ideology. Longtermists believe that if we create a “friendly” AI, it will solve all our problems and usher in a utopia, but if the AI is “misaligned”, it will destroy humanity...."

@technology

you are viewing a single comment's thread
view the rest of the comments
[–] wahming@monyet.cc 3 points 1 year ago (3 children)

That's a fair criticism. But how is that a threat to humanity?

[–] laylawashere44@lemmy.blahaj.zone 13 points 1 year ago* (last edited 1 year ago) (2 children)

Because it gives powerful people permission to do whatever they want, everyone else be damned.

Both of the two major Longtermist philophers casually dismiss climate change in their books for example (I have Toby Ord's book which is apparently basically the same as William Mckaskils book but first and better, supposedly). As if it's something that can be just solved by technology in the near future. But what if it isn't?

What if we don't come up with fusion power or something and solving climate change requires actual sacrifices that had to be made 50 years before we figured out fusion isn't going to work out. What if the biosphere actually collapses and we can't stop it. That's a solid threat to humanity.

[–] wahming@monyet.cc 6 points 1 year ago (1 children)

No, it gives them a justification to do so. But is that actually any different from any other belief system? Powerful assholes have always justified their actions using whatever was convenient, be it religion or otherwise. What makes longtermism worse, to the extent it's a threat to humanity when everything else isn't?

[–] AnonStoleMyPants@sopuli.xyz 2 points 1 year ago

Don't think so personally. The only reason might be that tech billionaires probably think it is more "their thing" than religion or whatever. Hence, quite bad.