this post was submitted on 04 Aug 2023
55 points (100.0% liked)

Technology

37720 readers
623 users here now

A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.

Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.

Subcommunities on Beehaw:


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 2 years ago
MODERATORS
 

Longtermism poses a real threat to humanity

https://www.newstatesman.com/ideas/2023/08/longtermism-threat-humanity

"AI researchers such as Timnit Gebru affirm that longtermism is everywhere in Silicon Valley. The current race to create advanced AI by companies like OpenAI and DeepMind is driven in part by the longtermist ideology. Longtermists believe that if we create a “friendly” AI, it will solve all our problems and usher in a utopia, but if the AI is “misaligned”, it will destroy humanity...."

@technology

you are viewing a single comment's thread
view the rest of the comments
[–] wahming@monyet.cc 21 points 1 year ago (3 children)

Why? The author explains what longtermism is, at least their perspective of it, but then makes the jump to 'in conclusion, it's bad for humanity' without ever quite touching on WHY it's bad (having a few key followers with questionable ethics is insufficient, you can do that about ANY major belief)

[–] Jummit@lemmy.one 14 points 1 year ago (1 children)

Probably because it ignores issues that are relevant right now in favor of some theoretical distant future which will probably never pan out.

[–] wahming@monyet.cc 2 points 1 year ago (1 children)

How so? To plan for the future requires that you survive the present. I doubt anybody is saying 'screw global warming, I'll be fine in a cpu'.

[–] Jummit@lemmy.one 8 points 1 year ago

I doubt anybody is saying ‘screw global warming, I’ll be fine in a cpu.

You'd be surprised what the tech billionaires are saying right now. They are definitely not tackling the problems of today, but are creating new ones by the minute.

[–] laylawashere44@lemmy.blahaj.zone 13 points 1 year ago (2 children)

A major problem with longterminism is that it presumes to speak for future people who are entirely theoretical, who's needs are entirely impossible to accurately predict. It also depriorites immediate problems.

So Elon Musk is associated with Longterminism (self proclaimed). He might consider that interplanetary travel is in best interest of mankind in the future (Reasonable). As a longtermist he would then feel a moral responsibility to advance interplanetary travel technology. So far, so good.

But the sitch is that he might feel that the moral responsibility to advance space travel via funding his rocket company is far more important that his moral responsibility to safeguard the well being of his employees by not overworking them.

I mean after all yeah it might ruin the personal lives and of a hundred, two hundred, even a thousand people, but what's that compared to the benefit advancing this technology will bring to all mankind? There are going to be billions of people befitting from this in the future!

But that's not really true. Because we can't be certain that those billions of people will even exist let alone benefit. But the people suffering at his rocket company absolutely do exist and their suffering is not theoretical.

The greatest criticism of this line of thought is that it gives people, or at the moment, billionaires permission to do whatever the fuck they want.

Sure flying on a private jet is ruinous to the environment but I need to do it so I can manage my company which will create an AI that will make everything better...

[–] wahming@monyet.cc 3 points 1 year ago (1 children)

That's a fair criticism. But how is that a threat to humanity?

[–] laylawashere44@lemmy.blahaj.zone 13 points 1 year ago* (last edited 1 year ago) (1 children)

Because it gives powerful people permission to do whatever they want, everyone else be damned.

Both of the two major Longtermist philophers casually dismiss climate change in their books for example (I have Toby Ord's book which is apparently basically the same as William Mckaskils book but first and better, supposedly). As if it's something that can be just solved by technology in the near future. But what if it isn't?

What if we don't come up with fusion power or something and solving climate change requires actual sacrifices that had to be made 50 years before we figured out fusion isn't going to work out. What if the biosphere actually collapses and we can't stop it. That's a solid threat to humanity.

[–] wahming@monyet.cc 6 points 1 year ago (1 children)

No, it gives them a justification to do so. But is that actually any different from any other belief system? Powerful assholes have always justified their actions using whatever was convenient, be it religion or otherwise. What makes longtermism worse, to the extent it's a threat to humanity when everything else isn't?

[–] AnonStoleMyPants@sopuli.xyz 2 points 1 year ago

Don't think so personally. The only reason might be that tech billionaires probably think it is more "their thing" than religion or whatever. Hence, quite bad.

[–] lloram239@feddit.de 2 points 1 year ago

The greatest criticism of this line of thought is that it gives people, or at the moment, billionaires permission to do whatever the fuck they want.

Rich people have been doing whatever the fuck they want for thousands of years. Musk at least tries to build a cool big spaceship while doing so. I don't really see the problem with that.

How is giving rich people a reason to do good and having some long term vision a bad thing? We didn't get climate change because people were looking too far ahead, we got it because it was cheap energy and people made a lot of money with it.

This whole article reeks of short term thinking. Do whatever feels good in the moment, don't care about the consequences.

[–] chromatic_churn@lemm.ee 6 points 1 year ago (1 children)

You need to listen to Tech Won't Save Us. Paris Marx interviews several guests who describe in detail the issues with longtermism.

The comparison to "ANY major belief" is wildly flawed and I see you keep doing that in every single response in this thread.

[–] wahming@monyet.cc 5 points 1 year ago

Ok, why IS the comparison wildly flawed?