this post was submitted on 22 Dec 2023
166 points (100.0% liked)

Technology

37603 readers
609 users here now

A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.

Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.

Subcommunities on Beehaw:


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 2 years ago
MODERATORS
all 38 comments
sorted by: hot top controversial new old
[–] frog@beehaw.org 115 points 9 months ago

It is true that removing and demonetising Nazi content wouldn't make the problem of Nazis go away. It would just be moved to dark corners of the internet where the majority of people would never find it, and its presence on dodgy-looking websites combined with its absence on major platforms would contribute to a general sense that being a Nazi isn't something that's accepted in wider society. Even without entirely making the problem go away, the problem is substantially reduced when it isn't normalised.

[–] PotentiallyAnApricot@beehaw.org 75 points 9 months ago (2 children)

I really struggle to take seriously what these tech people say about ‘not wanting to censor’. They made a business calculation, and maybe an ideological one, and decided “we want that nazi money, it’s worth it to us.” which really tells you everything about a company and how it is likely to approach other issues, too.

[–] flumph@programming.dev 32 points 9 months ago (2 children)

It's also disingenuous because they already decline to host sex workers newsletters. So if the censorship angle was true, they're already censoring.

[–] PotentiallyAnApricot@beehaw.org 14 points 9 months ago

RIGHT. Thank you for pointing this out.


:::

[–] intensely_human@lemm.ee 1 points 8 months ago

Right, and if the profit motive angle were true, they’re already violating that by censoring the sex workers you just mentioned.

So that eliminates profit as the reason for their actions here

[–] janabuggs@beehaw.org 14 points 9 months ago (1 children)

Yes! I love this simplification!

[–] PotentiallyAnApricot@beehaw.org 17 points 9 months ago

I feel like they always try to make it sound more complicated and high minded. I really don’t believe it is!

[–] ursakhiin@beehaw.org 55 points 9 months ago (1 children)

Not gonna lie. I've never heard of Substack but I appreciate their stance of publicly announcing why I would continue to avoid them.

[–] jmp242@sopuli.xyz 5 points 9 months ago

My only interaction with Substack is that one podcast moved there for premium content. I thought it was mostly for written newsletters, which I always wondered how much of a market there actually is for paying for one newsletter, but then again I guess it's just the written version of podcasts so I guess there is a market. Though promoting Nazi content gives me a lot of pause.

[–] Omega_Haxors@lemmy.ml 37 points 9 months ago* (last edited 9 months ago)

Translation: "We support Nazis and would like to offer them passive protection. If you have a problem with them, we will ban you"

[–] dubteedub@beehaw.org 31 points 9 months ago

Any writers still on SubStack need to immeadiately look at alternative options and shift their audiences to other platforms. To stick around on the site when the founder straight up condones neo nazis and not only gives them a platform, but profit shares with them and their nazi subscribers is insane.

[–] some_guy@lemmy.sdf.org 30 points 9 months ago

Reading about this at work the other day, I announced to my coworkers that Substack is officially bad. Profiting off of nazi propaganda is bad. Fuck Substack.

I had recently subscribed to the RSS feed for The Friendly Atheist and was considering monetary support. They accept via Substack or Patreon. I would have opted for Patreon anyway, because that's where I already have subscriptions. But after learning about this, I'll never support anything, no matter what, via Substack. Eat my ass, shitheads.

[–] AaronMaria@lemmy.ml 28 points 9 months ago

What do you mean banning doesn't work? The less reach those Nazis have the less people can see their Nazi-Posts and get turned into Nazis. Also it needs to be clear that being a Nazi is not acceptable so they don't have the courage to spread their hate. This bullshit needs to stop.

[–] maynarkh@feddit.nl 27 points 9 months ago (1 children)

If they plan to do business in the EU this is illegal.

[–] JohnDumpling@beehaw.org 2 points 9 months ago

Well, you can create an account from EU, although mine got locked after creating just one blog post. And the support does not seem to respond, so I moved to a different platform.

[–] sculd@beehaw.org 18 points 9 months ago

Nope, never supporting anything from substacks again. "Freeze peach" libertarians can go to hell.

[–] garrett@infosec.pub 14 points 9 months ago (1 children)

I always hate policy talk trying to split the hairs of Nazism and “calls for violence”.

Even worse, I just can’t get allowing monetization. If you truly “hate the views”, stop lining your pocket with their money…

[–] Kichae@lemmy.ca 11 points 9 months ago

The only thing they hate is not taking their money.

[–] ArugulaZ@kbin.social 13 points 9 months ago

There are too many of these goddamned social networks anyway. After Twitter/X exploded, everyone else wanted to grab a piece of that pie, and now we've got a dozen social networks nobody uses.

If you want a progressive social network that doesn't take shit from goosesteppers, Cohost is probably the place to go. It's so neurodivergent and trans-friendly that I can't imagine them blithely accepting Nazi content. It's just not how Cohost works. "Blah blah blah, free speech!" Not here, chumps. We've got standards. Go somewhere else to push that poison.

[–] cupcakezealot@lemmy.blahaj.zone 11 points 9 months ago* (last edited 9 months ago) (1 children)

if you say nazi and white supremacist content is just a "different point of view", you support nazi and white supremacist content. period.

and it's not surprising since lulu meservey's post on twitter when the whole situation with elon basically abandoning moderation.

"Substack is hiring! If you’re a Twitter employee who’s considering resigning because you’re worried about Elon Musk pushing for less regulated speech… please do not come work here."

https://www.inverse.com/input/culture/substack-hiring-elon-musk-tweet

[–] Drewski@lemmy.sdf.org 4 points 9 months ago (1 children)

The problem is that some people are quick to call things Nazi and white supremacist, when it's actually just something they disagree with.

[–] Powerpoint@lemmy.ca 8 points 9 months ago

That's not the problem at all. If you support fascists then you support Nazi's and white supremacy.

[–] janguv@lemmy.dbzer0.com 10 points 9 months ago (4 children)

There's a lot of empirical claims surrounding this topic, and I'm unaware who really has good evidence for them. The Substack guy e.g. is claiming that banning or demonetising would not "solve the problem" – how do we really know? At the very least, you'd think that demonetising helps to some extent, because if it's not profitable to spread certain racist ideas, that's simply less of an incentive. On the other hand, plenty of people on this thread are suggesting it does help address the problem, pointing to Reddit and other cases – but I don't think anyone really has a grip on the empirical relationship between banning/demonetising, shifting ideologues to darker corners of the internet and what impact their ideas ultimately have. And you'd think the relationship wouldn't be straightforward either – there might be some general patterns but it could vary according to so many contingent and contextual factors.

[–] Lowbird@beehaw.org 11 points 9 months ago (1 children)

I agree it's murky. Though I'd like to note that when you shift hateful ideologues to dark corners of the internet, that also means making space in the main forums for people who would otherwise be forced out by the aforementioned ideologues - women, trans folks, BIPOC folks, anyone who would like to discuss xyz topic but not at the cost of the distress that results from sharing a space with hateful actors.

When the worst of the internet is given free reign to run rampant, it has a tendency to take over the space entirely with hate speech because everything and everyone else leaves instead of putting up with abuse, and those who do stay get stuck having the same, rock bottom level conversations (e.g. those in which the targets of the hate are asked to justify their existence or presence or right to have opinions repeatedly) over and over with people who aren't really interested in intellectual discussions or solving actual problems or making art that isn't about hatred.

But yeah, as with anything involving large groups of people, these things get complicated and can be unpredictable.

[–] flora_explora@beehaw.org 1 points 9 months ago (1 children)

Thank you! Even on lemmy I find the atmosphere often oblivious or ignorant to marginalized views. The majority here are cis men (regarding the poll earlier this year) and it certainly shows. And the people here are probably mostly left-leaning? So I definitely couldn't imagine sharing a space with anyone more right-leaning than that.

[–] Zworf@beehaw.org 1 points 9 months ago* (last edited 9 months ago) (1 children)

It depends a lot on the instance IMO. I didn't like the attitude at lemmy.ml but I like it here at beehaw. I'm very left-leaning, progressive and LGBTQ+ friendly.

Lemmy.ml and lemmy.world are more right-leaning as far as I can see.

[–] flora_explora@beehaw.org 1 points 8 months ago

Yes sure, beehaw is more progressive. But still I sometimes don't feel so comfortable in its community because it can at times feel very male-centered.

[–] thesmokingman@programming.dev 8 points 9 months ago

What evidence did you find to support Substack’s claims? They didn’t share any.

You can quickly and easily find good evidence for things like Reddit quarantining and the banning of folks like Alex Jones and Milo Yiannopoulos.

Which claims are empirical again?

[–] cupcakezealot@lemmy.blahaj.zone 2 points 9 months ago

we also do know that going after nazis and white supremacists works since all through the 90s they were relegated to the fringe of the fringe corners on the internet.

[–] Zworf@beehaw.org 2 points 9 months ago* (last edited 9 months ago)

There’s a lot of empirical claims surrounding this topic, and I’m unaware who really has good evidence for them. The Substack guy e.g. is claiming that banning or demonetising would not “solve the problem” – how do we really know?

Well it depends what you define as "the problem".

If you define it as Nazis existing per se, banning them does not "solve the problem" of nazis existing. They will just go elsewhere. A whole world war was not enough to get rid of them.

However, allowing them on mainstream platforms does make their views more prevalent to mainstream users and some might fall for their propaganda similar to the way people get attracted to the Qanon nonsense. So if you define the problem as "Nazis gaining attention" then yeah sure. It certainly does "solve the problem" to some degree. And I think this is the main problem these days (even in the Netherlands which is a fairly down to earth country, the fascists gained 24% of the votes in the last election!)

However however you define "the problem" making money off nazi propaganda is just simply very very bad form. And will lead to many mainstream users bugging out, and rightly so.

[–] Zworf@beehaw.org 8 points 9 months ago (1 children)

Substack started so well... It was looking like the new Medium (after medium totally enshittified). But the discovery was never very good there, and now this. Nope. Not going to blog there.

I wonder if Snowden still supports them.

[–] cupcakezealot@lemmy.blahaj.zone 1 points 9 months ago (1 children)

probably since snowden thinks putin is amazing

[–] Zworf@beehaw.org 3 points 9 months ago* (last edited 9 months ago) (1 children)

Does he really? I think it's more like he got stuck there on his way to Ecuador and now he has no alternative but to "like" Putin :P

After all his plan was never to stay in Russia.

[–] cupcakezealot@lemmy.blahaj.zone 1 points 9 months ago (1 children)

i mean "Edward Snowden gets Russian passport after swearing oath of allegiance. Whistleblower is ‘happy and thankful to the Russian Federation’ for his citizenship, lawyer says"

[–] Zworf@beehaw.org 4 points 9 months ago* (last edited 9 months ago)

I know.. But the point is, he's stuck there.

He had 2 choices:

  • Play ball and swear his oath and suck up a little to the godfather and live happily ever after
  • Kick up a stink against Putin and find himself falling out of a closed window (or, best case, being deported and spending his days in a max security prison).

It's not really like "free will" applies here :)

Whatever the lawyer said is just the minimum required decorum IMO. Just politics. The oath is probably simply required to get the passport.

Putin got to get one-over on the US and Snowden got to stay out of prison (well, in reality a really huge prison but still...). It's a marriage of convenience.

[–] autotldr@lemmings.world 3 points 9 months ago

🤖 I'm a bot that provides automatic summaries for articles:

Click here to see the summaryWhile McKenzie offers no evidence to back these ideas, this tracks with the company’s previous stance on taking a hands-off approach to moderation.

In April, Substack CEO Chris Best appeared on the Decoder podcast and refused to answer moderation questions.

“We’re not going to get into specific ‘would you or won’t you’ content moderation questions” over the issue of overt racism being published on the platform, Best said.

In a 2020 letter from Substack leaders, including Best and McKenzie, the company wrote, “We just disagree with those who would seek to tightly constrain the bounds of acceptable discourse.”

The Atlantic also pointed out an episode of McKenzie’s podcast with a guest, Richard Hanania, who has published racist views under a pseudonym.

McKenzie does, however, cite another Substack author who describes its approach to extremism as one that is “working the best.” What it’s being compared to, or by what measure, is left up to the reader’s interpretation.


Saved 57% of original text.