this post was submitted on 11 Sep 2023
1026 points (96.2% liked)

Technology

72829 readers
2934 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 

cross-posted from: https://lemmy.world/post/3320637

YouTube and Reddit are sued for allegedly enabling the racist mass shooting in Buffalo that left 10 dead::The complementary lawsuits claim that the massacre in 2022 was made possible by tech giants, a local gun shop, and the gunman’s parents.

top 50 comments
sorted by: hot top controversial new old
[–] HawlSera@lemm.ee 131 points 2 years ago (4 children)

Youtube needs to be punished for their hypocrisy.

Average Joe gets a community guidelines strike for "promoting violence" because he said "Dead" instead of "Unalived", but Penis Prager can advocate for beating your gay kids till they turn straight and YouTube just throws it into everyone's playlists without so much as a "Boys will be boys"

[–] AndreTelevise@lemm.ee 13 points 2 years ago* (last edited 2 years ago)

Penis Prager

I see you're a person of culture as well... I know this reference.

[–] theangryseal@lemmy.world 11 points 2 years ago (3 children)

Who is going to punish them? The leaders who agree with Prager that you can beat the gay out of your kids aren’t gonna get behind that.

A significant portion of our population is hoping for a way to degay their kids.

Man, I’m gonna be all doom and gloom when I go back to bed here in a few.

For me, it seems hopeless. We’ll all be further radicalized by the thing that I thought for most of my life would bring salvation, our access to the Library of Shitexander. A big old library filled with information, Ricky. Information on the workings of electricity. Information on the life and work of Isaac Newton. Information on how to cannibalize your neighbor. Information on how some grifter talks to god and knows exactly what he wants. We can learn useful skills like, how to hate black people and why we should. We can learn how to kickoff Armageddon, and why nuclear weapons are biblical and mutually assured destruction isn’t only a good thing, it’s what we should strive for.

And because we humans create information, we have arguments about who should decide what kind of information is available. Free speech absolutists will say that anything goes and is fair game. Others will say that some speech is dangerous because it influences hatred and bigotry. Each group has representation and has to compromise in order to keep things from escalating, oh but compromise might escalate things too.

Our species was born from chaos looking for a leader who didn’t exist.

I’m just gonna ride the rock until I’m not riding it any more and hope the people of the future don’t destroy each other and can someday figure out that that god ain’t coming back. What else can we do?

You guys have a good morning. I’m heading back to bed.

load more comments (3 replies)
load more comments (2 replies)
[–] anthoniix@lemmy.world 115 points 2 years ago* (last edited 2 years ago)

I think the root of the problem is the Republican party. If you look at the language the shooter used in his manifesto, it's very very similar. There are things social media platforms can do to mitigate extremism, but people like this will continue to feel emboldened by the GOP.

[–] radix@lemmy.world 87 points 2 years ago (4 children)

Everytown Law is about to get a lesson on how Section 230 works.

load more comments (4 replies)
[–] curiousaur@reddthat.com 77 points 2 years ago (20 children)

This is so so stupid. We should also sue the ISPs then, they enabled the use of YouTube and Reddit. And the phone provider for enabling communications. This is such a dangerous slippery slope to put any blame on the platforms.

[–] Pyr_Pressure@lemmy.ca 77 points 2 years ago (4 children)

I think the thing isn't just providing access to the content, but using algorithms to promote how likely it is for deranged people to view more and more content that fuel their motives for hateful acts instead of trying to reduce how often that content is seen, all because they make more money if they watch more content, wether it is harmful or not.

load more comments (4 replies)
[–] PoliticalAgitator@lemm.ee 26 points 2 years ago

If you were head of a psychiatric ward and had an employee you knew was telling patients "Boy, I sure wish someone would kill as many black people as they could", you would absolutely share responsibility when on of them did exactly that.

If you were deliberately pairing that employee with patients who had shown violent behaviour on the basis of "they both seem to like violence", you would absolutely share responsibility for that violence.

This isn't a matter of "there's just so much content, however can we check it all?".

Reddit has hosted multiple extremist and dangerous communities, claiming "we're just the platform!" while handing over the very predictable post histories of mass shooters week after week.

YouTube has built an algorithm and monetisation system that is deliberately designed to lure people down rabbit holes then done nothing to stop it luring people towards domestic terrorism.

It's a lawsuit against companies worth billions. They're not being executed. There are grounds to accuse them of knowingly profiting from the grooming of terrorists and if they want to prove that's not the case, they can do it in court.

[–] firadin@lemmy.world 22 points 2 years ago

Do ISPs actively encourage you to watch extremist content? Do they push that content toward people who are at risk of radicalization to get extra money?

[–] sour@kbin.social 12 points 2 years ago

the isps don't encourage people to see content that makes them mad

load more comments (16 replies)
[–] Pratai@lemmy.ca 49 points 2 years ago (10 children)

They should be suing the Conservative Party. That’s the enabler of gun violence.

load more comments (10 replies)
[–] primbin@lemmy.one 48 points 2 years ago (7 children)

If youtube is still pushing racist and alt right content on to people, then they can get fucked. Why should we let some recommender system controlled by a private corporation have this much influence American culture and politics??

[–] sabogato@lemmy.blahaj.zone 46 points 2 years ago* (last edited 2 years ago) (8 children)

I sub to primarily leftist content and their YouTube shorts algorithm insists on recommending the most vile far right content on the planet. It is to the point that I'm convinced YouTube is intentionally trying to shift people far right

[–] pachrist@lemmy.world 20 points 2 years ago

I primarily watch woodworking or baking content on Youtube. I feel like the far right content is super prevalent with Shorts. I'll watch something like a quick tool review, and the next video will be someone asking folks on the street if it's ok to be white. What color you are isn't your decision, but what you do every day is, and being some dumbass white kid accosting black tourists in Times Square for shitty reaction content is just gross.

It doesn't matter how often I say I dislike the content, block channels or whatever, Youtube has just decided it's going to check in from time to time and see if I want to let loose my inner Boomer and rage with Rogan.

[–] T156@lemmy.world 16 points 2 years ago

It could be that pushing videos on the other side of the political spectrum gets interactions in the form of people sharing/commenting on it. Even if you disagree, going "Why does YouTube recommend this, this is awful" is still a share.

The algorithm prioritises interactions above all else, and fewer things get people interacting more than being wrong, or them disagreeing vehemently.

[–] TruTollTroll@lemmy.world 13 points 2 years ago

This is happening on my FB video feed. I watch a funny chick called Charlotte Dobre and she does funny reaction videos. I honestly love her, but all my algorithm shows me for recommendations are these cop brutality videos with comments praising the cops, and right wing crap that praises Abbotts wall and desantis dictatorship. It drives me nuts, and no matter howany pages I block I always get more right wing recommended crap videos

load more comments (5 replies)
load more comments (6 replies)
[–] reagansrottencorpse@lemmy.world 47 points 2 years ago (5 children)

If you look at anything even remotely related to "men's interests" YouTube will begin showing you alt right fascist bull shit.

[–] DarthBueller@lemmy.world 18 points 2 years ago (2 children)

Seriously. I spend a little too much time watching a short that is clearly designed to get me worked up about stereotypical communication difficulties between men & women from a "women, am I rite?" perspective, suddenly I'm getting Jordan Peterson and Joe Rogan. I spend a little too much time watching a video about certain Ukrainian war equipment or a Slo Mo Guys video involving guns (wood stock hunting guns, I felt like it was the early 80s all over again before everyone decided they needed assault weapons), suddenly I'm getting served tacticool idiots with kitted-out murder machines. Or I watch a Bart Erhman video (secular New Testament scholar with a large lay audience) and suddenly I get served muslim da'wah/apologetics videos and Catholic catechism ads.

load more comments (2 replies)
load more comments (4 replies)
[–] BonesOfTheMoon@lemmy.world 44 points 2 years ago (4 children)

They should sue Facebook too. Facebook is rife with Nazis. And they're fine with it.

[–] Duamerthrax@lemmy.world 21 points 2 years ago (4 children)

Considering the Facebook algorithm will introduce you to neo-nazis, that would actually make sense.

load more comments (4 replies)
load more comments (3 replies)
[–] Candelestine@lemmy.world 37 points 2 years ago* (last edited 2 years ago) (1 children)

Good. Civil court is where they're most vulnerable, this is called tort law.

In criminal cases, the defendant is innocent until proven guilty beyond a reasonable doubt by a jury of their peers. In a civil lawsuit, the defendant is only innocent until a judge, or jury, depending thinks they're 51% likely to be guilty, what they call the preponderance of evidence.

In other words, "probably" is good enough when you sue someone. It is not good enough if the state is trying to throw you in prison. This makes it more efficient to process the 99% of civil court cases, which are usually just dumb shit, like which of these two arguing neighbors needs to pay for having a tree on their property line cut down or something. It also results in our civil system being a very effective weapon though, as a lot of wealthier and more powerful people know pretty well.

edit for italics

edit2: If anyone doubts me you can just google "tort" and read all about our American system on wikipedia, or any number of other places.

edit3: juries in civil too.

[–] roguetrick@kbin.social 15 points 2 years ago (1 children)

I don't really know why you emphasized judge. Jury trials are very common in civil cases. This will be a pretrial dismissal or summary judgement without a jury, however. There's nothing to discover or evidence to review that's contested.

load more comments (1 replies)
[–] WheeGeetheCat@sh.itjust.works 36 points 2 years ago

Feels good to be reading this somewhere other than reddit

[–] Zengen@lemmy.world 35 points 2 years ago (21 children)

I mean lookingbat the details for the basis of the suit. They think they can sue someone for teaching a criminal how to do something. They think they can sue the makers of body armor for selling a guy who was not a criminal at the time of purchase, an unregulated commercial product. They think they can sue YouTube for providing motive for whatever he did.

In the law world theres a word for this. Its called a shakedown. This is grieving family's who are vindictive. They dont care who pays, but somebody has to pay in their eyes. Sadly on the merits this case will die in court pretty fast and nobody is gonna see a dollar unless alphabet and spez's lawyers decide they are feeling charitable. Which they won't because settling would cause implications of guilt in the public eye.

load more comments (21 replies)
[–] dingleberry@discuss.tchncs.de 28 points 2 years ago* (last edited 2 years ago) (17 children)

You have klan members in Congress, supreme court, churches and every police department, but sure, YT and Reddit are the problem.

load more comments (17 replies)
[–] shortwavesurfer@monero.town 25 points 2 years ago (1 children)

Will be dismissed on section 230 grounds.

[–] Chetzemoka@kbin.social 28 points 2 years ago (1 children)

Their content promotion algorithms are not protected by section 230. Those algorithms are the real problem, pushing more and more radical content onto vulnerable minds. (The alt-right YouTube pipeline is pretty well documented. Reddit, I think, less so. But they still promote "similar content")

load more comments (1 replies)
[–] skymtf@lemmy.blahaj.zone 22 points 2 years ago (1 children)

I feel like our problem isn't that social media companies are not liable but that they are too big, like imagine this happening on mastodon. Generally I feel like mastodon would not allow this unless the instance was specificlly facist like the KF instance

load more comments (1 replies)
[–] Colorcodedresistor@lemm.ee 19 points 2 years ago (1 children)

They blamed books for copy cat killers, movies and video games for shootings now they want to blame websites...

now they are trying to sue people because of hindsight? this isn't Minority Report. this is 'lets throw allot of torts and other legal bs on the wall and pray something sticks'

[–] VonCesaw@lemmy.world 37 points 2 years ago (16 children)

Making legal precedent so that they AVOID showing the offending content instead of PROMOTING the offending content is probably the goal

About 30-40 times a day, Youtube shorts shows me videos actively advocating violence, and I know for sure that Google has enough money and resources currently to prevent these videos being shown, considering it AUTOMATICALLY SUBTITLES THEM

[–] derpgon@programming.dev 23 points 2 years ago

I had to manually report a 100k views short showing someone killing a snail with an air gun. It got removed almost instantly.

Sure, it's a snail, and sure, it's an air gun, but exactly this type of videos are breeding grounds for sickos. And no YouTube, the 1mil sub Minecraft channel that said "kill a creep" is not really violent, neither is some who says "fuck" in the first 30 seconds.

Gosh I hate the platform.

load more comments (15 replies)
[–] SitD@feddit.de 19 points 2 years ago (39 children)

🤔 so if gun violence is a problem... and they've already banned violence... what if one would ban the other thing - oh wait no it's definitely the goofy gamer machinimas 🤭 stop giggling y'all, this is serious. you don't wanna turn into criminals

load more comments (39 replies)
[–] mob@lemmy.world 18 points 2 years ago* (last edited 2 years ago)

It's weird that this is a link to the exact same 25 day old post on the same community.

[–] sndmn@lemmy.ca 15 points 2 years ago (1 children)

The gun(s) are the most significant enablers of mass shootings.

[–] snausagesinablanket@lemmy.world 26 points 2 years ago (5 children)

Complete lack of accessible mental health counseling enters the room.

load more comments (5 replies)
[–] FrankTheHealer@lemmy.world 14 points 2 years ago (2 children)

Much as I dislike Reddit, I dont think they are to blame here

[–] Zithero@lemmy.world 14 points 2 years ago (8 children)

Reddit worked very hard to protect all anti-nazi imagery and stop people from posting anti-nazi sentiment. I'd like for someone to acknowledge that they silence anyone who posts anti-nazi shit and who speaks about killing Nazis.

Many are here because of that.

load more comments (8 replies)
load more comments (1 replies)
[–] autotldr@lemmings.world 12 points 2 years ago

This is the best summary I could come up with:


YouTube, Reddit and a body armor manufacturer were among the businesses that helped enable the gunman who killed 10 Black people in a racist attack at a Buffalo, New York, supermarket, according to a pair of lawsuits announced Wednesday.

The complementary lawsuits filed by Everytown Law in state court in Buffalo claim that the massacre at Tops supermarket in May 2022 was made possible by a host of companies and individuals, from tech giants to a local gun shop to the gunman’s parents.

The lawsuit claims Mean LLC manufactured an easily removable gun lock, offering a way to circumvent New York laws prohibiting assault weapons and large-capacity magazines.

YouTube, named with parent companies Alphabet Inc. and Google, is accused of contributing to the gunman’s radicalization and helping him acquire information to plan the attack.

“We aim to change the corporate and individual calculus so that every company and every parent recognizes they have a role to play in preventing future gun violence,” said Eric Tirschwell, executive director of Everytown Law.

Last month, victims’ relatives filed a lawsuit claiming tech and social media giants such as Facebook, Amazon and Google bear responsibility for radicalizing Gendron.


The original article contains 592 words, the summary contains 192 words. Saved 68%. I'm a bot and I'm open source!

load more comments
view more: next ›