this post was submitted on 11 Sep 2023
1026 points (96.2% liked)
Technology
59446 readers
4383 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I think the thing isn't just providing access to the content, but using algorithms to promote how likely it is for deranged people to view more and more content that fuel their motives for hateful acts instead of trying to reduce how often that content is seen, all because they make more money if they watch more content, wether it is harmful or not.
Absolutely. I saw a Google ad the other day from maybe PragerU that was about climate change not being real, while I was searching for an old article that was more optimistic about outcomes. They actually said by the ad that they were showing it as a suggested thing, and thankfully you could report it, which I did immediately. It pissed me off a ton.
A friend recently shared a similar suggested video/ad they got on YouTube, which was saying "Ukrainians are terrorists". PragerU or TPUSA.
I can see the argument for allowing these ads to exist as a freedom of speech thing, fine. But actively promoting these ads is very different. The lawsuit would have merits on this. I'd prefer if this content was actively minimized, but at the very least it shouldn't be promoted.
What if it isn't algorithms but upvotes? What if Lemmy is next?
and we all know what reddit mods do.