this post was submitted on 24 Oct 2025
334 points (95.6% liked)

Showerthoughts

37966 readers
967 users here now

A "Showerthought" is a simple term used to describe the thoughts that pop into your head while you're doing everyday things like taking a shower, driving, or just daydreaming. The most popular seem to be lighthearted clever little truths, hidden in daily life.

Here are some examples to inspire your own showerthoughts:

Rules

  1. All posts must be showerthoughts
  2. The entire showerthought must be in the title
  3. No politics
    • If your topic is in a grey area, please phrase it to emphasize the fascinating aspects, not the dramatic aspects. You can do this by avoiding overly politicized terms such as "capitalism" and "communism". If you must make comparisons, you can say something is different without saying something is better/worse.
    • A good place for politics is c/politicaldiscussion
  4. Posts must be original/unique
  5. Adhere to Lemmy's Code of Conduct and the TOS

If you made it this far, showerthoughts is accepting new mods. This community is generally tame so its not a lot of work, but having a few more mods would help reports get addressed a little sooner.

Whats it like to be a mod? Reports just show up as messages in your Lemmy inbox, and if a different mod has already addressed the report, the message goes away and you never worry about it.

founded 2 years ago
MODERATORS
top 50 comments
sorted by: hot top controversial new old
[–] stepan@lemmy.cafe 60 points 1 week ago (1 children)

the AI decided ¯_(ツ)_/¯

[–] PillowTalk420@lemmy.world 42 points 1 week ago (1 children)

Excuse me, you dropped this: \

[–] Cort@lemmy.world 18 points 1 week ago (3 children)

No, I'm sorry, the AI has decided you don't need that forearm, but you're welcome to keep the hand.

[–] PillowTalk420@lemmy.world 5 points 1 week ago (1 children)

It was a real struggle to upvote you after seeing your username. That's the name of the furniture rental company that tried to screw me over a few months ago.

[–] Cort@lemmy.world 5 points 1 week ago

Oof, I've heard they're as bad as payday loans, so I've never used them. Though I've also heard that rent-a-center is worse.

[–] shalafi@lemmy.world 2 points 1 week ago (1 children)

Cort

You the guy that trained Roland?

load more comments (1 replies)
[–] ch00f@lemmy.world 46 points 1 week ago (1 children)

Outside of law enforcement, this is certainly how shitty customer service policies get enforced. In other words, "Computer says no".

[–] Semi_Hemi_Demigod@lemmy.world 32 points 1 week ago (4 children)

The British Post office rolled out a hugely buggy piece of software that bankrupted small business owners, got some sentenced to years in prison, and caused thirteen people to commit suicide because “computers can’t be wrong”

[–] ch00f@lemmy.world 24 points 1 week ago (1 children)
[–] Seleni@lemmy.world 14 points 1 week ago

The case was settled for £58 million, leaving the claimants with £12 million after legal costs.

Ewwww

Here in Australia they rolled out an automated system to calculate welfare overpayments and issue debts. It didn't quite work of course and hundreds of thousands of the poorest Australian were issued with false debts, some of whom died or committed suicide before they could be repaid. People still keep floating the idea of automation and AI in our welfare systems...

[–] Xaphanos@lemmy.world 4 points 1 week ago* (last edited 1 week ago) (1 children)

Not to be that guy but... Link?

[–] Kyrgizion@lemmy.world 3 points 1 week ago

Not only UK's fuckup, also Fujitsu's.

[–] over_clox@lemmy.world 27 points 1 week ago (2 children)

Now where does this thought come from?

Do you not know what a computer is? It's literally a digital logical accountant! Yeah yeah, we should probably blame the programmers and engineers instead when shit goes sideways, but now I think we need to also hold CEOs accountable when they decide to inject faulty AI into mission critical systems...

https://lemmy.dbzer0.com/post/55990956

[–] henfredemars@infosec.pub 17 points 1 week ago (1 children)

There’s a reason why license agreements often stay there there are no warranties express or implied, no guarantees, and no fitness for any particular purpose.

[–] Semi_Hemi_Demigod@lemmy.world 19 points 1 week ago

“This software is useless and should not be used by anyone for any purpose” is my favorite part of license agreements.

[–] mogranja@lemmy.world 8 points 1 week ago (1 children)

If a building collapses. You blame the people who built the walls and poured the concrete, or the ones who chose the materials and approved the project?

In any case, often programmers and engineers retain no rights to the software they worked on. So whoever profits from the software should also shoulder the blame.

load more comments (1 replies)
[–] Zorsith@lemmy.blahaj.zone 17 points 1 week ago (1 children)

IT disagrees. Misbehaving hardware can be taken out back and shot.

[–] faythofdragons@slrpnk.net 9 points 1 week ago (1 children)

We all remember what happened to the printer in Office Space

[–] popekingjoe@lemmy.world 7 points 1 week ago

Damn it feels good to be a gangsta.

[–] ChonkyOwlbear@lemmy.world 16 points 1 week ago

That's why cops love using dogs too. Courts have ruled that dogs can't lie. That means if a dog indicates you have contraband, then a search is warranted, even if nothing is found. This of course ignores that it is entirely possible the dog indicated contraband because the cop trained it to do so on command.

[–] affenlehrer@feddit.org 10 points 1 week ago

You could hold developers of algorithms, logic and even symbolic AI accountable.

However, it's a completely different story for AI based on deep neutral networks. After training they're just a bunch of weights and parameters without individual meaning and it's not a few, it's billions or trillions of them. And almost none of them were individually set, they're often randomly initialized and then automatically tuned by deep learning algorithms during training until the behavior / predictions of the neural net are "good enough".

It's practically impossible to review the network and when you test it you just get the result for the concrete test cases, you can't interpolate or assume even slightly different cases will behave similarly. You also can't fix an individual bug. You can just train again or more and this effort might fix the problem but it could also destroy something that worked before (catastrophic forgetting).

[–] QuantumTickle@lemmy.zip 9 points 1 week ago (22 children)

We don't jail the gun for murder.

[–] PeriodicallyPedantic@lemmy.ca 5 points 1 week ago

I feel like you're missing the point.
They're not saying to jail computers, they're saying be ware of political leaders using computers to abdicate responsibility.

[–] kibiz0r@midwest.social 3 points 1 week ago (1 children)

We shut down companies for it though, and what AI vendors are doing is basically selling the ability to turn job roles into “accountability sinks”, where your true value is in taking the fall for AI when it gets it wrong (…enough that someone successfully sues).

If you want to put it in gun terms: The AI vendors are selling a gun that automatically shoots at some targets but not others. The targets it recommends are almost always profitable in the short term, but not always legal. You must hire a person to sit next to the gun and stop it from shooting illegal targets. It can shoot 1000 targets per minute.

[–] JoeBigelow@lemmy.ca 1 points 1 week ago

Sounds like a fun job if the acceptable failure rate is like, 50%

[–] snooggums@piefed.world 1 points 1 week ago* (last edited 1 week ago) (3 children)

We also don't give the murderer a free pass because they used a gun.

A tool is a tool, and the person who designed it or used it is responsible depending on why it caused a negative outcome. I know you clarified it later but it is so stupidly obvious I wanted to add to it.

load more comments (3 replies)
load more comments (19 replies)
[–] individual@toast.ooo 8 points 1 week ago* (last edited 1 week ago)

this is straight out of the book 'Do Android Dream Of Electric Sheep' later turned into the movie 'iRobot'

Tap for spoileronly humans can be convicted of murder, therefore if a robot kills someone, its nothing more than a common mechanical hazard.

[–] infinitesunrise@slrpnk.net 6 points 1 week ago* (last edited 1 week ago)

All technology has the potential to be both liberatory and oppressive, all that ever matters is who wields it and to what end.

Lewis Herber (Murray Bookchin) - Towards a Liberatory Technology

[–] bryndos@fedia.io 6 points 1 week ago

Law enforcement will seize and use computers and the data they hold as evidence to convict criminals, just like any other tool that they might be warranted to seize.

Courts will examine the evidence of what it did to determine what role it played in the offence and whether it supports the allegation.

Likewise police complaints authorities could do the same in principle against the police; if someone were to give them a warrant and the power to execute it.

If a thing happens in public that was unwarranted and can be traced back to a police force or how they deployed any equipment, they can be judicially reviewed* for any decision to deploy that bit of kit. It's more a matter of will they actually be JR'd and will that be review be just and timely. * - in my country.

I don't think it's much different from how they deploy other tech like clubs and pepper spray, tear gas, tazers or firearms. If they have no fear of acting outwith their authority , that's a problem.

In some ways it might be easier to have an 'our word' vs 'their word' defense when they shoot someone, compared to a computer program that might literally document the abuse of power in its code or log files.

"Oops i dropped my notebook", is maybe easier than, "oops i accidentally deleted my local file and then sent a request to IT - that was approved by my manager - asking them to delete instead of restore any onsite or offsite backups".

[–] Kyrgizion@lemmy.world 3 points 1 week ago

This is unironically one of the main drivers of AI. As soon as all crucial social systems are inundated with AI, the built-in bias will be excused as "minor glitches" of the system, but the real reason was always a total lack of accountability.

[–] Damage@slrpnk.net 2 points 1 week ago

Yeah, not like people in power, who are held accountable all the time!

[–] JeSuisUnHombre@lemmy.zip 2 points 1 week ago (1 children)
[–] netvor@lemmy.world 1 points 1 day ago

...or any inanimate objects, really.

load more comments
view more: next ›