this post was submitted on 12 Mar 2026
1694 points (99.1% liked)

Programmer Humor

30412 readers
3130 users here now

Welcome to Programmer Humor!

This is a place where you can post jokes, memes, humor, etc. related to programming!

For sharing awful code theres also Programming Horror.

Rules

founded 2 years ago
MODERATORS
top 50 comments
sorted by: hot top controversial new old
[–] mlg@lemmy.world 95 points 4 days ago (5 children)

The modern web is an insult to the idea of efficiency at practically every level.

You cannot convince me that isolation and sandboxing requires a fat 4Gb slice of RAM for a measly 4 tabs.

[–] Noja@sopuli.xyz 22 points 4 days ago* (last edited 4 days ago)
[–] kalpol@lemmy.ca 20 points 4 days ago

It is crazy that I can have a core 2 duo with 8 gig of RAM that struggles loading web pages

load more comments (3 replies)
[–] GreenShimada@lemmy.world 264 points 5 days ago (3 children)

For anyone unsure: Jevon's Paradox is that when there's more of a resource to consume, humans will consume more resource rather than make the gains to use the resource better.

Case in point: AI models could be written to be more efficient in token use (see DeepSeek), but instead AI companies just buy up all the GPUs and shove more compute in.

For the expansive bloat - same goes for phones. Our phones are orders of magnitude better than what they were 10 years ago, and now it's loaded with bloat because the manufacturer thinks "Well, there's more computer and memory. Let's shove more bloat in there!"

[–] VibeSurgeon@piefed.social 83 points 5 days ago (1 children)

Case in point: AI models could be written to be more efficient in token use

They are being written to be more efficient in inference, but the gains are being offset by trying to wring more capabilities out of the models by ballooning token use.

Which is indeed a form of Jevon's paradox

[–] errer@lemmy.world 32 points 5 days ago (3 children)

Costs have been dropping by a factor of 3 per year, but token use increased 40x over the same period. So while the efficiency is contributing a bit to the use, the use is exploding even faster.

load more comments (3 replies)
[–] GamingChairModel@lemmy.world 32 points 5 days ago (2 children)

Jevon's Paradox is that when there's more of a resource to consume, humans will consume more resource rather than make the gains to use the resource better.

More specifically, it's when an improvement in efficiency cause the underlying resource to be used more, because the efficiency reduces cost and then using that resource becomes even more economically attractive.

So when factories got more efficient at using coal in the 19th century, England saw a huge increase in coal demand, despite using less coal for any given task.

load more comments (2 replies)
load more comments (1 replies)
[–] bampop@lemmy.world 115 points 5 days ago

My PC is 15 times faster than the one I had 10 years ago. It's the same old PC but I got rid of Windows.

[–] brotato@slrpnk.net 128 points 5 days ago (2 children)

The tech debt problem will keep getting worse as product teams keep promising more in less time. Keep making developers move faster. I’m sure nothing bad will come of it.

Capitalism truly ruins everything good and pure. I used to love writing clean code and now it’s just “prompt this AI to spit out sloppy code that mostly works so you can focus on what really matters… meetings!”

load more comments (2 replies)
[–] GenderNeutralBro@lemmy.sdf.org 92 points 5 days ago* (last edited 5 days ago) (21 children)

Everything bad people said about web apps 20+ years ago has proved true.

It's like, great, now we have consistent cross-platform software. But it's all bloated, slow, and only "consistent" with itself (if even). The world raced to the bottom, and here we are. Everything is bound to lowest-common-denominator tech. Everything has all the disadvantages of client-server architecture even when it all runs (or should run) locally.

It is completely fucking insane how long I have to wait for lists to populate with data that could already be in memory.

But at least we're not stuck with Windows-only admin consoles anymore, so that's nice.

All the advances in hardware performance have been used to make it faster (more to the point, "cheaper") to develop software, not faster to run it.

[–] udc@lemmy.world 22 points 4 days ago (2 children)

I'm dreading when poorly optimized vibe coding works it's way into mainstream software and create a glut of technical debt. Performance gonna plummet the next 5 years just wait.

load more comments (2 replies)
load more comments (20 replies)
[–] AeonFelis@lemmy.world 49 points 4 days ago (2 children)

Thought leaders spent the last couple of decades propaganding that features-per-week is the only metric to optimize, and that if your software has any bit of efficiency or quality in it that's a clear indicator for a lost opportunity to sacrifice it on the alter of code churning.

The result is not "amazing". I'd be more amazed had it turned out differently.

[–] SanicHegehog@lemmy.world 35 points 4 days ago (2 children)

Fucking "features". Can't software just be finished? I bought App. App does exactly what I need it to do. Leave. It. Alone.

[–] Yaky@slrpnk.net 11 points 4 days ago (2 children)

No, never! Tech corps (both devs and app stores) brainwashed people into thinking "no updates = bad".

Recently, I have seen people complain about lack of updates for: OS for a handheld emulation device (not the emulator, the OS, which does not have any glaring issues), and Gemini protocol browser (gemini protocol is simple and has not changed since 2019 or so).

Maybe these people don't use the calculator app because arithmetic was not updated in a few thousand years.

[–] vala@lemmy.dbzer0.com 10 points 4 days ago (1 children)

A big part of this issue is mobile OS APIs. You can't just finish an android app and be done. It gets bit rot so fast. You get maybe 1-2 years with no updates before "this app was built for an older version of android" then "this app is not compatible with your device".

load more comments (1 replies)
load more comments (1 replies)
load more comments (1 replies)
[–] ChickenLadyLovesLife@lemmy.world 23 points 4 days ago (19 children)

It's kind of funny how eagerly we programmers criticize "premature optimization", when often optimization is not premature at all but truly necessary. A related problem is that programmers often have top-of-the-line gear, so code that works acceptably well on their equipment is hideously slow when running on normal people's machines. When I was managing my team, I would encourage people to develop on out-of-date devices (or at least test their code out on them once in a while).

load more comments (19 replies)
[–] Michal@programming.dev 17 points 4 days ago (5 children)

PCs aren't faster, they have more cores, so they can do more at a time, but it takes effort to optimize for parallel work. Also the form factor keeps getting smaller, more people use laptops now and you can't cheat thermal efficiency.

[–] leftzero@lemmy.dbzer0.com 20 points 3 days ago (2 children)

My first PC ran at 16MHz on turbo.

PCs today are orders of magnitude faster. Way less fun, but faster.

What's even more orders of magnitude slower and infinitely more bloated is software. Which is the point of the post.

It's almost impossible to find any piece of actually optimised software these days (with some exceptions like sqlite) to the point that 99% percent of the software currently in use can be considered unintentional (or intentional) malware.

Particularly egregious are web browsers, which seem designed to waste the maximum possible amount of resources and run as inefficiently as possible.

And the fact that most supposedly desktop software these days runs on top of one of those pieces ofintentional (it's impossible to achieve such levels of inefficiency and bloat unintentionally, it requires active effort) malware obviously doesn't help.

[–] TheparishofChigwell@sh.itjust.works 3 points 3 days ago (1 children)

Turbo slowed your processor down though

[–] Blue_Morpho@lemmy.world 2 points 1 day ago (1 children)

Only on some and name brand PC's which used it for compatibility. For home built or local store, the turbo would overclock. I remember telling a friend, that although their 16mhz could run at 20, to not do it because it would compromise longevity! Ha! Mind you the cpu's in those days didn't have heat sinks but still- Oh no your 386 might not work in 20 years from running too hot!

Wow so it actually did have some turbo behind it. Fascinating, thanks

load more comments (1 replies)
[–] ragas@lemmy.ml 4 points 3 days ago* (last edited 3 days ago)

I came from C and C++ and had learned that parallelism is hard. Then I tried parallelism on Rust in a project of mine and it was so insanely easy.

[–] EddoWagt@feddit.nl 6 points 3 days ago

What do you mean pc's aren't faster? Yes they have more cores, they also clock higher (mostly) and have more instructions per clock. Computers now perform way better than ever before in every single metric most tasks, even linear ones, could be way faster

load more comments (2 replies)
[–] luthis@lemmy.nz 112 points 5 days ago (1 children)

On Linux it really is noticeable

[–] dfyx@lemmy.helios42.de 123 points 5 days ago (2 children)

Well, until you open a browser... or five, because these days nobody wants to build native applications anymore and instead they shove webapps into electron containers.

Right now, my laptop doesn't have to run much. Just a combination of KDE, browser, emails, music player, a couple of messengers and some background services. In total, that uses about 9.5 GB of RAM. 20 years ago we would have run the same workload with less than 1 GB.

load more comments (2 replies)
[–] kunaltyagi@programming.dev 74 points 5 days ago (20 children)

The same? Try worse. Most devices have seen input latency going up. Most applications have a higher latency post input as well.

Switching from an old system with old UI to a new system sometimes feels like molasses.

[–] Korhaka@sopuli.xyz 25 points 5 days ago

I work in support for a SaaS product and every single click on the platform takes a noticeable amount of time. I don't understand why anyone is paying any amount of money for this product. I have the FOSS equivalent of our software in a test VM and its far more responsive.

load more comments (19 replies)
[–] OctopusNemeses@lemmy.world 80 points 5 days ago (18 children)

I'm pretty sure the "unused RAM is wasted RAM" thing has caused its share of damage from shit developers who took it to mean use memory with reckless abandon.

load more comments (18 replies)
[–] OwOarchist@pawb.social 90 points 5 days ago (8 children)

When you become one with the penguin, though ... then you can begin to feel how much faster modern hardware is.

Hell, I've got a 2016 budget-model chromebook that still feels quick and snappy that way.

load more comments (8 replies)
[–] Whitebrow@lemmy.world 76 points 5 days ago (7 children)

I still remember playing StarCraft 2 shortly after release on a 300$ laptop and it running perfectly well on medium settings.

Looked amazing. Felt incredibly responsive. Polished. Optimized.

Nowadays it’s RTX this, framegen that, need SSD or loading times are abysmal, oh and don’t forget that you need 40gb of storage and 32gb of ram for a 3 hour long walking simulator, how about you optimize your goddamn game instead? Don’t even get me started on price tags for these things.

Software and game development is definitely a spectrum though, but holy shit is the ratio of sloppy releases so disproportionate that it’s hard to see it at times.

[–] addie@feddit.uk 34 points 5 days ago (11 children)

StarCraft 2 was released in 2007, and a quick search indicates the most common screen resolution was 1024x768 that year. That feels about right, anyway. A bit under a million pixels to render.

A modern 4K monitor has a bit over eight million pixels, slightly more than ten times as much. So you'd expect the textures and models to be about ten times the size. But modern games don't just have 'colour textures', they're likely to have specular, normal and parallax ones too, so that's another three times. The voice acting isn't likely to be in a single language any more either, so there'll be several copies of all the sound files.

A clean Starcraft 2 install is a bit over 20 GB. 'Biggest' game I have is Baldur's Gate 3, which is about 140 GB, so really just about seven times as big. That's quite good, considering how much game that is!

I do agree with you. I can't think of a single useful feature that's been added to eg. MS Office since Office 97, say, and that version is so tiny and fast compared to the modern abomination. (In fact, in a lot of ways it's worse - has had some functionality removed and not replaced.) And modern AAA games do focus too much on shiny and not enough on gameplay, but the fact that they take a lot more resources is more to do with our computers being expected to do a lot more.

load more comments (11 replies)
load more comments (6 replies)
[–] auraithx@lemmy.dbzer0.com 67 points 5 days ago (3 children)

Websites are probably a better example; as the complexity and bloat has increased faster than the tech.

load more comments (3 replies)
[–] oyo@lemmy.zip 57 points 5 days ago (12 children)

Windows 11 is the slowest Windows I've ever used, by far. Why do I have to wait 15-45 seconds to see my folders when I open explorer? If you have a slow or intermittent Internet connection it's literally unusable.

load more comments (12 replies)
[–] khanh@lemmy.zip 7 points 3 days ago (2 children)

Actually I can, because I use Linux.

load more comments (2 replies)
[–] DontRedditMyLemmy@lemmy.world 45 points 5 days ago (4 children)

I hate that our expectations have been lowered.

2016: "oh, that app crashed?? Pick a different one!"

2026: "oh, that app crashed again? They all crash, just start it again and cross your toes."

load more comments (4 replies)
[–] merc@sh.itjust.works 28 points 4 days ago (4 children)

You do really feel this when you're using old hardware.

I have an iPad that's maybe a decade old at this point. I'm using it for the exact same things I was a decade ago, except that I can barely use the web browser. I don't know if it's the browser or the pages or both, but most web sites are unbearably slow, and some simply don't work, javascript hangs and some elements simply never load. The device is too old to get OS updates, which means I can't update some of the apps. But, that's a good thing because those old apps are still very responsive. The apps I can update are getting slower and slower all the time.

[–] ssfckdt@lemmy.blahaj.zone 27 points 4 days ago (3 children)

It's the pages. It's all the JavaScript. And especially the HTML5 stuff. The amount of code that is executed in a webpage these days is staggering. And JS isn't exactly a computationally modest language.

Of the 200kB loaded on a typical Wikipedia page, about 85kb of it is JS and CSS.

Another 45kB for a single SVG, which in complex cases is a computationally nontrivial image format.

load more comments (3 replies)
load more comments (3 replies)
[–] BlackLaZoR@lemmy.world 6 points 3 days ago

Unreal Engine is one of biggest offenders in gaming.

[–] erayerdin@programming.dev 43 points 5 days ago (3 children)

why do anime girls have to be right all the time?

load more comments (3 replies)
[–] ssfckdt@lemmy.blahaj.zone 18 points 4 days ago

The program expands so as to fill the resources available for its execution

-- C.N. Parkinson (if he were alive today)

[–] kamen@lemmy.world 16 points 4 days ago (6 children)

I paid for the whole amount of RAM, I'm gonna use the whole amount of RAM.

/s

Joke aside, the computer I used a little more than a decade ago used to take 1 minute just to display a single raw photo. I'm a liiiittle better off now.

load more comments (6 replies)
[–] M0oP0o@mander.xyz 22 points 5 days ago

"Let them eat ram"

[–] ZILtoid1991@lemmy.world 27 points 5 days ago (12 children)

They often are worse, because everything needed to be an electron app, so they could hire the cheaper web developers for it, and also can boast about "instant cross platform support" even if they don't release Linux versions.

Qt and GTK could do cross platform support, but not data collection, for big data purposes.

load more comments (12 replies)
load more comments
view more: next ›