this post was submitted on 02 Feb 2026
35 points (64.0% liked)

Technology

80288 readers
6613 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 

I haven’t thought about it in a while but the premise of the article rings true. Desktops are overall disposable. Gpu generations are only really significant with new cpu generations. CPUs are the same with real performance needed a new chipset and motherboard. At that point you are replacing the whole system.

Is there a platform that challenges that trend?

Edit Good points were made. There is a lot to disagree with in the article, especially when focused on gaming.

Storage For the love of your data : storage is a WEAR component. Especially with HDD. Up until recently storage was so cheap it was crazy not to get new drives every few years.

Power Supplies Just because the computer still boots doesn't mean the power supply is still good. A PSU will continue to shove power into your system long past the ability to provide clean power. Scope and test an older PSU before you put it on a new build.

top 50 comments
sorted by: hot top controversial new old
[–] anubis2814@lemmy.today 3 points 10 hours ago

This might be true for Intel, I don't know, I use amd. I know the limits of my cpu/gpu pairing. I bought the affordable low end GPU for the cpu and in 5 years I'll upgrade to the upper end gpu when it's really cheap. 5 years later, I'll get a new computer

[–] Willoughby@piefed.world 2 points 10 hours ago

My last GPU? $300.

One before that? $300.

Next one? $300.

(buy used)

[–] testaccount372920@piefed.zip 5 points 17 hours ago

The title of this article just doesn't match reality. It really only (maybe) applies to very high end systems that are already pushing the limits of all components. Most people don't have the money to waste on that and have plenty of room to upgrade their hardware for a looong time.

If you don't need much (e.g. no gaming, 3D rendering, etc.), especially if you don't need a dedicated gpu, then you can upgrade for at least a decade before running into issues. To be fair, a laptop should last a decade as well in that case, but at a higher prices and while being less repairable.

[–] verdi@tarte.nuage-libre.fr 7 points 20 hours ago (1 children)

The manufacturing of consent to move your machine to the cloud has begun. We had a good run lads. 

[–] worhui@lemmy.world -3 points 13 hours ago (1 children)

You are literally the only person saying that out this this whole exchange.

[–] verdi@tarte.nuage-libre.fr 5 points 11 hours ago* (last edited 11 hours ago)

"This persistent narrative in the media trying to talk consumers out of desktops as being viable options kind of sneakily ties into the greater "you will own nothing and you will be happy" narrative being pushed by big tech.

It's really obvious and it needs to be consistently called out for what it is."

Literally the most upvoted comment in the linked article.

I guess some frogs are just to stupid to figure out they're being slow boiled and it's up to us to carry the dead weight out of the pan...

[–] fonix232@fedia.io 19 points 1 day ago

This is categorically untrue with the latest generations of chipsets, CPUs and GPUs. Just look at AMD instead of Intel: AM4/5 cross-compatibility, DDR4/DDR5 combined support and so on.

If anything, today is the day when you can upgrade beyond your current gen hardware component by component.

[–] lightnsfw@reddthat.com 40 points 1 day ago (1 children)

I have been ship of theseusing my desktop and server for 15 years. This article is fucking stupid.

[–] A_norny_mousse@feddit.org 4 points 22 hours ago* (last edited 22 hours ago)

Aye.

And OP is doubling down.

[–] themachinestops@lemmy.dbzer0.com 15 points 1 day ago (5 children)

Honestly most people just upgrade the GPU and ssd, after 10-15 years they buy a new desktop. Also one of the biggest reasons to get a desktop is that it is cheaper than laptops, last longer, and you can change any part that breaks. I had many laptops with one component basically making the entire device useless, if it was a desktop it could easily be fixed, for example soldered RAM.

load more comments (5 replies)
[–] brucethemoose@lemmy.world 50 points 1 day ago* (last edited 1 day ago) (6 children)

That’s a huge generalization, and it depends what you use your system for. Some people might be on old threadripper workstations that works fine, for instance, and slaps in a second GPU. Or maybe someone needs more cores for work; they can just swap their CPU out. Maybe your 4K gaming system can make do with an older CPU.

I upgraded RAM and storage just before the RAMpocalypse, and that’s not possible on many laptops. And I can stuff a whole bunch of SSDs into the body and use them all at once.

I’d also argue that ATX desktops are more protected from anti-consumer behavior, like soldered price-gouged SSDs, planned obsolescence, or a long list of things you see Apple do.

…That being said, there’s a lot of trends going against people, especially for gaming:

  • There’s “initial build FOMO” where buyers max out their platform at the start, even if that’s financially unwise and they miss out on sales/deals.

  • We just went from DDR4 to DDR5, on top of some questionable segmentation from AMD/Intel. So yeah, sockets aren’t the longest lived.

  • Time gaps between generations are growing as silicon gets more expensive to design.

  • …Buyers are collectively stupid and bandwagon. See: the crazy low end Nvidia GPU sales when they have every reason to buy AMD/Intel/used Nvidia instead. So they are rewarding bad behavior from companies.

  • Individual parts are more repairable. If my 3090 or mobo dies, for instance, I can send it to a repairperson and have a good chance of saving it.

  • You can still keep your PSU, case, CPU heating, storage and such. It’s a drop in the bucket cost-wise, but it’s not nothing.

IMO things would be a lot better if GPUs were socketable, with LPCAMM on a motherboard.

[–] A_norny_mousse@feddit.org 4 points 21 hours ago* (last edited 21 hours ago)

You nailed it, except "huge generalization" is actually being generous. The article is simply wrong. The author is speaking esoteric technobabble:

The upgrade death spiral (...) happens because upgrading one component of your computer can unbalance the system.

It's the sort of argument a husband might give his not tech savvy wife when she asks why he repeatedly needs to spend so much $$$ on something only he uses.

I think FOMO says it pretty well, or simply consumerism.

Now that hardware is getting more expensive again, this is really sending the wrong message.

And OP keeps doubling & tripling down despite basically every comment disagreeing. I think they wrote that article.

[–] claymore@pawb.social 2 points 20 hours ago* (last edited 20 hours ago)

Don't forget about PCIe expansion. Just yesterday I got a FireWire PCIe card for 20€ to transfer old DV tapes to digital with no quality loss. Plug the card in and you're done. To get the same result on a laptop I'd need a Thunderbolt port and two adapters, one of which isn't manufactured anymore and goes for 150€+ on secondhand stores.

PS. I would remove "CPU heating" from your system if I were you :)

[–] kreskin@lemmy.world 4 points 1 day ago (1 children)

If my 3090 or mobo dies, for instance, I can send it to a repairperson and have a good chance of saving it.

While throwing out working things is terrible, the cost of servicing a motherboard outpaces the cost of replacing it. They can possibly still charge you 200 dollars and tell you the board cant be fixed, right? I think the right balance is that you observe the warranty period, try to troubleshoot it yourself --and then call it a day, unless you have a 400+ dollar motherboard.

[–] brucethemoose@lemmy.world 1 points 1 day ago* (last edited 1 day ago)

Yeah, probably. I actually have no idea what they charge, so I’d have to ask.

It’s be worth it for a 3090 though, no question.

load more comments (3 replies)
[–] sorghum@sh.itjust.works 17 points 1 day ago (1 children)

Disposable my ass. I just did the final upgrades to my AM4 platform to be my main rig for the next 5 years. After that it will get a storage upgrade and become a NAS and do other server stuff. This computer 7 years in has another 15 left in it.

[–] Lfrith@lemmy.ca 6 points 1 day ago (1 children)

Yeah, it's crazy that someone could have gotten like a Ryzen 5 1600 then upgraded to a 5800x3D around 5 years later without needing to buy a new motherboard, which usually can mean having to buy a new set of ram too.

For a long time just doing a new build if upgrading to a newer CPU used to be the thing when Intel was dominant.

[–] sorghum@sh.itjust.works 1 points 1 day ago

Yeah, I usually over spec when I build my main rig because I want to have it last and repurpose it later down the road. I finally retired a power supply that I bought back in the mid 2000s. I can't power modern cards anymore unfortunately. 🫡 pc power and cooling single rail take a break. You've earned it.

[–] mp3@lemmy.ca 27 points 1 day ago (1 children)

Personally I still prefer the desktop because I can choose exactly where I prefer performance, and where I can make some tradeoffs. Also, parts are easier to replace when they fail, making them more sustainable. You don't have that choice with a laptop since it's all prebuilt.

[–] socphoenix@lemmy.world 15 points 1 day ago

Desktops also offer better heat dissipation and peripheral replacements extending the life of the unit. It can be difficult for most folks to replace a laptop display or even battery nowadays frankly.

[–] snooggums@piefed.world 11 points 1 day ago

AMD challenges that trend, but the article writer dismisses them because of Intel's market share.

Terrible article.

[–] A_norny_mousse@feddit.org 15 points 1 day ago (1 children)

CPUs are the same with real performance needed a new chipset and motherboard. At that point you are replacing the whole system.

I find the quoted statement untrue. You still have all peripherals, including the screen, the PSU, and the case.

You can replace components as and when it becomes necessary.

You can add up hard drives, instead of replacing a smaller one with a larger one.

Desktop mobos are usually more upgradeable with RAM than laptops.

There's probably more arguments that speak against the gist of this article.

load more comments (1 replies)
[–] klymilark@herbicide.fallcounty.omg.lol 9 points 1 day ago (4 children)

Yes, desktop PCs challenge that trend. If you're not chasing the newest of the new, you can keep using your old stuff till it dies. I've done one CPU upgrade, and a GPU upgrade, to my desktop in the eight years I've owned it, and it handles all of my games fine.

If you're changing the motherboard, you'll usually need a new CPU, and sometimes RAM. As long as your MOBO has a PCI/PCIE slot you can shove your old graphics card in there. Unless there's a new RAM version, you don't need to replace the RAM, and SATA's been the standard storage connector for how long now?

Unless you're going above your current PSU's rating that thing's good until it's dead.

I just don't see how this argument holds up. If your motherboard is old enough that they no longer make your CPU/RAM socket, and you're looking to upgrade, chances are very good that thing's lived far longer than most laptops would be expected to. But like. When I built my current desktop 8 years ago, it had 8gb of RAM and a... I don't remember the graphics card, I know the processor was a pentium G something, and like 1tb of storage. It has an i7 (don't remember the generation off hand), and an R9 290, and 32gb of RAM, and 7tb of storage now. Same motherboard. If I replace it I will need a new processor, and new RAM (the RAM is actively dying, so I haven't been using it much), but these parts are all nearly a decade old, with the exception of the RAM. Well. One RAM stick is 8 years old, but that's beside the point.

This just doesn't line up with my own personal experience?

load more comments (4 replies)
[–] fyrilsol@kbin.melroy.org 16 points 1 day ago (3 children)

Everything is disposable. I don't think you or the author who wrote that article has a clue. It's a matter of getting things that'll last longer than others do and making financially wise choices and purchasing decisions based on the needs of the moment.

Like, I'm not spending $5 on a toothbrush when you need to replace it every 30 days, I buy the cheapest toothbrush I can afford to replace it with since they're all equally made. I will spend some more money on a computer component if I feel it will have a positive increment on my entire system. Replacing my entire system would just set me back big and it would make me waste the components that are already inside that are still good. Plus, if I decide to sell the old system, I'm not going to get a good value back.

The only thing I've yet to replace is the case. Why? Because it's still serviceable to me.

I just don't get this stupid logic where you have to replace the entire system. For what? just to be with the in-crowd of current technology trends? No thanks, I'll build my PC based on what I want out of it.

load more comments (3 replies)
[–] rimu@piefed.social 12 points 1 day ago

Laptop CPUs are crippled garbage compared to desktop CPUs of the same generation. So there's that.

[–] saltesc@lemmy.world 17 points 1 day ago (1 children)

Let's say that you've just significantly upgraded your GPU. If you were getting the most out of your CPU with your previous GPU, there's a good chance that your new GPU will be held back by that older component. So now, you need a new CPU or some percentage of your new GPU's performance is wasted. Except, getting a new CPU that's worth the upgrade usually means getting a new motherboard, which might also require new RAM, and so on.

This guy's friends should keep him away from computers and just give him an iPad to play with.

load more comments (1 replies)
[–] Tywele@piefed.social 8 points 1 day ago (1 children)

I don't agree with this article. Everyone I know usually upgrades their GPU until the CPU is bottlenecking it heavily and that is only the case after a few GPU upgrades.

[–] Lfrith@lemmy.ca 3 points 1 day ago

Yeah, and when CPU is the bottleneck upgrading the CPU, mobo, and ram but not the GPU.

This time though I only upgraded the CPU, since AM4 had supported multiple generations of CPUs. One of the best things to happen for PC.

[–] RaoulDook@lemmy.world 7 points 1 day ago

Everything in this post is wrong, actually. But if you buy shit parts to build your desktop, you'll have a shitty desktop.

Simple answer is at the motherboard level - you look at your motherboard's future expansion capability and if you started with a good foundation you can do years of upgrades. Also your computer case needs to be big enough to fit extra stuff, full ATX motherboard size is great.

For example I have a VR gaming rig that runs VR games well on DDR3 RAM and a Sandy Bridge CPU, because it has a decent modern GPU and enough CPU cores + RAM.

[–] artyom@piefed.social 10 points 1 day ago* (last edited 14 hours ago)

Let's say that you've just significantly upgraded your GPU. If you were getting the most out of your CPU with your previous GPU, there's a good chance that your new GPU will be held back by that older component. So now, you need a new CPU or some percentage of your new GPU's performance is wasted.

There's always an imbalance. It doesn't mean it's "wasted". CPU and GPU do different things.

except, getting a new CPU that's worth the upgrade usually means getting a new motherboard

Also not true. AM4 came out in 2016 and they are still making modern processors for it.

Generational performance increases are too small

No one should be upgrading every generation...

Ask yourself this: how much of your current desktop computer has components from your PC from five years ago?

Most of it.

They're also ignoring the concept of repairability. If my CPU dies? Buy another CPU. Maybe upgrade at the same time. CPU dies in your PS5? Fuck you, better throw the whole thing away and buy a new one.

[–] Cyv_@lemmy.blahaj.zone 8 points 1 day ago (1 children)

I disagree that you need to upgrade your CPU and GPU inline. I almost always stagger those upgrades. Sure, I might have some degree of bottleneck but it's pretty minimal tbh.

I also think it's a bit funny the article mentions upgrading every generation. I've never done that, I don't know a single person who does. Maybe I'm just too poor to hang with the rich fucks, but the idea of upgrading every generation was always stupid.

Repairability is a big deal too. It also means that if my GPU dies I can just replace that one card rather than buy an entire new laptop since they tend to just solder things down for laptops.

[–] stealth_cookies@lemmy.ca 3 points 1 day ago

I typically build a whole new PC and then do a mid-life GPU upgrade after a couple generations. e.g. I just upgraded my GPU I bought in late 2020. For most users there just isn't a good reason to be upgrading your CPU that frequently.

I can see why some people would upgrade their GPU every generation. I was suprised at how expensive even 2 generations old card are going for on ebay, if you buy a new card and sell your old one every couple years the "net cost per year" of usage is pretty constant.

[–] SuiXi3D@fedia.io 7 points 1 day ago

Meanwhile I’ve been using an AM4 board and DDR4 for… well, it’s been awhile now.

[–] UnspecificGravity@piefed.social 4 points 1 day ago* (last edited 1 day ago) (1 children)

I think the real thing you have learned is that PC upgrades are largely unnecessary. They are only selling new hardware that is better on paper and they need to create compatibility traps to make you upgrade a bunch of other shit to get that incremental upgrade.

I think a lot of people really just fail to analyze if the thing they are going to get is worth the cost. Like if you have a perfectly good DDR4 system is it really worth a thousand dollars to upgrade every component in order to get what, an extra 5 FPS? People are spending a lot of money doing upgrades and expecting to get the kind of improvements you got ten years ago, and its just not going to happen because hardware hasn't been improving at that rate for a long time.

Even still, there are a lot of components that are not cheap that you can reuse regardless of CPU socket and memory compatibility changes. I've used the same PSU and case and drives and network card for a decade. That's all shit I would have had to pay for over and over again with a different type of system.

[–] worhui@lemmy.world 1 points 1 day ago (1 children)

I’ve used the same PSU and case and drives and network card for a decade

How many full backups do you have of the data on a decade old drive?

Storage is one of the most relevant things to be continually replacing. I have decade old drives as well but they live in tandem with their replacements as mirrors. Until recently storage was so insanely cheap there was little reason not to replace it.

[–] UnspecificGravity@piefed.social 2 points 23 hours ago* (last edited 23 hours ago)

12 4tb drives in mirrored pairs in a zfs pool on a headless server supporting our local media server and various other data storage needs. The only stuff on the desktop is game installs and software.

Most of us who survived desktop gaming and media piracy in the early 2000 learned not to keep anything important locally cause your gonna have to wipe your os every now and then.

[–] yesman@lemmy.world 4 points 1 day ago

This is a weird way to say that PC tech is stagnated and improvements between "generations" is incremental.

[–] masterspace@lemmy.ca 3 points 1 day ago

The main benefit of a desktop is the price / performance ratio which is higher because you're trading space and portability for easier thermal management and bigger components.

[–] JakoJakoJako13@piefed.social 3 points 1 day ago

It rings true but it's not. It's highly dependent on your upgrade plan. You can get a new CPU without a new mobo if you aren't changing architecture like jumping from AM4 to AM5. The idea that only the cheap parts last the longest isn't true either. I've been on the same GPU for nearly 7 years. It's getting long in the tooth but when I do decide to upgrade I'm not forced to upgrade anything else. The GPU is the bottleneck but the bottleneck isn't noticeable unless I'm playing some new AAA game that requires everything under the sun to run it.

That last paragraph about parts being 5 to 10 years taking up close to 0% of your build just isn't true for me either. The newest parts in my PC are three years old at this point. The case, the CPU and Mobo, Ram and an NVME drive. The case was purely for vanity reasons. I got an old GPU, and old PSU, 1 NVME drive, 2 SSD drives, and 2 HDDs that are 10 years old. All those parts are older than 5 years. The argument that most people are using PCs that are less than 5 years old sounds like some phone FOMO shit. I don't buy it.

[–] Samskara@sh.itjust.works 3 points 1 day ago (1 children)

It’s been like that that since I can remember. Upgrading can extend the lifespan by a few years, but often it’s a good idea to replace the whole system.

It depends on a lot of factors of course. If you buy a midrange machine now, you can upgrade it in five years to a high end machine from today, then five years ago.

Rarely do you get to take advantage of technology shifts like hard drives to SSD. A couple of years ago, adding more RAM and an SSD made machines usable, that had these bottlenecks. Still the best thing you can do to an old laptop or desktop.

Over the last decade performance hasn’t improved that much for most typical use cases. An i7 from ten years ago with 16 GB RAM and a 1 TB SSD, and a NVIDIA GTX 1080 is still a decent computer today.

What makes PCs great is that you’re more flexible regarding how you configure your machine. Adding more storage, more ports, extension cards, optical drives inside your machine etc. is just nice.

With a laptop you end up with crappy hubs and lots of cables.

[–] worhui@lemmy.world 1 points 1 day ago

From a pure aesthetics standpoint hubs and cables suck. From a functional standpoint they are equivalent except for the GPU.

[–] RIotingPacifist@lemmy.world 3 points 1 day ago

This has been true for a long time, CPU sockets don't last long enough to make upgrades worth it, unless you are constantly upgrading. Whenever i've built a "futureproof" desktop with a mid-high end GPU, by the time I hit performance problems I needed a new motherboard to fit the new CPU anyway. Only really upgradable components are storage and ram, but you can do that in your laptop too.

The main advantage of Desktops is still that you get much more performance for your money and can decide where it goes if you build it yourself.

load more comments
view more: next ›