this post was submitted on 05 Jan 2026
635 points (99.7% liked)

PC Gaming

13125 readers
833 users here now

For PC gaming news and discussion. PCGamingWiki

Rules:

  1. Be Respectful.
  2. No Spam or Porn.
  3. No Advertising.
  4. No Memes.
  5. No Tech Support.
  6. No questions about buying/building computers.
  7. No game suggestions, friend requests, surveys, or begging.
  8. No Let's Plays, streams, highlight reels/montages, random videos or shorts.
  9. No off-topic posts/comments, within reason.
  10. Use the original source, no clickbait titles, no duplicates. (Submissions should be from the original source if possible, unless from paywalled or non-english sources. If the title is clickbait or lacks context you may lightly edit the title.)

founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] RamRabbit@lemmy.world 251 points 4 days ago (5 children)

Yep. Intel sat on their asses for a decade pushing quad cores one has to pay extra to even overclock.

Then AMD implements chiplets, comes out with affordable 6, 8, 12, and 16 core desktop processors with unlocked multipliers, hyperthreading built into almost every model, and strong performance. All of this while also not sucking down power like Intel's chips still do.

Intel cached in their lead by not investing in themselves and instead pushing the same tired crap year after year onto consumers.

[–] degen@midwest.social 2 points 2 days ago

cached in their lead

There are so many dimensions to this

[–] real_squids@sopuli.xyz 111 points 4 days ago (5 children)

Don't forget the awfully fast socket changes

[–] nokama@lemmy.world 46 points 4 days ago (2 children)

And all of the failures that plagued the 13 and 14 gens. That was the main reason I switched to AMD. My 13th gen CPU was borked and had to be kept underclocked.

[–] umbrella@lemmy.ml 2 points 2 days ago (1 children)
[–] nokama@lemmy.world 3 points 2 days ago* (last edited 2 days ago)

It would cause system instability (programs/games crashing) when running normally. I had to underclock it through Intel's XTU to make things stable again.

This was after all the BIOS updates from ASUS and with all BIOS settings set to the safe options.

When I originally got it I did notice that it was getting insanely high scores in benchmarks, then the story broke of how Intel and motherboard manufacturers were letting the CPUs clock as high as possible until they hit the thermal limit. Then mine started to fail I think about a year after I got it.

[–] bufalo1973@piefed.social 1 points 2 days ago

In the 486 era (90s) there was a not official story about the way Intel marked its CPUs: instead of starting slow and accelerate until failure, start as fast as you can and slow down until it doesn't fail.

[–] Junkers_Klunker@feddit.dk 21 points 4 days ago (1 children)

Even within the same socket family, looking at you lga1151, can you run into compatibility problems.

[–] captain_aggravated@sh.itjust.works 37 points 4 days ago (2 children)

I think AMD also did a smart thing by branding their sockets. AM4, AM5, what do you think is going to be next? I bet it's AM6. What came after the Intel LGA1151? It wasn't LGA1152.

[–] Junkers_Klunker@feddit.dk 16 points 4 days ago (1 children)

Yea, for the customer it really doesn’t matter how many pins a certain socket has, only is it compatible or not.

[–] captain_aggravated@sh.itjust.works 3 points 4 days ago (2 children)
[–] Junkers_Klunker@feddit.dk 3 points 3 days ago

Holy shit, crosscompatibility between manufacturers? We came this close to the almighty above and still ended up where we are today 🤦‍♂️

[–] ripcord@lemmy.world 2 points 4 days ago

I remember Slot 2

[–] 1Fuji2Taka3Nasubi@piefed.zip 6 points 4 days ago (1 children)

AMD tried the Intel thing too by stopping support of past generation CPU on latter AM4 boards though. Only after public outcry did they scrap that. Wouldn’t put it past them to try it again on AM5.

[–] captain_aggravated@sh.itjust.works 4 points 3 days ago (1 children)

Are there a lot of people wanting to plug Zen 1 chips into B550 motherboards? Usually it's the other way around, upgrading chip in an old motherboard.

[–] 1Fuji2Taka3Nasubi@piefed.zip 1 points 3 days ago

It can happen if the old motherboard failed, which was more likely than the CPU failing.

There was talk of not providing firmware update for old chipsets to support new gen CPU as well, which is relevant to the cases you mentioned.

[–] kieron115@startrek.website 13 points 3 days ago* (last edited 3 days ago)

I just read the other day that at least one motheboard manufacturer is bringing back AM4 since DDR4 is getting cheaper than DDR5, even with the "this isn't even manufactured anymore" price markup. That's only even possible because of how much long-term support AMD gave that socket.

[–] UnspecificGravity@piefed.social 17 points 4 days ago (1 children)

As a person that generally buys either mid-tier stuff or the flagship products from a couple years ago, it got pretty fucking ridiculous to have to figure out which socket made sense for any given intel chip. The apparently arbitrary naming convention didn't help.

[–] real_squids@sopuli.xyz 6 points 4 days ago (1 children)

It wasn't arbitrary, they named them after the number of pins. Which is fine but kinda confusing for your average consumer

[–] UnspecificGravity@piefed.social 15 points 4 days ago (1 children)

Which is a pretty arbitrary naming convention since the number of pins in a socket doesn't really tell you anything especially when that naming convention does NOT get applied to the processors that plug into them.

[–] billwashere@lemmy.world 3 points 3 days ago (1 children)

Or the 1200 different versions of CPUs. We just got some new Dell machines for our DR site last year and the number of CPU options was overwhelming. Is it really necessary for that many different CPUs?

[–] real_squids@sopuli.xyz 5 points 3 days ago* (last edited 3 days ago) (1 children)

Tbf AMD is also guilty of that, in the laptop/mobile segment specifically. And the whole AI naming thing is just dumb, albeit there aren't that many of those

[–] billwashere@lemmy.world 2 points 3 days ago

Well this scheme seems much more reasonable and logical to me.

[–] Valmond@lemmy.dbzer0.com 53 points 4 days ago (1 children)

They really segmented that market in the worst possible way, 2 cores and 4 cores only, possibility to use vms or overclock, and so on. Add windoze eating up every +5%/year.

Remember buying the 2600(maybe X) and it was soo fast.

[–] halcyoncmdr@lemmy.world 25 points 4 days ago (3 children)

The 2600k was exceptionally good and was relevant well past the normal upgrade timeframes.

Really it only got left behind because of its 4C/8T limit as everything started supporting lots of threads instead of just a couple, and just being a 2nd Generation i7.

[–] Valmond@lemmy.dbzer0.com 1 points 1 day ago

Yes, that was a beast! I was poor and had to wait and got the generation after, the 3770K and already the segmentation was there, I got overlooking possibilities but not the VM stuff...

[–] guynamedzero@piefed.zeromedia.vip 7 points 4 days ago (1 children)

Coincidentally, that’s the exact cpu I use in my server! And it runs pretty damn well.

[–] halcyoncmdr@lemmy.world 7 points 4 days ago (1 children)

At this point the only "issue" with it is power usage versus processing capability. Newer chips can do the same with less power.

Yeahhh, iirc it uses slightly less power than my main cpu for significantly less performance

[–] Trainguyrom@reddthat.com 5 points 4 days ago

Really it only got left behind because of its 4C/8T limit as everything started supporting lots of threads instead of just a couple, and just being a 2nd Generation i7.

Past me made the accidentally more financially prudent move of opting for the i7-4790k over the i5-4690k which ultimately lasted me nearly a decade. At the time the advice was of course "4 cores is all you need, don't waste the money on an i7" but those 4 extra threads made all the difference in the longevity of that PC

[–] wccrawford@discuss.online 37 points 4 days ago (1 children)

All of the exploits against Intel processors didn't help either. Not only is it a bad look, but the fixes reduced the speed of the those processors, making them quite a bit worse deal for the money after all.

[–] MotoAsh@piefed.social 19 points 4 days ago (1 children)

Meltdown and Spectre? Those also applied to AMD CPUs as well, just to a lesser degree (or rather, they had their own flavor of similar vulnerabilities). I think they even recently found a similar one for ARM chips...

[–] fuckwit_mcbumcrumble@lemmy.dbzer0.com 17 points 4 days ago (1 children)

Only one affected AMD, forget which. But Intel knew about the vulnerabilities, but chose not to fix the hardware ahead of their release.

[–] MotoAsh@piefed.social 6 points 4 days ago

Yea that definitely sounds like Intel... Though it's still worth pointing out that one of them was a novel way to spy on program memory that affects many CPU types and not really indicative of a dropped ball. (outside of shipping with known vulnerabilities, anyways)

... The power stuff from 12/13th gens or what ever though... ouch, massive dropped ball.

[–] brucethemoose@lemmy.world 7 points 4 days ago

Even the 6-core Phenom IIs from 2010 were great value.

But to be fair, Sandy Bridge ended up aging a lot better than those Phenom IIs or Bulldozer/Piledriver.