this post was submitted on 26 Jul 2024
310 points (98.1% liked)

Technology

59377 readers
4734 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] tal@lemmy.today 25 points 3 months ago* (last edited 3 months ago) (2 children)

It can get a whole lot worse.

I bought a $500 13th gen CPU that destroyed itself, replaced it (and didn't keep the dead CPU) with a $500 14th gen CPU that destroyed itself, and spent another ~$500 on related hardware and dumping Intel stuff to go AMD to get a working system. I also spent a lot of time trying to resolve the problem. I'd bet that I'm not the person burned worst, because someone could very easily have replaced their motherboard or memory or power supply unit in the hopes of fixing the issue, as any of these could have looked like potential causes, and there'd be no way for anyone to prove to Intel that this was the cause even if Intel intended to reimburse for these.

Maybe, I might get $500 back at most if Intel reimburses for the 14th gen CPU; I'd assume that at best, based on what they've been doing so far, that they'd send out another Intel CPU (which I no longer have a use for, having gone AMD).

And I was mostly using this system for fun. While I was corrupting my root filesystem regularly at boot at the end, I ultimately didn't -- as far as I know -- suffer any serious data loss or expense from the data that the processor was corrupting. My system was mostly to be used for my own entertainment. I didn't miss deadlines or lose critical information.

As Steve Burke has pointed out in earlier episodes on this, there are people who have been impacted by those secondary costs, some of which might make my own costs look irrelevant.

He was talking to video game companies who were using affected processors as well as having customers who were affected; they had apparently banned some customers for cheating because they knew that the internal state of the game was incorrect; they couldn't figure out what the customers were doing, but knew that their game state was being modified. It apparently wasn't the customers cheating, but their CPU, which had partially destroyed itself, and was now corrupting memory.

Another had been using CPUs for video game servers and those kept dying and taking down service; another company estimated that they'd lost $100k in player business due to the problem.

Apparently these were also popular, due to high single-threaded performance, with hedge funds that do stock trading. I imagine that a system that suddenly stops working or corrupts data can very quickly become extremely expensive in that context, far in excess of what the CPUs cost.

OEMs who build and sold systems containing these CPUs had apparently been taking back systems and repeatedly replacing parts; they probably incurred substantial costs and hits to their own reputation, as customers are upset with them.

Same thing with datacenter providers, who incurred a lot of costs investigating and mitigating problems, swapping parts and CPUs. One of these Burke quoted as having advised customers to use an alternate AMD-based system and if they insisted on the Intel one, the provider would charge a $1000 additional service fee to cover all the costs the provider was taking in having to deal with systems based on the CPUs. Gives an idea of what they were losing.

God only knows what the impact of having a ton of data around the world corrupted is. Probably no more than a tiny fraction of the problems related to corruption will ever actually be attributed to the CPUs themselves.

And I don't know how many systems out there may not be fully-tracked -- so they don't get updates to avoid the problem -- and have the CPUs built into them. Industrial automation hardware? Ship navigation systems? Who knows? All kinds of things that might fail in absolutely spectacular ways if they work for a period of time, then down the road, eventually start corrupting data more and more severely.

I mean, Intel might, at best, provide a cash refund for a dead CPU. But they aren't gonna cover losses from secondary problems, and there's no realistic way that most businesses and people who bought these could prove them, anyway.

Buying the last CPU they made before this clusterfuck occurred is maybe one of the best things you could have done and still be indirectly affected, as you got a reasonably fast system that wasn't directly affected -- if I'd known about this in advance, rather then Intel not saying anything, I'd have purchased a 12th gen CPU happily rather than another $1k in useless hardware and spent a ton of time to try to resolve my problems. You'll have the option to, at upgrade time, go AMD or 15th gen Intel and LGA 1851, if you want to hope that Intel's 15th gen is more solid than their previous two. Just means a new motherboard and, if you're using DDR4 memory, you'll need to toss that and buy DDR5.

[–] systemglitch@lemmy.world 10 points 3 months ago (1 children)

Anyone who knows of this and buys 15th gen over AMD is a fool imo. The risk is just so high and AMD has become so solid in the last decade.

[–] BobGnarley@lemm.ee 3 points 3 months ago

Sure would be nice if you could disable the god damn PSP though. sigh companies just CAN NOT not spy on you. Its insane.

[–] cRazi_man@lemm.ee 3 points 3 months ago (1 children)

I would have gone AMD in the first place if this happened at the time of my purchase.

Oh well. Upgrade time is going to be a long way away. My last gaming PC served me well for almost 10 years before I did an in socket upgrade.

[–] tal@lemmy.today 4 points 3 months ago

I would have gone AMD in the first place if this happened at the time of my purchase.

Well, you've got better judgement than me. l'd been running just Intel for ~25 years and was comfortable with them, and even when ordering the replacement, still wasn't absolutely certain that the CPU was at fault until the replacement (temporarily, for a few months) resolved all the problems.

Moving forward, I expect I'll use AMD unless they manage to do something like this.

My last gaming PC served me well for almost 10 years before I did an in socket upgrade.

Yeah, not a lot of annual single-threaded performance improvements since the early 2000s. Can very easily use older CPUs just fine for a long time these days, depending upon workload.