this post was submitted on 08 May 2024
254 points (99.2% liked)
PC Gaming
8536 readers
1290 users here now
For PC gaming news and discussion. PCGamingWiki
Rules:
- Be Respectful.
- No Spam or Porn.
- No Advertising.
- No Memes.
- No Tech Support.
- No questions about buying/building computers.
- No game suggestions, friend requests, surveys, or begging.
- No Let's Plays, streams, highlight reels/montages, random videos or shorts.
- No off-topic posts/comments.
- Use the original source, no clickbait titles, no duplicates. (Submissions should be from the original source if possible, unless from paywalled or non-english sources. If the title is clickbait or lacks context you may lightly edit the title.)
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
From the article ...
Anyone know of any details as to why this becomes an issue, why many cores causes older games to not work properly, requiring proton to hide extra cores from them?
~Anti~ ~Commercial-AI~ ~license~ ~(CC~ ~BY-NC-SA~ ~4.0)~
I am not aware of any games having a problem with too many cores*. But most of those (from memory) seem like peak Pentium era games. For the sake of this explanation I will only focus on Intel because AMD was kind of a dumpster fire for the pertinent parts of this.
Up until probably the late 00s/early 10s, the basic idea was that a computer processor should be really really fast and powerful. Intel's Pentium line was basically the peak of this for consumers. One core with little to no threading but holy crap was it fast and had a lot of nice architectural features to make it faster. But once we hit the 4 Ghz clock speed range, the technology required to go considerably faster started to get really messy (and started having to care about fundamental laws of physics...). And it was around this time that we started to see the rise of the "Core" line of processors. The idea being that rather than have one really powerful processor you would have 2 or 4 or 8 "kind of powerful" processors. Think "i4" as it were. And now we are at the point where we have a bunch of really powerful processors and life is great.
But the problem is that games (and most software outside of HPC) were very much written for those single powerful cores. So if Dawn of War ran past on a chonky 4 Ghz Pentium, it didn't have the logic to split that load across two or three cores of a 3 Ghz i4. So you were effectively taking a game meant to run on one powerful CPU core and putting it on one weaker CPU core that also may have lower bandwidth to memory or be missing instructions that helped speed things up.
To put it in video game (so really gun) terms: it is the difference between playing with a high powered DMR and going to a machine gun, but still treating it like it is semiauto.
But the nice thing is that compatibility layers (whether it is settings in Windows or funkiness with wine/proton) can increasingly use common tricks to make a few threads of your latest AMD chip behave like a pretty chonky Pentium processor.
*: Speculation as I am not aware of any games that did this but I have seen a lot of code that did it. A fundamental concept in parallel/multithreaded programming is the "parallel for". Let's say you have ten screws to tighten on your ikea furniture. The serial version of that is that you tighten each one, in order. The parallel version is that you have a second allen key and tell your buddy to do the five on that side while you do the five on this side. But a lot of junior programmers won't constrain that parallel for. So there might be ten screws to tighten... and they have a crew of thirty people fighting over who gets to hold the allen key and who tightens what. So it ends up being a lot slower than if you just did it yourself.
Upvote for the great analogy in the last paragraph
Thank you for that reply/write up!
I'm making that assumption, correctly or incorrectly, based on this portion of the article...
It seems to me, based on that description, that there's some kind of quantity issue going on, with games, that made Proton fixing it needed.
Basically, a balancing act problem, that's fixed by just limiting how much balancing you need to do. Using your analogy, making sure there's only X wrenches available to assemble that Ikea furniture.
But thats just a guess on my part.
~Anti~ ~Commercial-AI~ ~license~ ~(CC~ ~BY-NC-SA~ ~4.0)~
Far cry 4 is from 2014, it feels wierd that it has the same problem.
DoW Retribution needs to be played on specific version of proton (6.1-GE-2), otherwise you are kicked from multiplayer after some time. Hope they fixed that too, more fps is cool and all but worthless for multiplayer if I'm kicked.
Doesn't The Witcher 2 support Linux natively anyway?
You're assuming the Linux code base for that game doesn't potentially have the same issue. How much of the two code bases share common code, etc.
~Anti~ ~Commercial-AI~ ~license~ ~(CC~ ~BY-NC-SA~ ~4.0)~
Sure, but then that would be an issue for CDPR to fix, rather than Proton
Yep, unless it has something to do with how Proton does its emulation/layer work, vis-a-vis quantity of cores, etc.
I personally don't know enough about it to say, either way.
~Anti~ ~Commercial-AI~ ~license~ ~(CC~ ~BY-NC-SA~ ~4.0)~