this post was submitted on 28 Sep 2025
45 points (100.0% liked)
Linux Gaming
21424 readers
192 users here now
Discussions and news about gaming on the GNU/Linux family of operating systems (including the Steam Deck). Potentially a $HOME
away from home for disgruntled /r/linux_gaming denizens of the redditarian demesne.
This page can be subscribed to via RSS.
Original /r/linux_gaming pengwing by uoou.
No memes/shitposts/low-effort posts, please.
Resources
WWW:
Discord:
IRC:
Matrix:
Telegram:
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Generated frames are created using a neural network, they have nothing to do with the actual game scripts and game loop and input polling and stuff. FSR does generate frames to interpolate between real frames but things like physics and input are not being generated as well. It's only visual. I guess maybe you have to have some basic knowledge about how a computer program and game engine works to understand this.
Basically the CPU steps through the simulation in steps. When you use frame gen, if it lowers the actual frame rate, then the CPU is making less loops per second over everything, like the physics updates, input polling(capturing key presses and mouse events), and other stuff like this.
Oh yeah, now I remember why there's more input lag with frame interpolation turned on. Taking a shot right now and now it pops into my head.
Anyway, it's because while the frame interpolation adds more frames per second, the "I-frames"—or real frames—you're seeing are lagging behind one I-frame. This is because it can't start showing you interpolated frames until it has two frames it can interpolate between.
So you won't start seeing I-frame N-1 until I-frame N (the latest I-frame) has been generated, thus creating extra input lag.
Someone correct me if I'm wrong, I'm supposed to be asleep...
It's more so that the actual FPS is lower when using FSR in many cases. The GPU frame rate doesn't matter in terms of input lag and stuff, it's all about how many time the CPU can loop through the game logic per second.
So basically when you move 10 steps forward in a game, the CPU is running tons of code that take the time elapsed since the previous frame and interpolates where the player should be this frame. This is Delta time, (change in time between this frame and last) it's multiplied by stuff moving to give fluid movement with a variable frame rate. This is why older games would slow down if the frame rate dropped and new games will still calculate the passage of time correctly, even if you only have 15 FPS.
The fake frames have nothing to do with the game engine or logic, they are deep faked frames that are created with a neural network to fill in between real frames. This does give you something very close to extra frames on the GPU, but there is often a performance hit on the real frames since it's a heavy process. The CPU has to stay synced to the GPUs real frames since some logic is CPU bound, like physics, creating certain buffers, all kinds of stuff. If the real frame rate of the GPU is lower, it bottlenecks the CPU since it's also involved to a smaller degree, in rendering real frames. (Preparing data, sending it to the GPU, certain operations which are faster on the CPU that involve rendering like maybe using MMX or other CPU extensions.
So basically the less real frames you have, the longer the wait between when you game engine can detect mouse and keyboard events and update the game world, even if you are getting 2-3 times the frame rate with generated frames.