this post was submitted on 28 Sep 2025
43 points (100.0% liked)
Linux Gaming
21391 readers
555 users here now
Discussions and news about gaming on the GNU/Linux family of operating systems (including the Steam Deck). Potentially a $HOME
away from home for disgruntled /r/linux_gaming denizens of the redditarian demesne.
This page can be subscribed to via RSS.
Original /r/linux_gaming pengwing by uoou.
No memes/shitposts/low-effort posts, please.
Resources
WWW:
Discord:
IRC:
Matrix:
Telegram:
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Input lag is caused by frame interpolation, right? Or nah?
It's because game logic is calculated on real frames and these things lower the real frame rate even though they give you more rendered frames. If you were getting 40 real FPS, and then you go to 30 real fps, you will feel a significant amount of lag even if you are getting 60 fps in fake frames. Basically the game loop is running slower and stuff like input polling is happening slower even if you have a higher frame rate.
Framegen is worse the lower your base frame rate is.
The penalty to the speed at which the game runs is much more significant, if you normally run at 40 fps and framegen gives you 60 (30 real) then you have introduced 8 ms of latency just from that. While the same 25% performance cost going from 180 fps to 270 (135 real) adds just 2 ms.
The lower your real frame rate is the harder it will be to interpolate between frames because the changes between frames are much larger, so it will look worse. Also the lower your frame rate the longer any mishaps will remain on screen, making them more apparent.
Frame generation shouldn't be a bottleneck on the CPU though, should it? That stuff is happening on the GPU. I know I saw a video about this stuff but I can't remember the real reason input lag increases with frame generation/interpolation.
Most games aren't bottlenecked by your CPU at all. It spends a lot of time waiting for the GPU to be done drawing you a picture.
"Why isn't the game doing other stuff meanwhile?" you might ask, and part of the answer is surely, "Why do stuff faster than the player can see?", while another part is likely a need to syncronize the simulation and the rendering so it doesn't show you some half-finished state, and a third part might be that it would be very confusing for the player to decouple the game state from what they see on screen, like you see yourself aiming at the monster, but actually it moved in between frames so your shot will miss even if the crosshair is dead on.
Maybe it's not the CPU but with FSR either way the real frame rate drops which is why you get input lag. The game logic/game loop is only calculated per real frame. Which means if you take a 20% drop in real frame rate you are going to get 20% more input lag.
it’s not. The whole point of FG was to take advantage of high refresh rate monitors as most games can’t render 500FPS even on the fastest CPU… alas, here we are with games requiring FG to get you to 60FPS on most systems looks at Borderlands 4 and Monster Hunter Wilds
Right, but FG shouldn't be touching the CPU in any way, should it? It should be a local thing on the GPU transparent to the CPU, unless I'm misunderstanding how it works.
It's not using CPU
Then I don't understand how it would affect the game loop negatively. I'll look into it though, will do some research.
Because the "delayed" or real input does not correspond to the image you see on the screen. That's why FG is most useful when you already have high base framerate as the input gets significantly lower and the discrepancy between the felt input and perceived image narrows.
Example:
30FPS is 33.3ms frame to frame latency (+ something extra from mouse to displayed image for input)
With 2x FG you get at most 60FPS assuming there's no performance penalty for FG. So you see 16.6ms + mouse to display frame to frame but input remains 33.3ms + mouse to display.
Same from base 60FPS 16.6ms to FG 120FPS 8.3ms perceived but 16.6ms+
Same from 120FPS 8.3ms base to FG 240FPS 4.15ms perceived...
As you can see the difference in input gets smaller and smaller between base FPS and FG FPS as you're increasing the base framerate.
This is however a perfect scenario that does not represent real world cases. Usually your base FPS fluctuates due to CPU and GPU intensive scenes. And during those flucfuations you will get big inpuy delay spikes that can be felt a lot as they suddenly widen the range between perceived image and real input... Couple that with the fact that FG almost always has a performance penalty as it puts more strain on the GPU so your base framerate and therefore input will be automatically higher.
It's kinda the same thing. You get input lag based on the real framerate. Since interpolation requires some extra performance the base framerate will likely be a bit lower than the framerate without interpolation which will case an increase in input lag while providing smoother image.
It seems that the input lag is more perceived, rather than actually experienced, from what I understand. Like if you go from 30 to 120 fps, you expect the input lag to decrease, but since it stays the same (or slightly worse), you perceive it to be much more severe.
The frame rate isnt going from 30 to 120 FPS. It's actually going from 30 to like 20. The rendered frames are different then the CPU frames which handles the game loops, (physics, input, simulation, etc)
Not sure we have the same definition of frames here.
Generated frames are created using a neural network, they have nothing to do with the actual game scripts and game loop and input polling and stuff. FSR does generate frames to interpolate between real frames but things like physics and input are not being generated as well. It's only visual. I guess maybe you have to have some basic knowledge about how a computer program and game engine works to understand this.
Basically the CPU steps through the simulation in steps. When you use frame gen, if it lowers the actual frame rate, then the CPU is making less loops per second over everything, like the physics updates, input polling(capturing key presses and mouse events), and other stuff like this.
yes, that’s why FPS in this case is not a good measure of performance
Very much so. The very reason why we want more fps is to have less input lag, that's my personal take anyway. That's the only reason why I have a beefy computer, so the game can respond quicker (and give me feedback quicker as well).