Neat!
I didnt know RDNA2 would work with FSR4 too.
Discussions and news about gaming on the GNU/Linux family of operating systems (including the Steam Deck). Potentially a $HOME
away from home for disgruntled /r/linux_gaming denizens of the redditarian demesne.
This page can be subscribed to via RSS.
Original /r/linux_gaming pengwing by uoou.
No memes/shitposts/low-effort posts, please.
WWW:
Discord:
IRC:
Matrix:
Telegram:
Neat!
I didnt know RDNA2 would work with FSR4 too.
With the Int8 model this should work on older cardd as well as on NVIDIA and Intel
How were you able to get FSR4 on RDNA2? Is it a mod for Cyberpunk or a custom Proton version?
there is a modified .dll you can use to replace the one in a game folder… AMD leaked it accidentally when they were releasing some open source stuff
I can send you a link tomorrow or upload it, Im not at my PC right now
edit:
here is link https://gofile.io/d/fiyGuj
you need to rename it to amd_fidelityfx_dx12.dll and replace the one in the game folder and it should work (in Cyberpunk). I had to use OptiScaler for Hogwards Legacy as just replacing the .dll made the game crash on launch and it was necessary to spoof it as DLSS
That's awesome. Not a fan of using upscaling tech generally but since they keep trying to improve it, i might give this a try on my 6950xt out of curiousity.
I'll try this out tomorrow, thanks for the DLL.
Have you tried any games with no official FSR4 support, like Grounded?
Baldurs Gate 3 AFAIK does not officially support FSR4 and this works with it with OptiScaler (I've tried on Steam Deck). Wanted to try on PC as well but game has updated to the official Linux supported version and this does not work with it because it’s Vulkan only now. My internet is slow so I can’t be bothered to redownloadalmost 100GB just to downgrade the game version. Will have to probably check what’s in my library.
you used the int8 quantized leaked version right? i thought the f8 version doesn't run on rdna2
also i wondered if the fsr4 feels like bigger input lag, can you tell?
If it drops the real frame rate more than FSR2, which it does, then yes, you will have more input lag.
Yes, it's the INT8, not FP8 version.
Why would FSR had anything to do with input lag? The only reason why input lag would increase is due to FSR4 being more difficult to run on RDNA2 which would be due to lower FPS as FPS is also directly tied to input lag.
But we are talking about 120FPS vs 150FPS here when comparing Quality Presets so I doubt you could even tell. And even if you can, just lower the preset, it will still look better and get you to the same performance.
From multiple games I've tested so far my conclusion is that I am almost always CPU limited in most games even with 5800X3D (in CP2077, Hogwards Legacy, Kingdom Come Deliverance 2), most areas are CPU heavy due to a lot of NPCs and FPS drops in those areas enough where my GPU is bored, the only benefit of FSR in those areas is that FSR4 looks better but wont yield any performance benefits.
Input lag is caused by frame interpolation, right? Or nah?
It's because game logic is calculated on real frames and these things lower the real frame rate even though they give you more rendered frames. If you were getting 40 real FPS, and then you go to 30 real fps, you will feel a significant amount of lag even if you are getting 60 fps in fake frames. Basically the game loop is running slower and stuff like input polling is happening slower even if you have a higher frame rate.
It's kinda the same thing. You get input lag based on the real framerate. Since interpolation requires some extra performance the base framerate will likely be a bit lower than the framerate without interpolation which will case an increase in input lag while providing smoother image.
It seems that the input lag is more perceived, rather than actually experienced, from what I understand. Like if you go from 30 to 120 fps, you expect the input lag to decrease, but since it stays the same (or slightly worse), you perceive it to be much more severe.
The frame rate isnt going from 30 to 120 FPS. It's actually going from 30 to like 20. The rendered frames are different then the CPU frames which handles the game loops, (physics, input, simulation, etc)
yes, that’s why FPS in this case is not a good measure of performance
Very much so. The very reason why we want more fps is to have less input lag, that's my personal take anyway. That's the only reason why I have a beefy computer, so the game can respond quicker (and give me feedback quicker as well).