this post was submitted on 28 Sep 2025
34 points (100.0% liked)

Linux Gaming

21377 readers
74 users here now

Discussions and news about gaming on the GNU/Linux family of operating systems (including the Steam Deck). Potentially a $HOME away from home for disgruntled /r/linux_gaming denizens of the redditarian demesne.

This page can be subscribed to via RSS.

Original /r/linux_gaming pengwing by uoou.

No memes/shitposts/low-effort posts, please.

Resources

WWW:

Discord:

IRC:

Matrix:

Telegram:

founded 2 years ago
MODERATORS
 

I was trying out FSR4 on my RX 6800 XT, Fedora 42. Works really well and it easily beats FSR3 in visuals even on Performance. It does have a significant performance hit vs FSR3 though but it still works out to be a bit faster than a native rendering on Quality.

top 17 comments
sorted by: hot top controversial new old
[–] Hond@piefed.social 5 points 17 hours ago (1 children)

Neat!

I didnt know RDNA2 would work with FSR4 too.

[–] WereCat@lemmy.world 3 points 11 hours ago

With the Int8 model this should work on older cardd as well as on NVIDIA and Intel

[–] ElectroLisa@lemmy.blahaj.zone 2 points 19 hours ago (1 children)

How were you able to get FSR4 on RDNA2? Is it a mod for Cyberpunk or a custom Proton version?

[–] WereCat@lemmy.world 6 points 19 hours ago* (last edited 19 hours ago) (2 children)

there is a modified .dll you can use to replace the one in a game folder… AMD leaked it accidentally when they were releasing some open source stuff

I can send you a link tomorrow or upload it, Im not at my PC right now

edit:

here is link https://gofile.io/d/fiyGuj

you need to rename it to amd_fidelityfx_dx12.dll and replace the one in the game folder and it should work (in Cyberpunk). I had to use OptiScaler for Hogwards Legacy as just replacing the .dll made the game crash on launch and it was necessary to spoof it as DLSS

[–] juipeltje@lemmy.world 2 points 7 hours ago

That's awesome. Not a fan of using upscaling tech generally but since they keep trying to improve it, i might give this a try on my 6950xt out of curiousity.

[–] ElectroLisa@lemmy.blahaj.zone 1 points 18 hours ago (1 children)

I'll try this out tomorrow, thanks for the DLL.

Have you tried any games with no official FSR4 support, like Grounded?

[–] WereCat@lemmy.world 1 points 18 hours ago* (last edited 18 hours ago)

Baldurs Gate 3 AFAIK does not officially support FSR4 and this works with it with OptiScaler (I've tried on Steam Deck). Wanted to try on PC as well but game has updated to the official Linux supported version and this does not work with it because it’s Vulkan only now. My internet is slow so I can’t be bothered to redownloadalmost 100GB just to downgrade the game version. Will have to probably check what’s in my library.

[–] SitD@lemy.lol 1 points 22 hours ago* (last edited 22 hours ago) (2 children)

you used the int8 quantized leaked version right? i thought the f8 version doesn't run on rdna2

also i wondered if the fsr4 feels like bigger input lag, can you tell?

[–] DarkAri@lemmy.blahaj.zone 1 points 2 hours ago

If it drops the real frame rate more than FSR2, which it does, then yes, you will have more input lag.

[–] WereCat@lemmy.world 2 points 22 hours ago* (last edited 22 hours ago) (1 children)

Yes, it's the INT8, not FP8 version.

Why would FSR had anything to do with input lag? The only reason why input lag would increase is due to FSR4 being more difficult to run on RDNA2 which would be due to lower FPS as FPS is also directly tied to input lag.

But we are talking about 120FPS vs 150FPS here when comparing Quality Presets so I doubt you could even tell. And even if you can, just lower the preset, it will still look better and get you to the same performance.

From multiple games I've tested so far my conclusion is that I am almost always CPU limited in most games even with 5800X3D (in CP2077, Hogwards Legacy, Kingdom Come Deliverance 2), most areas are CPU heavy due to a lot of NPCs and FPS drops in those areas enough where my GPU is bored, the only benefit of FSR in those areas is that FSR4 looks better but wont yield any performance benefits.

[–] victorz@lemmy.world 2 points 21 hours ago (2 children)

Input lag is caused by frame interpolation, right? Or nah?

[–] DarkAri@lemmy.blahaj.zone 2 points 2 hours ago* (last edited 2 hours ago)

It's because game logic is calculated on real frames and these things lower the real frame rate even though they give you more rendered frames. If you were getting 40 real FPS, and then you go to 30 real fps, you will feel a significant amount of lag even if you are getting 60 fps in fake frames. Basically the game loop is running slower and stuff like input polling is happening slower even if you have a higher frame rate.

[–] WereCat@lemmy.world 1 points 20 hours ago (1 children)

It's kinda the same thing. You get input lag based on the real framerate. Since interpolation requires some extra performance the base framerate will likely be a bit lower than the framerate without interpolation which will case an increase in input lag while providing smoother image.

[–] victorz@lemmy.world 1 points 18 hours ago (2 children)

It seems that the input lag is more perceived, rather than actually experienced, from what I understand. Like if you go from 30 to 120 fps, you expect the input lag to decrease, but since it stays the same (or slightly worse), you perceive it to be much more severe.

[–] DarkAri@lemmy.blahaj.zone 1 points 2 hours ago

The frame rate isnt going from 30 to 120 FPS. It's actually going from 30 to like 20. The rendered frames are different then the CPU frames which handles the game loops, (physics, input, simulation, etc)

[–] WereCat@lemmy.world 1 points 18 hours ago (1 children)

yes, that’s why FPS in this case is not a good measure of performance

[–] victorz@lemmy.world 2 points 7 hours ago

Very much so. The very reason why we want more fps is to have less input lag, that's my personal take anyway. That's the only reason why I have a beefy computer, so the game can respond quicker (and give me feedback quicker as well).