this post was submitted on 02 Feb 2026
67 points (86.8% liked)
Linux
11708 readers
520 users here now
A community for everything relating to the GNU/Linux operating system (except the memes!)
Also, check out:
Original icon base courtesy of lewing@isc.tamu.edu and The GIMP
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
There were literally huge G-Sync logos in the boxes of the last three TVs I helped people buy. When you plug in a game console and press the settings button on my current display in game mode it pops up a large HUD element that says "VRR" and displays the type of VRR currently active and the current framerate. Every other option and metric is hidden away in a sub-menu.
Not that this matters, because the point of VRR is you don't need to know it's there. If it's working, the drivers and the display should talk to each other transparently. The end result if you have a Windows machine with VRR and a Linux machine that doesn't support it and you plug them both to the same display is, again, that the Windows game will look smoother, regardless of how many fps it's spitting out.
And as always, a reminder I've given many, many, many times in my life, both personally and professionally, "it works on my machine" means nothing and doesn't mean there's no bug or that your code isn't crap. Your anecdotal experience and my anecdotal experience aren't the same, because I have a showstopper bug and your seven friends don't, which still means there's a showstopper bug.
Yes that is my blind spot that I have created. I go into a store, I see g-sync is nvidia, and assume it won't work. I have been avoiding stuff that I know doesn't work or suspect won't within the decade, for decades. I've been recommending friends and family avoid certain specific brands/tech buzzwords on the basis that it probably won't work in a few years when the maker decides to drop support for version 1 and similar scenarios, or the 'surprise' real life case of windows really crossing the line on how shitty they can get away with and making people want to switch and coming to me to ask if this or that linux distro would work for them.
FFS. I mentioned G Sync because they have a logo. VRR is so common an ubiquitous that there is a VESA certification for it now and a default standard for it for both HDMI and Display Port, no Nvidia required. It doesn't matter if you have G Sync, AMD's Freesync (which is an open standard) and can be used by any brand of GPU or generic VRR, it.
You having had your head in a hole about what the average display features are in 2026 for even an entry level gaming display doesn't mean they aren't common, important or widely supported. When Nintendo has adopted a universal technology and you haven't you know you're behind the tech curve.
For the record, plenty of Linux distros have full support for HDR and VRR. Mint just happens to... not.
I have a display with HDR and VRR but VRR is fucked on all same and older generation oled panels I found out later, and the brightness flickers at dumb times so I can't use it regardless. There's a very specific case in which it works iirc, but it would require consideration for every game.
But apart from me and one guy several tax brackets above me, everyone I know uses ~2016 basic model tvs and monitors from fbm or similar era gaming displays from friends that upgraded. Some of them have old HDR capable displays but I feel like oled without HDR beats lcd with HDR any day. The average gamer still uses pretty shit tier hardware and GPU is definitely a generally preferred upgrade over display when you can get old 1080p's free or 4k for cheap used, lack of buzzwords be damned.