this post was submitted on 02 Feb 2026
69 points (87.1% liked)

Linux

11708 readers
1090 users here now

A community for everything relating to the GNU/Linux operating system (except the memes!)

Also, check out:

Original icon base courtesy of lewing@isc.tamu.edu and The GIMP

founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] Grass@sh.itjust.works 1 points 1 day ago (1 children)

IDK where you find 'normies' that have any idea what vrr and hdr even mean and those types tend to not even notice when they are running 60hz on a 265hz monitor with the resolution set wrong. My users are a bit more advanced than that but I've had 7 more people ask me to help them switch to bazzite or arch based distros since should be built in apps like notepad started bugging out for them and apart from things like 'how install repack?' or helping with mods, which they would have asked me for regardless of os, I haven't had to do any further actual work on their systems and the one that had an hdr display proudly told me about looking up how ge proton and one steam launch option/environment variable will enable it.

[–] MudMan@fedia.io 1 points 1 day ago (1 children)

There were literally huge G-Sync logos in the boxes of the last three TVs I helped people buy. When you plug in a game console and press the settings button on my current display in game mode it pops up a large HUD element that says "VRR" and displays the type of VRR currently active and the current framerate. Every other option and metric is hidden away in a sub-menu.

Not that this matters, because the point of VRR is you don't need to know it's there. If it's working, the drivers and the display should talk to each other transparently. The end result if you have a Windows machine with VRR and a Linux machine that doesn't support it and you plug them both to the same display is, again, that the Windows game will look smoother, regardless of how many fps it's spitting out.

And as always, a reminder I've given many, many, many times in my life, both personally and professionally, "it works on my machine" means nothing and doesn't mean there's no bug or that your code isn't crap. Your anecdotal experience and my anecdotal experience aren't the same, because I have a showstopper bug and your seven friends don't, which still means there's a showstopper bug.

[–] Grass@sh.itjust.works 1 points 1 day ago (1 children)

Yes that is my blind spot that I have created. I go into a store, I see g-sync is nvidia, and assume it won't work. I have been avoiding stuff that I know doesn't work or suspect won't within the decade, for decades. I've been recommending friends and family avoid certain specific brands/tech buzzwords on the basis that it probably won't work in a few years when the maker decides to drop support for version 1 and similar scenarios, or the 'surprise' real life case of windows really crossing the line on how shitty they can get away with and making people want to switch and coming to me to ask if this or that linux distro would work for them.

[–] MudMan@fedia.io 1 points 1 day ago (1 children)

FFS. I mentioned G Sync because they have a logo. VRR is so common an ubiquitous that there is a VESA certification for it now and a default standard for it for both HDMI and Display Port, no Nvidia required. It doesn't matter if you have G Sync, AMD's Freesync (which is an open standard) and can be used by any brand of GPU or generic VRR, it.

You having had your head in a hole about what the average display features are in 2026 for even an entry level gaming display doesn't mean they aren't common, important or widely supported. When Nintendo has adopted a universal technology and you haven't you know you're behind the tech curve.

For the record, plenty of Linux distros have full support for HDR and VRR. Mint just happens to... not.

[–] Grass@sh.itjust.works 1 points 1 day ago (1 children)

I have a display with HDR and VRR but VRR is fucked on all same and older generation oled panels I found out later, and the brightness flickers at dumb times so I can't use it regardless. There's a very specific case in which it works iirc, but it would require consideration for every game.

But apart from me and one guy several tax brackets above me, everyone I know uses ~2016 basic model tvs and monitors from fbm or similar era gaming displays from friends that upgraded. Some of them have old HDR capable displays but I feel like oled without HDR beats lcd with HDR any day. The average gamer still uses pretty shit tier hardware and GPU is definitely a generally preferred upgrade over display when you can get old 1080p's free or 4k for cheap used, lack of buzzwords be damned.

[–] MudMan@fedia.io 1 points 14 hours ago

I don't know how you walk into a store today and buy any OLED display without HDR. Every OLED panel I know of currently in production hits all the requirements.

For the record, the average gamer uses a Switch or a PS5 and a phone. The Switch 2 is moving fast, so the average gamer has HDR/VRR support across the board, or will very shortly if they're on Nintendo's ecosystem.