If it's a fast-paced action game, 60 is a must. If it's turn-based, or otherwise just slow enough to not matter, I'll sometimes accept a stable 30 - but only if it's truly stable, any dips below that are not okay.
Gaming
From video gaming to card games and stuff in between, if it's gaming you can probably discuss it here!
Please Note: Gaming memes are permitted to be posted on Meme Mondays, but will otherwise be removed in an effort to allow other discussions to take place.
See also Gaming's sister community Tabletop Gaming.
This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.
So long as the game doesn't lag enough that I have input lag, I'll gladly play through a game at prettyich any FPS.
When I play it's usually solo games, and I never had an issue with 20fps+ . If performance drops below that, I'm visually ok with 16fps, but usually at that range my system is struggling with game mechanics and that's the deal breaker for me
I feel like 20 FPS would be OK for me if I had absolutely no ability to get at least a 25. But 15? 16? That's like very jittery. I remember that happening on Alan wake 2 and it was playable, but to be honest I was kind of annoyed with it.
Depends on what I'm playing.
I can comfortably play some games down to 12fps ±3ish, if it isn't something that's fast paced.
I have yet to play anything where I'm skilled enough for higher than 30fps to matter response-wise, and while I can notice the difference between 60fps and 240fps on my monitor, I gotta say it doesn't do much for me.
Maybe I just don't know what to look for, what I'm missing, or how to set up my laptop right, but who knows. My eyes could be stuck on 720p for all I know.
If it's not 60 or higher, I can't stand it. But it has to be consistent. Even constant fluctuations between 120-140 are felt even if not necessarily seen. I generally just try to get 60 since my display is 60hz. What's annoying is that I could be doing 1440p at 60 with my specs, but for some reason setting the display to that specific resolution locks it to 30hz.
The display is 4k, and has 60hz available at 4k and every other resolution. My PC can't handle 4k @ 60 for most things, though.
Anything under 90 feels a bit wobbly
Anything over about 90 feels great to me, and I can't even notice the difference between 144 and 240.
There's a reason I only upgraded to a 2k monitor and not 4k, I'm not willing to sacrifice that much performance to just play at a higher resolution, 25 fps is way too low for me.
108 fps is what I play Fallout New Vegas at (to avoid physics behaving too weirdly) and I think that's fine. I think I've gone down to 90 and been somewhat ok with that, but anything below that is no bueno.
Non-fps games I'll cap lower, like 72 fps for a civilization game is perfectly fine.
But if you want beautiful games like God of War (or do you mean gears of war?) and are fine with a lower framerate, that makes sense to me.
I like how us humans have totally different likes and dislikes. I 100% understand you and will never judge you. You like what you like and that's very good. I mean God of war, yes. It's freaking gorgeous.
Weirdly enough, I actually care more about framerate on "pancake" (non-vr) games than I do on VR games. I can deal with 10fps in vrchat in a crowded instance. I need more like 20~30 for non-vr games.
That said, I get mentally exhausted when the framerate is <30 for an extended period of time in VRChat.
I can't do VR. It scares the shit out of me having a screen 2" away from my eyes.
Anything VR really needs to be 90 or more, but around 60 is good for most things.
I actually think the choppy framerates in Cyberpunk is actually really immersive so it's cool all the way down to 30 or with the smearing of dlss-performance, but most games don't give you progressive brain damage in the first 2 hours like it does
Depends on the game, but also the context.
Maybe this has changed since I've upgraded my gaming specs but I used to average 14 FPS on Kerbal Space Program and had a great time with it, docking is a nightmare at that frame rate but otherwise it's more than playable.
Back in my poverty gaming days I 100%-ed a pirated The Simpsons Hit and Run with potato graphics at slide show speeds, I'm talking like multiple seconds per frame with around 80% frame droppage.
Nowadays I just care that it looks decent and runs smoothly for the games I play, which is mostly Civilization and Stellaris
I don't really obsess about framerates myself and I've never had the kind of budget to have the latest and greatest parts but from what I've seen, somewhere around 30fps is fine.
And even though you didn't ask, the last setting that I ever sacrifice is draw distance. I'll turn down textures and shadows and reflections and everything else before I sacrifice draw distance. I don't need realistic graphics to be able to immerse myself and have a good time. But things popping in and out of existence in front of your eyes are the ultimate immersion breaker for me.
My personal minimum is a stable 40/s, which is roughly where I start noticing the lower framerate without paying attention to it.
With 30/s I need to get used to it, and I usually underclock (or, rather, power-limit) my GPU to hit an average 50 unless the game in question is either highly unstable (e.g. Helldivers 2) or the game is so light I don't have to care (e.g. Selaco).
60 FPS, I can't stand an unstable framerate, I prefer to lower quality/effects if I can't get constant 60 FPS
I think I'd feel like a millionaire if I ever got 90 on a high refresh monitor. Lol. I like me poor and not too spoiled.
Am I the only one who just plays games and doesn't know what FPS he's getting? If it plays, I'm good.
Or,... maybe I am missing out on something? Lol
@penquin does it have to be first person? If third person is allowed I'd say Warframe. If not, classic Doom with mods
Shit, I knew a comment like this would come up. I was asking specifically about refresh rate not, first person shooter game. Let me fix the title 😁
@penquin oh! Well in that case I used to be a 1080p 60Hz monitor kinda guy, and about a year ago I had to upgrade to dual 1440p 165Hz monitors.
While I can definitely feel the difference, 60 FPS is barely noticeable, and even 30 FPS is acceptable.
I grew up with slower machines so sub-30 was fairly normal, even older consoles targeted 30 and faltered below that, so at this point I'll take anything above what's acceptable for film
So far, all my mentality/generation folks. <3
I just don't care about FPS, as long as 25 or higher. Once you get to the 20ish, you start seeing the jitter.
@penquin like, I can tell the difference under 60, and I can tell it gets choppy under like, 40? But I probably don't make a comment about the "lag" or framerate dropping until it's below 20-30
100%. I can absolutely tell, but I just don't care. I'm here for the fun. Playing God of war with my son and fighting all these bosses and getting into it and yelling is just way too much fun to worry about FPS.
@penquin sometimes it's even more exciting overcoming the FPS drops, especially when I can tell why it's happening and/or if it's only temporary/rare. I've definitely caused my fair share during some overly modded Doom setups