I had been playing games at 60hz for the majority of the time I have been playing games. Once I switched to a high refresh rate display I was blown away by how fluid games that I had previously played on my old 60hz display felt. I can't find anyone who thinks a better frame rate at max settings is not a good thing, but I do know some people who try to max out their frame rate, trading off graphical settings that I would keep at ultra in both competitive games and single player games.
I personally prefer a balanced approach for competitive games like Overwatch 2, Apex Legends, and the like, trying to tune things so they still look amazing while also providing as many frames as possible. Usually dropping shadow settings and SSAO to low while also setting post processing and effects to medium, works for me. Ideally I want to push 120 FPS on average. With my current hardware I don't have to worry as much, but on my old RX 5600 XT, I had to really lower settings to push a decent framerate.
For single player story based games I am looking to get the best possible graphics the game has to offer while in the 60 FPS range. Some games have been poorly optimized and just don't run well at higher settings, and that is okay with me.
Which do you prefer? Do you prefer higher fidelity and 60 FPS, or lower quality settings so you can match your FPS to your refresh rate?
Is it a balancing act or do you prefer to play with as many frames as possible?
I do both. But I'm stuck on a 72Hz refresh monitor for the time being.
"60 FPS should be enough for everyone" 🙃
I remember when someone said 512kb is all you need, too.
For Shooters, performance (Though I do have all settings cranked in Apex and still get 160fps at 1440p) For singleplayer story games, BEAM ME UP, SCOTTY! Crank the settings to Ultra while still looking for 90fps+
This is the way.
Of note, in Rogue Company on Ultra 1080p, my RX570 can push 140fps. In Halo Infinite on Low... 65-72 depending on the map, and as low as 40.