Page 1 of 1

Sharpness vs Refresh Rate for an FPS

Posted: 09 Sep 2019, 02:21
by jonhconstantine11
What are your thoughts on sharpness/clarity versus refresh rate for first person shooter games, Overwatch in particular? I currently play on the 240hz 27-inch Acer but am leaning towards buying one of the coming 4k, 144hz, HDR , fald, quantum dot monitors. I definitely notice a difference between 144hz and 240hz but I've also had positive experiences with greater sharpness and clarity. Also, how will going from a TN to an IPS be given the pixel response time?

Re: Sharpness vs Refresh Rate for an FPS

Posted: 09 Sep 2019, 12:48
by sharknice
It depends on your gameplay style. I like to do a lot of sniping from very long distances. Sometimes the characters' heads are so small they're only be a few pixels in width. Extra resolution helps a lot there. When I went from 1920x1080 to 2560x1440 it definitely helped me in a lot of situations. Beyond that I think framerate is a lot more important.
For Overwatch I don't think the resolution would be very beneficial in compared to the refresh rate.

Re: Sharpness vs Refresh Rate for an FPS

Posted: 09 Sep 2019, 14:25
by Chief Blur Buster
It varies from use case to use case.

We want to have cake and eat it too.
-- Higher resolution reduces frame rate, but makes some sniping tasks better. (Camping and aiming at static points)
-- High frame rates makes a lot of fast-twitch situations better. (Trying to identify-aim-etc while running and turning all over).

It's basically a tradeoff that fights against each other and is hard to solve simultaneously.

Sharpness of static images
Solution: Increase resolution, increase detail, etc

Sharpness of moving images.
Solution: Decrease motion blur, raise Hz & frame rate, or add strobing instead of raise Hz, reduce detail to raise frame rate, etc.

Indeed, the goals are often contradictory
Optimizing for one affects the other.

It takes lots of expense and lots of overkill GPU to try to improve both simultaneously. It's only in recent years we can do 240fps at 1080p... and only in certain games. But now we've got 4K, we've got higher-detail in newer games, we've got VR with wide-FOV that makes it easier to see motion blur, etc.

It depends on the game, the gameplay style, the tradeoffs the technology is giving you (e.g. you may prefer a cheaper high-quality newer IPS over a low-quality high-Hz older TN). Witness some compromises that many 240Hz screens seems to be giving many readers even as others readers absolutely adore them. With 240Hz 1ms IPS, we're getting ever bit-by-bit closer to having cake and eating it too. But it will be a long journey.

Regardless of optimization goal, one thing is constant: One does have to throw ever more and more GPU at this!