Sharpness vs Refresh Rate for an FPS

Everything about displays and monitors. 120Hz, 144Hz, 240Hz, 4K, 1440p, input lag, display shopping, monitor purchase decisions, compare, versus, debate, and more. Questions? Just ask!
Post Reply
jonhconstantine11
Posts: 1
Joined: 09 Sep 2019, 02:14

Sharpness vs Refresh Rate for an FPS

Post by jonhconstantine11 » 09 Sep 2019, 02:21

What are your thoughts on sharpness/clarity versus refresh rate for first person shooter games, Overwatch in particular? I currently play on the 240hz 27-inch Acer but am leaning towards buying one of the coming 4k, 144hz, HDR , fald, quantum dot monitors. I definitely notice a difference between 144hz and 240hz but I've also had positive experiences with greater sharpness and clarity. Also, how will going from a TN to an IPS be given the pixel response time?

User avatar
sharknice
Posts: 295
Joined: 23 Dec 2013, 17:16
Location: Minnesota
Contact:

Re: Sharpness vs Refresh Rate for an FPS

Post by sharknice » 09 Sep 2019, 12:48

It depends on your gameplay style. I like to do a lot of sniping from very long distances. Sometimes the characters' heads are so small they're only be a few pixels in width. Extra resolution helps a lot there. When I went from 1920x1080 to 2560x1440 it definitely helped me in a lot of situations. Beyond that I think framerate is a lot more important.
For Overwatch I don't think the resolution would be very beneficial in compared to the refresh rate.

User avatar
Chief Blur Buster
Site Admin
Posts: 11647
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: Sharpness vs Refresh Rate for an FPS

Post by Chief Blur Buster » 09 Sep 2019, 14:25

It varies from use case to use case.

We want to have cake and eat it too.
-- Higher resolution reduces frame rate, but makes some sniping tasks better. (Camping and aiming at static points)
-- High frame rates makes a lot of fast-twitch situations better. (Trying to identify-aim-etc while running and turning all over).

It's basically a tradeoff that fights against each other and is hard to solve simultaneously.

Sharpness of static images
Solution: Increase resolution, increase detail, etc

Sharpness of moving images.
Solution: Decrease motion blur, raise Hz & frame rate, or add strobing instead of raise Hz, reduce detail to raise frame rate, etc.

Indeed, the goals are often contradictory
Optimizing for one affects the other.

It takes lots of expense and lots of overkill GPU to try to improve both simultaneously. It's only in recent years we can do 240fps at 1080p... and only in certain games. But now we've got 4K, we've got higher-detail in newer games, we've got VR with wide-FOV that makes it easier to see motion blur, etc.

It depends on the game, the gameplay style, the tradeoffs the technology is giving you (e.g. you may prefer a cheaper high-quality newer IPS over a low-quality high-Hz older TN). Witness some compromises that many 240Hz screens seems to be giving many readers even as others readers absolutely adore them. With 240Hz 1ms IPS, we're getting ever bit-by-bit closer to having cake and eating it too. But it will be a long journey.

Regardless of optimization goal, one thing is constant: One does have to throw ever more and more GPU at this!
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

Post Reply