Why are refresh rates "off" by a bit?

Everything about displays and monitors. 120Hz, 144Hz, 240Hz, 4K, 1440p, input lag, display shopping, monitor purchase decisions, compare, versus, debate, and more. Questions? Just ask!
User avatar
RealNC
Site Admin
Posts: 3737
Joined: 24 Dec 2013, 18:32
Contact:

Why are refresh rates "off" by a bit?

Post by RealNC » 24 Nov 2016, 08:33

This might turn out to be the stupid question of the day, but I'll just ask anyway...

Why are monitors shipped with an EDID that defines modes with refresh rates that aren't as close as possible to the target refresh rate?

For example:

100Hz is actually 99.93Hz.
120Hz is actually 119.982Hz.
60Hz is actually 59.98Hz.

The only mode I've seen so far that's exactly right, is 144Hz. It's exactly 144.000Hz. All the others are off.

Yeah, I can change them with CRU and set the pixel clock so that the refresh rates are spot on. But... why not have them spot on to begin with? Is there a reason the monitor vendor engineers choose these "off" timings?
SteamGitHubStack Overflow
The views and opinions expressed in my posts are my own and do not necessarily reflect the official policy or position of Blur Busters.

Q83Ia7ta
Posts: 761
Joined: 18 Dec 2013, 09:29

Re: Why are refresh rates "off" by a bit?

Post by Q83Ia7ta » 24 Nov 2016, 10:04

I saw some products have even 59Hz at windows. IRRC Japanse some products. I guess because of it's NTSC. But NTSC*2 = 59.94 not 59.98...

User avatar
Chief Blur Buster
Site Admin
Posts: 11647
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: Why are refresh rates "off" by a bit?

Post by Chief Blur Buster » 25 Nov 2016, 01:34

Part of the reason is refresh rate are often calculated from the graphics card's dot clock. The dotclock is exactly equal to (horizontal total) times (vertical total) times (refresh rate).

This can be a very fiddly relationship, especially since totals can't be too big or too small, and you want the totals to be the same for most refresh rates for consistency and maximum compatibility with a widest variety of monitors, using a common timings formula (such as VESA GTF -- General Timings Formula). Using the same formula, what might be exact at 75Hz might not be exact at 85Hz.

You sometimes have to pick a poison, and/or deviate from a standard timings formula (such as GTF), and/or make it exact at one refresh rate but not a different one, and/or reduce compatibility with monitors, etc, in order to hit an exact refresh rate. And some graphics cards can't do it (e.g. choose between 119.999Hz or 120.001Hz) -- older ones had bigger jumps in dotclock granularity due to clockrate divisor limitations, etc.

Also for the good old-fashioned NTSC 59.94Hz refresh rate was chosen for color TV (1000/1001th of 60Hz) -- instead of exactly 60Hz -- to solve an image artifact problem with the audio carrier overlaid on NTSC video.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

User avatar
RealNC
Site Admin
Posts: 3737
Joined: 24 Dec 2013, 18:32
Contact:

Re: Why are refresh rates "off" by a bit?

Post by RealNC » 25 Nov 2016, 09:14

The monitor is using CVT timings (at least as far as I can tell by looking at the timing values), except for 144Hz for which they have completely non-standard timings (which makes sense as they want to keep the pixel clock within allowed limits while still having enough blanking for GPUs to be able to adjust their clock speeds and voltages without visible flicker.)

With CVT, it's possible to get to 120.002Hz (which is what I've done using CRU). Yet, they ship with 119.982Hz. One would think they might want to have the "120Hz" mode be close to double of 59.94, but it's not.

So even though your explanation makes sense, it's still not clear why they don't try to get as close as possible to a target rate.

Can it be that they test their modes by measuring the actual refresh (with high speed cameras or other equipment,) and it's just that some graphics cards produce slightly different refresh rates with the same timings? So even if the CVT timings would normally give 119.982Hz, the actual, measured rate in their labs is actually 119.88Hz? (Which is 2 * 59.94.)
SteamGitHubStack Overflow
The views and opinions expressed in my posts are my own and do not necessarily reflect the official policy or position of Blur Busters.

User avatar
Chief Blur Buster
Site Admin
Posts: 11647
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: Why are refresh rates "off" by a bit?

Post by Chief Blur Buster » 25 Nov 2016, 09:27

Yes, the actual refresh rate from one card might actually be different from what they measure. That might be part of it -- but it could be an entirely different explanation -- such as coming up with timings that work properly on both AMD/NVIDIA. Or they chose a competitor's product and matched the timings. Or another mudane reason, such a lazy engineer. ;)

Also, longshot, but 119.998Hz (or another number) measured on a graphics card might actually measure closer to 120.000Hz using an atomic clock accurate measurement. I'm not sure exactly how accurate the clocks on a graphics card is, but they're obviously not atomic-clock accurate. No pretty LED-illuminated ☢ nuke ☢ logos ☢ on my fancy GPU...
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

ToastyX
Posts: 41
Joined: 28 Dec 2013, 14:52
Contact:

Re: Why are refresh rates "off" by a bit?

Post by ToastyX » 03 Dec 2016, 03:09

The CVT standard actually says to round the pixel clock down to a multiple of 0.25 MHz. I chose not to do that with CRU's automatic options because I want the value I enter to be a minimum, so it picks the next possible multiple of 0.01 MHz.

"Automatic - LCD standard" in CRU calculates 1920x1080 @ 120 Hz to be 120.003 Hz at 285.55 MHz. If you switch to "Manual" and round down the pixel clock, it ends up being 119.982 Hz at 285.50 MHz. If you want exactly 120 Hz, you'll need to set the vertical total to 1125 or 1150, but 1150 will activate strobing on LightBoost monitors. This tool can calculate the totals needed for exact refresh rates: https://www.monitortests.com/pixelclock.php

1920x1080 @ 60 Hz should be exactly 60 Hz when using the CEA standard, which is what most monitors use for TV resolutions.

1920x1080 @ 144 Hz is not exactly 144 Hz either. "Automatic - LCD reduced" uses the same timing parameters that the ASUS VG248QE uses, which actually ends up being around 144.0005 Hz.

Actual refresh rate = pixel clock MHz * 1000000 / horizontal total / vertical total

CRU truncates calculated values to 3 decimal places so I know 144.000 Hz is at least 144 Hz and not something like 143.9995 Hz.

User avatar
RealNC
Site Admin
Posts: 3737
Joined: 24 Dec 2013, 18:32
Contact:

Re: Why are refresh rates "off" by a bit?

Post by RealNC » 09 Dec 2016, 11:00

Thanks for the explanation. Especially this part:
ToastyX wrote:The CVT standard actually says to round the pixel clock down to a multiple of 0.25 MHz.
That explains a lot with what I see in default monitor timings.

Mystery solved :)

Edit:
I noticed that some default timings have the exact same values as CRU's "LCD Standard" timings (after rounding down the pixel clock to a multiple of 0.25), except for sync polarity. CRU gives "+/-" while the default modes have "+/+".

Do you happen to know if sync polarity is something that matters at all these days? IIRC, it had something to do with CRTs, electron guns and scanlines, so I assume it should be irrelevant with non-CRTs?
SteamGitHubStack Overflow
The views and opinions expressed in my posts are my own and do not necessarily reflect the official policy or position of Blur Busters.

User avatar
LagBuster
Posts: 73
Joined: 15 May 2014, 06:50

Re: Why are refresh rates "off" by a bit?

Post by LagBuster » 04 Dec 2019, 18:41

Sorry for resurrecting this 3 year old thread, but based on what was discussed here, would it be best to leave the refresh rate at the default 59.981 Hz that my monitor comes at, or manually change it to 60 Hz using CRU? Or does it make no difference?

User avatar
RealNC
Site Admin
Posts: 3737
Joined: 24 Dec 2013, 18:32
Contact:

Re: Why are refresh rates "off" by a bit?

Post by RealNC » 05 Dec 2019, 07:24

LagBuster wrote:
04 Dec 2019, 18:41
Sorry for resurrecting this 3 year old thread, but based on what was discussed here, would it be best to leave the refresh rate at the default 59.981 Hz that my monitor comes at, or manually change it to 60 Hz using CRU? Or does it make no difference?
It makes no difference, unless you're going OCD over seeing a whole number.
SteamGitHubStack Overflow
The views and opinions expressed in my posts are my own and do not necessarily reflect the official policy or position of Blur Busters.

User avatar
LagBuster
Posts: 73
Joined: 15 May 2014, 06:50

Re: Why are refresh rates "off" by a bit?

Post by LagBuster » 08 Dec 2019, 10:15

RealNC wrote:
05 Dec 2019, 07:24
It makes no difference, unless you're going OCD over seeing a whole number.
You're right, I didn't notice any difference between those two frequencies in my testing. But for some reason, setting a frequency of 62.9 Hz (the max my monitor will go to without frame-skipping) eliminated extreme DWM/Aero frame rate drops/stutters on my system (Win 7) - I have no idea why, but the way I tested was by playing a 4K 60 fps YouTube video. In the past I thought the stutters in the playback were caused by rendering errors, but turns out it was caused by Windows DWM not being consistent all along. I noticed it heavily when scrolling with my web browser too, but I'm not sure when that started (probably since I set my DPI scaling to 200%?). I also tried registry tweaks, but those didn't help.

I will report back if I notice any inconsistencies at 62.9 Hz from now on.

Post Reply