Blur Busters Forums

Who you gonna call? The Blur Busters! For Everything Better Than 60Hz™ Skip to content

Actual 240Hz monitors quality and 960Hz questions

Everything about displays and monitors. 120Hz, 144Hz, 240Hz, 4K, 1440p, input lag, display shopping, monitor purchase decisions, compare, versus, debate, and more. Questions? Just ask!

Re: Actual 240Hz monitors quality and 960Hz questions

Postby lexlazootin » 15 Dec 2017, 01:54

-AOC AGON AG251FZ (240Hz 1080p) 5ms frame time!!!, 200 fps, 14.8ms input lag, 3 fames at 240fps BUT this is a fake 240Hz as the pixel transition time is 5ms you cant have more than 200 fps shown...Cordialy!


Google translate the website:
Pixels can not change state more than 200 times per second, so the monitor can not display the promised 240 frames per second.


This is just bogus, i would seriously disregard this entire website as i really don't think they understand the methodology and/or the results. It's misinformation and it's straight up dangerous.

If you want accurate information check out tftcentral.co.uk
User avatar
lexlazootin
 
Posts: 1248
Joined: 16 Dec 2014, 02:57

Re: Actual 240Hz monitors quality and 960Hz questions

Postby Chief Blur Buster » 15 Dec 2017, 09:08

Pixels are continuously changing (GtG, pixel transitions, grey to grey) they just can't complete transitions fully on any LCD. It may be 75% or 80% or 99% or 99.9%.

It manifests itself as ghosting and smearing issues, and the slower the GtG, the more likely pixel transitions are more incomplete by the time the same pixel needs to refresh again in its new refresh cycle.

GtG numbers are rated from the 10 percent transition milestone to the 90 percent transition milestone.
Meaning for specifically during a transition from black to white, the GtG (pixel transition) benchmark is the time period from a 10% dark-grey to a 90% near-white. A pixel may take 1ms to do GtG10-90% but 10ms to do GtG99%. However, even an 80% transition is very human eye visible.

Being LCD, - Liquid Crystal Display - they are a liquidlike film of rotating molecules essentially being used as shutters for light - the GtG transition curve is inherently "analoglike" on a graph, pixes are changing infinitely but only at a finite speed towards another color. If you only needed adjacent shades (like a gradient), a TN type LCD can easily do it 10,000 times a second - but if you need random-access color (any color to any color) that is massively slower, and takes millliseconds - between 0.5ms to 2ms for an 80 percent transition, and more than 5ms for a near-100% transition. One begin to see reaaonable motion clarity well below the 100% point of the GtG curve.

Anyway:

Better overdrive makes a very massive difference.

That said, the AG251FZ is much blurrier motion than the AG251FG. Both 240Hz. But the wording should not be in a simplistic way: more accurate wording is "Pixel transition speeds appear to not be sufficiently complete enough on the AF251FZ to push the limits of 240Hz motion quality." ....

At this stage, focus on the 240Hz G-SYNC variants of well-rated 240Hz FreeSync models (good overdrive) as that is extremely critical to nearly maxing out 240Hz motion clarity.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter!
To support Blur Busters: Official List of Best Gaming Monitors | G-SYNC | FreeSync | Ultrawide
User avatar
Chief Blur Buster
Site Admin
 
Posts: 4871
Joined: 05 Dec 2013, 15:44

Re: Actual 240Hz monitors quality and 960Hz questions

Postby Chaps2 » 15 Dec 2017, 19:59

Ofc they tested it at the highest rated Hz. They would have to be insane and non professional to test it at 60Hz only. I know it's in german or french so you might not all be able to read it.

The input lag difference might come from that you don't have the same definition of input lag. For them it is the time btw a frame beeing sent to the monitor and the actual movement of pixels on the screen filmed with high speed camera. Ofc if you substract pixel response time (as you seems to be doing or something else) then it's not the same result.

Also your result of 3.7ms input lag does not look logical to me. As a 240Hz panel can not show more than 1 frame every 4.16ms (240/1000=4.16) and that is with a pixel speed of 0 (which does not - and will never exist).

On the websites i talked (They are higly trusted websites with high professional reputation, it's not gamers or generic hardware websites) they measure pixel response time, input lag, and real frames per seconds displayed.
Chaps2
 
Posts: 7
Joined: 30 Nov 2017, 09:40

Re: Actual 240Hz monitors quality and 960Hz questions

Postby daggertx » 15 Dec 2017, 20:14

There a few professional sites that disagree with the german site.

Here is a professional review from a professional review site.


http://www.tftcentral.co.uk/reviews/asu ... 8q.htm#lag
daggertx
 
Posts: 23
Joined: 28 Nov 2017, 22:09

Re: Actual 240Hz monitors quality and 960Hz questions

Postby Chief Blur Buster » 16 Dec 2017, 22:50

Chaps2 wrote:Also your result of 3.7ms input lag does not look logical to me. As a 240Hz panel can not show more than 1 frame every 4.16ms (240/1000=4.16) and that is with a pixel speed of 0 (which does not - and will never exist).

You have to understand how an LCD scans out. It's not displayed all at once.

VSYNC ON:
- Top edge is lagless + GtG/monitor processing
- Center is half a refresh cycle lag
- Bottom edge has full refresh cycle lag
- Average input lag is usually same as center-screen input lag (aka half a refresh cycle)

VSYNC OFF:
- Top edge of frameslice is lagless + GtG/monitor processing.
- Center of frameslice is half a frametime long.
- Bottom edge of frameslice is a full frametime long.

Pixels are transmitted from a GPU over the cable sequentially, one pixel at a time (at the dotclock rate, as seen in Custom Resolution Utility), beginning at the upper-left, scanning left-to-right, top-to-bottom (calender style scan sequence), with padding in between (blanking intervals). On a time-basis, VSYNC OFF essentially looks like this:

Image

Image

During VSYNC OFF< between the tearlines are the frameslices, and the top edge of frameslices at 432fps adds only 0ms to the existing subsystem input lag (e.g. monitor processing, GtG, etc). Center of frameslices at 432fps would add 0.5/432sec of scanout lag to the lag number. Bottom of frameslices at 432fps would add 1/432sec of scanout lag to the lag number.

In a high speed videos, the action of an LCD scanning out, can be seen http://www.blurbusters.com/lightboost/video

phpBB [video]


This is a high speed video of testufo.com/flicker running in full screen mode, and it shows the top-to-bottom refreshing action of an LCD panel. In many gaming monitors, this is in sync with the cable output, so the pixels can begin its own transitioning process, pretty much immediately upon the arrival of the pixel (especially if it's avoiding full-framebuffering for monitor processing -- many gaming monitors use linebuffered processing -- essentially realtime/"Instant Mode" processing).

As soon as the GPU is outputting pixels, the top edge of the monitor immediately begins pixel transitions for that particular area of the screen. The whole screen does not have to be fully displayed before your the photons of the first pixels at the top edge of the screen begins hitting your eyeballs. This can occur less than half a refresh cycle later. At ~120Hz with 1ms GtG, the GtG zone is approximately 1/8th the screen height as it scans downwards where the pixel-refreshing action lagsbehind the scanout.

Image

We have helped a few monitor manufacturers on how to create strobe backlights, see Electronics Hacking: Creating a Strobe Backlight so we have an understanding of how LCDs work. This is the stuff that BlurBusters got started with, we were originally "scanningbacklight.com" before our site renamed more mainstream to Blur Busters with the advent of strobe backlights such as LightBoost.

Anyway:
Lag methodology will output different values for:
-- Lag from GPU-side to monitor pixels
-- Lag from monitor input to monitor pixels (excludes cable transmission overheads, e.g. +1ms)
-- Lag from mouse to monitor pixels
-- Lag from keyboard to monitor pixels

And screen location:
-- Lag from VBI to monitor top (ala VSYNC ON input lag)
-- Lag from VBI to monitor center (ala VSYNC ON input lag)
-- Lag from VBI to monitor bottom (ala VSYNC ON input lag)
-- Lag of pixel transmitted from GPU to corresponding pixel shown on monitor (more representative of VSYNC OFF input lag)

And other variables to keep in mind:
-- Lag of a specific Hz (varies from Hz to Hz)
-- Leo Bodnar Tester is lag of VSYNC ON 60Hz
-- SMTT 2.0 is lag-differential between two screens.
-- Lag of VSYNC OFF is also very different from lag of VSYNC ON.

What this means is that the numbers on one sites are only comparable to other numbers on the same site for the same mode. But numbers are generally not comparable between 2 different websites -- because the input lag measuring methodologies vary quite a bit.

We were also the world's first website to successfully test input lag of G-SYNC, see year 2013 article at http://www.blurbusters.com/gsync/preview2

We're the inventors of a peer-reviewed monitor testing technique that has been validated by NOKIA, KELTEK and NIST.gov researchers. Several websites now use our testing techniques.

Image
(click for more info)

So I think you can now understand we've got plenty of credentials to know what we're talking about. ;)

Input lag measurement is a complicated topic.

It is very important for sites to properly & fully document their input lag measuring methodology.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter!
To support Blur Busters: Official List of Best Gaming Monitors | G-SYNC | FreeSync | Ultrawide
User avatar
Chief Blur Buster
Site Admin
 
Posts: 4871
Joined: 05 Dec 2013, 15:44

Re: Actual 240Hz monitors quality and 960Hz questions

Postby Chaps2 » 09 Jan 2018, 01:14

I don't think this answer the question as to why we already have 1000Hz TVs since a long time and yet no PC monitors than have even half of it. Be it interpolated or not it's still real pixel frequency saying it's because tech is not mature seems very wrong.

Kinda reminds me the whole CPU shrink process, we could have very much smaller CPU transistors since a very long time but they made it clear with Moore protocol that they want to draw as much Dollars as possible out of it so they take as much time as possible to shrink the transistors every 2-3 years, (except there is no CPU embeding super shrinked transistors commercialised, let alone since a few decades).

About the input lag test methodology, both websites stated how they were testing it. Lesnumerique is a website aimed at picture pro at it's core, lots (most?) of what you're saying is stuff panels pro knows. What is interresting is that they used the same protocol for every monitors they tested, so the difference btw a cheap 240Hz monitor (hello Viewsonic) and a good one is still relevant since they used the same test.

I know 10 years ago the first input lag tests for monitors on internet where bad, some where even extremly bad because using 1 GPU with 2 output to test etc. I know both of theese websites use trusted methodology now, they even updated the old monitors they had tested with better methodology. You claiming having a better test is not the question, if you have something that can be more accurate that's all good, doesn't change that from 15 to 33ms the diference is enormous. Specialy for a 240Hz panel. Pixel persistance is a problem for some 240Hz panels.

If i read what you say it's a bit like all the 240Hz are good and equal which i know is totaly not true, from having a lot of panels i can tell, from horide colors, to input lag and image noise and or bad antighosting and so on that it's not the case.

I am 100% not surprised when they claim that some panels are not even fast enough to show 200fps while promoting 240Hz on the box.

I'm still genuinly interrested in seeing someone jinx all the big names of this industry and make a monitor runing at 960Hz or 1000Hz or even MUCH more.
Chaps2
 
Posts: 7
Joined: 30 Nov 2017, 09:40

Re: Actual 240Hz monitors quality and 960Hz questions

Postby Chief Blur Buster » 09 Jan 2018, 01:27

Chaps2 wrote:If i read what you say it's a bit like all the 240Hz are good and equal which i know is totaly not true

I've never said that. I have often said that many 240Hz G-SYNC monitors have fairly similar quality, given how NVIDIA calibrates them very similarly, but I've never said that they're exactly equal.

Chaps2 wrote:I don't think this answer the question as to why we already have 1000Hz TVs

1000Hz doesn't mean 1000fps. It's easy to flash a backlight 1000 times a second, but that doesn't add 1000 frames per second.

Even "1000Hz" TVs only does interpolation up to around 120Hz then use things like scanning backlights that blink at frequencies such as 1000Hz. Our website used to be http://www.scanningbacklight.com in year 2012 (now redirects to Blur Busters). We have the very old Scanning Backlight FAQ and have written about this often (e.g. Panasonic 1600 Hz scanning backlight)

We already do something similar right now using ULMB, ELMB, DyAc, Turbo240, and at much better quality (clearer motion) than these "1000Hz TVs". See Motion Blur Reduction FAQ -- we are the website that made LightBoost popular and caused NVIDIA to notice.

Chaps2 wrote:You claiming having a better test is not the question, if you have something that can be more accurate that's all good, doesn't change that from 15 to 33ms the diference is enormous. Specialy for a 240Hz panel. Pixel persistance is a problem for some 240Hz panels.

Yes, enormous, but lag-testing a 240Hz can vary by that much depending on what lag test you use.

I am not calling out WHICH test to use. I'm calling out on the lack of DISCLOSURE of the exact lag test methodology.

Also, I talk to a few websites and they've already adjusted their lag testing method to become slightly more industry standard. For more information about all the lag testing variables.

For example, the problem of starting and stopping the lag stopwatch is not always well defined between different websites:

The prad.de measurement for this specific monitor may have used a 60Hz lag-chain measurement or some other input lag measuring methodology that creates bigger numbers. I'm going to try to reach out to Andrea Roth and compare notes as it's important.

For example, in our Blur Busters G-SYNC 101 series, written by Jorim, of the button-to-pixels test in a real-world game, for the XB252Q, in CS:GO, measured 12ms for VSYNC OFF at 1000 frames per second (last few bars). That's much less than Prad's monitor-only measurement -- and that was via our high speed video camera.

It's a "first-anywhere-on-screen" reaction methodology for that specific game, as eSports players often play with peripheral vision too -- and this methodology can produce dramatically lower numbers than "first-single-point" measurements or "VBI-to-photons" measurements.

Image

These numbers are the full chain, from mouse button to pixels, taken via high speed camera, in GSYNC 101 Part #3.

Image

Numbers for the full whole chain would naturally be higher than the monitor-only lag (if that is what prad.de is trying to measure).

Not saying their numbers are incorrect, but they need to document their input lag measuring methodology. I suggest that prad.de to fully document their lag measurement method to properly documenting HOW they measure lag -- it will help compare notes better.

Lag methodology will output different values for:
-- Lag from GPU-side to monitor pixels
-- Lag from monitor input to monitor pixels (excludes cable transmission overheads, e.g. +1ms)
-- Lag from mouse to monitor pixels
-- Lag from keyboard to monitor pixels

And screen location:
-- Lag from VBI to monitor top (ala VSYNC ON input lag)
-- Lag from VBI to monitor center (ala VSYNC ON input lag)
-- Lag from VBI to monitor bottom (ala VSYNC ON input lag)
-- Lag of pixel transmitted from GPU to corresponding pixel shown on monitor (more representative of VSYNC OFF input lag)

And how the lag tester starts the lag stopwatch:
-- Button press
-- Dongle on cable (VBI detector)
-- Black box (Leo Bodnar, etc)
-- API call (e.g. Direct3D Present() or OpenGL glutSwapBuffers)
-- etc.

And how the lag tester stops the stopwatch:
-- Photodiode on a specific location on screen (e.g. oscilloscope, Leo Bodnar, etc)
-- Differentials between two screens
-- First reaction anywhere on screen (e.g. high speed camera)

And how soon to stop the stopwatch
-- First GtG photons detectable
-- GtG 10% (recommended -- very human visible now by then)
-- GtG 50% (recommended)
-- GtG 90%
-- GtG 100% (artificially long, not recommended)
-- Undocumented (e.g. Leo Bodnar, ugh).

And other variables to keep in mind:
-- Lag of a specific Hz (varies from Hz to Hz)
-- Leo Bodnar Tester is lag of VSYNC ON 60Hz
-- SMTT 2.0 is lag-differential between two screens and runs 1000fps VSYNC OFF
-- Lag of VSYNC OFF is also very different from lag of VSYNC ON.

Also different behaviours:
-- VSYNC ON lag testers will have more lag at bottom edge than top edge for most screens
60Hz vs 240Hz have massive differences
-- VSYNC OFF lag testers (at high frame rates) will equalize lag throughout the screen, since VSYNC OFF is scanout-following
60Hz vs 240Hz have less differences, but due to frameslice lag gradients, MIN/AVG/MAX is tighter at 240Hz
-- VSYNC OFF adds a slight lag-randomization of (1/Hz)th of a second. Lag is lowest just right below a tearline. Lag is highest just right above a tearline. And because the lag jitter is a full refresh cycle due to the random tearline locations - this results in MIN/AVG/MAX becomes much tighter at higher Hz than lower Hz when using VSYNC OFF lag testers.
-- Etc.

240Hz displays currently have bad 60Hz lag numbers (worse at 60Hz lag than the best 60Hz monitors) but excellent 240Hz lag numbers. But that is different from 240Hz lag -- not everyone even bothers using 60Hz. Depending on methodology, naturally, some results will be better and some results will be worse.

And lag numbers are not comparable between different review websites.

That is normal and acceptable but insufficient disclosure of lag test methodology is a huge problem. During 2018 we will communicate with other websites to standardize this further.

Also, IMHO, prad.de has done a great job of most tests, and very stellar reputation, one of the best in monitor tests..... except I consider their lag numbers are likely from old-fashioned test methodology. Their lag tests needs an upgrade and needs clearer disclosure of lag stopwatching methodology. And they need to begin pursuit camera photography for WYSIWYG pics of motion blur (now a peer reviewed proven technique). The instructions are quite easy now for reviewers.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter!
To support Blur Busters: Official List of Best Gaming Monitors | G-SYNC | FreeSync | Ultrawide
User avatar
Chief Blur Buster
Site Admin
 
Posts: 4871
Joined: 05 Dec 2013, 15:44

Re: Actual 240Hz monitors quality and 960Hz questions

Postby Chief Blur Buster » 09 Jan 2018, 01:29

Chaps2 wrote:IAbout the input lag test methodology, both websites stated how they were testing it. Lesnumerique is a website aimed at picture pro at it's core, lots (most?) of what you're saying is stuff panels pro knows.

So, can you please tell me what part of the GtG cycle that they end their lag stopwatching at? GtG10% or GtG50% or?

This is a very important disclosure item that I am now discussing with a few reputable websites (that also use BlurBusters tests) to standardize an expanded/improved disclosure on.

GtG10%/GtG50% is a far closer to human reaction trigger than GtG100%. GtG10% for a black->white transition is still a dark grey, which is already human-visibly different from total black. If a website wishes to use GtG100% or GtG90% instead of GtG10% or GtG50% they should at least mention it. Stopwatching at different parts of the GtG cycle can mean 10ms differences. Even 1ms GtG can sometimes take 10ms to do GtG100% even though the visible photons hit much sooner. After processing time and when a pixel finally begins its GtG cycle, the pixel "fades" from a color to another color (as seen in high speed video). We generally react on the strongest GtG curve (strong image) not the final part of the GtG curve (the faint afterghost artifact trailing behind an image, is often remnants after GtG90% and can still take >20ms to clear up even on 1ms TN monitors).

We need better disclosure of these kinds of additional details nowadays. I'm talking to a few English-language websites to discuss further improving lag-measurement disclosure.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter!
To support Blur Busters: Official List of Best Gaming Monitors | G-SYNC | FreeSync | Ultrawide
User avatar
Chief Blur Buster
Site Admin
 
Posts: 4871
Joined: 05 Dec 2013, 15:44

Previous

Return to General — Displays, Graphics & More

Who is online

Users browsing this forum: No registered users and 8 guests