Analog output has Higher latency on modern Graphics cards?

Everything about latency. Tips, testing methods, mouse lag, display lag, game engine lag, network lag, whole input lag chain, VSYNC OFF vs VSYNC ON, and more! Input Lag Articles on Blur Busters.
User avatar
Chief Blur Buster
Site Admin
Posts: 11653
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: Analog output has Higher latency on modern Graphics card

Post by Chief Blur Buster » 31 May 2018, 09:55

A Solid lad wrote:Thanks for all the answers guys!
I'm so glad I finally have a place where I can ask questions like these and not get replies like "just learn to play" or "x amount of lag is insignificant, you shouldn't be worried about it" or "why are you aking this? you shouldn't even be using a CRT these days..."
(Well...not getting them for the most part. lol)
Indeed, policy is "milliseconds matters" at Blur Busters - we keep an open mind because the millisecond surprises in many ways (e.g. strobe crosstalk position, VR nausea, amount of motion blur, stutter, etc).

Maybe it won't always matter, but we don't "assume" -- many undiscovered surprises still lurk within the humble millisecond.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

User avatar
A Solid lad
Posts: 317
Joined: 17 Feb 2018, 08:07
Location: Slovakia
Contact:

Re: Analog output has Higher latency on modern Graphics card

Post by A Solid lad » 31 May 2018, 17:55

"Many undiscovered surprises still lurk within the humble millisecond."

-Mark Rejhon, 2018
Discord | Youtube | Twitch
Steam with a list of monitors & mice I've used/use.

3dfan
Posts: 62
Joined: 26 Jan 2014, 06:09

Re: Analog output has Higher latency on modern Graphics card

Post by 3dfan » 01 Jun 2018, 09:18

A Solid lad wrote:Thanks for all the answers guys!
I'm so glad I finally have a place where I can ask questions like these and not get replies like "just learn to play" or "x amount of lag is insignificant, you shouldn't be worried about it" or "why are you aking this? you shouldn't even be using a CRT these days..."
(Well...not getting them for the most part. lol)
Those answers are not only useless, but infuriating at the same time...and make me feel like there's no poeple like me left on earth, who are just curious about stuff, and want to learn new things...and improve on things they use.

Thankfully, that's not the case here!

One more thing...now that we know it's possible, can we know how likely do you think that this was the case, given the circumstances... I mean, the gtx 980 ti was a top-end card...
What's your experience? Have you came across similar corner cutting on other high-end gfx cards in the past?
hi, A Solid lad.
As you i am also a curious person, and i like pc display technology and came to this place since is the one i found taking seriously the advances of eliminating some nasty issues modern monitors have had such that nasty motion blurring, also i have learned interesting and usefull things here as well, even though modern pc monitor technology has been a big disappointment to me, and funny thought, even when some people laugh at those that still use CRT, today in 2018 there si still no modern monitor that can match a CRT (specially a FW900 widescreen) being able to produce excelent black levels, wihout ugly light bleeding, light glow, excellent latency, excellent viewing angles, excellent refresh rates, excellent resolutions, clear text, excellent colors, luminance, excellent motion clarity at even lower refresh rates without evil flicker and good enough widescreen screen size, all in one package :shock:

to contribute on your query, i use a EVGA SC+ superclocked GTX 980 TI with a sony FW900 CRT, i tried to do some input lag tests comparison using a second monitor AOC LE24H037 LED, working on cloned mode, CRT connected via DVI I to VGA adapter and LED via HDMI, both at 1920 x 1080 60hz Vsync off, and i created the 1080p 60hz mode on the CRT by using custom resolution "standart GTF timing (not a 144hz test since the LED only supports 60hz but it may help you giving an idea of a modern HDMI digital input vs older analog input latency performance on the GTX 980 TI.)

i uploaded two videos: one testing mouse motion and buttons with alien isolation game, i dont have other competitive games such battlefield, COD, etc to test but i guess that doesnt matter since it just a general input latency test. Reducing the speed of the video to the lowest may give a better idea of the input lag.
In the second video i tested with the input lag chronometer test, by pausing the video you can see latency performance between both.

i know this is not a professional way to measure input lag and may not be too acurated, im not expert on this, but it may help giving you an idea of the GTX 980 TI latency throught its analog vs digital HDMI ports side by side.

also i recorded with an old samsung galaxy S3 smarthphone at 30fps (whish i had a 1000fps camera to test :P )
sorry for the quality and lack of detail on the mouse, i could not manage a way to better illuminate it without interfering with the screens making them unwatchable. ;). the mouse used is just a normal Genius brand being overlocked to 1000hz using the mouse rate adjuster setup software, OS windows 10 64bit pro. Nvidia drivers version 391.05

here is the game video,

phpBB [video]


and chronometer video:
phpBB [video]


if you find it useful and you wish, i could do other tests, i could do a 144hz similar test on the fw900 and the GTX 980 TI alone since this monitor supports up to 160hz.

User avatar
Chief Blur Buster
Site Admin
Posts: 11653
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: Analog output has Higher latency on modern Graphics card

Post by Chief Blur Buster » 01 Jun 2018, 09:52

A Solid lad wrote:"Many undiscovered surprises still lurk within the humble millisecond."
-Mark Rejhon, 2018

Very tweetable quote. Permission granted to any media websites to quote me on this. ;)
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

User avatar
Chief Blur Buster
Site Admin
Posts: 11653
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: Analog output has Higher latency on modern Graphics card

Post by Chief Blur Buster » 01 Jun 2018, 09:57

3dfan wrote:if you find it useful and you wish, i could do other tests, i could do a 144hz similar test on the fw900 and the GTX 980 TI alone since this monitor supports up to 160hz.
Your test is very welcome. We welcome end user creativity, no matter how limited the equipment!

Even a lowly Samsung Galaxy S3 (in a smartphone holder on a sliding rail!) has doubled as a pursuit camera before for display reviewer testing.

To improve your tests, see if you can download a camera app that allow you to reduce camera exposure time a lot (e.g. 1/250sec exposure). This helps a lot even if your frame rate is low.

Short exposure time is actually more important than the camera frame rate. Even with 30fps, it's possible to get roughly refresh-cycle accuracy lag benchmarking with 30fps cameras when the camera is custom-configured with short exposure per frame. I simply freeze-frame a camera frame to inspect the refresh cycle difference.

Unfortunately the custom-camera Android apps are not supported by all Android phones but you could test one, and see if you can set exposure short.

Yes 1000fps is superior to get sub-refresh-cycle accuracy.

But a 30fps camera can be milked to refresh cycle accuracy (e.g. 1/240sec benchmarking accuracy between two 240Hz monitors) via a simple camera-exposure adjustment feature in a "professional" camera app from Google Play.

Ideally you want camera exposure roughly 1/2 of a refresh cycle (1/240sec for 120Hz benchmarking, 1/500sec for 240Hz benchmarking), to minimize sampling/nyquist effects of camera exposure unsynchronized with refresh cycles -- to gain the necessary accuracy error margins -- but the bottom line is 30fps cameras can at least beat its own poor 1/30sec accuracy through a few tricks via a downloadable camera app.

Or just skip video and take single-exposure photographs instead. That's how TFTCentral lag testing is done -- single photographs!

Also, be careful about camera sensor scan direction interfering with lag results, so always hold your camera in landscape mode, while two monitors side-by-side, while backing further away from monitors (to minimize vertical height in the photo, to minimize camera sensor scan latency problems (they often scan top-to-bottom in 1/60sec or slow numbers -- so quarter-height inside the photo is 1/240sec). By doing that, you overcome the camera scan latency from interfering too much with lag benchmarking.

That said, maximize monitor brightness, since images will be very dark at short exposures.

Yeah, I'll need to make a FAQ/HOWTO on camera benchmarking. :-)
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

User avatar
A Solid lad
Posts: 317
Joined: 17 Feb 2018, 08:07
Location: Slovakia
Contact:

Re: Analog output has Higher latency on modern Graphics card

Post by A Solid lad » 02 Jun 2018, 02:53

Thank you @3dfan for the lenghty post!
Though if I want to be honest, I prefer a fast LCD over even my once-high-end IBM p275.

Even when the CRT in question has been barely ever used and is adjusted near perfectly, (like mine) the sharpness on an LCD is just in a different league.
Also, geometry is NOT something you have to fiddle with for half an hour on every custom resolution you make on an LCD.
While on a CRT... even after that long-ass fiddling session, the picture still won't be as perfectly uniform as it is on an LCD.
Warmup is not a thing either on most modern LCDs.
And even though CRT colors are usually more accurate than a high refresh rate TN panel's if set up correctly, they aren't as vibrant... for example on my old xl2411z, colors at a digital vibrance setting of 65%, match the vibrance of the p275's at a 100%. And...I, as a competitive minded gamer, will take vibrant and slightly inaccurate colors over dull accurate ones any day.

I'm not against using a CRT really, I just find them a pain in the butt to use, compared to a good modern LCD.
Discord | Youtube | Twitch
Steam with a list of monitors & mice I've used/use.

User avatar
RealNC
Site Admin
Posts: 3757
Joined: 24 Dec 2013, 18:32
Contact:

Re: Analog output has Higher latency on modern Graphics card

Post by RealNC » 02 Jun 2018, 03:25

Heresy! :P

There's CRTs, and then there's CRTs though. Meaning, some of them have really good sharpness and geometry. The difference can be quite staggering between two CRTs.
SteamGitHubStack Overflow
The views and opinions expressed in my posts are my own and do not necessarily reflect the official policy or position of Blur Busters.

User avatar
A Solid lad
Posts: 317
Joined: 17 Feb 2018, 08:07
Location: Slovakia
Contact:

Re: Analog output has Higher latency on modern Graphics card

Post by A Solid lad » 02 Jun 2018, 06:01

RealNC wrote:There's CRTs, and then there's CRTs though.
I'm aware. My IBM p275 one of the better ones when it comes to...anything. Look that model up if you don't believe me...
but even then, sharpness goes downhill, once you exceed 1280x960 @120hz.
That's the highest res/refresh combo at which everithing is as clear as I'd like... ofc if you go lower with resolution, there's no problem with sharpness... but if you're running 1600x1200 @100hz for example, sharpness takes a hit.
Discord | Youtube | Twitch
Steam with a list of monitors & mice I've used/use.

User avatar
RealNC
Site Admin
Posts: 3757
Joined: 24 Dec 2013, 18:32
Contact:

Re: Analog output has Higher latency on modern Graphics card

Post by RealNC » 02 Jun 2018, 06:13

Yes. But at these monitor sizes (19.8" for this one), 1200p is kind of not needed. You'd get way better results with using DSR instead. Never tried it on a CRT, but with CRU you should be able to get the NVidia driver to use 1280x960 as the native res and get 2560x1920 (x4 DSR with the smoothness slider set to 0%).

Now I wish I still had my CRT to try just that... I bet it would look smooth and clear AF.
SteamGitHubStack Overflow
The views and opinions expressed in my posts are my own and do not necessarily reflect the official policy or position of Blur Busters.

3dfan
Posts: 62
Joined: 26 Jan 2014, 06:09

Re: Analog output has Higher latency on modern Graphics card

Post by 3dfan » 02 Jun 2018, 07:58

A Solid lad
yes, a CRT is more complex to setup and requieres patience, also the warm up, not for everyone, but once its properly setup, results are fantastic, and geometry and sharpness can get very close to an LCD, i personally find CRT shapness a bonus since on LCD text and edges look more pixelated, on CRT looks smoother like it had some antialiasing on it. and of course it also will depend on the CRT age, health status, focus, quality, etc.

RealNC,
i have tested x4 DSR slider 0 at 5120x2880 on the FW900 CRT an you are not wrong when thinking it looks smooth and clear on CRT, (would love to see that on a modern big 4k OLED TV)

about the main topic, Chief Blur Buster, interesting and detailed useful info to do with further tests with the camera ;), now that you talk about single photographs, there seems to be some interesting data findings i realize on those videos when taking a couple of printscreens i post here, where it seems the CRT monitor is being faster than the LCD some miliseconds, also in the game test i took one where the gun shoots first on the CRT as well,

since the CRT on the analog port seems faster, can that mean that the analog port on the gtx 980TI has not really slower latency than the digital one? or maybe the LED tested is just slower even with the advantage of supposed faster HDMI digital port? what you think?
Attachments
tests.jpg
tests.jpg (230.55 KiB) Viewed 4937 times

Post Reply