A Solid lad wrote:Thanks for all the answers guys!
I'm so glad I finally have a place where I can ask questions like these and not get replies like "just learn to play" or "x amount of lag is insignificant, you shouldn't be worried about it" or "why are you aking this? you shouldn't even be using a CRT these days..."
(Well...not getting them for the most part. lol)
Those answers are not only useless, but infuriating at the same time...and make me feel like there's no poeple like me left on earth, who are just curious about stuff, and want to learn new things...and improve on things they use.
Thankfully, that's not the case here!
One more thing...now that we know it's possible, can we know how likely do you think that this was the case, given the circumstances... I mean, the gtx 980 ti was a top-end card...
What's your experience? Have you came across similar corner cutting on other high-end gfx cards in the past?
hi,
A Solid lad.
As you i am also a curious person, and i like pc display technology and came to this place since is the one i found taking seriously the advances of eliminating some nasty issues modern monitors have had such that nasty motion blurring, also i have learned interesting and usefull things here as well, even though modern pc monitor technology has been a big disappointment to me, and funny thought, even when some people laugh at those that still use CRT, today in 2018 there si still no modern monitor that can match a CRT (specially a FW900 widescreen) being able to produce excelent black levels, wihout ugly light bleeding, light glow, excellent latency, excellent viewing angles, excellent refresh rates, excellent resolutions, clear text, excellent colors, luminance, excellent motion clarity at even lower refresh rates without evil flicker and good enough widescreen screen size, all in one package
to contribute on your query, i use a EVGA SC+ superclocked GTX 980 TI with a sony FW900 CRT, i tried to do some input lag tests comparison using a second monitor AOC LE24H037 LED, working on cloned mode, CRT connected via DVI I to VGA adapter and LED via HDMI, both at 1920 x 1080 60hz Vsync off, and i created the 1080p 60hz mode on the CRT by using custom resolution "standart GTF timing (not a 144hz test since the LED only supports 60hz but it may help you giving an idea of a modern HDMI digital input vs older analog input latency performance on the GTX 980 TI.)
i uploaded two videos: one testing mouse motion and buttons with alien isolation game, i dont have other competitive games such battlefield, COD, etc to test but i guess that doesnt matter since it just a general input latency test. Reducing the speed of the video to the lowest may give a better idea of the input lag.
In the second video i tested with the input lag chronometer test, by pausing the video you can see latency performance between both.
i know this is not a professional way to measure input lag and may not be too acurated, im not expert on this, but it may help giving you an idea of the GTX 980 TI latency throught its analog vs digital HDMI ports side by side.
also i recorded with an old samsung galaxy S3 smarthphone at 30fps (whish i had a 1000fps camera to test
)
sorry for the quality and lack of detail on the mouse, i could not manage a way to better illuminate it without interfering with the screens making them unwatchable.
. the mouse used is just a normal Genius brand being overlocked to 1000hz using the mouse rate adjuster setup software, OS windows 10 64bit pro. Nvidia drivers version 391.05
here is the game video,
and chronometer video:
if you find it useful and you wish, i could do other tests, i could do a 144hz similar test on the fw900 and the GTX 980 TI alone since this monitor supports up to 160hz.