Updated Statement: HDMI vs DP Input Lag

Everything about input lag. Tips, testing methods, mouse lag, display lag, game engine lag, whole input lag chain, VSYNC OFF vs VSYNC ON, and more! Input Lag Articles on Blur Busters.
Post Reply
ayukreizi
Posts: 6
Joined: 03 Apr 2020, 05:59

Updated Statement: HDMI vs DP Input Lag

Post by ayukreizi » 22 May 2020, 04:38

Hello BlurBuster team and especially to Chief,

your statements are always brought up when discussing panel technology and are valued highly by your followers which I am one of. However lately there has been a lot of discussion on input lag lately and after eliminating all of the obvious factors people turned to cables.
Your statements are always brought but I feel that it's not completely reasonable because one is from 2014 and the other from 2018 which is the reason I'm making this thread. Some people begin to recommend newbies buying HDMI cables from 50 to 100s of Dollars on your statement alone.

What is your opinion on this topic from todays perspective? Is there a difference between the latest implementations of HDMI and DP in regards to input lag, visual clarity or other factors you view as important?

Thank you for taking the time and sharing your knowledge with us.

flood
Posts: 925
Joined: 21 Dec 2013, 01:25

Re: Updated Statement: HDMI vs DP Input Lag

Post by flood » 22 May 2020, 20:55

passive cables will never add any meaningful amount of input lag

in terms of decoding the signal, i have no reason to believe either is significantly faster or slower than the other.

User avatar
Chief Blur Buster
Site Admin
Posts: 7554
Joined: 05 Dec 2013, 15:44
Location: Toronto, Ontario, Canada
Contact:

Re: Updated Statement: HDMI vs DP Input Lag

Post by Chief Blur Buster » 22 May 2020, 23:10

Many old active conversions had a frame buffer. This happens more often with devices that have onscreen menus (e.g. surround sound receivers). You know, the lag you got when you plugged a computer through a stereo receiver? Yes, that's the lag of active repeatering.

HDMI-to-DP however, requires a little bit of active adaptoring to convert HDMI micropacket format to DisplayPort micropacket formats and vice versa, but ideally they should be microseconds.

HDMI ports is standardized to output slight power (5 volt ~55mA) enough to power some simple basic active adaptoring, or power an optical transceiver (i.e. optical HDMI cables which are rapidly becoming cheaper -- fell to less than $50 for a lightweight 25 foot optical HDMI -- that's much easier to bend than old 25 foot HDMI cables)

The problem is I've seen a gamut ranging from 1 micropacket latency all the way to 1 refresh cycle `latency.

Fortunately, those tiny cheap adaptors will either be passive or lightly active (less memory = cheaper, less power, smaller)

HDMI-DVI (single link) is mainly passive because single-link DVI signalling is signal-compatible with older HDMI standard.

HDMI-DP generally has to be done actively (micropacketization differences).

A $10 adaptor fits needs perfectly sometimes -- but sometimes it's impossible. Some adaptors are cheap, while others are expensive. The problem is a good Dual Link DVI to DP adaptor costs over $100 if you want to recycle an old Benq XL2720Z monitor on a new RTX 2080 Ti graphics card. Single Link DVI to DP is cheap, but won't work for this. When faced with no other choice than $100+ adaptors, sometimes the best route is to upgrade the monitor.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

       To support Blur Busters:
       • Official List of Best Gaming Monitors
       • List of G-SYNC Monitors
       • List of FreeSync Monitors
       • List of Ultrawide Monitors

amezibra
Posts: 29
Joined: 15 Apr 2020, 15:48

Re: Updated Statement: HDMI vs DP Input Lag

Post by amezibra » 23 May 2020, 08:13

Chief Blur Buster wrote:
22 May 2020, 23:10
Many old active conversions had a frame buffer. This happens more often with devices that have onscreen menus (e.g. surround sound receivers). You know, the lag you got when you plugged a computer through a stereo receiver? Yes, that's the lag of active repeatering.

HDMI-to-DP however, requires a little bit of active adaptoring to convert HDMI micropacket format to DisplayPort micropacket formats and vice versa, but ideally they should be microseconds.

HDMI ports is standardized to output slight power (5 volt ~55mA) enough to power some simple basic active adaptoring, or power an optical transceiver (i.e. optical HDMI cables which are rapidly becoming cheaper -- fell to less than $50 for a lightweight 25 foot optical HDMI -- that's much easier to bend than old 25 foot HDMI cables)

The problem is I've seen a gamut ranging from 1 micropacket latency all the way to 1 refresh cycle `latency.

Fortunately, those tiny cheap adaptors will either be passive or lightly active (less memory = cheaper, less power, smaller)

HDMI-DVI (single link) is mainly passive because single-link DVI signalling is signal-compatible with older HDMI standard.

HDMI-DP generally has to be done actively (micropacketization differences).

A $10 adaptor fits needs perfectly sometimes -- but sometimes it's impossible. Some adaptors are cheap, while others are expensive. The problem is a good Dual Link DVI to DP adaptor costs over $100 if you want to recycle an old Benq XL2720Z monitor on a new RTX 2080 Ti graphics card. Single Link DVI to DP is cheap, but won't work for this. When faced with no other choice than $100+ adaptors, sometimes the best route is to upgrade the monitor.
I think OP was more about which port when the monitor has both present the less processing lag, according to some hdmi 2.0 is more reactive compared to DP1.2 .

User avatar
Chief Blur Buster
Site Admin
Posts: 7554
Joined: 05 Dec 2013, 15:44
Location: Toronto, Ontario, Canada
Contact:

Re: Updated Statement: HDMI vs DP Input Lag

Post by Chief Blur Buster » 26 May 2020, 22:47

Port lag differences are usually in the sub-millisecond timescales nowadays.

I think it will be an excellent lag-testing exercise to test different HDMI and DisplayPort standards. However, the latency differences are measured in mere tens and hundreds microseconds nowadays. In the old days, the buffers were probably unnecessarily big (e.g. a full millisecond or something).

But you can really just stream the pixel rows out of a HDMI port or DisplayPort within a few pixel rows of the true raster scan line number. (Our Tearline Jedi Experiments of microsecond-accurate VSYNC OFF tearline control...)

For a monitor running at 1080p 240Hz, it is usually running at an approximately 270 KHz scan rate (the "horizontal refresh rate" found in a Custom Resolution Utility). This means one pixel row takes 1/270,000th of a second. So a 1/270,000th second delay moves a VSYNC OFF tearline downwards by 1 pixel. Now, at 8000 VSYNC OFF tearlines per second, that's a frametime of only 125 microseconds (0.125ms). Present()-to-Photons() were reduced to practically nil with Present()-to-photons latency of ~2ms in many cases, that includes some GtG time. Early tests appear to show that transceiver latency (both ends combined) actually wasn't the most significant overhead in this latency. More testing will be required.

What is different about DisplayPort and HDMI is the transceiver processing + micropacketizing that they do, so we often observed tearlines move in 2-pixel or 4-pixel granularity steps at time, thanks to the rolling window buffering going on in the various layers (GPU, HDMI transceivers or DisplayPort transceivers, etc). Now, that doesn't include all transceiver overheads, but it's extremely minor too. If you properly optimize it to minimum needed buffering (no more pixel buffering than necessary get it into a port or out of a port) -- a few pixel rows on both ends -- that's not much at all. So, assuming proper programming in the HDMI processing and DisplayPort processing, it's literally only tens of microseconds apart.

That's not to say, there was more latency in the olden days when "a bit more buffer just to be safe" was often done, to improve reliability -- so millisecond timescale differences weren't unusual.

However, today, properly implemented ports, the differences are only mutiple tens of microseconds nowadays, and low hundreds of microseconds at the extreme. And any bigger differences can sometimes be the fault of the programming of the implementation (more transceiver buffer = more error margin for slower processing).

With the super-increased importance of latency, makers of the transceiver chips, some of them have an increased focus on making it easy to keep latency low for high performance applications. There can be many buffering weak links (from GPU to monitor motherboard, at both sides) but the performance on ultra-low latency gaming monitors have produced HDMI and DisplayPort implementations at very sub-millisecond latencies.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

       To support Blur Busters:
       • Official List of Best Gaming Monitors
       • List of G-SYNC Monitors
       • List of FreeSync Monitors
       • List of Ultrawide Monitors

Post Reply