Genericness of Variable Refresh Rate (VRR works on any video source including DVI and VGA, even on MultiSync CRT tubes)

Advanced display talk, display hackers, advanced game programmers, scientists, display researchers, display manufacturers, vision researchers & Advanced Display Articles on Blur Busters. The masters on Blur Busters.
Post Reply
User avatar
Chief Blur Buster
Site Admin
Posts: 11647
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Genericness of Variable Refresh Rate (VRR works on any video source including DVI and VGA, even on MultiSync CRT tubes)

Post by Chief Blur Buster » 25 Jul 2021, 00:22

Crosspost from a HardForum thread, because it's "Blur Busters Area 51" worthy.
kasakka, post: 1045073923, member: 189011 wrote: Freesync over HDMI is a proprietary thing for AMD which is why it does not work with Nvidia GPUs.
kasakka, post: 1045074544, member: 189011 wrote: HDMI 2.0 does not support VRR in its spec so AMD's Freesync over HDMI is a proprietary solution that Nvidia was probably not keen to reverse engineer. HDMI 2.1 supports VRR in its spec which is why it works with Nvidia GPUs, even ones with HDMI 2.0 ports. Who knows what the exact difference is.
chameleoneel, post: 1045074153, member: 121404 wrote: There's nothing proprietary about Freesync on HDMI. Freesync is an open standard. Nvidia is just being weird and stubborn about how they support VRR.
Surprisingly enough (other than PnP identification) there is NOTHING proprietary about VRR over HDMI.

These are ALL identical at the signal layer level:
  • VESA Adaptive-Sync (metaphorically the layer zero)
  • HDMI VRR (the HDMI version of VESA Adaptive Sync)
  • FreeSync (just merely AMD certified VESA Adaptive Sync)
  • G-SYNC Compatible (just merely NVIDIA certified VESA Adaptive Sync)
All it is is simply the standard varying Back Porch technique. They are 100% adaptorable to each other, although the problem is discovery (HDMI EDID translated to DisplayPort DisplayID and vice versa).

(The plug and play signalling at startup is totally different, even originally proprietary, despite the VRR video signal being perfectly identically generic on DP and HDMI. That’s why VRR plug-and-play identification is usually lost when you do a HDMI-to-DP adaptor or DP-to-HDMI adaptor, or using an old HDMI spec. But there are workarounds if you have access to ToastyX CRU and an AMD card to force-override things)

Image

Generic VRR (*All* forms of VESA Adaptive Sync) is merely a dynamically realtime variable-sized Vertical Back Porch. The signal stays the same Horizontal Scan Rate, but the Vertical Total of the signal varies via the graphics card real-time varying the VBI size (via a variable-sized Back Porch).

Now, if you’re a Gen X who was born in the 1970s, you will remember these analog TVs and their “VHOLD” adjustment:

Image

VRR is just fundamentally a dynamic variable-height VHOLD black bar spacering between refresh cycles! Porches are simply analog overscan for all 4 edges, and sync intervals are simply signals to begin next pixel row (horizontal sync) or to begin next refresh cycle (vertical sync).

But signal timing wise, and signal sequencing wise, it’s a 1:1 mapping between analog domains and digital domains — the same Custom Resolution Utility numbers can be used digitally or analogly for two different displays (one digital, one analog) with compatible overlapping specs and blanking interval size support.

Sure, HDMI & DP adds a packetization/compression layers on top of this generic signal, but after demultiplexing, decompression, and removing audio packets or USB packets, and then passing ONLY the display signal to the monitor’s motherboard — it is just a signal 1:1 temporally mappable between analog and digital doains. You could have 3580000 color pixels per second matching the NTSC 3.58 MHz colorburst, or double-it-up (to compensate for various Nyquist factors, or use a different frequency to solve other analog artifacts) and oversample it, but it’s still a temporal 1:1 analog-digital or digital-analog conversion between VGA and HDMI(or DP), or vice versa.

Yes, color depths may vary (e.g. 8bit, 10bit, 12bit, floating point, HDR, etc) but at that stage, it’s just mostly mapping digital values to analog voltage levels, and then merging analog signals via passive electronics components (e.g. to convert RGB to YPbPr or composite or RF or vice versa). You could do it digitally too, but early adaptors were just passive components, and that’s all the minimum you need, really to adaptor between the analog signal formats (of the same timings & resolution aka “same Custom Resolution Utility numbers”). A compatible passive adaptor stage or rudimentary (practically bufferless) digital-to-analog adaptor — that’s how 1:1 temporally mappable digital/analog video signals are.

And this is also true for VRR:

HDMI VRR works on analog CRT Tubes

HDMI VRR -> HDMI to VGA adaptor -> proper MultiSync CRT tube!

Successful VRR tests done were done on a Compaq 1210 CRT and Mitsubishi Diamond Pro 2070B CRT tube, usually approx 56Hz-100Hz or 56Hz-120Hz VRR ranges (the MutliSync tube’s refresh rate range).

Generic VRR is 100% in-spec with a MultiSync CRT tube, it’s just merely ~100 gentle mode changes per second (unchanged scanrate, unchanged resolution, unchanged horizontal sync, only variable VBI).

The refresh rate “mode change” is done exactly (more or less) to the microsecond between refresh cycles, at an exact point in the VBI (Vertical Back Porch, right after the Vertical Sync scanlines).

VRR signal is clearly done in a way that is analog-compatible too, and some MultiSync tubes have ranges wide enough to be retroactively accidentally VRR compatible, even the first MultiSync tubes from the 1980s and 1990s.

With ToastyX to force VRR without being commanded to do so via an EDID. Most VRR problems on both HDMI and DisplayPort is mostly political as well as plug-n-play discovery incompatibilities (intentional & unintentional).

For example, if you go “by the book” and look at a few bytes (or so) in the EDID to get the VRR range (a specific custom CEA-861 extension block), but you can just simply “ignore” it and blindly blast VRR out of the graphics output.

Blind VRR output occurs on Radeon cards when forced to do so via a Windows Registry based EDID override. Boom. Blind VRR spewing out, which then can be forced into any VGA / RGBHV / SCART / DVI / custom video connector, as long as the display is true multisync without an aggressive mode-change-blackout “cop electronics” built into the circuit board.

Even HDMI 1.0 is VRR compatible when “forced” to be. I even got VRR to work on DVI too! Not all panels are compatible, but a lot were (for Laptop power management; invisible Hz change to lower to save battery power), which automatically meant accidental compatibility with VESA Adaptive Sync.

So apparently, many year 2000-2010s DVI LCDs worked with a narrow VRR range via the trick, such as 50Hz-75Hz VRR ranges on a 2006 DVI LCD. This is in a major discussion thread on Guru3D forums from early experimentation.

It’s a symmetric 1:1 signal-temporal and signal-sequence mapping between analog domain and digital domain!

See crosspost below: That’s how generic HDMI VRR is, chrissakes. :)

I also will crosspost something I posted on ArsTechnica about HDMI VRR being so generic that it works on CRT tubes!
“mdrejhon comment on ArsTechnica” wrote:
Dputiger wrote: The first monitor I ever used was an NEC MultiSync II+. I'm sure it wouldn't actually be compatible -- it didn't even have a "modern" 15-pin connector -- but I'm tickled by the idea that CRT technology could pull this off at all.
Who knows -- you might be surprised.

Apparently, unlike for newer multisync CRTs which had more mode-blackout electronics -- the older multisync CRTs are "accidentally" more compatible with FreeSync VRR / HDMI VRR / VESA Adaptive-Sync VRR (all identical in the technique of signal timing).

You know, the older curved multisync tubes with fewer digital features "to get in the way" of VRR.

Mode-change blackout electronics were actually a later addition to early multisync tubes, because early video sources changed modes in a very non-surgical and unsynchronized manner, creating a lot of artifacts/noise during mode changes, from picture rollings, parallelograming, temporary flicker, and other weird distortions. Later multisync tubes added mode-change blackout electronics to hide the distracting mode-change junk. But, VRR is simply surgically-precise mode changes that occur at exact times between refresh cycles, and many CRTs don't even flinch with that type of minor mode change, as if it was not even a mode change.

As long as they were truly multisync and the monitor was old enough not to have a mode-change blackout firmware cop -- these monitors tend to slew okay with continuous variable intervals between refresh cycles, as long as it only affected the vertical blanking interval (unchanged horizontal blankings, unchanged resolution, unchanged horizontal scan rate).

VRR is cleverly designed to keep those parameters unchanged while temporally varying the time interval between refresh cycles.

Another way to view this is that these are surgically precise mode changes timed exactly between refresh cycles, that keeps most signal parameters unchanged except for simply adding/removing scanlines in the VBI in order to adjust time interval between refresh cycle scanouts.

Another way to view VRR converted to the analog domain -- is it's simply a variable-thickness VHOLD black bar:

Image

The horizontal scanrate is unchanged (number of scanlines per second -- aka number of pixel rows per second). It is just the number of scanlines in the VBI (Vertical Blanking Interval -- that hidden tiny pause between the refresh cycles) that varies between refresh cycle. Most mode-change blackout algorithms only blacks out when the horizontal scanrate changes -- so oftentimes there is no mode-change blackout for variable-height VBIs. But it depends on the analog electronics (or digital firmware behavior)

However, some CRTs will behave a little weirdly such as vertical size or vertical position changes that are proportional to the size of VBI (interval between refresh cycles). But as long as the refresh rate slews at reasonable speeds, it can be relatively imperceptible since many of these tubes are designed to automatically compensate overscan. Some of these "analog VRR artifacts" are fixable by reducing the VRR range, albiet not always.

One of the first persons who discovered FreeSync works in the analog domain, is in the Guru3D Forums: CRU Tips and Tricks. The analog VRR trick will also function over BNC connectors including separate and combined sync (RGBHV, RGBS, RGsB) with the appropriate adaptors, which will adaptor okay to a 9-pin connector.

I would guesstimate half of old "true" multisync tubes can be coaxed to function with a limited VRR ranges, such as 56-72, 60-100, 55-120, etc -- range experimentation is required via ToastyX Custom Resolution which allows you to create custom Windows registry-based E-EDID overrides with CEA861 extension blocks, including the FreeSync range identifier for min/max Hz.

The CRT doesn't need to support EDID at all; the only purpose of this Windows registry-based EDID override (for FreeSync ranges) is to trick the GPU to output FreeSync anyway to a non-FreeSync device or a device that doesn't have any EDID at all!

(This works successfully on Radeon GPU cards, but not currently on NVIDIA cards -- they're more picky about outputting VRR to unidentifiable screens).

Then you use a cheap HDMI-to-VGA adaptor (the almost completely unbuffered type, that just dumbly does 1:1 timings sync -- no modifications to blanking intervals -- more likely to simply verbatim digital VRR into analog VRR with identical signal timings of the original digital VRR).

This resulting VGA VRR can then be appropriately adaptered further (9pin RGB, or 3/4/5 RGB BNC), and FreeSync VRR gets preserved through any of those analog domains successfully -- fundamentally, it's just merely simply variable-interval sync signals, rather than a tick-tock metronomic sync signal of traditional raster fixed-Hz.

1950s and 1960s vector tubes (line drawing tubes) were always variable refresh rate since it had to dynamically lower refresh rate to support more lines per refresh cycle. So CRTs have no inherent laws-of-physics restriction on variable refresh rate, its refresh cycle is pretty much completely controlled by the signal timings.

When you played Star Wars arcade machine in 1983, on a color vector tube -- it was variable refresh! If you played the game often, then you noticed it start flickering when it had more lines (like during the the Death Star explosion) -- the refresh rate was slowing down in real time while the death star exploded!

A 1980s and 1990s raster-based multisync tube has enough flexibility to support a limited VRR range at low resolutions, while technically staying 100% in-spec (despite literally ~100 surgically-precise mode changes per second), as long as its mode-change blackout electronics isn't aggressive.

For the analog VRR experimenter, you want to derive your analog VRR timings your highest-Hz low resolution mode (e.g. 800x600 120Hz on a 1600x1200 60Hz CRT tube). Then you have a 56-120Hz 800x600 VRR mode on CRT. If your max Hz was 85Hz at a specific resolution, your VRR range may be narrower such as 56 Hz through 85 Hz.
Frankly speaking, it’s amazing we still use the same signal topology between a 1920s TV broadcast (Baird / Farnsworth) and a 2020s DisplayPort cable. Left-to-right, top-to-bottom, with synchronization markers. What was CRT control (overscan, next scanline, or next refresh cycle) signals are now essentially digital comma separators, but they are 1:1 conversions.

4K signals even have a lot in common with a 1980s Japan MUSE Analog HDTV signal, in that Muse HDTV was a Vertical Total of 1125, and 4K just doubles that to 2250. We kept the same Vertical Totals (VBI size) when going from HD analog to HD digital. Same numbers in ToastyX Custom Resolution for both analog and digital! So I can drive an NVIDIA card straight into a 1980s Japanese HDTV, even today. Or force an AMD card to do VRR on a NEC MultiSync CRT tube.

The only truly “proprietary” VRR is the G-SYNC native module, but even underneath that is just a massive enhancement layer on top of VESA Adaptive Sync (such as really, really good variable overdrive algorithms). Just enough enhancements that makes it locked in to NVIDIA cards, but increasingly even newer NVIDIA G-SYNC chipped monitors now work with AMD cards via the G-SYNC Compatible mode — so slowly there is some interoperability if you’re careful to do your homework on recentness of display.

The G-SYNC premium is worth it to many, but it’s an individual’s decision to choose whether or not it is worthwhile. Nontheless, the rest is generic VRR mainly with minimum improvements necessary to pass certification (by either AMD or NVIDIA or both).

That said, it is neat we just stuck to the same signal topology for 100+ years. It’s merely just serializing 2D image into a 1D wire or broadcast, and we’ve kept a 1:1 mapping from analog to digital.

And VRR is just a “minor” modification of real-time dynamically adding/removing overscan above top edge of screen to temporally delay the next refresh cycle until frametime is ready.

The visible screen doesn’t care whether the signal’s overscan is 1 pixel row above top edge, or 1,000 pixel rows above top edge, they’re just a clever trick to remove the fixed-frequencyness of refresh cycle.

It’s brilliant how simple and generic VRR is bolted-on to the existing display signal status quo.

Retroactively, in theory, you could use the old PowerStrip API to add VRR support to the first Radeon GPU / first NVIDIA Riva TNT, via a special Windows 95/98 driver that does real time porch changes, with specific compatible MultiSync CRT tubes.

In fact, I actually suggested raster VRR in year 2004 in the doom9 forums, as a genlock hack for unreliable-varying-sync of VHS VCR signals input into Hauppage TV cards, to eliminate stutter by varying the VGA refresh rate to match the ~0.01-to-0.1Hz variances of an analog VHS VCR from simple sheer VCR tape motor speed error margin. This simply just underscores how generic VRR is.

P.S. Related note on this incredible chain of mostly uninterrupted but rarely utilized legacy backwards compatiblity — I can use Windows ToastyX CRU software or use Linux “Modelines” to doctor VGA to run on pratically any legacy display — I can get an an semi-old ATI/AMD Radeon VGA-output card to almost direct-drive a 1940s North American TV set (via Custom Resolution Utility to run the scanrate at NTSC compatible frequency + connect 15.3KHz VGA green wire to the luma NTSC to a luma contact point next to one of the vaccuum tubes + appropriate impedance adjustment via discrete analog components like potentiometer; in those pre-composite-input days where you have to hack a 1940s/1950s TV set to connect composite-luma directly via VGA green wire but it’s just a few simple components). Now, today, HDMI/DP is 1:1 timings-mappable to analog VGA (and its complete superset of possible signals that also even includes compatiblity with 1940s signal frequencies with only homebrew passive adaptor consisting merely of resistors/potentiometers). But you can just use simple mere passive linear electronics components like resistors/capacitors to MacGyver-adaptor the green-color wire (or one primary color) of VGA into an old monochrome vaccuum tube TV set, assuming the VGA output was ToastyX-CRU-software-modded to 15300 pixel rows per second (15.3 KHz scan rate — the scan rate of legacy NTSC).

Some MAME arcade CRT tube builders use this trick to trick an old VGA output into an RGB SCART / NTSC compatible tubes, with appropriate extremely simple lag-free passive adaptoring, to generate 240p or 480i graphics for compatible games. At least until graphics card stopped having native VGA outputs.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

Post Reply