nimbulan wrote:nVidia doesn't believe that any monitors on the market currently support variable refresh rate
I think there's some truth to both sides:
1.
AMD Side -- I think the key distinction is laptop LCD controllers that already automatically slow down the refresh rate to save power. I have had a Lenovo ThinkPad at one of my former jobs, and that's what it did. It seems FreeSync is hijacking that as a method of eliminating stutters/tearing. It makes sense, provided the controllers are flexible enough to let that be done.
2.
NVIDIA Side -- I think NVIDIA needed the custom FPGA to flawlessly pull off what they needed on modern 120Hz & 144Hz panels. Flawless, flickerfree, operation of variable refresh rate over a wide refresh rate range (30Hz through 144Hz) is far more difficult than variable refresh rate over a small range (e.g. 40Hz-60Hz). You've got the massive bandwidth of full HD resolution going 144 cycles a second -- a far cry from laptop LCD.
Some unanswered questions:
-- Wondering how realtime variable the AMD implementation of VRR is. Every single frame? Just a few times a second?
-- Wondering what the refresh rate range the AMD implementation of VRR is. 30 through 60Hz?
You need single-frame granularity for true VRR. GSYNC is single-frame granular; which means NVIDIA can change refresh rate over a hundred times a second -- every single frame. If AMD is a slowly-slewing version or creates artifacts (e.g. even the faintest noticeable flicker during fast refresh rate changes, can be annoying), then it will not be as good as NVIDIA's. However, even if it is a problem now, I don't see any reason why future manufacturing of LCDs can solve this issue, for better embedded/included FreeSync/GSYNC implementations without needing a FPGA.
More monitor makers and HDTV makers need to pay attention to this now. We need to put Freesync/GSYNC in televisions too...