Solar wrote:$200 actually isn't that bad considering the FPGA used in G-Sync appears to retail at roughly $1000. As long as we have access to the features it might offer such as being programmable for mods or future updates, it should be well worth the price.
Wow -- which FPGA would that be? I guess they haven't come up with dedicated circuit (ASIC) for G-SYNC use just yet, at least for 1920x1080p 120Hz.
AMD's FreeSync isn't running at 1920x1080 120Hz, so presumably, the FPGA is necessary here.
Either way, the motion quality during 40fps-50fps looked like a greater-than-$200 GPU upgrade, e.g. it sort of looked like I upgraded the GPU massively (more than $200 worth!) for recent highly-variable-framerate games such as Crysis 3 and Battlefield 4, the use of G-SYNC really helps since those games are often too slow to benefit well from LightBoost.
Although I prefer 120fps strobed, my eyes clearly preferred smooth 45fps G-SYNC (fps=Hz) over stuttery 80fps non-G-SYNC (fps!=Hz) non G-SYNC. G-SYNC gave Crysis 3 a permanent "framerate matching Hertz" look even when the framerate fluctuated. Only rare stutters (caused by disk loading / background computer activity) occured, all the remaining GPU stutters went away. The animation at
http://www.testufo.com/stutter#demo=gsync shows how G-SYNC can pull off seamless, stutterless framerate transitions. (Well you do get the low framerate feel regular stutter at 30fps, but you don't get any erratic stutter during frame rate transition). If I was using a GPU that limited me to 50fps frequently (e.g. even Titan during Crysis 3 at Ultra settings, get brought to knees very often), G-SYNC improves motion quality more than LightBoost does.