Good idea about 120fps capping (or even 240fps with future true-240Hz monitors) to get best twitch streams most of the time, while still retaining GSYNC benefits.
Now, on the subject of GSYNC, but on a slightly different topic (than this thread) -- in a way that still helps enhances understanding of GSYNC:
Sparky wrote:As for g-sync staying synchronized(with the game engine) to the exact millisecond, I don't think it's that good, because the game engine decides it's animation interval based on previous frames, it can't predict exactly how long this frame will take to render.
Exact millisecond is happening routinely.
This is true, some games are much better than others on this. The accuracy of rendertime is, however, sub-millisecond in the best games and sometimes negligible. The games that GSYNC fixes stutters most completely, would be the ones that have frame render-times most accurately in sync with frame present-times (this is a necessary property of GSYNC stutter-improvement). The games with the least stutters are the ones that have far more consistent rendertimes. Usually, framerates will ramp up/down smoothly, so rendertimes will ramp up/down smoothly.
It continuously looked like permanent "fully capped out VSYNC ON silky motion effect" despite a varying frame rate. As I was turning through 50fps, it looked like perfect 50fps@50Hz. As I was turning through 60fps, it looked like perfect 60fps@60fps. As I was turning through 70fps, it looked like perfect 70fps@70Hz, and so on. This is one of the bigger benefits of GSYNC; the ability to make a game maintain the "perfect smooth capped-out look" at fluctuating frame rates.
When I did this, it appeared stutter free (at least random-stutter free; I still see the regular stop-motion-effect of low frame rate motion when I go closer to 30fps -- but the erratic stutters are gone) -- I couldn't see stutters normally associated with framerate changes; it was smooth framerate ramping effect like a CVT (continuous variable transmission) with no stepped effect (erratic stutters). This situation is where GSYNC really shines, otherwise GSYNC isn't really all that useful of a feature; it looked that 40fps GSYNC looked visually better than floating 50-70fps VSYNC OFF (less stutter, less tearing)
Assuming you take approximately 1 second to smoothly move from 50fps to 100fps, GPU rendertimes vary only gradually. The frame rendertimes would vary by only hundreds of microseconds in consecutive frames relative to photons hitting the eyes For example at 50fps going to 51fps in next frame, you'd get, say 1/51th variance to 1/50th sec (~400 microsecond GPU rendering variance), and for 99fps going to 100fps in next frame -- smoothly going to 1/100th variance of 1/99th sec (~100 microsecond GPU rendering variance). For this, the dis-synchronization of rendertimes away from presenttimes would be a statistically insignificant factor, and GSYNC performs admirably as advertised.
Now, when frames go from a single-frame-granularity randomization, e.g. 50fps suddenly going to 70fps suddenly 30fps, suddenly 100fps, there would be enough variance between game times and the light hitting eyes (due to GPU rendering variances) to create stutters that show up through GSYNC. Some game engines still stutter with GSYNC, so I would presume that part of this effect is because of this. Explosions, sudden physics processing, trigger mechanisms, door opening, can create massively dramatic framerate swings that amplifies stutters, but if GPU rendertimes still remain reasonably consistent (e.g. 60fps suddenly changes to 45fps next frame, you're getting a GPU rendertime change of 1/45th of 1/60 = only about a 370 microsecond change in GPU rendertime during a single-frame sudden change from 60fps down to 45fps. GSYNC is perfectly capable of smoothing that stutter out, so reasonable sudden changes in framerates can still noticeably look much more stutterfree.
Obviously, the error margin is probably bigger than this as all the factors add up (e.g. CPU time variances, mouse poll time variances) but GSYNC becomes the statistically insigificant cause of microstutter in this situation. I even think that the world is now ready for 2000Hz mice (reduce microstutter variances to 0.5ms) since today, it is already known by Blur Busters audience that LightBoost and GSYNC massively amplify the visibility of microstutters of 500Hz versus 1000Hz mice -- giving a hint that the world might even begin to be ready for 2000Hz computer mice (once sensor technology is good enough), especially during a future era of strobed >100Hz+ 4K (since higher resolutions, low persistence & clearer motion makes microstutters easier to see).
Theoretically you could "frame-pace" (sort of like what you have to do for SLI) -- e.g. buffer the frames, and then time the actual display of the frames to be correctly relative to game time (adding input lag, in exchange for further stutter removal for wildly fluctuating frame rates), compensating for GPU frame rendering time, but as far as I know, GSYNC drivers don't currently do that depth of stutter removal (Addition of "frame pacing" to GSYNC), but according to
this NVIDIA diagram NVIDIA isn't currently framepacing GSYNC, otherwise, panel displaytimes would stay perfectly in sync with the very _beginning_ of GPU rendertimes (rather than _end_ of GPU rendertimes).
As you pointed out -- currently, GSYNC makes panel display times in sync with _end_ of GPU rendertimes:
Which is often good enough, at least for a lot of game engines (apparently) since stutter elimination benefits are already immediately observed. I see many situations where GSYNC at 70-100fps look smoother than triple-buffered 300fps (microstutter error margin of 3.3ms), so I am not surprised if GSYNC during normal situations can get to sub-millisecond precision. Perhaps this is a situation where Blur Busters would love to borrow a Phantom Flex, to determine how accurate frame times stay in sync with game times.
The ideal perfect motion scenario is if GSYNC is modified/optimized further to stay in sync with _beginning_ of GPU rendertimes, so that game times are perfectly in sync with panel times (In other words: Light from a new frame, hitting human eyes at a time synchronized accurately with game time). However, this would help certain problem game engines, and not all situations, and would necessarily add a slight amount of input lag necessary to achieve frame pacing (roughly equivalent to SLI input lag)... The ideal motion secenario would not be the ideal latency scenario, as frame pacing necessarily adds lag.
In theory, it is possible that future NVIDIA drivers might add an extra mode ("enhanced frame-paced GSYNC") that synchronizes refresh timings relative to the _begin_ of GPU rendertimes, rather than _end_ of GPU rendertimes, that would smooth-out stutters in even more games. But I don't know if enough "problem" game engines exist that such further GSYNC enhancements might benefit.
Either way, good discussions, and I know NVIDIA is paying attention to these forums, they're probably getting ideas (at least for the distant future) already as we speak.
What you are saying is a good idea about 120fps capping (or even 240fps with future true-240Hz monitors) to get best twitch streams most of the time, while still retaining GSYNC benefits. However, GSYNC makes fluctuating framerate no longer matter anymore because fluctuating 50-100fps looks like "consistent perfectly capped out look" despite fluctuating frame rate; that's the GSYNC benefit -- the luxury of letting framerate fluctuate and keeping it looking smooth throughout. Games like Battlefield 4 and Crysis 3 really shine here where you don't even notice the difference of 40fps versus 50fps (except a 40/50th difference in motion blurring = 25% difference in motion blur -- but since motion blur varies in a subtle way, e.g. motion blur increases/decreases by fractions of pixel per frame, during subsequent frames -- that you do not notice the subtly varying motion blur effect of GSYNC which is far less noticeable than tearing/stutter). GSYNC exists with the ability to make variable framerates look perfect-looking, and keeping the frame rate fixed defeats a lot of the maximal benefits of GSYNC.
I am kind of getting offtopic, I know -- but this discussion is very interesting -- from a vision science perspective, nontheless.