Edmond wrote:I dont believe it will ever be possible to have framerate = refresh = strobing. Not in real world gaming.
Not true -- at least for 1080p 100Hz/120Hz strobing. Many of us are routinely getting it with our Titan or 780 SLI's, even in Battlefield 4 (at reduced detail) and Crysis 3 (at slightly reduced detail, e.g. FXAA). We spend extra on our graphics card to get the framerate == refreshrate == stroberate experience which is quite stunning, and often worth the extra price on GPUs.
Certainly not consistently in all games, but I enjoyed a perfect (95% of time) 120fps@120Hz in Bioshock Infinite framerate locked with almost all details maximum (shadow notched down 1, view distance notched down 2, and using FXAA). That's a solo game, often worth turning VSYNC ON to get the butter-smooth effect during the rail sliding. Also, some of us intentionally wait 1-2 years before buying games, and thusly easily play 120fps on current cards on those, for the true triple-digit framerate experience on 120Hz monitors. I was able to play 100fps@100Hz in Borderland2 on a mere Geforce GTX 680, and all my older Source Engine games (CS:GO, Portal 2, etc) could easily do framerate-refreshrate synchronized motion, which is what I always loved to play for years with, and wanted to also maintain. For competitive I do turn off VSYNC, but when I'm playing solo, I use VSYNC ON, and slightly notch detail levels down to get the 'perfect motion effect' (TestUFO smooth effect / Nintendo Super Mario Brother pan smooth) in most of my solo games, since very slightly decreasing detail levels actually improve motion detail, since motion clarity is no longer obscured by stutter. So motion clarity nuts striving for the sharpest and most detailed graphics motion (unobscured by stutter/blurring) strategically find certain detail settings that make a big impact (e.g. switching to FXAA). Obviously, this isn't the goal of everyone in these forums.
Also, some models of monitors have ultra-flexible strobe rates, such as Z-series from 60Hz through 144Hz, so you can select a strobe rate which isn't beyond your GPU capacity, and at some comfortable strobe threshold that doesn't show visible flicker (e.g. 85Hz).
As for 1000fps -- never say never. May take years, decades, lifetime, etc. Researchers have tested experimental refresh rates beyond 10KHz, and some consumer HDTVs are capable of displaying 240 discrete images per second (via interpolation). There is work done on onboard-GPU ultrawlow-latency interpolation, to convert a framerate into a higher framerate, so there is work being done in the lab. Strobing, however, is here to stay for a long time.
Now, the arguments of lag vs stutter for VSYNC ON/OFF obviously are valid, but VSYNC OFF FPS shooters isn't always "real world gaming" for every single individual for every game, there are those who play Angry Birds, likes solo RTS games, does solo games like the Bioshock/Tomb Raider series, etc.
framerate == refreshrate == stroberate (with an ultrasmooth 1000Hz mouse) needs to be seen to be believed, and is worth the price of admission of a $1000 GPU for some of us, and some of us spent $4000 on GPUs to get the triple digit framerate that LightBoost benefits (see
120fps Crysis3 on THREE monitors, by CallSignVega -- that's almost enough to push 4K 120fps already). The perfect silky stutter-free smooth CRT effect is hard to achieve, but several of us are routinely achieving it.
See
PHOTOS: 60Hz vs 120Hz vs LightBoost, the photographic comparision is only maximized/accurate during framerate = refreshrate = stroberate, as well as the
LightBoost testimonials where the
"wow" experience during strobing mainly occurs at framerates near stroberates; all these real-world wows mainly come from people able to achieve triple-digit framerates. Strobing reduces motion blur by 80-95%+ (Z-Series can reduce motion blur by up to 97%), theoretical maximum possible motion blur elimination only occurs at framerates matching stroberates, and the BENQ Z-Series monitor has less motion blur than a typical CRT (which also achieve best possible IQ during framerates matching CRT flickerrate, aka its refresh rate).
Some of these strobed LCDs has less motion blur than CRT monitors. BENQ Z-Series are capable of
0.5ms persistence -- and easily successfully passes the
TestUFO Panning Map Test with crystal-sharp readable map labels at all speeds (right up your human eye's maximum tracking speed) -- panning map is as sharp as a paper map being waved past your eyes with absolutely no motion blur -- so it translates directly into greater image detail during panning/strafing/turning/camera spinning/etc (as long as the game is running at framerates matching refreshrate matching stroberate), much like a CRT can. Blur reduction is very sensitive to stutters (as lack of motion blur makes stutters easier to see). This causes some of us people around here, to pay extra on our GPUs to achieve the super-silky-smooth experience during strobing. It's not an important thing to everyone, but
we have lots of enthusiasts here that actually get real-world framerate = refreshrate = stroberate here using the various
strobe backlight technologies now available.
That said, GSYNC is much easier and friendlier on GPUs, and definitely dramatically improves the motion clarity of non-strobing situations. Random framerate motion looks so much better and smoother. Motion blur will always be bottlenecked by the persistence of the current framerate (e.g. 1/144sec persistence during 144fps, rather than achieving sub-frame persistence like strobing is able to).