Question on 21:9 g-sync possibility

Talk about NVIDIA G-SYNC, a variable refresh rate (VRR) technology. G-SYNC eliminates stutters, tearing, and reduces input lag. List of G-SYNC Monitors.
Post Reply
Edmond

Question on 21:9 g-sync possibility

Post by Edmond » 08 May 2014, 22:36

Hi,
i am super excited about g-sync, like ecstatic and stuff...

And it seems to take half a year for companies to release 1080p tn gsync monitors, maybe longer... we arent there yet.
Not to mention ive been spoiled by one of them 21:9 monitors.
Yes, its an ips blurfest... but that bezel free fov in games is so awesome. Anyway, i cannot possibly describe how much i want a 21:9 gsync monitor, but i guess it will take 3 -4 years, if even anyone cares enough to make one. A low persistence oled panel for that would be the bomb, but thats prob 10^googl years in the future...

ANYWAY
If modding any of the existing 21:9 panels with gsync is possible i would certainly try. (and i do not care how the panel looks afterwards)

http://www.panelook.com/modelsearch.php ... on_state=1
http://www.panelook.com/modelsearch.php ... on_state=1

As far as i can tell, these are the four currently made 21:9 panels on the market. The pixel response on the 29" ones is 14ms, which is probably correct. But the pixel response on the 34" is 5ms, which i think is bullshit - thats that gtg marketing response time imo.
The 14ms 29" prob are the ones to go with(14ms is below that 16.3ms@60hz); and they do say lvds for the connection type.

I would just like a comment on this... or am i way out my league here?

User avatar
Chief Blur Buster
Site Admin
Posts: 11647
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: Question on 21:9 g-sync possibility

Post by Chief Blur Buster » 08 May 2014, 22:55

Good first post!
Welcome to Blur Busters Forums.
Edmond wrote:Hi,
i am super excited about g-sync, like ecstatic and stuff...
We are too! It may be a year before they are widespread GSYNC choices (engineering, R&D, you know). But the flood of GSYNC is coming eventually.
Edmond wrote:A low persistence oled panel for that would be the bomb, but thats prob 10^googl years in the future...
Oculus development kit 2 is a low persistence OLED. You can get simultaneously blurfree/stutterfree/tearfree on it in some games with great FOV during VSYNC ON 75fps@75Hz motion (only compromise is a smidgent of lag, so you have to do a few tricks to reduce VSYNC ON latency).
Edmond wrote:If modding any of the existing 21:9 panels with gsync is possible i would certainly try. (and i do not care how the panel looks afterwards)
We got fun monitor modders here in Area 51. Cirthix created a homebrew 240Hz LCD.

The challenge for homebrew of GSYNC is that it is a 2-way display technology with the display communicating back to the computer, using an NVIDIA proprietary protocol. In the future, one could try transplanting a 2560x1440 GSYNC board (once those arrives) to drive a 2560x1024 monitor, hopefully without too much difficulty.

Two 1440p GSYNC boards will exist -- one for the ASUS ROG PG278Q, and one for the Overlord Tempest X270OC (scribby's working on it). Theoretically, this would be the lowest-lying apple -- buying the 1440p GSYNC Tempest and transplanting the GSYNC board (with some mods) to a 21:9 monitor -- or you can contact Scribby and bribe him with a large venture capital "donation" :D :D :D
Edmond wrote:As far as i can tell, these are the four currently made 21:9 panels on the market. The pixel response on the 29" ones is 14ms, which is probably correct. But the pixel response on the 34" is 5ms, which i think is bullshit - thats that gtg marketing response time imo. The 14ms 29" prob are the ones to go with(14ms is below that 16.3ms@60hz); and they do say lvds for the connection type.
Response time has nothing to do with refresh rate.
You can have 33ms LCD's doing 60Hz (response slower than refresh rate)
Or 1ms LCD's doing 60Hz (response faster than refresh rate)
GtG transition is just simply pixel inerta that is independent of how frequently voltage hits the pixel...
high speed video of LCD panels being refreshed

In fact, pixel response becomes mostly meaningless below half a refresh cycle (e.g. I can't easily tell apart 2ms and 5ms GtG transition response during a 16ms refresh cycle at 60Hz). Below that, persistence becomes the dominant motion blur factor (I am able to tell apart 0.5ms, 1.0ms and 2.0ms persistence via TestUFO motion tests).
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

Edmond

Re: Question on 21:9 g-sync possibility

Post by Edmond » 09 May 2014, 16:02

Chief Blur Buster wrote:Welcome to Blur Busters Forums.
Thank you!
Chief Blur Buster wrote:The challenge for homebrew of GSYNC is that it is a 2-way display technology with the display communicating back to the computer, using an NVIDIA proprietary protocol. In the future, one could try transplanting a 2560x1440 GSYNC board (once those arrives) to drive a 2560x1024 monitor, hopefully without too much difficulty.

Two 1440p GSYNC boards will exist -- one for the ASUS ROG PG278Q, and one for the Overlord Tempest X270OC (scribby's working on it). Theoretically, this would be the lowest-lying apple -- buying the 1440p GSYNC Tempest and transplanting the GSYNC board (with some mods) to a 21:9 monitor -- or you can contact Scribby and bribe him with a large venture capital "donation"
Thats actually one of the best things i could learn! I now have a plan!

If by the time the tempest gsync monitor starts appearing there has not been any rumors or anything of a legit 21:9 gsync panel i will look into how much would it cost me to try and transplant the tempests guts into a 21:9 panel. I mean, i HAVE to try, otherwise the blind endless waiting will just drive me cray cray.
Chief Blur Buster wrote:Oculus development kit 2 is a low persistence OLED. You can get simultaneously blurfree/stutterfree/tearfree on it in some games with great FOV during VSYNC ON 75fps@75Hz motion (only compromise is a smidgent of lag, so you have to do a few tricks to reduce VSYNC ON latency).
If you mean that "x-1 fps trick" where x=your refresh rate. I am using that right now to fix tearing with v-sync and remove most of the lag introduced by vsync. Stuttering still remains... but hey, that would be too easy, right?

Also, apparently there is a 4096x2160 30" oled monitor from Sony coming this year. Cant wait to see hows the pixel persistence on that. But it is Sony, therefore expensive. And im guessing it wont have gsync either. Which is a real fucking shame - imagine getting a beautiful sharp monitor and then dealing with tearing/stutter/lag just like before. (As far as gaming goes anyway)

User avatar
Chief Blur Buster
Site Admin
Posts: 11647
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: Question on 21:9 g-sync possibility

Post by Chief Blur Buster » 09 May 2014, 17:06

Edmond wrote:Also, apparently there is a 4096x2160 30" oled monitor from Sony coming this year. Cant wait to see hows the pixel persistence on that.
I'm mighty curious to know if Sony will be using a rolling-scan on this 30" OLED like Sony already does with their PVM/BVM series. That would lower persistence significantly.
Edmond wrote:imagine getting a beautiful sharp monitor and then dealing with tearing/stutter/lag just like before.
You're able to get framerate == refreshrate == stroberate (despite the herculean resolution), then you can get something better looking than GSYNC. But you would have a bit of lag due to that. And it could flicker a lot, if it's just a 60Hz rolling scan.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

Edmond

Re: Question on 21:9 g-sync possibility

Post by Edmond » 09 May 2014, 17:35

Chief Blur Buster wrote:You're able to get framerate == refreshrate == stroberate (despite the herculean resolution), then you can get something better looking than GSYNC. But you would have a bit of lag due to that. And it could flicker a lot, if it's just a 60Hz rolling scan.
That oled screen in the last oculus headset is flicker free, right?
Because i see low persistence oled to be the only thing in existence that can give a minimal blur on flicker free screen (that isnt 1000hz).
And if we will actually see oled monitors start appearing this year... well, nirvana might be in reach after all.
(Add a cinematic aspect ratio, 100hz and g-sync to that = perfect monitor.)

I dont believe it will ever be possible to have framerate = refresh = strobing. Not in real world gaming. And even if its doable with an op pc -the lag and the flicker you get is again a compromise. In my opinion oled is our only way to get a no compromise screen.
I dont understand why there are things like IGZO panels and what not being developed when the real solution is low persistence oled and g-sync (or an open version of it).
And we are not going to ever get 500fps/hz or 1000fps/hz gaming for a flicker free low persistence experience. Not for the consumer market... (never say "never"), i highly doubt it i mean.

User avatar
Chief Blur Buster
Site Admin
Posts: 11647
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: Question on 21:9 g-sync possibility

Post by Chief Blur Buster » 09 May 2014, 19:07

Edmond wrote:I dont believe it will ever be possible to have framerate = refresh = strobing. Not in real world gaming.
Not true -- at least for 1080p 100Hz/120Hz strobing. Many of us are routinely getting it with our Titan or 780 SLI's, even in Battlefield 4 (at reduced detail) and Crysis 3 (at slightly reduced detail, e.g. FXAA). We spend extra on our graphics card to get the framerate == refreshrate == stroberate experience which is quite stunning, and often worth the extra price on GPUs.

Certainly not consistently in all games, but I enjoyed a perfect (95% of time) 120fps@120Hz in Bioshock Infinite framerate locked with almost all details maximum (shadow notched down 1, view distance notched down 2, and using FXAA). That's a solo game, often worth turning VSYNC ON to get the butter-smooth effect during the rail sliding. Also, some of us intentionally wait 1-2 years before buying games, and thusly easily play 120fps on current cards on those, for the true triple-digit framerate experience on 120Hz monitors. I was able to play 100fps@100Hz in Borderland2 on a mere Geforce GTX 680, and all my older Source Engine games (CS:GO, Portal 2, etc) could easily do framerate-refreshrate synchronized motion, which is what I always loved to play for years with, and wanted to also maintain. For competitive I do turn off VSYNC, but when I'm playing solo, I use VSYNC ON, and slightly notch detail levels down to get the 'perfect motion effect' (TestUFO smooth effect / Nintendo Super Mario Brother pan smooth) in most of my solo games, since very slightly decreasing detail levels actually improve motion detail, since motion clarity is no longer obscured by stutter. So motion clarity nuts striving for the sharpest and most detailed graphics motion (unobscured by stutter/blurring) strategically find certain detail settings that make a big impact (e.g. switching to FXAA). Obviously, this isn't the goal of everyone in these forums.

Also, some models of monitors have ultra-flexible strobe rates, such as Z-series from 60Hz through 144Hz, so you can select a strobe rate which isn't beyond your GPU capacity, and at some comfortable strobe threshold that doesn't show visible flicker (e.g. 85Hz).

As for 1000fps -- never say never. May take years, decades, lifetime, etc. Researchers have tested experimental refresh rates beyond 10KHz, and some consumer HDTVs are capable of displaying 240 discrete images per second (via interpolation). There is work done on onboard-GPU ultrawlow-latency interpolation, to convert a framerate into a higher framerate, so there is work being done in the lab. Strobing, however, is here to stay for a long time.

Now, the arguments of lag vs stutter for VSYNC ON/OFF obviously are valid, but VSYNC OFF FPS shooters isn't always "real world gaming" for every single individual for every game, there are those who play Angry Birds, likes solo RTS games, does solo games like the Bioshock/Tomb Raider series, etc.

framerate == refreshrate == stroberate (with an ultrasmooth 1000Hz mouse) needs to be seen to be believed, and is worth the price of admission of a $1000 GPU for some of us, and some of us spent $4000 on GPUs to get the triple digit framerate that LightBoost benefits (see 120fps Crysis3 on THREE monitors, by CallSignVega -- that's almost enough to push 4K 120fps already). The perfect silky stutter-free smooth CRT effect is hard to achieve, but several of us are routinely achieving it.

See PHOTOS: 60Hz vs 120Hz vs LightBoost, the photographic comparision is only maximized/accurate during framerate = refreshrate = stroberate, as well as the LightBoost testimonials where the "wow" experience during strobing mainly occurs at framerates near stroberates; all these real-world wows mainly come from people able to achieve triple-digit framerates. Strobing reduces motion blur by 80-95%+ (Z-Series can reduce motion blur by up to 97%), theoretical maximum possible motion blur elimination only occurs at framerates matching stroberates, and the BENQ Z-Series monitor has less motion blur than a typical CRT (which also achieve best possible IQ during framerates matching CRT flickerrate, aka its refresh rate). Some of these strobed LCDs has less motion blur than CRT monitors. BENQ Z-Series are capable of 0.5ms persistence -- and easily successfully passes the TestUFO Panning Map Test with crystal-sharp readable map labels at all speeds (right up your human eye's maximum tracking speed) -- panning map is as sharp as a paper map being waved past your eyes with absolutely no motion blur -- so it translates directly into greater image detail during panning/strafing/turning/camera spinning/etc (as long as the game is running at framerates matching refreshrate matching stroberate), much like a CRT can. Blur reduction is very sensitive to stutters (as lack of motion blur makes stutters easier to see). This causes some of us people around here, to pay extra on our GPUs to achieve the super-silky-smooth experience during strobing. It's not an important thing to everyone, but we have lots of enthusiasts here that actually get real-world framerate = refreshrate = stroberate here using the various strobe backlight technologies now available.

That said, GSYNC is much easier and friendlier on GPUs, and definitely dramatically improves the motion clarity of non-strobing situations. Random framerate motion looks so much better and smoother. Motion blur will always be bottlenecked by the persistence of the current framerate (e.g. 1/144sec persistence during 144fps, rather than achieving sub-frame persistence like strobing is able to).
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

Edmond

Re: Question on 21:9 g-sync possibility

Post by Edmond » 10 May 2014, 11:24

That was a very interesting read! Much appreciated! Feel like i learned !

I just cant shake the thought of being at the mercy of the game not dropping below that 120fps. Also, if thats the min framerate, you have a lot of gpu power idling away a lot of the time. But a consistent framerate is good.
Watched the video you linked and it seems some people have achieved a borderline nirvana and im jelly as hell.

I hope a more mainstreamed or better experience awaits us in the future however, but it will probably be a while before we get a 100+hz low persistence flickerfree oled with gsync.
Then the only thing left is pixel density high enough for AA to be obsolete ^^

But i hear dreaming too much in unhealthy.

Still looking forward to modding a 21:9 panel with a tempests gsync board tho.

Post Reply