Vive & Rift - actual resolution & frame rate demands?

Everything about displays and monitors. 120Hz, 144Hz, 240Hz, 4K, 1440p, input lag, display shopping, monitor purchase decisions, compare, versus, debate, and more. Questions? Just ask!
Post Reply
Manimal 5000
Posts: 53
Joined: 05 Jul 2015, 21:01

Vive & Rift - actual resolution & frame rate demands?

Post by Manimal 5000 » 01 Apr 2016, 19:25

For VR freaks,

If Vive/Rift run @ 90hz/fps per eye does that mean the GPU outputs 180fps? (one viewpoint per eye)
That would explain people getting twice the frame rate in 2D and the emphasis on GPU over CPU in VR performance tests.

And if resolution is 1200x1080 per eye does that mean actual 2400x2160 (again, each eye being different)
What's the perceived resolution? I guess 1200x1080 unless they "interlaced" the left/right eyes if u know what I mean.
That would double it? Do they do that? Don't see why not!

2400x2160 @180hz would be whoaaa lots of horsepower.

User avatar
sharknice
Posts: 295
Joined: 23 Dec 2013, 17:16
Location: Minnesota
Contact:

Re: Vive & Rift - actual resolution & frame rate demands?

Post by sharknice » 01 Apr 2016, 22:54

Nope
1200x1080@90hz x2 equates to 2400x1080@90hz

aeliusg
Posts: 145
Joined: 08 Sep 2014, 08:03

Re: Vive & Rift - actual resolution & frame rate demands?

Post by aeliusg » 02 Apr 2016, 12:30

In the worst case scenario, 3D requires the scene be rendered twice, but 2400x2160 is in reality 4x the rendering, not 2x, and is not relevant to the discussion at hand. Perceived resolution is still only going to be 1200x1080 each eye which your brain pieces together into something looking like 2160x1200 (the actual resolution of the Vive). Distortion from the optics and it being about an inch or two from your eyes will make it look much worse than an actual display of that resolution however.

Manimal 5000
Posts: 53
Joined: 05 Jul 2015, 21:01

Re: Vive & Rift - actual resolution & frame rate demands?

Post by Manimal 5000 » 03 Apr 2016, 00:52

Thanks guys my math was wrong....

RLBURNSIDE
Posts: 104
Joined: 06 Apr 2015, 16:09

Re: Vive & Rift - actual resolution & frame rate demands?

Post by RLBURNSIDE » 03 Apr 2016, 09:34

One really good thing about VR is that SLI will become useful again, without any programmer effort (just tick a checkbox in the driver and it will split the work into two perspectives).

AFR SLI is really complex work, I've implemented it in some big games and I'm glad it's going to be fully automated now. I might even build my next PC with SLI in mind.

Manimal 5000
Posts: 53
Joined: 05 Jul 2015, 21:01

Re: Vive & Rift - actual resolution & frame rate demands?

Post by Manimal 5000 » 03 Apr 2016, 09:50

RLBURNSIDE wrote:One really good thing about VR is that SLI will become useful again, without any programmer effort (just tick a checkbox in the driver and it will split the work into two perspectives).

Is VR is currently not taking advantage of SLI? It seems like a perfect match, but people taking Valve's performance test were saying SLI wasn't helping them. It was only used one card or something. I did take the test btw and failed miserably with my single 7870 of course.

RLBURNSIDE
Posts: 104
Joined: 06 Apr 2015, 16:09

Re: Vive & Rift - actual resolution & frame rate demands?

Post by RLBURNSIDE » 03 Apr 2016, 10:33

From what I understand having read some articles about LiquidVR and NVidia's work, is that the auto-SLI/X-F is currently implemented only in the high end cards and only on OpenGL. But that will surely change.

I think if I do get an SLI setup that it will be a dual GPU card instead of two cards. That allows me to focus on mini itx form factor. No longer a fan of huge cases. Dual GPU just makes so much sense for VR.

Manimal 5000
Posts: 53
Joined: 05 Jul 2015, 21:01

Re: Vive & Rift - actual resolution & frame rate demands?

Post by Manimal 5000 » 03 Apr 2016, 13:14

Sadly looks like no SLI / CF yet since game programmers are the ones not supporting it.

https://forums.oculus.com/community/dis ... /32096/sli

RLBURNSIDE
Posts: 104
Joined: 06 Apr 2015, 16:09

Re: Vive & Rift - actual resolution & frame rate demands?

Post by RLBURNSIDE » 08 Apr 2016, 16:07

The main place that SLI or Crossfire would add latency in a VR scenario is if re-combining the left and right eyes into a single frame for compositing into the Oculus framework adds a lot of latency. So that depends I think on the sync time between GPUs, and can be optimised at the card level.

What I'm saying is, let's ignore Oculus for a second and talk about SBS 3D stereo rendering (one frame, left side is left eye, right half is right eye). From what I understand about LiquidVR, they can do stereo 3D with dual GPUs at the driver level, by the game internally rendering one "middle eye" and then the GPUs offset the view projection matrices and render it twice, then recombine each result in half-horizontal resolution back into a single frame buffer.

If this can be done without adding a lot of latency, and I believe it can since SLI bridge chips or even PCI-E transfers are quite speedy (only have to transfer the right eye from the right GPU into the left GPU's frame buffer and that should be fast), then I think the Oculus Rift SDK can consume that frame buffer just as well (without any code changes) as a single GPU. It's possible that the Oculus SDK would need to query the system to see if a special "Dual GPU driver-level split rendering" mode were enabled, then call a sync command.

The way the Rift SDK works right now is that the game engine literally re-renders the entire scene twice, with slightly different view proj matrices per eye, and sends that single SBS stereo 3D frame buffer to the compositing engine for the lens skew + chromatic aberration shader step. However, there is NO reason that this step can't be done in hardware, or as a post-processing step that gets inserted by the graphics card driver (when the rift is the output display device).

*disclaimer, I am a VR developer working on the rift in my day job, and have coded SLI in games before. SLI coding is something rarely done because it's a small market segment. I hope that changes but honestly, I think the best way for multi-GPU to actually take off again is if the driver manufacturers figure out a way to render the entire scene twice at the driver level, and take it out of the hands of developers. Intercepting and overriding the view matrices and altering them for left/right eyes is not that hard. There is already plenty of precedent for end users being able to over-ride game settings for things like AA or internal resolution.

Post Reply