I've never used SLI with my G-SYNC setup, but pretty much everything I've read has said it does not work very well. That said, I haven't read many complaints lately, so are things better these days? I am eying the new Acer XB280HK, but I would need a second 780 before I even think about getting it.
What are your guys' experiences lately?
State of SLI + G-SYNC
Re: State of SLI + G-SYNC
Hi.
I have two GTX 780 in SLI and Asus Swift from about a month. No problems
I have two GTX 780 in SLI and Asus Swift from about a month. No problems
- GameLifter
- Posts: 104
- Joined: 25 May 2014, 13:47
Re: State of SLI + G-SYNC
I'm running two GTX 680s and I had some stuttering issues when I first got my monitor but since getting the 344.11 drivers the stuttering issues have disappeared. I think you'll be fine.
Re: State of SLI + G-SYNC
now the question is... did they actually improve the stuttering or did they just add more buffering (and input lag)
btw does anyone know how well split-frame rendering works in terms of fps, stuttering, and input lag?
btw does anyone know how well split-frame rendering works in terms of fps, stuttering, and input lag?
- GameLifter
- Posts: 104
- Joined: 25 May 2014, 13:47
Re: State of SLI + G-SYNC
Honestly, I couldn't really tell when playing Crysis 3. If there is added input lag it wasn't very noticeable. Doesn't SLI always add a little input lag?flood wrote:now the question is... did they actually improve the stuttering or did they just add more buffering (and input lag)
btw does anyone know how well split-frame rendering works in terms of fps, stuttering, and input lag?
Re: State of SLI + G-SYNC
well in the ideal world where fps isn't artificially capped and completely limited by gpu speed
100fps single gpu lag = 100fps SFR SLI < 100fps 2xAFR SLI < 50fps single gpu
single gpu/sfr sli at 100fps would have between 10-20 ms input lag
100fps 2xAFR SLI with no microstuttering would have 20-30 ms input lag
50fps single gpu would have 20-40ms input lag
100fps single gpu lag = 100fps SFR SLI < 100fps 2xAFR SLI < 50fps single gpu
single gpu/sfr sli at 100fps would have between 10-20 ms input lag
100fps 2xAFR SLI with no microstuttering would have 20-30 ms input lag
50fps single gpu would have 20-40ms input lag
- GameLifter
- Posts: 104
- Joined: 25 May 2014, 13:47
Re: State of SLI + G-SYNC
I think I get it. So basically single gpu is the way to go to get less input lag at the lower fps ranges?
Re: State of SLI + G-SYNC
SLI has lots of positives, like massive performance and ... stuff.GameLifter wrote:I think I get it. So basically single gpu is the way to go to get less input lag at the lower fps ranges?
But its a general rule to have a single gpu connected to a single tiled, single monitor for the cleanest gaming performance.
Whenever you add more monitors or gpus, there are always lots of new problems, issues, whatever. ALWAYS.
But its no sin to deal with all that anyway and make multiple lightboost monitor setups and what not...
Re: State of SLI + G-SYNC
note the "ideal world" in my post
in practice sli may be a frame or two more laggy than what i wrote... who knows.
in practice sli may be a frame or two more laggy than what i wrote... who knows.
- GameLifter
- Posts: 104
- Joined: 25 May 2014, 13:47
Re: State of SLI + G-SYNC
Yeah, SLI is great when used with the right game but like you said it can cause issues. Some games outright don't support it or they don't fully utilize it. I plan to get a factory overclocked 980ti when its released and I may even forget about SLI altogether but we'll see.Edmond wrote:SLI has lots of positives, like massive performance and ... stuff.GameLifter wrote:I think I get it. So basically single gpu is the way to go to get less input lag at the lower fps ranges?
But its a general rule to have a single gpu connected to a single tiled, single monitor for the cleanest gaming performance.
Whenever you add more monitors or gpus, there are always lots of new problems, issues, whatever. ALWAYS.
But its no sin to deal with all that anyway and make multiple lightboost monitor setups and what not...