Blur Buster's G-SYNC 101 Series Discussion

Talk about NVIDIA G-SYNC, a variable refresh rate (VRR) technology. G-SYNC eliminates stutters, tearing, and reduces input lag. List of G-SYNC Monitors.
User avatar
Chief Blur Buster
Site Admin
Posts: 11653
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: Blur Buster's G-SYNC 101 Series Discussion

Post by Chief Blur Buster » 29 Jan 2018, 16:49

KKNDT wrote:One question: Why does G-SYNC have more lantency than V-SYNC OFF even within G-SYNC's range, especially running at lower refresh rate?
Sparky wrote:Possibly because it's a 240hz monitor being run at 60hz, so there may be frame doubling happening. IIRC there really isn't any appreciable difference at 238fps//240hz.
There's only a slight difference at 240Hz than at 60Hz, but it's because of scanout latency. VSYNC OFF is scanout-interrupting, while GSYNC gives contiguous top-to-bottom scans. Top-edge lag would be identical, but bottom-edge lag would be less with VSYNC OFF than with GSYNC or VSYNC ON.

This is because VSYNC OFF can interrupt mid-refresh cycle, as follows:

Image

From Page 6 of Jorim's GSYNC 101.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

KKNDT
Posts: 51
Joined: 01 Jan 2018, 08:56

Re: Blur Buster's G-SYNC 101 Series Discussion

Post by KKNDT » 04 Feb 2018, 07:41

Chief Blur Buster wrote:
KKNDT wrote:One question: Why does G-SYNC have more lantency than V-SYNC OFF even within G-SYNC's range, especially running at lower refresh rate?
Sparky wrote:Possibly because it's a 240hz monitor being run at 60hz, so there may be frame doubling happening. IIRC there really isn't any appreciable difference at 238fps//240hz.
There's only a slight difference at 240Hz than at 60Hz, but it's because of scanout latency. VSYNC OFF is scanout-interrupting, while GSYNC gives contiguous top-to-bottom scans. Top-edge lag would be identical, but bottom-edge lag would be less with VSYNC OFF than with GSYNC or VSYNC ON.

This is because VSYNC OFF can interrupt mid-refresh cycle, as follows:

Image

From Page 6 of Jorim's GSYNC 101.

Sorry chief, I can't quite understand the G-SYNC scanout diagram you posted here. Can you help me to find out where my understanding is wrong?

1. The scanout time of each frame slice is roughly equal to the rendering time of next frame.

2. When G-SYNC works perfectly, the rendering time of each frame should be longer than the display's min refresh cycle. In this case, no extra lag is to be introduced over V-SYNC OFF in general. A new frame may appear from the middle of a screen with V-SYNC OFF, but at the same time it will appear from the top of the screen with G-SYNC. The lag differential of a certain area between the two screens is in random, depends on the positon where a new frame begins to be scanned out.

3. Even if you use in-game FPS limiter, few frames' rendering time may be shorter than the displays min refresh cycle, which will cause G-SYNC to have more lag, in this case, the scanout diagram will look like the one you post here?

User avatar
RealNC
Site Admin
Posts: 3757
Joined: 24 Dec 2013, 18:32
Contact:

Re: Blur Buster's G-SYNC 101 Series Discussion

Post by RealNC » 04 Feb 2018, 08:55

KKNDT wrote:1. The scanout time of each frame slice is roughly equal to the rendering time of next frame.
Yes. With g-sync off, vsync off you get slices, and the current frame is being scanned out until replaced with the next frame.
2. When G-SYNC works perfectly, the rendering time of each frame should be longer than the display's min refresh cycle. In this case, no extra lag is to be introduced over V-SYNC OFF in general. A new frame may appear from the middle of a screen with V-SYNC OFF, but at the same time it will appear from the top of the screen with G-SYNC. The lag differential of a certain area between the two screens is in random, depends on the positon where a new frame begins to be scanned out.
Correct. In both cases, the frame will start to become visible at the same time, except that with g-sync it will start to appear at the top of the screen, where without g-sync it will start to appear at a random position. And that's of course because g-sync is able to force the monitor to go to the top of the screen and start there, while without g-sync there's no way to do that.
3. Even if you use in-game FPS limiter, few frames' rendering time may be shorter than the displays min refresh cycle, which will cause G-SYNC to have more lag, in this case, the scanout diagram will look like the one you post here?
Yes. It would introduce some lag that is equal to the distance between the current scanout position and the bottom of the screen.

Fortunately, capping the frame rate has input lag benefits, so this extra lag is usually completely negated by the FPS cap's latency reduction. This means that in many cases, a 142FPS cap + g-sync ends up having less input lag than 200FPS uncapped without g-sync. This depends on the game, but in my experience, the majority of games show this effect, especially with in-game limiters. Keeping the GPU from being completely saturated can result in a latency reduction of 2 or even 3 frames when using an in-game limiter, and 1 or 2 frames when using RTSS.

But I don't know if this also applies to CPU-bound games, like CS:GO. It's difficult to test this game because of the very high frame rates it can achieve and there's no setting that will make it increase CPU load so much as to drop in the 40FPS range. When making it GPU-bound (playing at 5K for example with everything on highest settings) then an FPS limit also lowers latency in CS:GO. But of course for most people, the game is CPU-bound.
SteamGitHubStack Overflow
The views and opinions expressed in my posts are my own and do not necessarily reflect the official policy or position of Blur Busters.

KKNDT
Posts: 51
Joined: 01 Jan 2018, 08:56

Re: Blur Buster's G-SYNC 101 Series Discussion

Post by KKNDT » 05 Feb 2018, 09:04

RealNC wrote:
3. Even if you use in-game FPS limiter, few frames' rendering time may be shorter than the displays min refresh cycle, which will cause G-SYNC to have more lag, in this case, the scanout diagram will look like the one you post here?
Yes. It would introduce some lag that is equal to the distance between the current scanout position and the bottom of the screen.
Thank you for reply. Could you explan to me why G-SYNC+V-SYNC OFF has more lag than V-SYNC OFF?

User avatar
RealNC
Site Admin
Posts: 3757
Joined: 24 Dec 2013, 18:32
Contact:

Re: Blur Buster's G-SYNC 101 Series Discussion

Post by RealNC » 05 Feb 2018, 09:31

KKNDT wrote:
RealNC wrote:
3. Even if you use in-game FPS limiter, few frames' rendering time may be shorter than the displays min refresh cycle, which will cause G-SYNC to have more lag, in this case, the scanout diagram will look like the one you post here?
Yes. It would introduce some lag that is equal to the distance between the current scanout position and the bottom of the screen.
Thank you for reply. Could you explan to me why G-SYNC+V-SYNC OFF has more lag than V-SYNC OFF?
Because with g-sync the frame needs to be delayed until the monitor is ready to scan it out. A 144Hz monitor has a minimum cycle of 6.94ms. If a frame renders at, say 6ms, then it needs to be held back for another 0.94ms before gsync can refresh the monitor again. With gsync off+vsync off, the frame will be scanned out immediately and thus a tear line will appear.

This is one of the reasons why capping to 2FPS lower than refresh rate is recommended. The game will not try and send frames too fast. But if it does anyway, then g-sync will hold back the frame just long enough to avoid tearing. The latency impact of this is virtually zero, unless the frame limiter is very inaccurate and can't keep a steady pace. Limiters are generally accurate within 2FPS on average, so a 2FPS or 3FPS lower than Hz setting is almost always a good setting.

Very slow frames will actually have the same effect, btw. Because if a frame takes too long, gsync will refresh the screen again at some point, using the previous frame. (It needs to, because otherwise the screen might flicker, or pixels might even get damaged if not refreshed for too long.) So if the new frame arrives just after that, then it also need to be delayed until the monitor is ready again.

So overall, this frame time compensation method of g-sync does not actually increase input lag. Yes, fast frames, or very slow frames, need to be delayed, but they don't impact the average input lag. On average, you are going to get the input lag that is expected by the target FPS you've set the limiter to. These outlier frames will only have an impact of less than 1ms, so they don't matter.

If you get these kinds of outlier frames so frequenty that it would impact the average input lag, then input lag would be the least of your worries anyway. You'd get way too much micro-stutter for the game to be playable.

However, note that you can actually disable this frame time compensation mechanism. It's controlled by the vsync setting. If you enable g-sync but set vsync to off, then gsync will not attempt to do what I just described above. It will simply tear. But given that this mechanism doesn't actually increase input lag, this is a really bad setting to use, IMO.
SteamGitHubStack Overflow
The views and opinions expressed in my posts are my own and do not necessarily reflect the official policy or position of Blur Busters.

KKNDT
Posts: 51
Joined: 01 Jan 2018, 08:56

Re: Blur Buster's G-SYNC 101 Series Discussion

Post by KKNDT » 06 Feb 2018, 07:51

RealNC wrote:
KKNDT wrote:
RealNC wrote:
3. Even if you use in-game FPS limiter, few frames' rendering time may be shorter than the displays min refresh cycle, which will cause G-SYNC to have more lag, in this case, the scanout diagram will look like the one you post here?
Yes. It would introduce some lag that is equal to the distance between the current scanout position and the bottom of the screen.
Thank you for reply. Could you explan to me why G-SYNC+V-SYNC OFF has more lag than V-SYNC OFF?
Because with g-sync the frame needs to be delayed until the monitor is ready to scan it out. A 144Hz monitor has a minimum cycle of 6.94ms. If a frame renders at, say 6ms, then it needs to be held back for another 0.94ms before gsync can refresh the monitor again. With gsync off+vsync off, the frame will be scanned out immediately and thus a tear line will appear.

This is one of the reasons why capping to 2FPS lower than refresh rate is recommended. The game will not try and send frames too fast. But if it does anyway, then g-sync will hold back the frame just long enough to avoid tearing. The latency impact of this is virtually zero, unless the frame limiter is very inaccurate and can't keep a steady pace. Limiters are generally accurate within 2FPS on average, so a 2FPS or 3FPS lower than Hz setting is almost always a good setting.

Very slow frames will actually have the same effect, btw. Because if a frame takes too long, gsync will refresh the screen again at some point, using the previous frame. (It needs to, because otherwise the screen might flicker, or pixels might even get damaged if not refreshed for too long.) So if the new frame arrives just after that, then it also need to be delayed until the monitor is ready again.

So overall, this frame time compensation method of g-sync does not actually increase input lag. Yes, fast frames, or very slow frames, need to be delayed, but they don't impact the average input lag. On average, you are going to get the input lag that is expected by the target FPS you've set the limiter to. These outlier frames will only have an impact of less than 1ms, so they don't matter.

If you get these kinds of outlier frames so frequenty that it would impact the average input lag, then input lag would be the least of your worries anyway. You'd get way too much micro-stutter for the game to be playable.

However, note that you can actually disable this frame time compensation mechanism. It's controlled by the vsync setting. If you enable g-sync but set vsync to off, then gsync will not attempt to do what I just described above. It will simply tear. But given that this mechanism doesn't actually increase input lag, this is a really bad setting to use, IMO.
Thank you again, for the explanation.

From your post, I can read:
1. Why G-SYNC+V-SYNC tends to have more lag than V-SYNC OFF/(G-SYNC+V-SYNC OFF)
2. How fast/slow frames lead to extra lantency
3. The bigger lag number by capped G-SYNC+V-SYNC usually does not matter at all

But I'm not complaining that capped G-SYNC brings more lantency and influences gaming experience. I just wonder technically what exact causes the number of (G-SYNC+V-SYNC OFF) to be bigger than (V-SYNC OFF) in Jorimt's G-SYNC 101 review, since you agree that G-SYNC+V-SYNC OFF allows the tearing to appear, which means the frame can be scanned out as soon as it finishes rendering.

User avatar
Chief Blur Buster
Site Admin
Posts: 11653
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: Blur Buster's G-SYNC 101 Series Discussion

Post by Chief Blur Buster » 06 Feb 2018, 12:04

KKNDT wrote:From your post, I can read:
1. Why G-SYNC+V-SYNC tends to have more lag than V-SYNC OFF/(G-SYNC+V-SYNC OFF)
If the frame cap software is perfectly accurate, there is absolutely no difference.

As long as all frametimes never become less than a refresh cycle, the max Hz is never reached, and VSYNC ON never occurs, nor does VSYNC OFF. That only happens when a frame tries to begin delivering before the monitor is finished with its previous refresh cycle.

As long as the cap is perfect, always below max Hz, big whoop. VSYNC ON never occurs. VSYNC OFF never occurs. So the lag differences never occur.

But in the real world, caps are imperfect. Game framerates are imperfect. Caps and framerates fluctuate a bit.

That's why we need to cap a few frames per second below max Hz because the frametimes will vary a bit. The more accurate the cap is, the tighter the limit can be. RTSS can do extremely accurate frame rate capping, while in-game frame rate capping (usually) is less accurate. However, in-game capping has less lag.

You have to trade off the extra lag of more accurate capping, with the extra lag of very erratic capping (that create transient sub-refresh-cycle frametimes, which can create lag against max Hz during GSYNC + VSYNC ON)
KKNDT wrote:2. How fast/slow frames lead to extra lantency
Only transiently, for those specific 'fast' frames.
KKNDT wrote:3. The bigger lag number by capped G-SYNC+V-SYNC usually does not matter at all
There's no difference, if you're using an accurate cap that never causes a single frametime to be faster than a monitor refresh cycle. It only happens for those specific "frame-faster-than-refresh cycles" frames only.

A frame may take 1/143sec then the next frame may take 1/145sec. One of those two is faster than a 144Hz refresh cycle. It'll be delayed by roughly ((1/144) - (1/145)) = 48 microsecond delay for that "too-fast" frame now being forced to wait for a refresh cycle.

Frame cap inaccuracy enroaching VRR max Hz
Now if your framerate cap inaccuracy is about 5%, your cap of 140fps may cause frametimes to vary in time from 0.95/140sec through 1.05/140sec .... As frametimes fluctuate, the 0.95/140sec = 6.78 milliseconds which is faster than 1/140sec = 6.94 milliseconds. So you've got a forced wait of (6.94 - 6.78) = 0.16 milliseconds = 160 microseconds = for that too-fast frame during VSYNC ON + GSYNC. Those are really tiny numbers!

Frame cap so accurate it never hits VRR max Hz
Now, if your framerate cap inaccuracy was only 1% for a 140fps cap, your frametimes will fluctuate between 0.99/140sec and 1.01/140sec. The fastest frame, 0.99/140sec = 7ms ... That's still slightly slower than 1/140sec (6.94ms).

Mathematics theory of determining perfect frame rate caps for a VRR display
In order to do this, you need to experimentally figure out how much inaccuracy your frame cap is (this is extremely hard to do, but you can output frametime values to an excel spreadsheet and run math formulas to figure out how much they vary). Now, assuming your inaccuracy is 1% (very very very hard to get framecapping that accurate. Your frametime variance range of a 1% cap inaccuracy. You want 144Hz * 0.99 = a frame cap of 142.56. But if your frame capping is inaccurate to 5%, your perfect cap might be 144Hz * 0.95 = a frame cap of 136.8fps. So you see, the more accurate your frame capper is, the closer you can get to VRR max without those "fast frames" creating lag. On the other hand, "fast frames" tend to create only microseconds of input lag for properly-implemented VRR algorithms (preferably without a polling granularity).

Lower refresh rates can less frame rate capping error percentage -- theoretically accept tighter frame rate caps
Some software such as well-written 8-bit emulators manage to pull off a 60fps frame cap with such frame-pacing accuracy. Also, lower refresh rates have larger refresh cycle times, and frame capping inaccuracy is much less at lower refresh rates. This can work in your favour. Many emulator users have noticed that VRR monitors reduce the input lag of emulators (instead of using 60Hz, you simply play 60fps at 144Hz or 240Hz -- getting the VSYNC ON experience without the input lag of VSYNC ON). But some FreeSync monitors only go up to 60Hz. There are also 4K G-SYNC monitors that only go up to 60Hz too. For these displays, you could theoretically slightly overclock your 60Hz VRR monitor to approximately 60.5Hz -- and use a 60fps cap to reduce emulator lag without needing a 75Hz+ VRR monitor. As an alternative, one can slowdown the emulator by 1% -- e.g. make the emulator run 1% slower at 59.5fps (sounds will lower in pitch by 1%, game motion slows down by 1%) and use a 59.5fps frame cap with a 60Hz-limited VRR display -- to reduce emulator lag when you are not using a high-Hz VRR display. The low-Hz and the high-capping-accuracy makes this doable, if one has a VRR display limited to 60Hz.

Factors beyond control: Granularity effects
Depending on the VRR drivers and VRR monitor technology, there may be some granularity that forces additional delay above-and-beyond. In the best possible implementations of VRR, the frame begins to be delivered immediately upon the new refresh cycle. However, if a polling mechanism is used for a specific VRR tech (e.g. 1ms polling), there might be some added lag granularity added. It is the graphics card's responsibility to begin delivering the new refresh cycle as quickly as possible after the last refresh cycle -- preferably within microseconds rather than a millisecond.

Microstutter not completely filtered by VRR
1ms errors in VRR refresh-cycle delivery (relative to frametime/gametime) can create visible microstutter, so monitor manufacturers should not induce any random latency between game-time and refresh-cycle-visibility-time. 1ms refresh-cycle delivery timing errors in a 8000 pixels/sec flick turn, is an 8-pixel-amplitude microstutter (8000pix/s * 0.001s = 8pix), so milliseconds matter in gametime-vs-visibilitytime inaccuracies. This requires a high speed camera of sufficient precision (e.g. Phantom Flex) synchronized to verifying gametime stamps. Also, because of varying GPU rendertime, gametimes are not always perfectly in sync with photons hitting the human eyes. So not all capping techniques may 100% eliminate stutter, depending on how the cap affects the sync of gametime-vs-displayrefresh visibility.
KKNDT wrote:But I'm not complaining that capped G-SYNC brings more lantency and influences gaming experience. I just wonder technically what exact causes the number of (G-SYNC+V-SYNC OFF) to be bigger than (V-SYNC OFF) in Jorimt's G-SYNC 101 review, since you agree that G-SYNC+V-SYNC OFF allows the tearing to appear, which means the frame can be scanned out as soon as it finishes rendering.
Depends on the capping algorithm: In theory, the frame rate capping is sufficiently accurate and you have enough capping headroom, tearing never appears. In the real world, not everything is perfect, alas. But as you can see, delays on early frames tends to be extremely minor -- often the lesser of evil compared to a lower cap or using VSYNC OFF.

Once you add a few frames per second to the cap, the occasionally delayed early frames becomes a nonissue. If you see only 1 tearline per second during GSYNC+VSYNC-OFF (only one early frame per second) -- then using VSYNC ON only delays that 1 specific refresh cycle (often by far less than 1ms) for GSYNC+VSYNC ON.

With the 40-run-averaged in Jorim's tests, this really shows the non-issueness of rare (occasional) tearlines that often appear briefly near the bottom edge of the screen.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

KKNDT
Posts: 51
Joined: 01 Jan 2018, 08:56

Re: Blur Buster's G-SYNC 101 Series Discussion

Post by KKNDT » 07 Feb 2018, 10:36

If I cap the FPS to a very low number compared to the monitor's MAX HZ, say approx. 60FPS@144HZ, will the lag number be almost the same between V-SYNC OFF and G-SYNC?

Chief Blur Buster wrote:Lower refresh rates can less frame rate capping error percentage -- theoretically accept tighter frame rate caps
But chief, from your test, I see V-SYNC OFF showes a very noticeable lag advantage over G-SYNC at 58FPS@60HZ. Who is to be blamed? Cap inaccuracy?

What makes me more confused is that the main problem is the number of MIN and AVG lag. I was thinking that G-SYNC should have a very similar MIN number as V-SYNC OFF.

User avatar
Chief Blur Buster
Site Admin
Posts: 11653
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: Blur Buster's G-SYNC 101 Series Discussion

Post by Chief Blur Buster » 07 Feb 2018, 12:07

You're comparing:
G-SYNC 58fps@60Hz versus VSYNC OFF 58fps@60Hz

I was comparing:
G-SYNC 58fps@60Hz versus uncapped (VSYNC ON 60fps lag) -- a bigger lag difference than above.

These versus have different lag-mechanics. There are lots of superimposed overlapping lag-affecting variables, including GPU rendering time, full screen scanout latency (for VSYNC ON and VRR -- essentially a full screen lag gradient) and frameslice scanout latency (for VSYNC OFF -- essentially a frameslice lag gradient). It's horrendously complex interactions, as you can imagine.

Now... what I believe you were referring to this below image (which is a different lag versus topic):

In Jorim's chart:
Image

Observe MAX is little different (46 versus 44), but MIN is dramatically different (30 vs 19) because of the scanout At 60Hz. G-SYNC can't interrupt scanout, so there's always more lag for bottom part of screen than for VSYNC OFF. The penalty difference between these two depends on where the VSYNC OFF interrupts the scanout. (Essentially splicing a different, fresher frame mid-scanout). Now, for using GSYNC instead of VSYNC OFF, fixing that scanout lag penalty is simply mitigated by using high-Hz instead. Which reduces the Top/Bottom scannout penalty. From 1/60sec=16.7ms to 1/240sec=4.3ms. That by itself, is a big lag reduction. So 240Hz VRR solves a lot of the scanout lag penalty imposed by VRR over VSYNC OFF. The average scanout lag penalty is half a refresh cycle for a random position on the screen (top edge no difference between perfectly-timed VRR versus VSYNC OFF, bottom edge +full refresh cycle, middle screen +half refresh cycle) so that's a +2ms average scanout penalty for 240Hz VRR versus the ability to interrupt scanout with a new frame (aka 240Hz VSYNC OFF) -- that really clearly shows during the 238fps@240Hz tests. So the difference between GSYNC 58fps@60Hz versus VSYNC OFF 58fps@60Hz is very easily explained via lack of scanout lag for VSYNC OFF, while G-SYNC has a mandatory scanout lag that correspondingly increase towards the bottom edge of the screen. This affects the MIN numbers more than the MAX numbers. Which is, the pattern observed (for this specific versus comparision situation). The scanout lag variable is one refresh cycle of the current max of the VRR range, so if it's 30-240Hz VRR range, the scanout lag between top/bottom edge is 1/240sec whenever tearlines are not involved. (Tearlines behaves as interrupted scanouts, lag-wise)

Again, lag is a complex topic consisting of many superimposing interacting factors.

Anyway....we're getting offtrack. We're talking about different versus situations.
My earlier post is practically a completely different lag-versus topic, compared to this versus.

Lessons To Learn:
-- Scanout lag
-- Polling lag (only if a specific VRR tech uses a polling mechanism)
-- Processing lag
-- GPU lag (Frametime lag / lag in frame delivery)
-- Capping cannot fix scanout lag (albiet it can interact, sometimes microsecond timescales, sometimes multi-millisecond timescales)
-- Scanout lag is reduced by higher-velocity scanout (scanning refresh cycles faster, top-to-bottom)
-- They all layer/superimpose/interact with each other
-- Some lag causes overwhelm (outside) other causes of lag. Different causes of lag can be in different orders of magnitude.
Etc.

One needs to be very careful and very unusually specific when talking about lag-versus situations. What you're trying to talk about is different from what I was earlier trying to say in my previous post. This post and my previous post are still simultaneously correct, because they're essentiallky covering two different parts of the lag universe. Going further, may be like trying to learn every single nuance of quantum mechanics when you're only trying to identify what type of common molecule that a specific molecule is.

It's easy to just simply measure lag.
It's just much harder to explain what's causing the lag variances
(even if I understand a lot of nuances already, like advanced VSYNC OFF frameslice lag gradients, and such)
Some of them are really easily explained but they are still quite complicated interactions at the end of the day.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

xenphor
Posts: 69
Joined: 28 Feb 2018, 11:47

Re: Blur Buster's G-SYNC 101 Series Discussion

Post by xenphor » 28 Feb 2018, 11:51

I've been looking at gsync for a long time because I can't stand tearing or stuttering. I have a few questions:

1. Am I right in thinking that gsync could be useful when paired with a low end video card such as a 1050ti that I have? I would be more than happy if I could get a fluid, stable image, even at a low frame rate. That way I don't have to worry about keeping up with increasingly more expensive gpu hardware.

2. If true, how low can the framerate go and still receive the benefits of gsync? I see in the article that there is an operating range, but what is it like in practice? I would expect to see sub 30 fps frame rates on a lower end card in high end games which I am fine with, as long as the frames are presented uniformly, without tearing.

3. Even though I would be mostly dealing with fps in the ~30 range, I don't think I have a choice but to also buy a monitor with a high refresh rate because that's how gsync is being sold. How would I set that up knowing that I would most likely never approach the 100+ fps range in high end games that those monitors support? Should I buy a 120hz+ monitor but set the refresh rate to 60hz or does it matter?

3. What are peoples' opinions on freesync? Since I already have a 1050 ti I would not use it, but just curious.

Post Reply