speancer wrote: ↑17 May 2020, 08:15
So I guess I'm best off with Low Latency Mode set to "On".
Typically yes, and Vega pretty much just summed the setting up. That's the gist.
I would only add that LLM settings have no effect in DX12 and Vulkan games, as they handle the pre-rendered frames queue themselves currently. Also, not all games are guaranteed to respect LLM settings, even if supported.
speancer wrote: ↑17 May 2020, 08:15
May I ask where does your advice come from? Would you recommend any good article concerning this subject?
Maybe Vega does, but I personally don't know of any beyond my own material; few cover the subject in any depth.
I've discussed it quite a bit recently on the forums here though, so just search my post history for "LLM" or "pre-rendered" if you're interested:
search.php?author_id=2920&sr=posts
My G-SYNC 101 article also has a couple of paragraphs on it...
"Optimal G-SYNC Settings" page:
https://blurbusters.com/gsync/gsync101- ... ttings/14/
Maximum Pre-rendered Frames*: Depends
*As of Nvidia driver version 436.02, “Maximum pre-rendered frames” is now labeled “Low Latency Mode,” with “On” being equivalent to MPRF at “1.”
A somewhat contentious setting with very elusive consistent documentable effects, Nvidia Control Panel’s “Maximum pre-rendered frames” dictates how many frames the CPU can prepare before they are sent to the GPU. At best, setting it to the lowest available value of “1” can reduce input lag by 1 frame (and only in certain scenarios), at worst, depending on the power and configuration of the system, the CPU may not be able to keep up, and more frametime spikes will occur.
The effects of this setting are entirely dependent on the given system and game, and many games already have an equivalent internal value of “1” at default. As such, any input latency tests I could have attempted would have only applied to my system, and only to the test game, which is why I ultimately decided to forgo them. All that I can recommend is to try a value of “1” per game, and if the performance doesn’t appear to be impacted and frametime spikes do not increase in frequency, then either, one, the game already has an internal value of “1,” or, two, the setting has done its job and input lag has decreased; user experimentation is required.
And "Closing FAQ" entry "What exactly does the “Maximum pre-rendered frames” setting do again? Doesn’t it affect input lag?":
https://blurbusters.com/gsync/gsync101- ... ttings/15/
NOTE: As of Nvidia driver version 436.02, “Maximum pre-rendered frames” is now labeled “Low Latency Mode,” with “On” being equivalent to MPRF “1.”
While this setting was already covered in part 14 “Optimal G-SYNC Settings & Conclusion” under a section titled “Maximum Pre-rendered Frames: Depends,” let’s break it down again…
The pre-rendered frames queue is effectively a CPU-side throttle for average framerate, and the “Maximum pre-rendered frames” setting controls the queue size.
Higher values increase the maximum amount of pre-rendered frames able to generate at once, which, in turn, typically improves frametime performance and allows higher average framerates on weaker CPUs by giving them more time to prepare frames before handing them off to the GPU to be completed.
So while the “Maximum pre-rendered frames” setting does introduce more buffers at higher values, its original intended function is less about being a direct input lag modifier (as is commonly assumed), and more about allowing weaker systems to run demanding games more smoothly, and reach higher average framerates than they would otherwise be able to (if at all).
On a system where the power of the CPU and GPU are more matched, a “Maximum pre-rendered frames” value of “1” is typically recommended, and usually causes little to no negative effects, of which would be evident by a lower average framerate and/or increased frametime spikes.
Finally, it should be noted that the NVCP’s MPRF setting isn’t respected by every game, and even where it is, MPRF results may vary per system and/or per game, so user-experimentation is required.
Finally, regarding LLM usage with G-SYNC in particular:
https://blurbusters.com/gsync/gsync101- ... ttings/14/
Low Latency Mode* Settings:
*This setting is not currently supported in DX12 or Vulkan.
- If an in-game or config file FPS limiter is not available, RTSS is prohibited from running, a manual framerate limit is not required, and framerate exceeds refresh rate:
Set “Low Latency Mode” to “Ultra” in the Nvidia Control Panel. When combined with G-SYNC + V-SYNC, this setting will automatically limit the framerate to ~59 FPS @60Hz, ~97 FPS @100Hz, ~116 FPS @120Hz, ~138 FPS @144Hz, ~224 FPS @240Hz, etc.
- If an in-game or config file FPS limiter, and/or RTSS FPS limiter is available, or Nvidia’s “Max Frame Rate” limiter is in use, and framerate does not always reach or exceed refresh rate:
Set “Low Latency Mode” to “On.” Unlike “Ultra,” this will not automatically limit the framerate, but like “Ultra,” “On” (in supported games that do not already have an internal pre-rendered frames queue of “1”) will reduce the pre-rendered frames queue in GPU-bound situations where the framerate falls below the set (in-game, RTSS, or Nvidia “Max Frame Rate”) FPS limit.
And:
viewtopic.php?f=5&t=5903&start=30#p44825
1. NULL does limit the FPS below the refresh rate with G-SYNC + V-SYNC, and thus does prevent V-SYNC input lag, but, overall, not nearly as much as a good in-game limiters, and as such, is not a direct substitute to a dedicated limiter (nor, I assume, does it likely have as solid frametime consistency as RTSS).
2. Low Latency Mode (both "On" and "Ultra") does nothing to reduce input lag if you're already using an FPS limiter to keep G-SYNC in it's range and prevent your system from being GPU-bound (as I've already stated here and in my existing article/Closing FAQ), as the pre-rendered frames queue does not apply so long as your FPS is being limited by the cap.
3. Low Latency Mode only reduces input lag in GPU-bound situations, and typically only by up to 1 frame (also as I've stated in the past). Keeping the FPS limited by your (preferably in-game) FPS cap at all times is still going to give you the lowest input lag in all situations.
As for whether my optimal G-SYNC settings have change, overall, no:
G-SYNC + V-SYNC "On" (NVCP optimal in my opinion, with in-game likely safe 99% of the time as well) + -3 minimum in-game FPS limit or RTSS if "good" in-game limiter not available.
Regarding Low Latency Mode "Ultra" vs. "On" when used in conjunction with G-SYNC + V-SYNC + -3 minimum FPS limit, I'd currently recommend "On" for two reasons:
1. "On" should have the same effect as "Ultra" in compatible games (that don't already have a MPRF queue of "1") in reducing the pre-rendered frames queue and input lag by up to 1 frame whenever your system's framerate drops below your set FPS limit vs. "Off."
2. Since "Ultra" non-optionally auto-caps the FPS at lower values than you can manually set with an FPS limiter, for the direct purposes of point "1" above, you'd have to set your FPS limiter below that when using "Ultra" to prevent it from being the framerate's limiting factor, and allow the in-game (or RTSS) limiter to take effect. At 144Hz, you would need to cap a couple frames below 138, which isn't a big deal, but at 240Hz, "Ultra" will auto-cap the FPS to 224 FPS, which I find a little excessive, so "On" which doesn't auto-cap, but should still reduce the pre-rendered frames queue by the same amount as "Ultra" in GPU-bound situations (within the G-SYNC range) is more suited to such a setup.