as you know recommendations for certain settings can differ depending on the source. Same goes for Nvidia Control Panel V-SYNC vs. In-game V-SYNC settings (edit: in combination with G-Sync ON)
While on Blurbusters it says: "While NVCP V-SYNC has no input lag reduction over in-game V-SYNC, and when used with G-SYNC + FPS limit, it will never engage, some in-game V-SYNC solutions may introduce their own frame buffer or frame pacing behaviors, enable triple buffer V-SYNC automatically (not optimal for the native double buffer of G-SYNC), or simply not function at all, and, thus, NVCP V-SYNC is the safest bet.
There are rare occasions, however, where V-SYNC will only function with the in-game option enabled, so if tearing or other anomalous behavior is observed with NVCP V-SYNC (or visa-versa), each solution should be tried until said behavior is resolved."
Source: https://blurbusters.com/gsync/gsync101- ... ttings/14/
Chris from Battle(non)sense states in his videos that: "The V-SYNC options inside a game might also trigger other optimizations inside the game engine." Which is why he recommends to use in-game V-SYNC (edit: in combination with G-Sync ON).
Source: https://youtu.be/OAFuiBTFo5E?t=734; https://youtu.be/YR0vNs0ZdWI?t=289
My questions:
- Are there any other methods beside looking for tearing or other anomalous behavior to know if in-game V-SYNC introduced its own frame buffer, frame pacing, triple buffer or if it's not working at all? Meaning any methods, which don't rely on eyesight solely.
- Can anyone here backup Chris's claim about the optimizations inside game engines or link any other sources proving his claim?