overwatch buffering reduction vs nvidia LLM Ultra ??

Everything about latency. Tips, testing methods, mouse lag, display lag, game engine lag, network lag, whole input lag chain, VSYNC OFF vs VSYNC ON, and more! Input Lag Articles on Blur Busters.
Post Reply
blackstorm82
Posts: 23
Joined: 01 Jun 2020, 20:23

overwatch buffering reduction vs nvidia LLM Ultra ??

Post by blackstorm82 » 24 Jun 2020, 00:45

overwatch - Reduce Buffering
"Reduce Buffering I believe is aimed at reducing input lag. It does this by lowering the pre-rendered frames to a lower amount. It's often recommended to turn this feature on as long as your FPS is meeting or exceeding the maximum refresh rate of your monitor."

Isn't that what the company says is considering 99% of gpu usage?
As overwatch said there, I said that if the frame is higher than the monitor hz, it works unconditionally.

nvidia LLM - ULTRA
Isn't it the same function?

Why is "Overwatch" talking without considering the usage of the graphics card?

User avatar
k2viper
Posts: 293
Joined: 23 Jan 2018, 06:30

Re: overwatch buffering reduction vs nvidia LLM Ultra ??

Post by k2viper » 28 Jun 2020, 05:43

It seems that Overwatch's "Reduce buffering" should be working the same way that Nvidia's LL Ultra. Though it's hard to tell, how they can interfere when both enabled.
Battlenonsense has a video on this topic. https://youtu.be/7CKnJ5ujL_Q?t=347

User avatar
axaro1
Posts: 627
Joined: 23 Apr 2020, 12:00
Location: Milan, Italy

Re: overwatch buffering reduction vs nvidia LLM Ultra ??

Post by axaro1 » 28 Jun 2020, 07:25

blackstorm82 wrote:
24 Jun 2020, 00:45
overwatch - Reduce Buffering
"Reduce Buffering I believe is aimed at reducing input lag. It does this by lowering the pre-rendered frames to a lower amount. It's often recommended to turn this feature on as long as your FPS is meeting or exceeding the maximum refresh rate of your monitor."

Isn't that what the company says is considering 99% of gpu usage?
As overwatch said there, I said that if the frame is higher than the monitor hz, it works unconditionally.

nvidia LLM - ULTRA
Isn't it the same function?

Why is "Overwatch" talking without considering the usage of the graphics card?
Basically it is supposed to reduce input lag by 1 frame but there is a massive issue with Reduce Buffering.
Every 5-7minutes (or even sooner if you happen to alt-tab out of the game multiple times) it mess with the fps causing the avg fps to massively drop. That's why you see many pros going into options in the middle of the game to disable and reenable this feature since it resets the fps drop (from my experience I have to do this basically once every two rounds).
Personally I keep it off, 1 frame at high refresh rate doesn't make a massive difference in terms of input lag and when I use it I often find myself in the middle of a teamfight with 180fps due to this reduce buffering issue instead of my average 250-280.

Not every pro uses Reduce Buffering, many good players (like ryujehong) just keep it disabled.

I recently saw a comment on reddit stating that it you are CPU bound it may actually increase input lag:
Reduce buffering syncs your CPU simulation-start to the GPU render-end. If your game is GPU bound, the total lag is equal to

1/FPS + (t_GPU - t_CPU) = (2 * t_GPU) - t_CPU

Enabling "reduce buffering" forces the total lag to be

1/FPS = t_GPU + t_CPU

If your game is CPU-bound, the total lag is equal to

1/FPS = t_CPU

Enabling "Reduce Buffering" forces the CPU to start simulating only when the frame finishes rendering, which makes the total lag equal to

1/FPS' = t_GPU + t_CPU
XL2566K* | XV252QF* | LG C1* | HP OMEN X 25 | XL2546K | VG259QM | XG2402 | LS24F350[RIP]
*= currently owned



MONITOR: XL2566K custom VT: https://i.imgur.com/ylYkuLf.png
CPU: 5800x3d 102mhz BCLK
GPU: 3080FE undervolted
RAM: https://i.imgur.com/iwmraZB.png
MOUSE: Endgame Gear OP1 8k
KEYBOARD: Wooting 60he

milojr21
Posts: 85
Joined: 23 Jul 2018, 22:46

Re: overwatch buffering reduction vs nvidia LLM Ultra ??

Post by milojr21 » 30 Jun 2020, 20:17

I was playing around OW and testing out PresentMon (viewtopic.php?f=10&t=5552) seems like with every setting Reduce Buffering on increased latency, maybe someone more knowledgeable could test it out.

outcast
Posts: 1
Joined: 02 Aug 2020, 23:51

Re: overwatch buffering reduction vs nvidia LLM Ultra ??

Post by outcast » 03 Aug 2020, 00:03

milojr21 wrote:
30 Jun 2020, 20:17
I was playing around OW and testing out PresentMon (viewtopic.php?f=10&t=5552) seems like with every setting Reduce Buffering on increased latency, maybe someone more knowledgeable could test it out.
@jorimt @Chief Blur Buster

My results ( Overwatch - 144hz - 240fps in game cap - gfx low preset )

Hardware: Legacy Flip

Reduce buffering ON:

MsBetweenPresents: 3.941
MsInPresentAPI: 0.097
MsUntilRenderComplete: 1.868
MsUntilDisplayed: 1.868

Reduce buffering OFF:

MsBetweenPresents: 3.903
MsInPresentAPI: 0.181
MsUntilRenderComplete: 0.651
MsUntilDisplayed: 0.651

pro players dont use reduce buffering, ( 300 fps cap, 240hz etc ), so RB works only for low fps and 64hz?

User avatar
axaro1
Posts: 627
Joined: 23 Apr 2020, 12:00
Location: Milan, Italy

Re: overwatch buffering reduction vs nvidia LLM Ultra ??

Post by axaro1 » 22 Sep 2020, 12:18

This is what I did to make aiming better with this game:
-I disabled Radeon Anti-Lag because it was increasing input lag (I'm cpu/engine bottlenecked, watch Battle(non)sense videos and I noticed a decrease in input lag.
-I disabled Reduce Buffering since it keeps bugging forcing me to disable/enable in the middle of fights to get rid of the random fps drop issues (down to 160-180fps from an average of 270).
-I just switched from 1600 dpi to 3200 and halved my sens(Currently at 4000edpi) + enabled High Precision Mouse Input and my aim have been the best it's ever been on this game, I highly recommend to increase DPI + enable HPMI it truly improves your aim, especially if you have a micro-flicks like play style, it truly makes a difference.
First, a quick summary on how aiming and shooting happens in Overwatch. Overwatch simulates (or ticks) every 16 milliseconds, or at 62.5Hz. Each tick, we update the players aiming direction based on whatever raw inputs we’ve received from the mouse hardware since the previous tick. If the player has pressed primary fire, we then launch their shot in that direction (assuming they are alive, have ammo, aren’t in the middle of some other action, etc.)

Despite this being a conventional way of handling mouse input in a first-person shooter, we wanted to do better. Many gaming mice these days support 500Hz, 1000Hz or even more in terms of polling rates. At 1000Hz, this means the mouse movement done between game ticks can be done by as many as 16 discrete mouse movements delivered from the hardware.

With the new High Precision option enabled, projectiles can now travel down any of the red lines! This means you can also functionally
shoot between rendered frames (or at high fps, on rendered frames that are in between ticks)

Enabling this option does come with a small CPU cost overhead.
Comparison beteen HPMI ON(red lines) and OFF(white lines), it makes a massive difference depending on the distance of the enemy, I highly recommend you guys to try it!
https://bnetcmsus-a.akamaihd.net/cms/ga ... 426861.mp4

HPMI wasn't making a huge difference at 144hz but since I upgrade to 280hz my aim started feeling floaty and not consistent, apparently Overwatch was reading the correct input just 1 every 4.5 frames (280hz / tick rate of the server (62.5)), if you own a high refresh rate monitor ENABLE HPMI

Overwatch servers read 1 input every 3.84 frames at 240hz
Overwatch servers read 1 input every 2.3 frames at 144hz
XL2566K* | XV252QF* | LG C1* | HP OMEN X 25 | XL2546K | VG259QM | XG2402 | LS24F350[RIP]
*= currently owned



MONITOR: XL2566K custom VT: https://i.imgur.com/ylYkuLf.png
CPU: 5800x3d 102mhz BCLK
GPU: 3080FE undervolted
RAM: https://i.imgur.com/iwmraZB.png
MOUSE: Endgame Gear OP1 8k
KEYBOARD: Wooting 60he

Post Reply