nVidia Reflex - THIS IS HUGE FOR US, BLURBUSTERS!

Breaking news in the gaming monitor industry! Press releases, new monitors, rumors. IPS, OLED, 144Hz, G-SYNC, Ultrawides, etc. Submit news you see online!
Post Reply
User avatar
A Solid lad
Posts: 317
Joined: 17 Feb 2018, 08:07
Location: Slovakia
Contact:

nVidia Reflex - THIS IS HUGE FOR US, BLURBUSTERS!

Post by A Solid lad » 03 Sep 2020, 04:05

phpBB [video]
Discord | Youtube | Twitch
Steam with a list of monitors & mice I've used/use.

forii
Posts: 218
Joined: 29 Jan 2020, 18:23

Re: nVidia Reflex - THIS IS HUGE FOR US, BLURBUSTERS!

Post by forii » 03 Sep 2020, 07:47

Im glad someone made a topic about this.

In my opinion it's marketing, like always..., like now this 80% boost in performance RTX 3080 vs 2080, which probably its boost only of Raytracing

If it comes to nVidia Reflex I can get low system latency by myself:
1) by keeping HPET on in BIOS but disabling it in Windows + by using ISLC:
https://www.youtube.com/watch?v=CfPuZ4wH-do
2) Switch MSI mode Utility of my GPU (and I bet 3080 will have it already build in - and that will be whole secret)
-> how to: https://www.youtube.com/watch?v=gCedPy3Eoh8&t
-> info about it: https://forums.guru3d.com/threads/windo ... ol.378044/

My screenshot:
Image

Yea, I might be wrong but just saying Low latency system isn't something new. Don't know how it can work with mouse plugged to monitor, and maybe this will be the good time to get wired mouse, for now using wireless g305, because cable can slow down ur moves

User avatar
A Solid lad
Posts: 317
Joined: 17 Feb 2018, 08:07
Location: Slovakia
Contact:

Re: nVidia Reflex - THIS IS HUGE FOR US, BLURBUSTERS!

Post by A Solid lad » 03 Sep 2020, 13:36

I'm not hyped for reflex as a way to bring latency down... as judging from the video, it will mostly help in GPU bound scenarios, which I already eliminate all the time. (CPU bound framerate = way lower latency than GPU bound, for anyone else reading this)

I'm hyped for it as a tool which enables easy, hassle free latency measurements for everyone!

...I could never bother to do latency testing with a high speed camera and a LED light modded mouse after every setting change in my BIOS or windows registry tweak... however, I can imagine myself bothering to test again and again with reflex!

So yeah, making latency testing way faster and easier is what I'm hyped about.

maybe this will be the good time to get wired mouse
No need to... you can just plug the wireless dongle into the monitor. (with the extension cable if needed)
Discord | Youtube | Twitch
Steam with a list of monitors & mice I've used/use.

ELK
Posts: 125
Joined: 18 Dec 2015, 02:51

Re: nVidia Reflex - THIS IS HUGE FOR US, BLURBUSTERS!

Post by ELK » 10 Jan 2021, 22:27

I'm about to watch the video but first I would like to post that nvidia will not enable MSI (message signal interrupt) mode by default due to the fact that it does not work in virtual machines. NVidia would have set they're cards to MSI a long time ago, especially because it increases performance by ~5%, except for this incompatibility. I've never seen a test on input lag. It's always been theoretical. If you know of a test(s) that has been done let me know please.

Wow I'm really excited that we will be able to test our own latency. This will help alot.

Let me explain the reflex queue thing to y'all. You should already be familiar with pre-rendered frames. That's EXACTLY what this is, but please read on. It's been a common practice to change pre-rendered frames from 3 to 1. Thanks to AMD inventing anti-lag us we Nvidia cards now have the option to set pre-rendered frames to 0 by setting ultra low latency to "ultra." What most people don't know is that THIS WILL EFFECT ALMOST NO GAMES AT ALL. A long long long time ago developers built the queue into their game engine instead of using the systems queue because windows defaults to 3. A queue of 3 caused EXTREMELY bad input lag in gpu bottle necked situations especially if the cpu was much faster. This great solution has backfired badly with the invention of AMD anti-lag and nvidia ultra low latency as changing the system to just-in-time (0 pre-rendered frames) will have absolutely no effect on your game. Nvidia reflex is simply implementing nvidia ultra low latency into the game engine itself. (Meaning nvidia ultra low latency will be enabled and actually do something this time)

If you want to taste what nvidia reflex will feel like set nvidia ultra low latency to ultra and load up oblivion. It is NOTORIOUS for having EXTREMELY bad input lag as it used the system for it's queue. It's even playable at 30fps. Trust me. Load up oblivion and your mind will be absolutely blown by how incredibly low the input lag is. IT IS SO INCREDIBLE. please load up oblivion

ELK
Posts: 125
Joined: 18 Dec 2015, 02:51

Re: nVidia Reflex - THIS IS HUGE FOR US, BLURBUSTERS!

Post by ELK » 10 Jan 2021, 23:08

A Solid lad wrote:
03 Sep 2020, 13:36
I'm not hyped for reflex as a way to bring latency down... as judging from the video, it will mostly help in GPU bound scenarios, which I already eliminate all the time. (CPU bound framerate = way lower latency than GPU bound, for anyone else reading this)

I'm hyped for it as a tool which enables easy, hassle free latency measurements for everyone!

...I could never bother to do latency testing with a high speed camera and a LED light modded mouse after every setting change in my BIOS or windows registry tweak... however, I can imagine myself bothering to test again and again with reflex!

So yeah, making latency testing way faster and easier is what I'm hyped about.

maybe this will be the good time to get wired mouse
No need to... you can just plug the wireless dongle into the monitor. (with the extension cable if needed)
It will help in gpu bound, cpu bound, and no bottleneck situations. Oblivion is hard evidence.

Your gpu cannot begin making a frame until It has has some essential data from your cpu. For example, data like where your enemy is positioned. With a queue size of 1 your cpu is given the task of computing this essential data for your gpu as soon as your gpu has received the previous essential data. It will look something like this.

CPU sends essential data to gpu -> cpu begins computing next essential data and helps gpu draw current frame -> repeat
*Positions, view angle, etc, are calculated at the start of the previous frame

Same as nvidia ultra low latency, designed by AMD as anti-lag, and monopolized to only be compatible with NVidia cards as nvidia reflex. I'm assuming game developers don't even know they should change their engines to work like anti-lag or ultra low latency, Nvidia is making it easy for developers to add this to their games so don't be so extremely mad at nvidia for ripping off AMD of their tech as some games would never get the benefit of this new tech, and let me tell you it's amazing.

CPU sends essential data to gpu -> cpu and gpu work together to complete the frame (Unlike before essential data for next frame is not computed during this time) -> gpu asks for essential data, essential data purposefully missing, told to be waiting for this specific moment, (forces gpu to idle a very short amount of time very slightly decreasing fps [which won't matter if you're using an fps limit or some form of syncing]
*Positions, view angle, etc, are calculated at the start of the current frame.

EDIT: I'm also really glad nvidia is pointing out the importance of input lag over fps. For an extreme example that would never happen 100fps is better than 300fps if the 300fps has 100ms more input lag. I even have a post about it about unreal engine 4 explaining the exact same thing. Unreal engine 4 actually defaults to a queue size of 2.

Here is my post
https://steamcommunity.com/app/548430/d ... 596757168/

quoting my post
"
NOTE: THIS IS CHANGING QUEUE FROM 2 TO 1, THIS IS RELATED BUT NOT THE SAME AS NVIDIA REFLEX OR ULTRA LOW LATENCY
I will explain the performance impact in more detail.

There are 3 scenarios. You are either gpu bottlenecked, cpu bottlenecked, or neither because you are using an fps limiter, vsync, ssync, gsync, free sync, adaptive sync, etc... all sync except nvidia fast sync or amd enhanced sync but they cause stutter.

(((
Unrelated but if you want vsync without input lag don't use fast sync or enhanced sync use RTSS ssync as it doesn't have stutter, and it doesn't bottleneck you, and if you're gpu bottlenecking that will cause input lag so it's that much better.
))))

SCENARIO 1: GPU BOTTLENECK
If you are gpu bottlenecked and change r.oneframethreadlag to 0. It will have NO impact on your fps. Input lag will be reduced by an entire frame. "Perceived" fps will be greater due to the greatly reduced input lag.
-Your gpu will now always grab frame 1 from the queue instead of always grabbing frame 2 from the queue because the queue size was reduced to 1 from 2 thanks to r.oneframethreadlag=0.

SCENARIO 2: CPU BOTTLENECK
If you are cpu bottlenecked and change r.oneframethreadlag to 0. It will have NO impact on your fps. Input lag will NOT be reduced. "Perceived" fps will not change.
-Your GPU will idle as it waits for the cpu to generate the needed data. This is actually lower input lag than a gpu bottleneck with r.oneframethreadlag=0. It is the situation that AMD anti-lag or Nvidia ultra low latency forces in games that use driver level queueing instead of engine level queueing. Deep rock galactic uses engine level queueing so changing pre-rendered frames or either of the AMD/Nvidia setting mentioned will have absolutely ZERO effect on deep rock.
--However if there is a frame(s) when you are gpu bottlenecked on a mainly cpu bottlenecked machine changing r.oneframethreadlag to 0 will lower your fps, but it will NOT increase your perceived fps as the input lag will be increased on those frames.

SCENARIO 3: NO BOTTLENECK
If you don't have a bottleneck because you are using an fps limiter or some form a syncing that limits your fps like vsync or ssync and change r.oneframethreadlag to 0. It will have NO impact on your fps. Input lag will be reduced by an entire frame. "Perceived" fps will be greater due to the greatly reduced input lag.
-Your gpu will now always grab frame 1 from the queue instead of always grabbing frame 2 from the queue because the queue size was reduced to 1 from 2 thanks to r.oneframethreadlag=0.
--You will have much less input lag than scenario 1 as a gpu bottleneck causes a good deal of input lag. Sometimes up to 40% more.

There is no good reason to not change r.oneframethreadlag to 0. It some situation it can increase your fps, but it will not increase your perceived fps when it happens. However these higher FPS number do look good, and if you're trying to sell a video game engine it will help sales, even though your game will run much worse in perceived fps given scenario 1 or 3 or the special case --2. It's all about the money.
"

Post Reply