Nvidia Reflex 2 Frame Warp

Breaking news in the gaming monitor industry! Press releases, new monitors, rumors. IPS, OLED, 144Hz, G-SYNC, Ultrawides, etc. Submit news you see online!
Boop
Posts: 152
Joined: 08 Nov 2018, 22:10

Nvidia Reflex 2 Frame Warp

Post by Boop » 07 Jan 2025, 02:43

https://www.nvidia.com/en-us/geforce/ne ... rame-warp/
With Reflex 2, we’ve introduced a different approach to reducing latency. Four years ago, NVIDIA’s esports research team published a study illustrating how players could complete aiming tasks faster when frames are updated after being rendered, based on even more recent mouse input. In the experiment, game frames were updated to reduce 80 milliseconds (ms) of added latency, which resulted in players completing an aiming target test 30% faster.

When a player aims to the right with the mouse, for example, it would normally take some time for that action to be received, and for the new camera perspective to be rendered and eventually displayed. What if instead, an existing frame could be shifted or warped to the right to show the result much sooner?

Reflex 2 Frame Warp takes this concept from research to reality. As a frame is being rendered by the GPU, the CPU calculates the camera position of the next frame in the pipeline, based on the latest mouse or controller input. Frame Warp samples the new camera position from the CPU, and warps the frame just rendered by the GPU to this newer camera position. The warp is conducted as late as possible, just before the rendered frame is sent to the display, ensuring the most recent mouse input is reflected on screen.

When Frame Warp shifts the game pixels, small holes in the image are created where the change in camera position reveals new parts of the game scene. Through our research, NVIDIA has developed a latency-optimized predictive rendering algorithm that uses camera, color and depth data from prior frames to in-paint these holes accurately. Players see the rendered frame with an updated camera perspective and without holes, reducing latency for any actions that shift the in-game camera. This helps players aim better, track enemies more precisely, and hit more shots.
Reflex Low Latency mode is most effective when a PC is GPU bottlenecked. But Reflex 2 with Frame Warp provides significant savings in both CPU and GPU bottlenecked scenarios. In Riot Games’ VALORANT, a CPU-bottlenecked game that runs blazingly fast, at 800+ FPS on the new GeForce RTX 5090, PC latency averages under 3 ms using Reflex 2 Frame Warp - one of the lowest latency figures we’ve measured in a first-person shooter.
Image

Image

phpBB [video]

NDUS
Posts: 73
Joined: 12 Aug 2019, 16:05

Re: Nvidia Reflex 2 Frame Warp

Post by NDUS » 07 Jan 2025, 14:48

Isn't this a pretty exact mirror of Blurbuster's 2023 article on how next-gen GPUs should implement frame reprojection?
https://blurbusters.com/frame-generatio ... rojection/

They even use the *exact* same example as in the article (Cyberpunk 2077 with path tracing, reprojected to 240fps)

Image
Image

User avatar
Chief Blur Buster
Site Admin
Posts: 12059
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: Nvidia Reflex 2 Frame Warp

Post by Chief Blur Buster » 07 Jan 2025, 18:24

Almost! And NVIDIA has been giant fans of Blur Busters content.

Good for NVIDIA, it can be utilized by NVIDIA, AMD and Intel, as long as they can get close to 10:1 like my framegen-as-blur-reduction article.

I've been REALLY egging GPU vendors on aggressively about using framegen as a motion blur reduction method.

Happy to see 8x framegen arrive!

I totally expected NVIDIA to be the leader in this, as naturally they now know large-ratio framegen is a substitute to flickery ULMB. ULMB is great but not everyone can stand flicker-based motion blur reduction, so ergonomic framerate-based motion blur reduction is The Way of the big-GPU future. Some DLSS settings is much better looking than awful TAA. It could be better, but you can see how the tech is improving in quality.

FSR + XeSS, get your engines ready! I want competitors onboard too.

Framerate-based motion blur reduction is amazing on OLEDs (8x = almost 90% motion blur reduction with no flicker). Even 2ms MPRT HDR on 480Hz OLED looks better than 1ms MPRT non-HDR on strobed LCD, there's a 2:1 ergonomic powerup factor from gaining flickerless+HDR.

Also, LCD won't have as much motion blur reduction with framegen, since you want GtG=0.000! Not GtG=1ms, not GtG=0.5ms, GtG=0.000ms because GtG is like a slowly moving camera shutter. You don't want a camera shutter move for 1ms before and after a 1/480sec camera exposure; that adds more blur. That's why 240vs360 on LCD is only an utterly useless 1.1x difference, while 120Hz vs 480Hz is MUCH MORE HUMAN VISIBLE than 60Hz-vs-120Hz. The catch is GPU framerate. Now that 8x framegen is here, let's bring it to more GPU technologies.

I love strobing, users need a choice -- strobe-based blur reduction (that's why I released CRT simulator, www.blurbusters.com/crt ...) and framerate-based blur reduction.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on: BlueSky | Twitter | Facebook

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

RonsonPL
Posts: 139
Joined: 26 Aug 2014, 07:12

Re: Nvidia Reflex 2 Frame Warp

Post by RonsonPL » 08 Jan 2025, 12:22

Chief Blur Buster wrote:
07 Jan 2025, 18:24
Almost! And NVIDIA has been giant fans of Blur Busters content.
This whole post will relate to the quoted part, please excuse me going off-topic.

While I'm super happy to read that and won't ever doubt your words, I think there may be a difference between passionate tech geeks working at Nvidia and Nvidia's management (and Jensen himself)

Nvidia is on a collision course with clear motion since years. They're clearly prefering what's more important for their marketing and that has always been static image, which looks great on screenshots and is what everyone can see. Meanwhile majority of people don't own displays which can handle motion even remotely close to "OK" and after years and years of using smeary displays, they won't even know how to utilize a display capable of clear motion if they were sat down in front of it and told to play for an hour. Their moves will be done with 500Hz mouse (not 4000+ or more) and their movements will resemble a chicken. Movement. Stop. Movement. Stop. I've seen this happening many times. And even those who approach it correctly will have to go past mind trick our brain does to us when we think about the motion. What I'm saying is: it's very difficult to make people really understand how very important the motion is.

Nvidia made a few damaging steps from our (motion purist's) perspective:

- let's start from the least important and one that nobody even here will agree with ;) - they introduced G-sync which pushed strobing away and steered discusion from "how clear the motion is" to "how fluid the motion is", placing a seed of misconceptions into many, many minds.
- introduced TAA, which sacrifices quality of moving image for antialiasing of static or very slowly moving images
- introduced variable rate shading, which basically relies on "nobody can see anything in motion so let's degrade it even more and move the power to static image"
- introduces ray-tracing, path tracing and AI upscalling, which again, poses serious problems for popularization of clear motion in gaming. Especially RT is bad as it basically relies on analysis of multiple frames to draw the final frame and fans of RT (and Nvidia's marketing) praise the realistic motion blur, which may lead to this being turned on without an option of turning it off on PS6 console generation and possibly in some PC ports as well
- tries to push AI upscalling as default and mandatory. The new norm. At the same time, while they're being perfectly capable of enforcing ideas onto game developers, there was zero effort from Nvidia to push any dev into making their games motion friendly. The upscallers which damage the motion quality are being the target at the development stage, resulting in severe undersampling (more on that + examples can be found on https://www.reddit.com/r/FuckTAA/ - pardon the word but it's a part of the link, can't remove it ) which means even as little as just 1 out of 9 pixels is being rendered and if you want to get real native resolution without any motion degradation, you see aliasing much worse than gaming at 320x240. This type of pixellation is not fixable. Even if you run the game in 16K res in 2050 on RTX 999080Ti, you'd still be forced to suffer the smeared image OR the awful pixellation. Nvidia does nothing here since years. And they could easily influence devs into making a set of higher res assets or even in-game options. There's no reason why Red Dead Redemption has to look awful on 2080Ti (the best GPU during the development of PC port) if the same game can run on Xbox One. I'm pretty sure the undersampling fix wouldn't need a 500GB patch either.
Sadly, almost all games have this issue now, and that's destroying the future of clear motion in gaming even before we touch the other problems like unfixable 150-200fps framerate caps in game engines. (I do hope for frame gen to help here in the future)

With path tracing getting 21fps on 4090 and 27 on 5090 and PS6 being rumored to be focused on RT/PT and AI, I'm worried even more: Upscalling with even good frame gen from 30fps will never be good for neither clear fast motion or responsiveness.



To conclude:
All those reasons plus the fact we're talking about motion - a matter understood and valued by a very tiny fraction of the market which already is a niche market in the eyes of newborn AI giant - Nvidia, who prefers AI over gaming hardware as that's where almost all of their current profits come from) I think it may take more time then we'd hope for, before Nvidia actually starts putting real effort into this.
I'm afraid it may be as late as 2040, after they ran out of marketing fuel -HDR, 8K and... there's nothing to grab onto but the motion clarity.

I'm also more concerned than optimistic because of Nvidia's history of priorities and their approach to awesome tech. Let me just remind you 3D Vision and VR. The management did the most illogical move imaginable, by throwing it all into a trashcan.
Releasing 3D Vision for open source projects or keeping their words about bringing 3D Vision library to VR, was basically free and they didn't do it anyway.
Pointing out the similarities to current situation - I'm pretty sure there was a lot of Nvidia employees who fell in love with 3D Vision or VR tech, but in the end, it sadly didn't matter.
They also did not do what you did - the CRT simulation shaders. Nor did they make any standard for motion clarity like BlurBusters Approved.

That said, despite writing so much in this post... I do hope I'm wrong.

PS I can see a thread on mentioned subreddit, complaining about the quality of the Reflex 2 Frame Warp. I've seen complaints about DLSS 4 too.
I won't form my own opinion before I can see it with my own eyes, but I really hope it's the first really useful and good thing coming from DLSS for people who require perfect motion clarity and prefer HFR gaming. I hope it's closer to perfect native HFR of CRT quality than DLSS 3 which was a huge disappointment for me, personally.

Dalek
Posts: 134
Joined: 21 Oct 2022, 10:18

Re: Nvidia Reflex 2 Frame Warp

Post by Dalek » 08 Jan 2025, 20:11

It doesn't change the fact that it still has horrific latency along with ghosting/smearing to it. I'd much rather have actual frames. Jenson is appealing to shareholders and investors up on stage, not gamers. Spouting AI every 5 seconds and out of touch corporate cringe mentioning $10,000 PCs and 'a lot' of people being 4090 owners. You can tell he doesn't believe what he's saying.

But there are much bigger issues that are plaguing gaming:

-TAA and other recent anti-aliasing methods look absolutely horrible, and it's asinine that games go out the door in such condition. It's gotten to the point where games are FORCING it and not leaving an option to disable it I think some of the comments speak for itself.

-Graphics peaked 10 years ago, yet optimisation is still worse than ever for a number of recent games.

-Ray tracing makes little no to obvious difference when it comes to lighting. It's not worth the lower frame rate for something you don't really notice unless you're on purposely comparing it. Ray tracing also being forced now in games like Indiana Jones and the Great Circle and Alan Woke 2.

A little off-topic but speaking of visual clarity, I've noticed that certain fonts within Windows 11 (e.g. clock in the corner, Command Prompt, various fonts in the 'PC settings' menu etc) have fonts that don't scale well at 1080p probably because of more developers using 4K perhaps? Could some of these other issues above be due to developers using 4K?

User avatar
RealNC
Site Admin
Posts: 4428
Joined: 24 Dec 2013, 18:32
Contact:

Re: Nvidia Reflex 2 Frame Warp

Post by RealNC » 09 Jan 2025, 14:05

Dalek wrote:
08 Jan 2025, 20:11
It doesn't change the fact that it still has horrific latency along with ghosting/smearing to it. I'd much rather have actual frames.
Reflex 2 is independent from DLSS. It can work on its own.
SteamGitHubStack Overflow
The views and opinions expressed in my posts are my own and do not necessarily reflect the official policy or position of Blur Busters.

Dalek
Posts: 134
Joined: 21 Oct 2022, 10:18

Re: Nvidia Reflex 2 Frame Warp

Post by Dalek » 10 Jan 2025, 00:54

RealNC wrote:
09 Jan 2025, 14:05
Dalek wrote:
08 Jan 2025, 20:11
It doesn't change the fact that it still has horrific latency along with ghosting/smearing to it. I'd much rather have actual frames.
Reflex 2 is independent from DLSS. It can work on its own.
Oops, I guess my slack of sleep this week is showing :oops:, thank you for the correction.

User avatar
Chief Blur Buster
Site Admin
Posts: 12059
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: Nvidia Reflex 2 Frame Warp

Post by Chief Blur Buster » 11 Jan 2025, 17:34

RonsonPL wrote:
08 Jan 2025, 12:22
- let's start from the least important and one that nobody even here will agree with ;) - they introduced G-sync which pushed strobing away and steered discusion from "how clear the motion is" to "how fluid the motion is", placing a seed of misconceptions into many, many minds.
- introduced TAA, which sacrifices quality of moving image for antialiasing of static or very slowly moving images
- introduced variable rate shading, which basically relies on "nobody can see anything in motion so let's degrade it even more and move the power to static image"
- introduces ray-tracing, path tracing and AI upscalling, which again, poses serious problems for popularization of clear motion in gaming. hem t
The nice thing is that at CES 2025 I saw RTX ON graphics at >300fps on a 480Hz OLED, TAA fully disabled, and it looked much better than TAA. There's no way I'm not awarding at least a little kudo to those improved less-fake frames.

I think NVIDIA is kinda course correcting.

Not perfect or good enough. But much better route.

Next piece will be fawning over DLSS 4, so be forewarned. But as a subset within it, I will have a fire-breathing piece again, this time roasting TAA and variable-rate shading, but kudos to DLSS 4 for being a motion-purist improvement over DLSS 3.5. The multi-frame framegen is much better than 99% of TV's interpolation, so I give them that. The AI parallax infills has improved so much, that the 3 fake frames looked better than 4000-series 1 fake frame. That's an accomplishment, less artifacts despite more fake frames (Now... I remind you, triangles and polygons are a different way of 'faking' real life... but I disgress). We're in an era where we're down to ~0.1% of the artifacts of 2010-era Sony Motionflow interpolation. That will continue to improve.

I cheerfully pointed out to NVIDIA, that the clearer motion becomes -- the more TAA/VRS artifacts show -- and that they should focus on lagless & artifactless framegen instead. Correct path IMHO.

Yes, a year ago, at CES 2024, I literally screamed (literally - loud voice) at two NVIDIA employees for neglecting 60fps. My new reduced-flicker CRT simulator, is my tour de force micdrop. Seth Schneider can tell you how upset I was at NVIDIA.

One of my rare slips of professionalism as it may -- but I'm an irish ginger, after all, with those redhead stereotypes. Oh, and I'm in Canada because of an 18th century potato famine in my ancestral country. You can imagine. But I robinhood for my Blur Busters fans as much as I can. I run my hobby-turned-business but I don't forget my fans. At least I try not to (can't remember a million names & requests).

But I have to keep cordial relations to the beautifully rendered green gorilla that makes Blur Busters possible (LIghtBoost catapulted Blur Busters to fame). No matter the love-hate relationship, the business itself still has to love them more than hate em. I do get GPU samples from 'em from time to time, and I do feed back a bit of tech suggestions back at them.

Glad to see them 'steal' (with my blessing) a small subset of the lagless framegen concepts into Reflex2 with between-frame inputreads as extra ground truth to make framegen much less fake. Now let's extend that to aggregate objects (e.g. between-frame enemy movements) to make framegen even more lagless. Even the ultra-4K-highdef textures stayed sharp during framegen this time around.

I'm not the only idea generator around here, but I'm helping with a lot of the goading and amplifying since I've got fans and content creator allies that can be sic'd onto them too. We have to team up as a unified battlefront.

Even way back in 2014 beginning in early ULMB + GSYNC monitors when "ULMB Pulse Width" exists in your monitor menus -- because of me (one NVIDIA employee confirmed as such, it was inspired by the LightBoost 10% fans of short-pulsewidth strobing that reduces even more motion blur).

This very forum exists because of a G-SYNC giveaway, one of my first collabs with NVIDIA.

Anyway;

You can see I root AMD too, to goad them to leapfrog NVIDIA, at the very top of my infographic:

Image

I play both the strobe-based blur busting game, and the framegen-based blur busting game too -- users should have a choice!

There's no way we're doing 4K 1000fps path tracing without the help of framegen, and framegen needs to improve massively, even much better than DLSS4. But let's put it this way: It's fallen to less than 0.1% of a 2010-era TV interpolation system, as long as your pre-framegen framerates are roughly at least ~70-80fps before framegen stage. The high pre-framegen framerate is VERY important; it pushes oscillating framegen-vs-nonframegen artifacts above flicker fusion threshold, making it much less visible than before.

And with better framegen, there's also fewer of those artifacts, but the picky motion purists will notice that a pre-framegen framerate above flicker fusion is very key to a sudden increase in motion quality of framegen, especially for OLEDs.

True. Not all of you will like it, but that's why I am giving people choice -- by releasing the Blur Busters Open Source Display Initiative. I'll have a plasma TV shader open sourced too, 2025-2026, as well as other display simulation shaders (like simulating LCD GtG on an OLED for better less-stuttery 24fps Netflix; and no I won't worsen the blacks)

Framegen does look amazing *if* you have an OLED, because framerate increases dramatically reduces motion blur on OLEDs much more than it does on LCD. The motion blur improvement of yesteryear framegen was too pathetic to overcome the framegen artifacts, but in the OLED era + 4:1 framegen, the motion clarity improvement far massively more than outweighs the now much-more-minor detail loss of framegen.

Maybe wait if you have just a LCD, especially if you already have a display model with poor quality strobing that you never use. Upgrade to 4:1 framegen if you have an OLED. The blur busting benefits of 4:1 to 8:1 framegen is gigantically dramatic on a 480Hz OLED. Some prefer strobe-based blur reduction since they can do 0.5-1ms MPRT, but some of us want the tradeoff of an extremely bright colorful HDR-equipped 2-3ms MPRT. And 60fps CRT simulators work more reliably on 240/360/480Hz OLED, if you want to blur bust your 60fps material.

Now, I finally met a MiniLED display that kept up with the CRT simulator. The new Lenovo 240Hz miniLED gaming laptop looked amazing with the CRT simulator, so I am very pleased they are fixing MiniLED latencies. The LCD is solidly staying in the ballgame; they're having to catch up because of OLED.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on: BlueSky | Twitter | Facebook

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

lann
Posts: 10
Joined: 16 Jun 2024, 03:02

Re: Nvidia Reflex 2 Frame Warp

Post by lann » 20 Jan 2025, 08:59

A question is, Reflex1 can actually reduce latency by reducing the frame queue, for example, allowing for an earlier view of enemies during sniping, but Reflex2 only makes the post-operation visual feedback faster, and it cannot see enemies earlier than Reflex1, which is not helpful for competitive games, but should be very effective in 3A games after frame interpolation.

User avatar
RealNC
Site Admin
Posts: 4428
Joined: 24 Dec 2013, 18:32
Contact:

Re: Nvidia Reflex 2 Frame Warp

Post by RealNC » 20 Jan 2025, 09:48

lann wrote:
20 Jan 2025, 08:59
A question is, Reflex1 can actually reduce latency by reducing the frame queue, for example, allowing for an earlier view of enemies during sniping, but Reflex2 only makes the post-operation visual feedback faster, and it cannot see enemies earlier than Reflex1, which is not helpful for competitive games, but should be very effective in 3A games after frame interpolation.
It is helpful in competitive games as well, because visual feedback and mouse feel are important. If you add mouse lag to your game, which won't affect how soon you see enemies, only how soon the camera responds to your mouse movements, you're gonna be playing worse.
SteamGitHubStack Overflow
The views and opinions expressed in my posts are my own and do not necessarily reflect the official policy or position of Blur Busters.

Post Reply