NVIDIA introduces DLSS3, interpolates frames, but is not v-sync compatible.

Ask about motion blur reduction in gaming monitors. Includes ULMB (Ultra Low Motion Blur), NVIDIA LightBoost, ASUS ELMB, BenQ/Zowie DyAc, Turbo240, ToastyX Strobelight, etc.
Cellx
Posts: 11
Joined: 24 Feb 2020, 07:56

Re: NVIDIA introduces DLSS3, interpolates frames, but is not v-sync compatible.

Post by Cellx » 30 Sep 2022, 18:52

Geez, I didn't expect the DF hate here of all places.

Anyway, how about doing something about this vsync issue? There's an alternative way to force vsync: by using RTSS's scanline sync. It works at 60fps already, but not at 120. Would great if they could fix that somehow without as much a CPU cost.

User avatar
Chief Blur Buster
Site Admin
Posts: 11647
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: NVIDIA introduces DLSS3, interpolates frames, but is not v-sync compatible.

Post by Chief Blur Buster » 30 Sep 2022, 22:45

yuri wrote:
30 Sep 2022, 17:37
I'm the person that chief motion about strobe effect and yeah i was Hype about interpolation at high framerate, but DLSS lower the global quality of the image. and this look worse than DLSS 2 :cry:
If you increase quality instead of performance to slow DLSS 3.0 to equal performance increase as DLSS 2.0, the newer 3.0 already does better. So for framerate apples-vs-apples multiplier, better quality per pixel.

It's when you really push it to high-ratios (4x) that quality can suffer, but that would be a user preference.
yuri wrote:
30 Sep 2022, 17:37
Nvidia really need to separate interpolation feature and DLSS.
the latency is crazy AF with DLSS too.
1. It appears there is already a way to disable the interframe feature (I heard it is via the DLSS quality setting)

DLSS 3.0 configured to perform like DLSS 2.0 has better quality than DLSS 2.0 so you just back off the settings to the same-ratio framerate amplification, and things improve dramatically so that apples-vs-apples favours DLSS 3.0 quality over DLSS 2.0

2. Latency is proportional to frame rate.

If you feedstock DLSS at 100fps, the latency to amplify 100->200+fps is very tiny (milliseconds), an order of magnitude less than television interpolation. This is BIG for some fans here who turn on interpolation with consoles -- because sometimes they have motionblur eyestrain.
yuri wrote:
30 Sep 2022, 17:37
in fact i've test on Bullet per minute that if you put the lowest motion blur setting at 240fps (low setting, and the value at 1/10) the strobe effect become not perceptible if you don't look for it, and the overall blur is not really disgusting to see.
Unfortunately everybody sees differently.

Some have brightness eyestrain/nausea/motionsickness
Some have blue light eyestrain/nausea/motionsickness
Some have flicker eyestrain/nausea/motionsickness
Some have phantom array (stroboscopics) eyestrain/nausea/motionsickness
Some have high frame rate eyestrain/nausea/motionsickness
Some have low frame rate eyestrain/nausea/motionsickness
Some have motion blur eyestrain/nausea/motionsickness
Some have more than one of above
Etc, etc.

For some of us, DLSS 3.0 is sometimes the lesser of evil, alas.

Remember some fans here turn on (+100ms lag) Interpolation on TVs when playing PlayStation games, due to a "motionblur nausea" issue (motion sickness), and that we find it solved motion sickness. Even the Game Mode Interpolation feature still adds +40ms, still lots.

Some have nausea and motion sickness from stroboscopic (phantomarray effects) -- but I don't. So I love strobing and turning off GPU motion blur effects. However I recognize the art of Blur Busting -- is more complex.

We know all this because we're Blur Busters and everybody often comes to us crying to solve a certain problem, whether more Hz helps them, how to be less motionsick during gaming, etc, etc. Sometimes Hz helps a hella lot, and sometimes the extra frames solved the problem.

_____________________

Now that being said, DLSS 3.0 can blur motion quite somewhat, so you have to fiddle with the quality settings until you get the right tradeoff for your specific game for your specific situation.

Compared to black-box interpolators, DLSS is not so black-box (it has direct access to GPU memory) so it can have fewer artifacts than TV-based interpolation. It has fewer artifacts per pixel of processing power than black-box interpolators that needs to collect massive amounts of lookbehind/lookforward frame history data (which requires it to buffer a lot of frames in advance). But DLSS only buffers one frame extra. And you can reduce that one-frame-extra with more feedstock original frame rate. While 25fps may mean 1/25sec lookhead buffering lag, 100fps means 1/100sec lookhead buffering lag. Using DLSS with only a fraction of that latnency penalty is HEAVEN to those motionblur-sickness people who gets headsplitting nausea at anything less than a certain frame rate.

In the future, API hooks into frame rate amplifiers can accept translation data (e.g. 1000Hz controller data, movements of characters) to eliminate lookahead latencies, and simply extrapolation-related stuff instead of use interpolation-related stuff. If a frame rate amplifiers knows the mouse moved a bit, the frame rate amplifier can simply extrapolate/reproject (does not require lookahead bufferingt) instead of interplating between two adjacent frames (requires lookahead buffering).

Eliminating the blackboxness from an interpolation makes it cease to be an interpolator, so there is a future engineering path forward. People who has experience with Oculus ASW, know that reprojection can be fantastic if done properly.

Hopefully DLSS 3.0 can be enhnaced to be VSYNC-compatible for VR compatibility, but even if not, it could wait till DLSS 4.0. I hope that AMD and Intel comes up with answers to large-ratio frame rate amplification (4x and greater)

Perspective FTW!
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

RonsonPL
Posts: 122
Joined: 26 Aug 2014, 07:12

Re: NVIDIA introduces DLSS3, interpolates frames, but is not v-sync compatible.

Post by RonsonPL » 01 Oct 2022, 09:26

yuri wrote:
30 Sep 2022, 17:37

As for Motion blur (digital foundry made a video about it), it help a lot for masking the gaps between each frame but devs doesn't allways provide a slider for adjust the motion blur. for me that hate stroboscopic effect at high frame rate it's better to inject 1 frame of blur with Reshade than make a 180 turn with double image everywhere..

in fact i've test on Bullet per minute that if you put the lowest motion blur setting at 240fps (low setting, and the value at 1/10) the strobe effect become not perceptible if you don't look for it, and the overall blur is not really disgusting to see.

I'd like to take a guess here, that you're talking about online FPP shooters like CSGO where you make quick 180° turns. Your play style mattes a lot.

I've noticed that people who learned how to play on a blurred displays, tend to steer differently in games. Naturally they don't use the moves which simply don't work on a sample-and-hold display, espeically if it was a 60Hz display. There are even players in esport, who always keep their eyes on the centre of the screen. I will not imply that this is worse for effectiveness in game. Might be even better to score better K/D ratios and such. But I would argue it's not the only acceptable way of playing.
End of response.
OFFTOPIC only below
For me, it's the complete opposite. I prefer smooth and slow movement even when I play using a mouse. I once posted a video of me playing Battlefield 4 game online, and people were asking if I play ona joypad or something.
Thing is, I don't go for "twitch aim" unless I really need to. I prefer to look at the scenery when moving the camera so this naturally lessens the stroboscopic problem for me. Even at 50Hz you can turn the camera without ever seeing the problem. That's how I played PS2 3D platform games, although that was on a joypad controller, which meant slower pan speeds.
Likewise, when I just move around even in competetive FPP game like CS:GO, I move my mouse in such a way, so I can still keep my eye on something in the scenery when moving. I'm not doing it because I don't like the strobe effect, but just to have more fun from looking t in-game graphics and better immersion. Well, at least to a degree. In games like Witcher 3 and platform games, I do it more often, sometimes even switching to joypad controls over mouse, if aiming is rare and I'd rather focus on just travel, admiring the landscapes and the game world in general. :)
PS. There's no blur which is not disgusting to me. I was poor. I had to use a very crappy monitor from 1992 to 1998. Everyone around had newer models. Even 1997 14" CRT monitor my friend bought at that time, was miles better than what I had on my 14" manufactured in like 1990 (and el cheapo brand as well). Blurry, muddy image? Less vivid, less "alive"? I don't like it. I had to work hard to get something better and it took me years, as I was jut a teenager back then and I had to spend all the money on the PC parts anyway - upgrading from 386 to Pentium required years of extreme saving up. So now if some stupid postprocess effect reminds me of that old crappy 1990 14" piece of junk, I immediately hate it. No matter if it's FXAA, TAA, motion blur or whatever. It is disgusting :D

I'd really like to see the addition of "spoiler" feature, so I can hide the offtopic in situations like this, where it's a lot of text directed only at one person I'm replying to.

yuri
Posts: 46
Joined: 09 Jun 2022, 14:19

Re: NVIDIA introduces DLSS3, interpolates frames, but is not v-sync compatible.

Post by yuri » 01 Oct 2022, 17:24

RonsonPL wrote:
01 Oct 2022, 09:26
yuri wrote:
30 Sep 2022, 17:37

As for Motion blur (digital foundry made a video about it), it help a lot for masking the gaps between each frame but devs doesn't allways provide a slider for adjust the motion blur. for me that hate stroboscopic effect at high frame rate it's better to inject 1 frame of blur with Reshade than make a 180 turn with double image everywhere..

in fact i've test on Bullet per minute that if you put the lowest motion blur setting at 240fps (low setting, and the value at 1/10) the strobe effect become not perceptible if you don't look for it, and the overall blur is not really disgusting to see.

I'd like to take a guess here, that you're talking about online FPP shooters like CSGO where you make quick 180° turns. Your play style mattes a lot.

I've noticed that people who learned how to play on a blurred displays, tend to steer differently in games. Naturally they don't use the moves which simply don't work on a sample-and-hold display, espeically if it was a 60Hz display. There are even players in esport, who always keep their eyes on the centre of the screen. I will not imply that this is worse for effectiveness in game. Might be even better to score better K/D ratios and such. But I would argue it's not the only acceptable way of playing.
End of response.
OFFTOPIC only below
For me, it's the complete opposite. I prefer smooth and slow movement even when I play using a mouse. I once posted a video of me playing Battlefield 4 game online, and people were asking if I play ona joypad or something.
Thing is, I don't go for "twitch aim" unless I really need to. I prefer to look at the scenery when moving the camera so this naturally lessens the stroboscopic problem for me. Even at 50Hz you can turn the camera without ever seeing the problem. That's how I played PS2 3D platform games, although that was on a joypad controller, which meant slower pan speeds.
Likewise, when I just move around even in competetive FPP game like CS:GO, I move my mouse in such a way, so I can still keep my eye on something in the scenery when moving. I'm not doing it because I don't like the strobe effect, but just to have more fun from looking t in-game graphics and better immersion. Well, at least to a degree. In games like Witcher 3 and platform games, I do it more often, sometimes even switching to joypad controls over mouse, if aiming is rare and I'd rather focus on just travel, admiring the landscapes and the game world in general. :)
PS. There's no blur which is not disgusting to me. I was poor. I had to use a very crappy monitor from 1992 to 1998. Everyone around had newer models. Even 1997 14" CRT monitor my friend bought at that time, was miles better than what I had on my 14" manufactured in like 1990 (and el cheapo brand as well). Blurry, muddy image? Less vivid, less "alive"? I don't like it. I had to work hard to get something better and it took me years, as I was jut a teenager back then and I had to spend all the money on the PC parts anyway - upgrading from 386 to Pentium required years of extreme saving up. So now if some stupid postprocess effect reminds me of that old crappy 1990 14" piece of junk, I immediately hate it. No matter if it's FXAA, TAA, motion blur or whatever. It is disgusting :D

I'd really like to see the addition of "spoiler" feature, so I can hide the offtopic in situations like this, where it's a lot of text directed only at one person I'm replying to.
My play style and sensitivity i guess,

Actually the more you flick and move quick the more strobe effect is visible. with 1600 dpi for my case.
But i don't know how people can play with strobed display whith 0 motion blur with double image everywhere in fps.. i guess it's just adaptation like old CRT like before. but we was not so picky like now...

looking for landscape like you said, for me, with no motion blur every high contrast object is disgusting to see when you move. ( if the game is above 180+fps the difference between 144 and 240 is clearly visible in term of blur and strobe effects )
i don't like motion blur too, that's why i use reshade to set the amount myself, i remember uncharted 4 there was ps2 motion blur :o
in fact per object motion blur is good when use correctly (doom and MW.)

post process can be good if use in a good way.

yuri
Posts: 46
Joined: 09 Jun 2022, 14:19

Re: NVIDIA introduces DLSS3, interpolates frames, but is not v-sync compatible.

Post by yuri » 01 Oct 2022, 17:32

Chief Blur Buster wrote:
30 Sep 2022, 22:45
yuri wrote:
30 Sep 2022, 17:37
I'm the person that chief motion about strobe effect and yeah i was Hype about interpolation at high framerate, but DLSS lower the global quality of the image. and this look worse than DLSS 2 :cry:
If you increase
yuri wrote:
30 Sep 2022, 17:37
Nvidia really need to separate interpolation feature and DLSS.
the latency is crazy AF with DLSS too.
1. It appears there is already a way to disable the interframe feature (I heard it is via the DLSS quality setting)

DLSS 3.0 configured to perform like DLSS 2.0 has better quality than DLSS 2.0 so you just back off the settings to the same-ratio framerate amplification, and things improve dramatically so that apples-vs-apples favours DLSS 3.0 quality over DLSS 2.0

2. Latency is proportional to frame rate.

If you feedstock DLSS at 100fps, the latency to amplify 100->200+fps is very tiny (milliseconds), an order of magnitude less than television interpolation. This is BIG for some fans here who turn on interpolation with consoles -- because sometimes they have motionblur eyestrain.
yuri wrote:
30 Sep 2022, 17:37
in fact i've test on Bullet per minute that if you put the lowest motion blur setting at 240fps (low setting, and the value at 1/10) the strobe effect become not perceptible if you don't look for it, and the overall blur is not really disgusting to see.
Unfortunately everybody sees differently.

Some have brightness eyestrain/nausea/motionsickness
Some have blue light eyestrain/nausea/motionsickness
Some have flicker eyestrain/nausea/motionsickness
Some have phantom array (stroboscopics) eyestrain/nausea/motionsickness
Some have high frame rate eyestrain/nausea/motionsickness
Some have low frame rate eyestrain/nausea/motionsickness
Some have motion blur eyestrain/nausea/motionsickness
Some have more than one of above
Etc, etc.

For some of us, DLSS 3.0 is sometimes the lesser of evil, alas.

Remember some fans here turn on (+100ms lag) Interpolation on TVs when playing PlayStation games, due to a "motionblur nausea" issue (motion sickness), and that we find it solved motion sickness. Even the Game Mode Interpolation feature still adds +40ms, still lots.

Some have nausea and motion sickness from stroboscopic (phantomarray effects) -- but I don't. So I love strobing and turning off GPU motion blur effects. However I recognize the art of Blur Busting -- is more complex.

We know all this because we're Blur Busters and everybody often comes to us crying to solve a certain problem, whether more Hz helps them, how to be less motionsick during gaming, etc, etc. Sometimes Hz helps a hella lot, and sometimes the extra frames solved the problem.

_____________________

Now that being said, DLSS 3.0 can blur motion quite somewhat, so you have to fiddle with the quality settings until you get the right tradeoff for your specific game for your specific situation.

Compared to black-box interpolators, DLSS is not so black-box (it has direct access to GPU memory) so it can have fewer artifacts than TV-based interpolation. It has fewer artifacts per pixel of processing power than black-box interpolators that needs to collect massive amounts of lookbehind/lookforward frame history data (which requires it to buffer a lot of frames in advance). But DLSS only buffers one frame extra. And you can reduce that one-frame-extra with more feedstock original frame rate. While 25fps may mean 1/25sec lookhead buffering lag, 100fps means 1/100sec lookhead buffering lag. Using DLSS with only a fraction of that latnency penalty is HEAVEN to those motionblur-sickness people who gets headsplitting nausea at anything less than a certain frame rate.

In the future, API hooks into frame rate amplifiers can accept translation data (e.g. 1000Hz controller data, movements of characters) to eliminate lookahead latencies, and simply extrapolation-related stuff instead of use interpolation-related stuff. If a frame rate amplifiers knows the mouse moved a bit, the frame rate amplifier can simply extrapolate/reproject (does not require lookahead bufferingt) instead of interplating between two adjacent frames (requires lookahead buffering).

Eliminating the blackboxness from an interpolation makes it cease to be an interpolator, so there is a future engineering path forward. People who has experience with Oculus ASW, know that reprojection can be fantastic if done properly.

Hopefully DLSS 3.0 can be enhnaced to be VSYNC-compatible for VR compatibility, but even if not, it could wait till DLSS 4.0. I hope that AMD and Intel comes up with answers to large-ratio frame rate amplification (4x and greater)

Perspective FTW!
So i wander if you use nvidia REFLEX+DLSS or NULL how pre rendering frame impact in term of latency :?:

1000WATT
Posts: 391
Joined: 22 Jul 2018, 05:44

Re: NVIDIA introduces DLSS3, interpolates frames, but is not v-sync compatible.

Post by 1000WATT » 03 Oct 2022, 13:03

Now Ebay
3090 500$
3080 300$
Video cards sell a kilogram for a dollar)
It seems to me 4090 with a price of 2000 euros for Europe, for the founders version. This is unjustifiably overpriced.
I often do not clearly state my thoughts. google translate is far from perfect. And in addition to the translator, I myself am mistaken. Do not take me seriously.

yuri
Posts: 46
Joined: 09 Jun 2022, 14:19

Re: NVIDIA introduces DLSS3, interpolates frames, but is not v-sync compatible.

Post by yuri » 04 Oct 2022, 07:34

1000WATT wrote:
03 Oct 2022, 13:03
Now Ebay
3090 500$
3080 300$
Video cards sell a kilogram for a dollar)
It seems to me 4090 with a price of 2000 euros for Europe, for the founders version. This is unjustifiably overpriced.
1000$ more than 3080 for 20fps more :!:

nuninho1980
Posts: 140
Joined: 26 Dec 2013, 09:49

Re: NVIDIA introduces DLSS3, interpolates frames, but is not v-sync compatible.

Post by nuninho1980 » 04 Oct 2022, 07:48

"...but is not v-sync compatible."

FALSE, we want vsync!!!! 😡
CPU: [email protected]
RAM: 2x16GB DDR4@3600MHz
MB: MSI PRO Z690-A DDR4
GPU: Zotac RTX 4090 non-OC new! <3 :D
Opt. disc: LG BD-RE writer BH16NS40
HDD: SATA 1TB
SSDs: OCZ RD400 0.5TB+Crucial MX500 2TB
PSU: AEROCOOL 1kW 80+ Gold
Disly: CRT 21" Sony E530 :D

Cellx
Posts: 11
Joined: 24 Feb 2020, 07:56

Re: NVIDIA introduces DLSS3, interpolates frames, but is not v-sync compatible.

Post by Cellx » 11 Oct 2022, 09:13

Looks like you CAN force vsync on in the drivers after all with DLSS3! But according to DF, there's some issues depending on games.

carlcamper
Posts: 1
Joined: 28 Oct 2019, 18:50

Re: NVIDIA introduces DLSS3, interpolates frames, but is not v-sync compatible.

Post by carlcamper » 12 Oct 2022, 01:44

Cellx wrote:
11 Oct 2022, 09:13
Looks like you CAN force vsync on in the drivers after all with DLSS3! But according to DF, there's some issues depending on games.
Good to hear! I can't use Gsync as my OLED monitor FO48U exhibits VRR flicker, so Vsync will be the way to go.

Post Reply