yuri wrote: ↑30 Sep 2022, 17:37
I'm the person that chief motion about strobe effect and yeah i was Hype about interpolation at high framerate, but DLSS lower the global quality of the image. and this look worse than DLSS 2
If you increase quality instead of performance to slow DLSS 3.0 to equal performance increase as DLSS 2.0, the newer 3.0 already does better. So for framerate apples-vs-apples multiplier, better quality per pixel.
It's when you really push it to high-ratios (4x) that quality can suffer, but that would be a user preference.
yuri wrote: ↑30 Sep 2022, 17:37
Nvidia really need to separate interpolation feature and DLSS.
the latency is crazy AF with DLSS too.
1. It appears there is already a way to disable the interframe feature (I heard it is via the DLSS quality setting)
DLSS 3.0 configured to perform like DLSS 2.0 has better quality than DLSS 2.0 so you just back off the settings to the same-ratio framerate amplification, and things improve dramatically so that apples-vs-apples favours DLSS 3.0 quality over DLSS 2.0
2. Latency is proportional to frame rate.
If you feedstock DLSS at 100fps, the latency to amplify 100->200+fps is very tiny (milliseconds), an order of magnitude less than television interpolation. This is BIG for some fans here who turn on interpolation with consoles -- because sometimes they have motionblur eyestrain.
yuri wrote: ↑30 Sep 2022, 17:37
in fact i've test on Bullet per minute that if you put the lowest motion blur setting at 240fps (low setting, and the value at 1/10) the strobe effect become not perceptible if you don't look for it, and the overall blur is not really disgusting to see.
Unfortunately everybody sees differently.
Some have brightness eyestrain/nausea/motionsickness
Some have blue light eyestrain/nausea/motionsickness
Some have flicker eyestrain/nausea/motionsickness
Some have phantom array (stroboscopics) eyestrain/nausea/motionsickness
Some have high frame rate eyestrain/nausea/motionsickness
Some have low frame rate eyestrain/nausea/motionsickness
Some have motion blur eyestrain/nausea/motionsickness
Some have more than one of above
Etc, etc.
For some of us, DLSS 3.0 is sometimes the lesser of evil, alas.
Remember some fans here turn on (
+100ms lag) Interpolation on TVs when playing PlayStation games, due to a "motionblur nausea" issue (motion sickness), and that we find it solved motion sickness. Even the Game Mode Interpolation feature still adds +40ms, still lots.
Some have nausea and motion sickness from stroboscopic (phantomarray effects) -- but I don't. So I love strobing and turning off GPU motion blur effects. However I recognize the art of Blur Busting -- is more complex.
We know all this because we're Blur Busters and everybody often comes to us crying to solve a certain problem, whether more Hz helps them, how to be less motionsick during gaming, etc, etc. Sometimes Hz helps a hella lot, and sometimes the extra frames solved the problem.
_____________________
Now that being said, DLSS 3.0 can blur motion quite somewhat, so you have to fiddle with the quality settings until you get the right tradeoff for your specific game for your specific situation.
Compared to black-box interpolators, DLSS is not so black-box (it has direct access to GPU memory) so it can have fewer artifacts than TV-based interpolation. It has fewer artifacts per pixel of processing power than black-box interpolators that needs to collect massive amounts of lookbehind/lookforward frame history data (which requires it to buffer a lot of frames in advance). But DLSS only buffers one frame extra. And you can reduce that one-frame-extra with more feedstock original frame rate. While 25fps may mean 1/25sec lookhead buffering lag, 100fps means 1/100sec lookhead buffering lag. Using DLSS with only a fraction of that latnency penalty is HEAVEN to those motionblur-sickness people who gets headsplitting nausea at anything less than a certain frame rate.
In the future, API hooks into frame rate amplifiers can accept translation data (e.g. 1000Hz controller data, movements of characters) to eliminate lookahead latencies, and simply extrapolation-related stuff instead of use interpolation-related stuff. If a frame rate amplifiers knows the mouse moved a bit, the frame rate amplifier can simply extrapolate/reproject (does not require lookahead bufferingt) instead of interplating between two adjacent frames (requires lookahead buffering).
Eliminating the blackboxness from an interpolation makes it
cease to be an interpolator, so there is a future engineering path forward. People who has experience with Oculus ASW, know that reprojection can be fantastic if done properly.
Hopefully DLSS 3.0 can be enhnaced to be VSYNC-compatible for VR compatibility, but even if not, it could wait till DLSS 4.0. I hope that AMD and Intel comes up with answers to large-ratio frame rate amplification (4x and greater)
Perspective FTW!