The Amazing Human Visible Feats Of The Millisecond

Everything about latency. Tips, testing methods, mouse lag, display lag, game engine lag, network lag, whole input lag chain, VSYNC OFF vs VSYNC ON, and more! Input Lag Articles on Blur Busters.
Locked
User avatar
Chief Blur Buster
Site Admin
Posts: 11647
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

The Amazing Human Visible Feats Of The Millisecond

Post by Chief Blur Buster » 08 May 2020, 12:57

Crosspost from the other thread

____________________________

Advantages of high refresh rates?

Someone dare ask that question on Blur Busters ;)
While OP knows some of this already, most posters don't realize how many contexts the millisecond is important in.

<Big Rabbit Hole>
I am going write a shorter version of one of famous Blur Busters flavored pieces.
This post will be a bit of a boast, justifiably so, because we're the "Everything Better Than 60Hz" website.

There are many contexts where the humble millisecond is important. Sometimes it's not important. Sometimes it's useless. But milliseconds are important in lots of display science -- motion clarity, strobe crosstalk, reaction times, refresh cycles, stutters, frametime differences, latency, etc. Sometimes you optimize a display to have 1ms less latency, but there's occasionally also (internal engineering) beneficial side effects when so many display factors interact with each other.

Frametime Context / Refresh Rate Context
The most famous Milliseconds Matter example, 144fps vs 240fps is only a 2.78ms frametime difference, YET it is still human visible as improved motion, including less motion blur on sample-and-hold displays, and reduced stroboscopic effects. Also, 240Hz vs 360Hz is only a 1.38ms difference yet still human-visible too.

Image

The above is simplified because it's slower motion (960 pixels/sec). It becomes much more visible (and 1000Hz shows limitations) at higher motion speeds like 3840 pixels/sec rather than 960 pixels/sec as in above.

That said, we ideally need to go up the curve geometrically (60Hz->120Hz->240Hz->480Hz->960Hz) rather than incrementally (144Hz->165Hz, 240Hz->280Hz). 60Hz-vs-144Hz is a 2.4x improvement in motion clarity (if GtG=0), while 144Hz-vs-240Hz is only a 1.6x improvement (if GtG=0), whereas 144Hz-vs-360Hz is a 2.5x improvement in motion clarity (if GtG=0). The jump 60Hz-144Hz is more similar to the jump 144Hz-360Hz as a result.

Stutter Context
Stutters are because of gametime:photontime variances. Many reasons exists such as the game engine, sync technology, and fluctuating frame rates, etc. Humans can still see frame rate fluctuations that are only a few milliseconds apart in frametime. 100fps vs 200fps is only a 5 millisecond difference in frametime, and it's definitely human visible on 240Hz displays with fast GtG. Variable refresh such as G-SYNC and FreeSync can make stutter less visible by avoiding the fps-vs-Hz aliasing effect of the fixed refresh cycle schedule (animation of variable refresh rate benefits), but is not completely immune and gametime:photontime can still diverge for other reasons like engine inefficiencies, multi-millisecond-scale system freezes, dramatic rendertime differences between consecutive frames, etc. There is even a a piece for game developers, regarding how multi-millisecond issues can add stutters to VRR, and it's a simpler bug to fix than many developers realize. Recently Blur Busters helped a game developer fix stutters in VRR, with rave reviews from end users, precisely because of this.

Input Lag Context
For input latency, you don't need to feel the milliseconds to win by the milliseconds. When you're earning $100,000 in esports, milliseconds matters when those top champions have relatively well-matched players like Olympics sprinters at the starting line waiting for the starting pistol.
- The "Olympics finish line effect": Two racers pass finish line milliseconds apart. Likewise, two esports players go around corner in an alley or dungeon, see each other simultaneously, draws gun simultaneously, shoots simultaneously. The one with less lag is more statistically likely to win that frag.
- The "I'm suddenly missing my sniping shots" factor: Remember 1ms equals 1 pixel every 1000 pixels/sec. Say, 5ms and 2000 pixels/sec (one screenwidths per second), maths out to 10 pixels offset relative to your trained aim. The "Dammit, why does this display make me feel like I'm missing my shots" effect [ugh] [later discovers that display has more lag than player's previous display].

Reaction Time Context
Also, Blur Busters commissioned a paid reaction-time study, called Human Reflex, and it has three sections with some rather interesting findings. There's many kinds of reaction stimuli (visual, aural, and of many subtypes, such as sudden appearance stimuli, or motion-change stimuli, etc), with different reaction times, and this studied a different kind of stimuli that may apparently be faster (<100ms!) than a starting-pistol-type stimuli. More study is needed, but it shows how complex reaction time stimuli is, and it's only barely scratched the surface.

Eye-Hand-Coordination Context
In virtual reality, you need perfect sync between real life and virtual reality as much as possible. Extreme lag creates dizziness. If you're swinging a lightsaber in a fast 10 meters/sec whoosh (1000 centimeters/second), a 20ms latency means your hand whoosh will lag behind by 20 centimeters (0.02sec * 1000cm/sec = 20cm). That's almost a foot behind. People can get dizzy from that. Let that sink in. If we are emulating Holodecks, milliseconds matter a hell of a lot if you don't want to get dizzy in VR.

Netcode Context
Yes, netcode lag and network jitter applies. But in the era of FTTH and LAN play, even with 128tick servers, 4ms means you're 50% more likely to get that earlier tick, and that frag too. 4ms is one full 1/240sec refresh cycle! And, did you know.... Battle(non)sense, the YouTube star about netcode lag, also wrote a guest article for Blur Busters.

MPRT Context
Now, milliseconds also matters in other contexts (motion quality), given that 0.25ms MPRT versus 0.5ms MPRT versus 1.0ms MPRT is now human-visible motion clarity differences in the refresh rate race to retina refresh rates -- especially at 4000 pixels/second. (Just adjust ULMB Pulse Width on an NVIDIA ULMB monitor while viewing TestUFO at 2000 thru 4000 pixels/sec, to witness clarity differences of sub-millisecond MPRT). This is thanks to the Vicious Cycle Effect where bigger displays, higher resolutions, higher refresh rates, wider FOV, faster motion, all simultaneously combine to amplify the visibility of millisecond-scale flaws.

Strobe Backlight Context
Also for improved strobe backlights -- GtG limitations is why ULMB was disabled for >144Hz. Faster GtG makes it easier to hide GtG in VBI to reduce strobe crosstalk. 0.5ms GtG is easier to hide LCD pixel response limitations between 240Hz (1/240sec = 4.16ms) refresh cycles, because you have to flash between scanout sweeps (high speed video #1, high speed video #2). Even a 0.5ms mistime can amplify strobe crosstalk by 2x to 10x, depending on if it starts to enroach a bad part of the GtG curve.

Pixel Response FAQ, GtG vs MPRT
Needless to say, Blur Busters also got one of the world's best Pixel Response FAQs, GtG versus MPRT. While 1ms is unimportant on 60Hz displays, it's a giant cliff of a problem at 360Hz and GtG needs to become faster than 1ms. GtG needs to be a tiny fraction of a refresh cycle to prevent bottlenecking the Hz improvements. Also, strobeless blur reduction requires brute Hz. In the strobeless MPRT context, doubling Hz halves motion blur, and you need approximately ~1000Hz to achieve ULMB strobelessl & laglessly. Full persistence simultaneously with low persistence, with no black periods between short-persistence frames.

Milliseconds Work With Manufacturers
We often work with vendors and manufacturers nowadays (we're more than a website) -- services.blurbusters.com .... We've also got the Blur Busters Strobe Utility, as well as the Blur Busters Approved programme.

Strobe Backlight Precision Context
Did you know 10 microseconds became human visible in this case? I once helped a manufacturer debug an erratically-flickering strobe backlight. There's 1% more photons in a 1010 microsecond strobe flash versus a 1000 microsecond strobe flash. A 1% brightness change is almost 3 RGB shades apart (similar to greyscale value 252 verus greyscale value 255). If you erratically go to 1010 microseconds for a few strobe flashes a second, it becomes visible as an erratic faint candlelight flicker when staring into a maximized Windows Notepad window or bright game (e.g. outscore scene). Yup. 10 microsecond. 0.01 milliseconds. Annoying human visible artifact.

Discovery of G-SYNC Frame Capping Trick
Oh, and we are also the world's first website in 2014 to discover how to measure input lag of G-SYNC. This led to the discovery of the "Cap below max Hz" trick -- we're the first website to recommend that. Now it's standard parrot advice to "cap 3fps below" on VRR displays, now common advice.

Journey to 1000Hz Displays
And if you're enthralled by these articles, you probably should be aware of Blur Busters Law: The Amazing Journey To Future 1000 Hz Displays, as well as other articles like Frame Rate Amplification Technology that enables more frame rates on less powerful GPUs (it's already happening with Oculus ASW 2.0 and NVIDIA DLSS 2.0 but will continue to progress until we get 5:1 or 10:1 frame rate amplification ratios). ASUS has already roadmapped for 1000Hz displays in about ten years, thanks to a lot of Blur Busters advocacy, as told to us, to PC Magazine, and to other media by ASUS PR.

Also, sometimes improving one millisecond context also automatically improves a different millisecond context (lag <-> image quality), though there can be interactions where one worsens the other.

Blur Busters exists because Milliseconds Matters. Blur Busters is all about milliseconds. Motion blur is about milliseconds. Our name sake is Blur Busters. We're paid to pay attention to the millisecond. :D

We know our milliseconds stuff!

</Big Rabbit Hole>
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

User avatar
Chief Blur Buster
Site Admin
Posts: 11647
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: The Amazing Human Visible Feats Of The Millisecond

Post by Chief Blur Buster » 31 May 2020, 13:45

The latency training context is super-important.

Metaphorically, pre-shooting an archery arrow at a horizontally-moving archery target. That's the latency training effect.

Also known as "muscle memory" or "latency familiarity" or other terms, it's training towards a predictable lag.

Latency Aim Training Effect

Latency Training Context on Sudden Hardware Lag Changes
If you're slewing your aim at 8000 pixels per second, a one-millisecond change (1ms) creates a 8 pixel misaim. So you're aiming predictably, perhaps even trained to aim ahead of target like pre-shooting an archery arrow towards a moving target (also an artillery tactic, but it also applies to simple FPS shooting too). You get familiar with the amount of time you need to pre-aim. But if latency changes suddenly (lower or higher), the amount of pre-aiming you need to do changes! So you miss (overshoot or undershoot) because of the latency change. Even a tiny sudden latency change in your setup sometimes creates a "Why am I not getting my hits?" or "I seem to be scoring crap on this monitor".

Latency Training Context on Sudden Temporary Loss Of Scoring When Switching Monitors (Even Hz Upgrades)
Even settings changes (sync technology, refresh rate, etc) creates a situation where latency-retraining is needed. The champion doesn't need to feel the millisecond directly, they just know "why am I scoring crap?" when latency suddenly changes a lot. Then their ability improves again if the latency remains consistent after that. Fixed-lag changes are common when changing setups or upgrading setups. Scoring problems happens more often during lag-increases but can also happen with lag-decreases (sudden temporary loss of scoring ability during a 240Hz upgrade), then they start scoring better than their 144Hz monitor, after one week of familiarization. So you have to fight through a lag-training penalty when you upgrade your rig, sometimes.
Scientific studies comparing Hz for professional elite players, in longer-term results, should allow extra time to compensate for the latency training effect (BTW, yoohoo NVIDIA researchers -- want to generate more impressive Hz-vs-Hz graphs? Then compensate for this factor in a scientific way. Quite a few follow Blur Busters these days. ;) ...)

Latency Training Context on Network Jitter
This also inteferes badly with network latency. Even a big change in latency jitter (e.g. Internet peak), such as 5ms TCP/UDP ping jitter can create hitreg problems, especially on 128-tick servers (since latency jitter can jitter the hitreg between ticks), since that's a whopping variable hitbox offset at fast motionspeeds. YouTubers such as Battle(non)sense covers a lot of this in the universe. It's easier to train to a predictable 30ms latency, than a random 10-25ms latency. Predictable latency is like pre-aiming an archery arrow ahead of a horizontally-moving archery target, to try to get a latecy-compensated hitreg. Now if you shoot players and it doesn't hitreg, that's milliseconds fighting against you in the network (human visible lagged/accelerated enemy position versus actual invisible hitbox location is essentially out of sync), alas... I've mentioned this context in the earlier post, but it's a situation of a user having to continually re-train for changing latencies throughout the day, which perpetually keeps a player a bad player. To fight against this problem, some professionals (A) upgrade to a business connection instead, (B) upgrade to high performance router and use direct Ethernet, and (C) dedicate it just to gaming computer (some pro players with big budgets get two FTTH connections; one only for gaming PC, the rest for the family/streaming/WiFi/etc), as well as (D) switch to a gaming VPN to bypass their bad-jitter backbone, and even (E) All Of The Above

Note: Other Error Margins and Variables
Different equipmentfeel, different tactileness, and other issues can create other training issues. Bear this in mind when building scientific tests that attempts to measure latency training effects, or in potentially creating new blind tests using identical-equipment (with different internals / firmwares / settings / etc).
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

User avatar
Chief Blur Buster
Site Admin
Posts: 11647
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: The Amazing Human Visible Feats Of The Millisecond

Post by Chief Blur Buster » 31 May 2020, 14:00

We're very famous raw material for new researcher study material, so I'm going to be keeping this thread up to date.

Different Human Reaction Time Responses To GtG Pixel Response
Those familiar with Pixel Response FAQ: GtG versus MPRT, as well as LCD Overdrive Artifacts show that different humans have very different reaction time behaviours to different levels of overdrive.
--> Some users prefer no overdrive at all (vision gets distracted by coronas, slowing them down)
--> Some users want super-excess blatant overdrive (BenQ AMA Premium) because it's like a tracer-bullet assist feature
--> Freezing homes (artic) will slow pixel response, requiring slightly higher overdrive
--> Hot homes (tropics) will speed pixel response, requiring slightly less overdrive
--> Some users have a preference to faster pixel response with slight coronas

So different humans have different reaction-time responses to different GtG/overdrive settings. An excessively fast 0.5ms GtG may actually slow a player down because coronas distract them. But may speed up other players because they're trained to treat coronas as a "highlight marker" for movement.

At the time of this writing, the website, PROSETTINGS.NET show that about ~50% of esports players are using BenQ monitors (famous for AMA with exaggerated overdrive), and many of them are using the AMA feature as a "motion highlight assist" feature, similar to shadow boost and other esportsy features.

For a long time, Blur Busters has been dissapointed at fixed overdrive (calibrated at 20C), which is why Blur Busters is an advocate of the 100-Level Overdrive Gain Slider (like Brightess/Contrast), it should never be locked, and should be a User Defined option in main monitor menus. As well as User Defined Overrdrive Lookup Tables (since Blur Busters can generate better LUTs than many scaler/TCON vendors), because there are over 60,000 GtG numbers on a LCD surface

In the past, manufacturers didn't want to add extra overdrive to monitor menus because it complicates laypeople. However, it should at least be a "User Defined" setting hidden in the same area as RGB adjustments or ULMB Pulse Width or other advanced adjustments (among other needed features such as 60Hz single-strobe, for MAME arcade-machine enthusiasts). Monitor manufacturers sadly limit flexibility to keep things easier for users (but hurt the market for other users). It is all often just 1-line firmware programming changes to exaggerate overdrive, or re-add features that expand the market sideways (even features that don't push the refresh rate race upwards).

Anyway, we've amazingly noticed how overdrive is an unexpected "esports assist" and why it's very popular on BenQ monitors. People don't believe Blur Busters until the researchers test these out, and they grudgingly say "Blur Busters Is Right", years ahead of schedule: It appears that people react very differently (lagged reactions & accelerated reactions) to pixel response behaviours such as overdrive.

*(Thread replies can be added by VIP invitation -- reputable researchers/scientists, send me email mark [at] blurbusters.com)
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

User avatar
Chief Blur Buster
Site Admin
Posts: 11647
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: The Amazing Human Visible Feats Of The Millisecond

Post by Chief Blur Buster » 18 Nov 2022, 17:49

For software developers looking to debug the millisecond, check this out:
Brainlet wrote:
12 Nov 2022, 10:08
FrameView
User Guide for 1.4
Supported games
If you get an error loading these sites, scroll to the bottom and select United States as your region, then load the original URL again.

Image
Image
To Software Developers

At the end of the day, the higher the Hz, the more important gametime:photontime relative-sync becomes.

As we reach 1000fps 1000Hz or we reach sub-millisecond MPRT -- error margins for jitter becomes less than 1ms!

Ultra-precise sub-millisecond framepacing now has human visible improvements in virtual reality already -- 0.5ms framepacing errors shows up as noticeable jitter in virtual reality applications! The Oculus Quest 2 has a 0.3ms MPRT. Jitter bigger than MPRT can be human visible especially when it's at a low cyclic/harmonic frequency (e.g. 10 jitters of 0.5ms each per second). Since the jitter is no longer hidden by display motion blur.

0.5ms jitter = 4 pixels jitters at 8000 pixels/sec head turning in VR

Math:
8000 x 0.5 = 4

And even will be a problem for multiple-triple-digit refresh rates on upcoming new 0ms-GtG displays, and for some low-persistence strobe backlight modes. Higher Hz, higher resolutions, bigger FOV, all amplify jitter problems. Displays are getting bigger, clearer, faster, etc.

So please DEBUG your jitter to vastly sub-millisecond levels as much as you can.

gametime:photontime sync perfection FTW!

Every single frame!


Never assume the millisecond is unimportant. It might be unimportant on your bog-standard 1080p 60Hz DELL office monitor (16.7ms MPRT100%), but it's a problem for a 0.3ms MPRT VR headset, and is also a big problem on upcoming ultra-high-Hz desktop OLEDs.

Remember.... jitter, for some cases (e.g. VR) is more important than 1ms differences in absolute latency. Both are important, but for different reasons. Sometimes de-jittering may necessitate minor lag (e.g. using a 1ms de-jittering buffer for a 1000fps 1000Hz workflow). Some esports tolerate jitter if it reduces latency, while other gamers (e.g. sim players, VR players, motion fluidity enthusiasts, etc) do not tolerate jitter.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

User avatar
Chief Blur Buster
Site Admin
Posts: 11647
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: The Amazing Human Visible Feats Of The Millisecond

Post by Chief Blur Buster » 16 Mar 2023, 20:20

Crossposting some findings about power management problems I am witnessing.

Even with lots of power management disabled (CPU 100%, Performance Mode, P-states disabled in many Device Manager items), My RTX 3080 powered system is having more problems beam-racing than my GTX 1080 Ti powered system; and I'm worried that a power management focus (even in "Performance Mode"). This could directly impact the future 1000fps 1000Hz ecosystem;

<Advanced Programmer-Tech Speak>

Skip this if you're not technically literate; but it's part of the Milliseconds Matters series.

Even in my Tearline Jedi beam racing experiments, even at triple-digit frame rates, there were sometimes power management issues creating tearline jitter (wide enough amplitude to be roughly ~0.5ms worth, going by horizontal scanrate amplitude, and the number of scanlines that the tearline jittered). As not all pixels on a display refresh at the same time (high speed videos www.blurbusters.com/scanout ...) as the video output is a serialization of 2D data through a 1D medium (broadcast, wire) as we've been doing for the better part of a century displaying on a 1920s TV or a 2020s DisplayPort monitor. So tearlines are raster splices to that, and anytime tearing is delayed (moved downwards) is a latency, even if tiny. It was fun to watch raster jitter become 10x or 100x worse, depending on power management settings, and the way you could visually see sub-1/100,000sec timing errors, was quite fun to watch. If a display signal was 100 KHz scan rate (number of pixels row per second spewed out of the GPU output), a tearline moved down by 1 pixel was a 1/100,000sec lag, and a tearline moved down by 100 pixels was a 1ms lag. Power management was really wreaking havoc to raster-interrupt-style beam racing feats. Even different APIs used to measure time, timer events, QueryPerformanceCounter, RTDSC, other high precision events vs busylooping, etc, had a major impact on calming down the tearlines.

Here's me controlling VSYNC OFF tearing with a computer mouse:

phpBB [video]


At 1080p/60Hz, this requires a precision of 1/67000sec (67KHz horizontal scanrate), and even a 2-pixel raster jitter is a 2/67000sec error margin.

Now if I turned on power management, this jittered like crazy, often by a 100 pixel amplitude (100/67000sec latency jitter).

Nontheless, precision beam raced tearline-positioning engineering it opened my eyes to how 0.5ms-to-2ms league power management interferences were all over the system, even affecting mudane games.

Once people realized "VSYNC OFF tearlines are just rasters" -- some programmers took notice. An emulator, WinUAE added lagless VSYNC (sync of emuraster to real raster). And a technology, called RTSS Scanline Sync (and Special K Latent Sync) was created because they were impressed by Tearline Jedi, and is now in use by tweakers who love glassfloor frametimes being perfectly delivered to refresh cycles without the need for VRR or laggy VSYNC ON sync technologies. Though, there are compromises to using an external frame rate capper (instead of an internal one that could be less laggy) it is one of the lower-lag external custom third party sync technologies available that doesn't require game awareness to utilize (e.g. NVIDIA Reflex, custom in-game cap, etc)

This type of precision is ignored by game programmers at this time; though we may need to revisit this when we're doing the 4K 1000fps 1000Hz reprojected-UE5-raytraced future (via reprojection technology -- a frame generation technology that can reduce latency), which is now feasible on 4090-class GPUs. (LinusTechTips has a great video of VR reprojection being brought to PC)

phpBB [video]


Technologies like these will portend esports of the 2030s when we need UE5-quality at 1000fps esports-class latencies). It can even rewind frametime latencies (10ms 100fps) because the Frame Generation using reprojection instead of interpolation, reduces latency by knowing input reads (6dof translation coordinates) to modify the last UE5-raytraced-rendered frame instantly (within 1ms) to create a reprojected frame; and you can do that at 10:1 ratios. Obviously reprojection artifacts, but they mostly disappear if you reproject from a base framerate of 100fps to a higher frame rate (e.g. 360fps or 1000fps).

While the game development world won't be usually directly controlling tearlines except for niche purposes, but they were amazing visual-timing debugging (since 1/10,000sec timing errors became human visible -- which made beam racing software fun!). It revealed timing issues that will potentially become an issue in the refresh rate race to retina refresh rates; So it is a potential canary in a coal mine.

Some of these precisions are not yet critical today, but... there are imprecisions from power management that are going to kind of semi-bottleneck 1000fps 1000Hz raytraced UE5 future, unless they are temporarily turned off in a kind of Ultra Performance Mode.

Now that being said, not all systems create major problem from power management. I've seen many systems perform glass floor even during Balanced Power Management, and other systems just go.... fritzy. Attempting to run Tearline Jedi, my RTX3080 system has more power management latency than my old GTX 1080 Ti system, which is really a disappointment. We need an Ultra Timing-Precise Performance Mode.

</Advanced Programmer-Tech Speak>
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

Locked