How much do ram timings affect input lag?

Separate area for niche lag issues including unexpected causes and/or electromagnetic interference (ECC = retransmits = lag). Interference (EMI, EMF) of all kinds (wired, wireless, external, internal, environment, bad component) can cause error-correction latencies like a bad modem connection. Troubleshooting may require university degree. Your lag issue is likely not EMI. Please read this before entering sub-forum.
Forum rules
IMPORTANT:
This subforum is for advanced users only. This separate area is for niche or unexpected lag issues such as electromagnetic interference (EMI, EMF, electrical, radiofrequency, etc). Interference of all kinds (wired, wireless, external, internal, environment, bad component) can cause error-correction (ECC) latencies like a bad modem connection, except internally in a circuit. ECC = retransmits = lag. Troubleshooting may require university degree. Your lag issue is likely not EMI.
🠚 You Must Read This First Before Submit Post or Submit Reply
deama
Posts: 368
Joined: 07 Aug 2019, 12:00

Re: How much do ram timings affect input lag?

Post by deama » 23 Jun 2020, 03:28

schizobeyondpills wrote:
22 Jun 2020, 16:43
deama wrote:
22 Jun 2020, 16:26
Ok, I think I've noticed a difference now, my mouse feels a bit smoother now, perhaps the better word would be more "consistent"?

I basically kept my tFAW at 16, and went ahead and changed my tRRDS from 6 to 4, and tRRDL from 8 to 4.
U wont notice much difference adjusting 2ndary timings or teritary ones WITHOUT first disabling ram power down mode/power saving feature. It wont be as visible since RAM power down adds significant latency to everything related to ram so small changes that affect things below 1ns wont be as easily noticeable.

Also use intel memory latency checker to measure.

first things first u should set command rate to 1, try increase tREFI, lower tRFC.
Power management mode was disabled by default, and I think command rate was set to 1 already (was auto previously) because when I measured it before and after, there wasn't really any difference.

I'm only really after input lag at this point, not even high fps as I lock my fps with scanline sync (120hz) in order to get a more consistent experience, and no tearlines. So I'm only after raw input lag enhancements, since higher fps won't do me any good.

On that note, do RAM timings affect frame latency? The ms delay that presentmon measures?
viewtopic.php?t=5552&start=10

ptuga
Posts: 44
Joined: 20 May 2020, 02:06

Re: How much do ram timings affect input lag?

Post by ptuga » 23 Jun 2020, 04:02

axaro1 wrote:
23 Jun 2020, 02:49
Krizak wrote:
22 Jun 2020, 19:08
axaro1 wrote:
22 Jun 2020, 16:59

I can agree that there is misinformation. I prefer something that is measurable rather than "I changed "XXX"setting, and it "feels" faster/smoother/etc.

Here are some tools that I use to actually measure results to include memory overclocking:

LatencyMon
https://resplendence.com/latencymon

Aida64
https://www.aida64.com/downloads

MouseTester
https://www.overclock.net/forum/375-mic ... oaded.html
axaro1 wrote:
22 Jun 2020, 16:59

Timings DO NOT CHANGE INPUT LAG
axaro1 wrote:
22 Jun 2020, 16:59

This thread is a placebo fiesta
Have to disagree with this statements if I understand the context it is directed at (when someone says they "feel" something improved). There is evidence that memory bandwidth, latency, etc. all have a direct correliation to improving max FPS in addition improving FPS 1% and .1% lows. That means less input lag in of itself. Sometimes it depends on the game, but there are tools out there that actually can measure this. One such tool that one can test their own memory overclocks and results with is CapFrameX. Here is a link to that tool:

CapFrameX
https://github.com/CXWorld/CapFrameX

Also if you add in all the overclocking you can do (CPU, memory, uncore) and optimizations and tweaks with an operating system, it all adds up to a much more efficient gaming system.
The only argument here that makes sense is tight timings = more fps = lower frame times, the rest is just repurposing driver latency testings and core latency testings to prove a weak point.

The only thing that could make sense is unstable RAM producing errors/error correction causing asynchronous behaviour in time sensitive application.
It's not a problem with framerate either. It's the feel of the mouse itself even in windows. I think it changes how the system handles input, because in kovaak's I had 300fps lock and that doesn't move AT ALL, no matter the ram settings. Yet my scores are better when ram is tweaked.

I do believe the error correction part, as when i tweak certain settings mouse starts to get inconsistent and sometimes heavy. For example, if tWR is not twice tRTP I literally can't aim, it's super inconsistent, it's that big a problem, yet it's doesn't show any errors on stress tests.
Other things when tweaked to the minimum (like RRDs and tfaw) can affect frame times, and make mouse feel less smooth in game, this is amplified when you have 1000hz mouse at constant high frame rate, but that usually is caught by stress test, even with takes a lot of time in my experience.

User avatar
Chief Blur Buster
Site Admin
Posts: 11647
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: How much do ram timings affect input lag?

Post by Chief Blur Buster » 23 Jun 2020, 16:47

axaro1 wrote:
22 Jun 2020, 16:59
This thread is a placebo fiesta
Placebo is correct. Much of what you see is placebos.

However, there's also a saying of "Death by a thosuand cuts".
In other words, a million nanoseconds equals one millisecond. It has happened before in some contexts -- see the The Amazing Human Visible Feats Of The Millisecond thread. There's amazing ways where things can dominoe (Rube Goldberg style) into legitimate issues, but it's definitely true that there's a lot of placebo.

That nanosecond that jitters en-masse noisily into random milliseconds, that dominoe consecutive into missed polls or inputreads or VSYNC misses (one microsecond miss = big stutter). Race conditions where a perfectly timed event now gets nanoseconded into a missed-the-subway-train scenario (aka event miss). You'd be shocked. We've discovered a 10 microsecond strobe-backlight issue that domino'd into visible candlelight flicker... 99% of it is garbage but 1% is jawdrop. The jawdrop is how Blur Busters exists.

We need more accredited / trusted researchers to properly study these kinds of topics.

Blur Busters exists because people disbelieved LCD could achieve zero motion blur (like a CRT). We've become sort of famous for keeping an open mind on super-outlier topics. We generally don't close threads like these, but I agree of a big disclaimer. Topics considered whacko on Blur Busters 5-7 years ago is now in research papers today. Blur Busters, a pioneer, continues to confound industry by being Einstein when everyone else is being Issac Newton. I tend to ignore traditional scientific assumptions by reading smartly between the lines. My history of "Humans Cant Tell 30fps vs 60fps", which I've mythbusted since 1993 (screenshot!), long before Blur Busters.

I usually prefer disclaimers since there are definitely lots of understudied areas. A great read is the thread, The Amazing Human Visible Feats Of The Millisecond, which most scientists invariably agree with (at least some of the items). Sometimes I've made fantastic callouts years before big corporate researchers.

So without further ado, adding a disclaimer tag to this thread:

DISCLAIMER: This Thread Topic Tends To Cover Mostly Wild Goose Chases After Red Herrings. Be Wary of Misinformation / Placebo Effects.

Good enough, axaro1?
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

User avatar
axaro1
Posts: 627
Joined: 23 Apr 2020, 12:00
Location: Milan, Italy

Re: How much do ram timings affect input lag?

Post by axaro1 » 23 Jun 2020, 16:49

Chief Blur Buster wrote:
23 Jun 2020, 16:47

Good enough, axaro1?
Yes, you have a point.
XL2566K* | XV252QF* | LG C1* | HP OMEN X 25 | XL2546K | VG259QM | XG2402 | LS24F350[RIP]
*= currently owned



MONITOR: XL2566K custom VT: https://i.imgur.com/ylYkuLf.png
CPU: 5800x3d 102mhz BCLK
GPU: 3080FE undervolted
RAM: https://i.imgur.com/iwmraZB.png
MOUSE: Endgame Gear OP1 8k
KEYBOARD: Wooting 60he

User avatar
schizobeyondpills
Posts: 103
Joined: 06 Jun 2020, 04:00

Re: How much do ram timings affect input lag?

Post by schizobeyondpills » 23 Jun 2020, 17:22

Chief Blur Buster wrote:
23 Jun 2020, 16:47
We need more accredited / trusted researchers to properly study these kinds of topics.
Whenever you or anyone else is ready to sponsor equipment I will prove all of my "schizo" placebo facts to be true.

On a more serious note, what everyone reading this fails to realize is how they are preceiving that measured RAM latency, which is where you are on a correct track but not quite at the end of the line.

It is nanoseconds measure of RAM latency. But RAM latency for read/write is the most primitive and fundamental operation your CPU does, only triumphed by doing raw computations(CPU core clock is what affects this). They both affect eachother a lot. I dont understand how you folks still believe that raw measure of RAM latency doesn't affect anything and that nanoseconds dont matter. Every single power saving feature in the past 20 years on any component be it hardware of software was caused by ONE THING. And that thing is RAM latency bottleneck.

Let me give you some hints
- every single CPU in past 20 years both on AMD and CPU side has 3 levels of caching to reduce delay to read/write into RAM (i wonder why multi billion dollar research shows ram latency matters, backed with proof anyone can look up on chipwiki, but people on forums say nanoseconds dont matter and think they are correct (?) )
- your 5GHz CPU cannot fetch enough data since its not placed in such a way to be optimized for current RAM design (high latency/optimized for raw bandwidth) due to a lot of things such as code control flow jumping/if checks, data access patterns, pointer chasing across different pages, multiple different combinations of reads and writes,etc..) - its not doing linear reads or writes which is what DDR4/3 ram is optimized for - bandwidth, which latency is exact opposite of, latency = time, bandwidth = space.
- your CPU goes to sleep because its clockrate is oversaturated due to RAM latency
- your OS goes to sleep and puts CPU to sleep literally hundreds of thousand times a second because it has nothing to do
- your monitor/gpu has G-Sync because the CPU is unable to render at or above monitor refresh rate, its unable to do so because its oversaturated clock is affected by RAM latency (notice the chain ? all of this due to RAM latency)
- your 1000Hz mouse cannot be made 8000Hz because it takes at least 8 * 1000 writes/reads to RAM per second to get those packets to the game engine, and not even that 1000Hz is stable/consistent

And all of the things I said above are not even my claims, everything I state is objective truth anyone can look up online on any trusted source they like be it chipwiki/wikipedia/books.

User avatar
Chief Blur Buster
Site Admin
Posts: 11647
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: How much do ram timings affect input lag?

Post by Chief Blur Buster » 23 Jun 2020, 17:46

schizobeyondpills wrote:
23 Jun 2020, 17:22
Whenever you or anyone else is ready to sponsor equipment I will prove all of my "schizo" placebo facts to be true.
Only the big companies can afford sufficient enough equipment to audit the entire Rube Goldberg contraption -- e.g. Intel, that expends a lot of effort in trying to reduce various latencies.

There is already tools/equipment that measures tiny fractions of the chain -- including those mentioned in these forums -- but to measure how all those billions of transistors cascade nanorseconds into milliseconds, takes some really serious time & effort. Currently, this is not Blur Busters speciality or budget to expend testing efforts in this area -- however, we encourage innovation by end users to find ways to reliably generate proof.

Now.... Some good points; For example, I've seen how power management (at low frame rates) and thermal throttling (at high % utilization) generate random input lag; when a CPU/GPU was underutilized or overutilized and they downclocked to powersave or to protect. I've even seen leak into TestUFO frametime volatility analysis as web browsers power-managed itself. Especially laptops. Watching how those sub-milliseconds sometimes cascade to TestUFO stutters in realtime with this colorcoded visualization is quite mesmerizing on about 1 in 5 systems that had weird power management behaviours with certain web browsers, since realtime CPU upclocks/downclocks become hugely visible in this TestUFO chart, and we already know frametime is one part of the lag chain. I am totally not surprised that this happens to game engines to a lesser extent, and also is one of the many causes of VRR stutter leakage (gametime:photontime divergences). This testing area IS more of Blur Busters purview -- watching the Rube Goldberg machination in this particular more easily-testable areas where I am able to innovate on various tests.

Also, rather than reviewing displays directly (Except for special editions like G-SYNC 101) -- Blur Busters is a display laboratory covering a lot of display temporal topics (latency, VRR, refresh rate, frame rate, GtG, MPRT, strobing, pixel response, etc) and invent test methods for other display reviewers.

Hats off to those researchers (usually proprietary, hired by the big companies) who actually do this. Like Intel or Sony (PS5) or others. One can just use a simple tool to measure tiny glimpses and hints, but for a complete audit trail of proof -- one has to essentially go thousands times more detailed (scholar worthy), really takes a lot of time and effort. And often those big-money endeavours are often proprietary research that is not typically shared on a reviewer/blogger site -- and sometimes you have to fight a paywall to access some academic papers that don't even go sufficiently detailed on these matters.

CPU/RAM latency testing just simply isn't a Blur Busters speciality at this time, nor have the resources to do so. However, this topic is forum-worthy because it's about computer hardware & latency, despite being understudied for direct esports corroborations (e.g. realtime RAM latency analysis mid-esports-game, to monitor how it all cascades up).
schizobeyondpills wrote:
23 Jun 2020, 17:22
- your 1000Hz mouse cannot be made 8000Hz because it takes at least 8 * 1000 writes/reads to RAM per second to get those packets to the game engine, and not even that 1000Hz is stable/consistent
However...Indirectly, Blur Busters may help in ways. For example, a TestUFO Mouse Benchmark is being built, thanks to new HTML5 APIs for full-pollrate raw input modes, which now makes possible webpages that measure a gaming mouse (as long as the . One click away from new mouse benchmark relevant to refresh rate race to retina refresh rates (unlike those existing simple numbers-outputting mouse benchmarkers).

We're big time advocates of increasing mouse poll rate for the refresh rate race to retina refresh rates -- and realize some of the timing/latency error margins like what you say can Rube-Goldberg itself into visible problems, including for improved poll rates, so at some point the universes may overlap, forcing manufacturers to improve hardware (CPU / mobo / RAM / etc) to make poll benchmarks look good. Browsers do have limitations in real-world (they're not reliable reproductions of a game engine's influence on latencies) but more public mouse data, the merrier.

In tomorrow's world, an example debate could theoretically be "Why does 2000 Hz poll look so jittery/dirty" type of benchmarks, and changing various system configuration parameters might suddenly improve/worsen the look of polling in realtime, to real-world mouse microstutters and such. Those users can say "Look at how my two TestUFO Mouse Poll Tests look, before/after my tweaks! And it actually made my CS:GO game feel better too!". It may be a simple domino, but we'll try out this domino and see how it cascades to industry change (in theory).

..And that's the typical pioneer role of Blur Busters to play in lifting all boats in a trustworthy way. Manufacturers notice and they optimize. Just as they already do for displays, thanks to all the tests Blur Busters has invented...
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

Sparky
Posts: 682
Joined: 15 Jan 2014, 02:29

Re: How much do ram timings affect input lag?

Post by Sparky » 23 Jun 2020, 19:50

schizobeyondpills wrote:
23 Jun 2020, 17:22
Whenever you or anyone else is ready to sponsor equipment I will prove all of my "schizo" placebo facts to be true.
It is not difficult to build your own end to end latency test device: viewtopic.php?p=22634#p22634


RAM latency can increase overall latency, but not invisibly and by more than the frame interval. RAM latency directly impacts the throughput of the CPU, this type of latency isn't sitting between two different stages of the display pipeline, where it would cause latency without impacting framerate. It's more like just having a slower CPU, which shows up as lower framerates.

As for USB, you should probably split that into OS input processing vs actual USB hardware, because polling rates are extremely consistent in the latter, and what applies to one OS might not apply to all of them.

Also, there are software performance analysis tools, you might want to check how many cache misses your game engine is actually generating.

User avatar
schizobeyondpills
Posts: 103
Joined: 06 Jun 2020, 04:00

Re: How much do ram timings affect input lag?

Post by schizobeyondpills » 23 Jun 2020, 20:36

Sparky wrote:
23 Jun 2020, 19:50
schizobeyondpills wrote:
23 Jun 2020, 17:22
Whenever you or anyone else is ready to sponsor equipment I will prove all of my "schizo" placebo facts to be true.
It is not difficult to build your own end to end latency test device: viewtopic.php?p=22634#p22634


RAM latency can increase overall latency, but not invisibly and by more than the frame interval. RAM latency directly impacts the throughput of the CPU, this type of latency isn't sitting between two different stages of the display pipeline, where it would cause latency without impacting framerate. It's more like just having a slower CPU, which shows up as lower framerates.

As for USB, you should probably split that into OS input processing vs actual USB hardware, because polling rates are extremely consistent in the latter, and what applies to one OS might not apply to all of them.

Also, there are software performance analysis tools, you might want to check how many cache misses your game engine is actually generating.
Time consuming to build my own. And the quality wont be anywhere near what actual commercial solutions can do.
Your arguments only see it as a single action of latency to photon which is wrong and does not look into microstutter improvement from lower latency aka through time (multiple frames) and in terms of input responsivness/consistency. Yes, there is very likely to be reduced latency to photon from lower RAM latency but its wrong way to properly measure RAM latency impact.

We are discussing RAM latency not monitor latency.

As for your suggestion about software perf analysis tools they are close to useless for very hard evidence (apart from measuring RAM latency through intel memory lat checker), especially what you recommended, regarding cache misses, caches are very small, and everything gets in them, any game uses at least 6GB of RAM(im aware some of it is static data), yes some things do get cached, but RAM latency is 2nd most important time measure of PC(first being CPU clock) and cache misses are just a small indicator of how well the game is engineered/optimized. RAM latency works on base layer of every single RAM write/read your entire PC does, not just the game.

Telling me to check how many cache misses game engine actually does is like telling me to break my finger to finally reach something, when I can just increase the height of ladder i'm on, in this inverse analogy this is what RAM latency reduction does, its the base of everything.

User avatar
Chief Blur Buster
Site Admin
Posts: 11647
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: How much do ram timings affect input lag?

Post by Chief Blur Buster » 23 Jun 2020, 22:22

Easy. Sparky is a well-respected forum member.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

Sparky
Posts: 682
Joined: 15 Jan 2014, 02:29

Re: How much do ram timings affect input lag?

Post by Sparky » 24 Jun 2020, 06:44

schizobeyondpills wrote:
23 Jun 2020, 20:36
Sparky wrote:
23 Jun 2020, 19:50
schizobeyondpills wrote:
23 Jun 2020, 17:22
Whenever you or anyone else is ready to sponsor equipment I will prove all of my "schizo" placebo facts to be true.
It is not difficult to build your own end to end latency test device: viewtopic.php?p=22634#p22634


RAM latency can increase overall latency, but not invisibly and by more than the frame interval. RAM latency directly impacts the throughput of the CPU, this type of latency isn't sitting between two different stages of the display pipeline, where it would cause latency without impacting framerate. It's more like just having a slower CPU, which shows up as lower framerates.

As for USB, you should probably split that into OS input processing vs actual USB hardware, because polling rates are extremely consistent in the latter, and what applies to one OS might not apply to all of them.

Also, there are software performance analysis tools, you might want to check how many cache misses your game engine is actually generating.
Time consuming to build my own. And the quality wont be anywhere near what actual commercial solutions can do.
Less time than you've spent in this thread, all the hard work is already done for you. Also, please show me a commercial solution that measures whatever you think is a problem.
Your arguments only see it as a single action of latency to photon which is wrong and does not look into microstutter improvement from lower latency aka through time (multiple frames) and in terms of input responsivness/consistency. Yes, there is very likely to be reduced latency to photon from lower RAM latency but its wrong way to properly measure RAM latency impact.

We are discussing RAM latency not monitor latency.
An end to end latency test counts everything. If you want to see how much of an impact RAM timings make, you run one test, change your timings, then run another test, with everything else kept the same. That testing method gets thousands of independent latency samples in a few minutes. Plotting out a histogram lets you see any difference pretty clearly. If there's a flat amount of latency added in series, the whole graph gets shifted over. If there's 5ms added to 10% of frames, then you see a small peak 5ms to the right of the original peak. About the only thing it can't measure is your mouse/keyboard, but you can use lagbox for that.

As for your suggestion about software perf analysis tools they are close to useless for very hard evidence (apart from measuring RAM latency through intel memory lat checker), especially what you recommended, regarding cache misses, caches are very small, and everything gets in them, any game uses at least 6GB of RAM(im aware some of it is static data), yes some things do get cached, but RAM latency is 2nd most important time measure of PC(first being CPU clock) and cache misses are just a small indicator of how well the game is engineered/optimized. RAM latency works on base layer of every single RAM write/read your entire PC does, not just the game.

Telling me to check how many cache misses game engine actually does is like telling me to break my finger to finally reach something, when I can just increase the height of ladder i'm on, in this inverse analogy this is what RAM latency reduction does, its the base of everything.
If there are no cache misses, your processor isn't waiting on RAM at all, because the data it needs is already in the much faster cache. It's like telling you to look up and see if the object you're trying to reach is actually on the shelf.

Post Reply