How much do ram timings affect input lag?

Separate area for niche lag issues including unexpected causes and/or electromagnetic interference (ECC = retransmits = lag). Interference (EMI, EMF) of all kinds (wired, wireless, external, internal, environment, bad component) can cause error-correction latencies like a bad modem connection. Troubleshooting may require university degree. Your lag issue is likely not EMI. Please read this before entering sub-forum.
Forum rules
IMPORTANT:
This subforum is for advanced users only. This separate area is for niche or unexpected lag issues such as electromagnetic interference (EMI, EMF, electrical, radiofrequency, etc). Interference of all kinds (wired, wireless, external, internal, environment, bad component) can cause error-correction (ECC) latencies like a bad modem connection, except internally in a circuit. ECC = retransmits = lag. Troubleshooting may require university degree. Your lag issue is likely not EMI.
🠚 You Must Read This First Before Submit Post or Submit Reply
ptuga
Posts: 44
Joined: 20 May 2020, 02:06

Re: How much do ram timings affect input lag?

Post by ptuga » 24 Jun 2020, 09:20

For those who still don't believe RAM affect mouse feel, install Kovaak's and do Close Long Strafes Invincible, it's soo obvious on that scenario, maybe because it's a bot moving at constant high speed and doesn't change directions that often soo you can have consistent results assuming you have decent aim.

User avatar
Chief Blur Buster
Site Admin
Posts: 11647
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: How much do ram timings affect input lag?

Post by Chief Blur Buster » 24 Jun 2020, 09:54

Sparky wrote:
24 Jun 2020, 06:44
An end to end latency test counts everything. If you want to see how much of an impact RAM timings make, you run one test, change your timings, then run another test, with everything else kept the same. That testing method gets thousands of independent latency samples in a few minutes. Plotting out a histogram lets you see any difference pretty clearly. If there's a flat amount of latency added in series, the whole graph gets shifted over. If there's 5ms added to 10% of frames, then you see a small peak 5ms to the right of the original peak. About the only thing it can't measure is your mouse/keyboard, but you can use lagbox for that.
Yep!

That’s how I figures out how to lag-test G-SYNC, and discovered it was a co-operative latency behaviour between software and display. A button-to-photons test will measure the difference of a single configuration change, like a software based frame rate cap.

If I measure the display only, I would have never discovered the software influence in the different lags of the max-framerate versus below max framerate.

Whether ultra slomo camera or a photodiode, the same thing applies to changing any config. One can change a driver, or a motherboard setting, even RAM latency, and watch how resulting lag changes. Do it accurate enough (e.g. microsecond accurate GPIO via a PCI-X card, bypassing USB jitter) and a beautiful trove if data shows below the

These devices also can simulate aspects of a mouse with appropriate modifications, creating a button-to-photons tester.

This only measures the result though (Easy!) rather than a full lag audit trail though (Hard!), but it can be a great starting point of determining lag behaviors, or potentially homing on the right part of the chain to measure. It’s a lot of data that is hard to record without impacting performance.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

ptuga
Posts: 44
Joined: 20 May 2020, 02:06

Re: How much do ram timings affect input lag?

Post by ptuga » 24 Jun 2020, 12:18

It would be nice if someone could test this. One important thing is to disable all the timing optimisations, as that changes certain timings on memory training.

Again, I'm pretty sure it's how input is processed rather than input lag, since i feel mouse smoother on those tracking scenarios. The higher speed tracking ones, like Close Long Strafes are the one i notice it the most. This translates to OS mouse feel too, soo not frame time issue.

I'm very good and smooth at tracking with over 60% acc on ascended tracking and over 70% on Close Long Strafes. This on 144hz and 1000hz GPW, with locked 288fps.

User avatar
jorimt
Posts: 2481
Joined: 04 Nov 2016, 10:44
Location: USA

Re: How much do ram timings affect input lag?

Post by jorimt » 24 Jun 2020, 12:25

ptuga wrote:
24 Jun 2020, 12:18
Again, I'm pretty sure it's how input is processed rather than input lag, since i feel mouse smoother on those tracking scenarios. The higher speed tracking ones, like Close Long Strafes are the one i notice it the most. This translates to OS mouse feel too, soo not frame time issue.
Tighter RAM/better CPU performance can result in higher minimum framerates, so any improvements you might be feeling in games could be due to improved frametime performance, even at the same average framerate.

As for an improvement in the OS, as long as you have DWM enabled, you are getting V-SYNC, and visual updates are still being delivered via frames, so improved frametime performance for OS isn't entirely out of the question there either.

That said, short of subjective observation, RAM timing adjustments/latency improvements would be incredibly difficult to test for (let alone fully isolate from other factors) in an objective manner where input lag levels are concerned.
(jorimt: /jor-uhm-tee/)
Author: Blur Busters "G-SYNC 101" Series

Displays: ASUS PG27AQN, LG 48CX VR: Beyond, Quest 3, Reverb G2, Index OS: Windows 11 Pro Case: Fractal Design Torrent PSU: Seasonic PRIME TX-1000 MB: ASUS Z790 Hero CPU: Intel i9-13900k w/Noctua NH-U12A GPU: GIGABYTE RTX 4090 GAMING OC RAM: 32GB G.SKILL Trident Z5 DDR5 6400MHz CL32 SSDs: 2TB WD_BLACK SN850 (OS), 4TB WD_BLACK SN850X (Games) Keyboards: Wooting 60HE, Logitech G915 TKL Mice: Razer Viper Mini SE, Razer Viper 8kHz Sound: Creative Sound Blaster Katana V2 (speakers/amp/DAC), AFUL Performer 8 (IEMs)

User avatar
schizobeyondpills
Posts: 103
Joined: 06 Jun 2020, 04:00

Re: How much do ram timings affect input lag?

Post by schizobeyondpills » 24 Jun 2020, 15:27

Its sad you folks are still constrained in one level of thinking, which is measuring frame latency and frames. We are not discussing measuring displays here. The same way you dont test your CPU overclock measuring photon latency, you shouldn't do it with RAM latency overclock either. Yes, there would be noticeable difference in both CPU oc/ram OC in photon latency test. But its 1/10th of what RAM latency/CPU OC affect.

Intel memory latency checker is a great tool to measure RAM latency OC, aim for lower numbers (with command rate 1 - not 2, and power down mode disabled ).

If you reduced latency of your brain processing input and output, yes, you would process visual input faster, but as you all know we have 5 senses which would all be affected,and not just senses, but raw thinking speed aswell, its the same with RAM latency.

RAM latency is such a fundamental concept that in calculation of just 1 frame its used millions of times during memory reads and writes.
Theres no point in using display/photon latency test for ram latency, if your CPU/GPU is limited to sending out a frame every 5ms then no matter how much you remove latency in earlier parts of the chain it wont be visible on photon latency test. Why is this so hard to see? All NVIDIA gpus by default have write combined physical allocation of their memory, so the CPU doesnt instantly send the frames out. Every single driver from NVIDIA below 450.XX has this undocumented feature. Feel free to create the registry key below and re-start to verify. (NVIDIA gpus only ofc). So now you realize all photon latency tests everyone did are flawed since they didn't know about this undocumented setting inside driver. And all those small miniscule changes didnt show up on them, because CPU didnt write back to GPU mem instantly, instead it used write combining to wait for more data before sending it. unfortunate :lol:

Code: Select all

PATH : Computer\HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Services\nvlddmkm
NAME : DisableWriteCombining
TYPE: DWORD
Value: 1
Info: Controls the flag for memory mapping IO space of GPU with disabling caching/write combining ( https://docs.microsoft.com/en-us/windows-hardware/drivers/ddi/wdm/ne-wdm-_memory_caching_type)

User avatar
jorimt
Posts: 2481
Joined: 04 Nov 2016, 10:44
Location: USA

Re: How much do ram timings affect input lag?

Post by jorimt » 24 Jun 2020, 16:10

schizobeyondpills wrote:
24 Jun 2020, 15:27
Its sad you folks are still constrained in one level of thinking, which is measuring frame latency and frames. We are not discussing measuring displays here.
But the only way we see and process any aspect of system output/feedback is ultimately via frames. All the information is contained in frames. The only way we can "feel" any change is seeing it reflect in frames over a period of frames.

Where visual stimuli on a personal computer is concerned, there are only frames, which are seen through displays.

How frames (or what's in them) aren't ultimately the point where feedback for user interaction is concerned is honestly beyond my interest to try to "unconstrain" my thinking.

However, if we're just talking about benchmark numbers that may or may not have direct visual bearing on an actual user experience in real-world situations (e.g. direct visual feedback from frame information during gaming), then sure, but I consider that another subject.
(jorimt: /jor-uhm-tee/)
Author: Blur Busters "G-SYNC 101" Series

Displays: ASUS PG27AQN, LG 48CX VR: Beyond, Quest 3, Reverb G2, Index OS: Windows 11 Pro Case: Fractal Design Torrent PSU: Seasonic PRIME TX-1000 MB: ASUS Z790 Hero CPU: Intel i9-13900k w/Noctua NH-U12A GPU: GIGABYTE RTX 4090 GAMING OC RAM: 32GB G.SKILL Trident Z5 DDR5 6400MHz CL32 SSDs: 2TB WD_BLACK SN850 (OS), 4TB WD_BLACK SN850X (Games) Keyboards: Wooting 60HE, Logitech G915 TKL Mice: Razer Viper Mini SE, Razer Viper 8kHz Sound: Creative Sound Blaster Katana V2 (speakers/amp/DAC), AFUL Performer 8 (IEMs)

User avatar
schizobeyondpills
Posts: 103
Joined: 06 Jun 2020, 04:00

Re: How much do ram timings affect input lag?

Post by schizobeyondpills » 24 Jun 2020, 16:40

jorimt wrote:
24 Jun 2020, 16:10
schizobeyondpills wrote:
24 Jun 2020, 15:27
Its sad you folks are still constrained in one level of thinking, which is measuring frame latency and frames. We are not discussing measuring displays here.
But the only way we see and process any aspect of system output/feedback is ultimately via frames. All the information is contained in frames. The only way we can "feel" any change is seeing it reflect in frames over a period of frames.

Where visual stimuli on a personal computer is concerned, there are only frames, which are seen through displays.

How frames (or what's in them) aren't ultimately the point where feedback for user interaction is concerned is honestly beyond my interest to try to "unconstrain" my thinking.

However, if we're just talking about benchmark numbers that may or may not have direct visual bearing on an actual user experience in real-world situations (e.g. direct visual feedback from frame information during gaming), then sure, but I consider that another subject.
Frames contain much more info than their photon latency. Trying to measure effects of RAM latency visually is like trying to see temperature of your CPU until its boiling red. There is too much of a pipeline uncertainty in CPU -> GPU -> display to measure something which affects CPU and its insides the most. RAM latency CAN/probably does reduce photon latency, but what is shown on that pixel might be player object 0.5 or 0.3 points in different position existing more in real-time due to lower latency of RAM which caused game engine to be closer to real-time than trying to chase it. See? For now i suggest folks stick to measuring raw base property ( ram latency using IMLC - Intel memory latency checker) than trying to do it visually because as many people in this thread "claimed" their mouse feels more responsive/smoother which is clear evidence that theres more to RAM latency than its effect on photon latency, evident from RAM latency being most primitive/basic building block of PCs

>The only way we can "feel" any change is seeing it reflect in frames over a period of frames.
There is also audio =)

Also, visual testing would be great if you managed to insert so called probes in the entire pipeline and combine it with visual output, since then you would remove the factor of "possible variation, error aspect" in the decimals of frame photon latency being actually backed by software captured statistics of things such as engine/dpc latency/input time to engine rather than possible error/variation.

I just fail to see how an arduino project would measure something so fundamental with great detail or hold up as hard evidence, unless you work around it and back it up with 30+ probing variables of entire pipeline.

If we had entire latency chain analysis setup we could see that GPU to out via HDMI protocol analyzer takes xx.x5 ms less time on lower ram latency and then check user mode driver libary which sits behind directx API also has xx.x3ms less latency and then we checked the probe we placed at submission of command/render buffers for Present() calls and saw the same latency reduciton there (due to lower ram latency) we would have much harder/academic tier evidence than just using a homemade arduino photon detector.

User avatar
jorimt
Posts: 2481
Joined: 04 Nov 2016, 10:44
Location: USA

Re: How much do ram timings affect input lag?

Post by jorimt » 24 Jun 2020, 17:27

schizobeyondpills wrote:
24 Jun 2020, 16:40
but what is shown on that pixel might be player object 0.5 or 0.3 points in different position existing more in real-time due to lower latency of RAM which caused game engine to be closer to real-time than trying to chase it. See?
But that's my point...

Whatever amount RAM adjustments reduce input lag are still ultimately going to be reflected in the frame information scanned into the display. If RAM tuning reduces input lag, it will cause that information to appear in the previous frame scan instead of the current (or in an earlier part of it), etc. This is the only way users are going to visually "feel" the difference. There isn't some second sense that will make their mouse feel more responsive unless they see it reflect on the display.
schizobeyondpills wrote:
24 Jun 2020, 16:40
I just fail to see how an arduino project would measure something so fundamental with great detail or hold up as hard evidence, unless you work around it and back it up with 30+ probing variables of entire pipeline.
Again, fully isolating RAM tuning differences out of the remaining factors with button-to-pixel methods would be difficult, but measuring overall differences in averaged input lag levels between two scenarios (especially with the camera method I used in my G-SYNC tests, which captures the entire screen, example: viewtopic.php?f=5&t=3441&start=510#p52233), aka RAM tuning A and then RAM tuning B, while leaving all other factors exact, may still capture results if they are actually in the measurable realm of such methods (if not, we're talking sub <1ms input lag reductions solely via RAM tuning, at most).
schizobeyondpills wrote:
24 Jun 2020, 16:40
There is also audio =)
Indeed, but I did say "visual," and then we're getting into audio/video sync mismatch, which is another (often overlooked) issue.
(jorimt: /jor-uhm-tee/)
Author: Blur Busters "G-SYNC 101" Series

Displays: ASUS PG27AQN, LG 48CX VR: Beyond, Quest 3, Reverb G2, Index OS: Windows 11 Pro Case: Fractal Design Torrent PSU: Seasonic PRIME TX-1000 MB: ASUS Z790 Hero CPU: Intel i9-13900k w/Noctua NH-U12A GPU: GIGABYTE RTX 4090 GAMING OC RAM: 32GB G.SKILL Trident Z5 DDR5 6400MHz CL32 SSDs: 2TB WD_BLACK SN850 (OS), 4TB WD_BLACK SN850X (Games) Keyboards: Wooting 60HE, Logitech G915 TKL Mice: Razer Viper Mini SE, Razer Viper 8kHz Sound: Creative Sound Blaster Katana V2 (speakers/amp/DAC), AFUL Performer 8 (IEMs)

1000WATT
Posts: 391
Joined: 22 Jul 2018, 05:44

Re: How much do ram timings affect input lag?

Post by 1000WATT » 24 Jun 2020, 17:50

I don’t understand what you are trying to achieve.
Yes timings affect the whole system. Just like the extra mhz on the CPU. But what is the use of going to forums and saying that it is "very important". The testing tools that we offer do not suit you. There is no money for more accurate equipment.
schizobeyondpills wrote:
24 Jun 2020, 16:40
For now i suggest folks stick to measuring raw base property ( ram latency using IMLC - Intel memory latency checker)
Well we did this test, but what next? What do we do with the results?

I imagine it like this. A comet flies to the earth. You rush from city to city and shout: it will fall soon, we will all die, this is the punishment of the gods. Even without you, we know that this will happen. But why scream like that?
I often do not clearly state my thoughts. google translate is far from perfect. And in addition to the translator, I myself am mistaken. Do not take me seriously.

User avatar
schizobeyondpills
Posts: 103
Joined: 06 Jun 2020, 04:00

Re: How much do ram timings affect input lag?

Post by schizobeyondpills » 24 Jun 2020, 20:21

@jorimt
again, your point is correct, there will be visible differences, yes, but those differences wont be accurate at all or represent whats possible with RAM latency OC, how is this so hard for you people to understand? Did you disable write combining of your NVIDIA gpu before doing your g-sync tests? Did you even know about it? I guess not since there is not a single result on google about it until I shared it with the world, yes, me. So it will show 0.3pixel/ms difference, but the real 1ms difference or more hides behind write combined bottleneck of CPU->GPU memory, so that ram timing OC is now shadowed by later stage of your photon latency pipeline.

I will skip your 2nd point and answer it all in one go. You can no problem push your car engine 20% more in horse power, and it will go faster (notice the FPS/photon latency analogy connection here) yes, no doubt. But if you tune the tires,use better fuel, adjust car body, do ECU tuning, exhaust, adjust aerodynamics to fit the new engine, it will give you far more than 20% power. No sane person would use F1 car tires on his family car, it makes no difference (duh), however on a F1 car those same tires are what make the biggest difference after engine.

RAM latency is abstract term/power. If you just measure how that abstract power translates into visible photon latency you miss out on a lot of possible optimizations because abstract power needs to be combined with more components optimized for it to take full advantage of it.

If you want to take full advantage of your RAM latency OC you have to adjust your OS/system/hardware accordingly otherwise it falls apart and you dont notice any difference. This is why all the folks in the threads so far claim it doesnt matter, they try to change but dont notice anything, It doesnt matter if you have a lambo/ferrari in your garage if you have no keys for it and cant drive it.

So one must disable every power saving feature in the entire pipeline to see how much RAM latency actually makes a difference, its like walking compared to flying once everything is adjusted for it.

In your specific case i noticed you mentioned that you run a "balanced" power plan, also your os is Win10 which has a lot of bloat(especially newer versions) as well as meltdown/spectre mitigations and many under the hood performance tweaks, no ram OC will give you less latency if you dont adjust entire system for it. Its like OC'ing your 8700k to 5GHz running a balanced/power saving OS plan, it makes no sense, one limits or unlocks/boosts the other.

If you have time please do try to disable all power saving features in your BIOS and debloat/place your OS in high performance, then change your RAM timings into CR1 and you will notice the huge difference. (Also please dont use G-Sync)




@1000WATT

The testing tools I am being offered are for measuring photon latency that is very well suited for testing displays. They both fit into the same basket, its not an abstract thing like RAM latency which is far harder to measure and notice(from the points i presented above).

I dont ask for money, all I ask for is to not deny its not real, and my goal is to raise awareness, someone who has connections and/or works in the field where they are capable of measuring this might pick it up from the threads and provide hard evidence. Best I can do is show you links/info from RAM/CPU architecture books and real world (chipwiki).

And also there are very vocal users who ask for labels such as "wild goose chasing" and claim timings dont affect anything, which I find very rude and dishonest since I provided nothing but objective proof from books/chipwiki.


P.S. Wish this site had a multi quote button.

Post Reply