How much do ram timings affect input lag?

Separate area for niche lag issues including unexpected causes and/or electromagnetic interference (ECC = retransmits = lag). Interference (EMI, EMF) of all kinds (wired, wireless, external, internal, environment, bad component) can cause error-correction latencies like a bad modem connection. Troubleshooting may require university degree. Your lag issue is likely not EMI. Please read this before entering sub-forum.
Forum rules
IMPORTANT:
This subforum is for advanced users only. This separate area is for niche or unexpected lag issues such as electromagnetic interference (EMI, EMF, electrical, radiofrequency, etc). Interference of all kinds (wired, wireless, external, internal, environment, bad component) can cause error-correction (ECC) latencies like a bad modem connection, except internally in a circuit. ECC = retransmits = lag. Troubleshooting may require university degree. Your lag issue is likely not EMI.
🠚 You Must Read This First Before Submit Post or Submit Reply
User avatar
Chief Blur Buster
Site Admin
Posts: 11647
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: How much do ram timings affect input lag?

Post by Chief Blur Buster » 25 Jun 2020, 14:28

Although a valuable piece of the jigsaw puzzle, it is well outside the domain of Blur Busters at this present current time; at this specific stage in the journey of the refresh rate race to retina refresh rates.

There are actually many other businesses that have developed new tools / new visualizations / new measurement methodologies -- all because of posts/articles/tests they find on Blur Busters.

This thread may very much be useful to alert vendors for that.

But for end-user debate, and end-user testing, at least in the near term -- I think this thread purpose is complete for now and simply is archived material to come up in future googling. This thread is now moved to the Lounge forum (with a temporary redirect in the Input Lag forum).
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

Sparky
Posts: 682
Joined: 15 Jan 2014, 02:29

Re: How much do ram timings affect input lag?

Post by Sparky » 25 Jun 2020, 14:58

schizobeyondpills wrote:
24 Jun 2020, 20:46
Sparky wrote:
24 Jun 2020, 20:43
Schizo, lets say your RAM is magically running at the same latency and bandwidth as your L3 cache, and in this particular game, framerate stays the same. What specific benefits do you expect to see, from having the faster RAM?
I much rather operate in real world limited by physics(including time and space constraints) than insulting it and going into magic world that doesnt exist. Any questions regarding the real world im free to answer.
If you have any coherent idea about how the system works, you should be able to make a prediction about what performance metrics will actually change when you make RAM faster. I literally offered you the best case scenario for whatever argument it is you've been trying to make, and you can't even come up with a plausible benefit, let alone a measurable one. If you want to avoid eating your foot in the future, measure whatever difference in performance you think exists BEFORE posting about it.

User avatar
schizobeyondpills
Posts: 103
Joined: 06 Jun 2020, 04:00

Re: How much do ram timings affect input lag?

Post by schizobeyondpills » 25 Jun 2020, 23:04

again, i apologize for poor formatting, its very painful and time consuming as well as repetitive to format properly using current features, I hope multi quote as well as quote selection buttons can be added(so that only selected text is inserted into reply field), quoting someone who quoted you results in double quote of your original post showing in your new posts which is rather pointless and makes everything appear 3 times as well as break focus and immersion.

@jorimt
You do realize that what you're saying basically suggests this can't be practically tested so we just have take your (and a single page of a single technical article's) word for it?
oh it can, I never said it cant or claimed anything close to that, I did however say measuring abstract thing such as time factor (RAM latency) through single instance of frame-to-photon latency is useless and measures 1/100th of what RAM latency does. there are custom linux kernels which CPU sidechannel researchers use for clock cycle timing analysis of the CPU/system for example, Travis Downs on his blogs can measure number of hidden/mask registers inside CPUs. Its possible to measure everything, however a good measuring method must be created to not miss any effect of what you are measuring. thats what I am saying, $20 arduino photon latency is a joke measuring tool for something as fundamental as RAM latency which exists in nanoseconds and is used millions of times per second. atleast on its own without 50+ OS hardware/software measuring probes, and even then the measures would lack resolution and proper depth. can this be practically tested? absolutely yes, can it be done with $20 arduino photon capture? absolutely no, 1/10th yes for a high school project, not hard acedemic research tier evidence.
and what you're suggesting would require the majority of PC gamers (many of which just want to play the darn game) to go far more out of their way than navigating the inconveniences of basic forum post formatting.
input lag is not a topic for majority of PC gamers(especially those who just want to play the game). Its for those who have a reason to go deeper, be it overclocking hobby, eSports, or just huge intolerance to input lag. I fail to see why you would mention this since with your point entire Input Lag section is useless on this forum, they just want to play the darn game, not waste hours researching anything, tweaking, adjusting, overclocking or reading forums.
And further still, you're not even bothering to offer explicit steps or instructions, since that, as opposed to pure talk, would make you accountable and vulnerable to actual critique.
This is a discussion, not a lecture or a guide by me. however going through my posts I did provide info about what to tune to easily notice difference (high perf power plan,debloat OS, set ram CR to 1, increase tREFI, lower tRFC, disable self-refresh and power down mode - https://github.com/integralfx/MemTestHe ... 20Guide.md is a good starting point for general overclocking of RAM ). one must learn about his own system and how to overclock it to get the most of it, no guide will do the thinking for you.
vulnerable to actual critique.
i am open to critique all this time, however my facts are backed by actual computer architecture while so far everyone in this thread was able to only throw weak "placebo" "wild goose hunt" "it doesnt matter" as arguments.
but then you didn't even lay out the steps to properly use it when other posters ended up using it incorrectly, and I and RealNC had to ultimately step in to provide them.
it was my impression that folks chasing input lag dont just want to play the darn game and will read the manual that comes with it. weird.
That, and Intel MLC is yet another benchmark that does nothing to prove real world benefits in end user gaming.
its a tool to measure RAM latency under different scenarios, its not meant to prove anything, only measure raw ram latency. those who do decide to overclock their RAM for better input/responsivness/smoothness can verify their "placebo" claims afterwards by measuring RAM latency which they tuned.
You do realize this is the case for almost every supplemental hardware and software solution in existence (including RAM tuning parameters)? They are all ultimately compensatory mechanics for imperfect hardware performance, which no amount of tuning by the end user will 100% correct.

Some of this is limitations of modern PCs, and some of it is limitation of physics. So long as perfection isn't achievable (hint: it never will be), these solutions will remain necessary.
not all solutions have the same compensation factor on how much they affect the bleeding edge performance.
you forgot to mention some of these solutions are due to very poor engineering and lack of knowledge(knowledge being things which are already possible and published/taught but not applied rather than things not yet known or researched), as well as management of project funds in other areas (such as marketing/art/design) rather than optimization.
And so does the cable that brings power to the PC. Even your greatest skeptics here will admit that RAM capability has an impact on overall performance, but what's your point beyond that? What do you want everyone to do, and HOW do they do it to your satisfaction?
They will admit it matters, however they will deny with very strong, empty argument, and personal attachment that how much it matters is very small/impossible to notice while exact opposite is true (which I defend with arguments and evidence from computer engineering pioneers rather than personal claims - you cannot notice something if your OS/PC/CPU goes to sleep 1000 times a second on Balanced power plan)
And? No one is claiming RAM latency doesn't matter here, but that single page of one article isn't providing any actionable instructions or insight to the average end user.
People are claiming it doesnt matter because its in nanoseconds, using their perception of time (seconds) rather than that of CPU operating at 5 billion cycles a second ( 0.2ns tick - average human 1 thought a second so nanosecond is not much in your own head).
That single page of one article is from a computer engineering pioneer with a PhD under Donald Knuth (https://sites.google.com/site/dicksites/ ), its a bit insulting to call it "single page of one article".
Also again some assumption about end user, this forum and especially input lag forum are not meant for end user who just wants to play their games is it? forums are for discussion, and thats what i am here for, as for actionable instructions I have provided as said above, hints and things what to change, however I cannot write out every possible BIOS of every possible motherboard as step by step instructions.


@sparky
If you have any coherent idea about how the system works, you should be able to make a prediction about what performance metrics will actually change when you make RAM faster. I literally offered you the best case scenario for whatever argument it is you've been trying to make, and you can't even come up with a plausible benefit, let alone a measurable one. If you want to avoid eating your foot in the future, measure whatever difference in performance you think exists BEFORE posting about it.
Sparky wrote:
24 Jun 2020, 20:43
Schizo, lets say your RAM is magically running
as i said, we live in a real world backed by actual physical constraints and laws, as well as results of actual engineering and quality of that engineering, both on hardware and software levels rather than your magical world which
1. is not real
2. breaks laws of physics
3. has no phsyical constraints or properties
4. runs on a "particular" game which you defined no properties of
5. does not have any hardware such as mouse/keyboard
6. has no one using those peripherals or looking at the monitor
7. runs on magic (your own words)
Sparky wrote:
24 Jun 2020, 20:43
Schizo, lets say your RAM is magically running at the same latency and bandwidth as your L3 cache, and in this particular game, framerate stays the same. What specific benefits do you expect to see, from having the faster RAM?
attempting to get a view into your magical world is impossible, even with pills i just took, it has no definition of anything except that its magically running. In real world however, frames per second is just a measure of the amount of visual/scene computation(frame) engine can do per second of time, it sits somewhere in the middle of the end to end latency. Meaning reducing latency of anything through various means (CPU oc/uncore OC/GPU clock oc/GPU mem oc/RAM latency oc) will still reduce end to end input latency, and not just that, some things will affect time through time, meaning consistency rather than single instance of latency (click to photon) of a frame.
The same way increasing GPU clock/GPU memory clock reduces the amount of time/latency of frames, even if you are are already at max/consistent framerate.

User avatar
schizobeyondpills
Posts: 103
Joined: 06 Jun 2020, 04:00

Re: How much do ram timings affect input lag?

Post by schizobeyondpills » 26 Jun 2020, 00:02

https://en.wikipedia.org/wiki/Streetlight_effect

Code: Select all

The streetlight effect, or the drunkard's search principle, is a type of observational bias that occurs when people only search for something where it is easiest to look.[1][2][3][4] Both names refer to a well-known joke:

A policeman sees a drunk man searching for something under a streetlight and asks what the drunk has lost. He says he lost his keys and they both look under the streetlight together. After a few minutes the policeman asks if he is sure he lost them here, and the drunk replies, no, and that he lost them in the park. The policeman asks why he is searching here, and the drunk replies, "this is where the light is".[2]
ironic isnt it?

User avatar
jorimt
Posts: 2481
Joined: 04 Nov 2016, 10:44
Location: USA

Re: How much do ram timings affect input lag?

Post by jorimt » 26 Jun 2020, 07:35

schizobeyondpills wrote:
25 Jun 2020, 23:04
quoting someone who quoted you results in double quote of your original post showing in your new posts which is rather pointless and makes everything appear 3 times as well as break focus and immersion.
You delete all the wrapping quotes (and their content) but for the very outer one per. Then, you reuse (copy/paste) that outer wrap quote markup for further quotes, and close with a "[/quote]" per.
schizobeyondpills wrote:
25 Jun 2020, 23:04
$20 arduino photon latency is a joke measuring tool for something as fundamental as RAM latency which exists in nanoseconds and is used millions of times per second.
I never said anything about a $20 arduino.

I'll also note here that such configurations, no matter how capable, only measure a single point on the screen, so are only viable for testing of synced methods. For V-SYNC off, you'd need something to capture the entire screen in one go, and to get a noise threshold/error margin low enough to start picking up any differences RAM tuning itself would bring, you'd need a test camera well above 1000+ FPS to reliably capture such differences (as they'd likely be very small, and accumulated over a vast period of frames).
schizobeyondpills wrote:
25 Jun 2020, 23:04
input lag is not a topic for majority of PC gamers(especially those who just want to play the game). Its for those who have a reason to go deeper, be it overclocking hobby, eSports, or just huge intolerance to input lag. I fail to see why you would mention this since with your point entire Input Lag section is useless on this forum, they just want to play the darn game, not waste hours researching anything, tweaking, adjusting, overclocking or reading forums.
Some do, some don't. You give the impression that everyone should be tweaking every aspect of their system to the nth degree, or they're scum of the earth (you said much worse in some of your first posts, many of which only us moderators had the pleasure of seeing), so that's all I was targeting when I said "play the darn game."
schizobeyondpills wrote:
25 Jun 2020, 23:04
This is a discussion, not a lecture or a guide by me.
Then you need to be a little less aggressive in your communication style, because you come across as someone who thinks they have all the answers and know best for everyone, yet you, again, provide very little material for practical application, and zero real-world numbers to back anything up.

I'll discuss theory (and probably many others here will) with you all day, but to call every bit of this objective fact (whether it is or isn't) is where you're getting the unwanted skepticism. Dogmatism and zealotry don't get any of their users very far (unless of course, it's by force, at which point it's no longer a "discussion").

Also "theory" isn't a dirty word. It sometimes basically just means an untested fact. See Einstein.
schizobeyondpills wrote:
25 Jun 2020, 23:04
however going through my posts I did provide info about what to tune to easily notice difference (high perf power plan,debloat OS, set ram CR to 1, increase tREFI, lower tRFC, disable self-refresh and power down mode - https://github.com/integralfx/MemTestHe ... 20Guide.md is a good starting point for general overclocking of RAM ).
I personally know about most of this. You also seem to assume since I have Windows 10, that I haven't attempted to de-bloat it. Sure, there are aspects of the OS that will always be heavier than previous iterations, but as for extra services, processes, or programs, if they aren't running, they aren't actively bloating, just taking up extra space on the hard drive.
schizobeyondpills wrote:
25 Jun 2020, 23:04
one must learn about his own system and how to overclock it to get the most of it, no guide will do the thinking for you.
But a guide will give you a start. I'm a self-teacher, but I (and everyone else) still need the material.
schizobeyondpills wrote:
25 Jun 2020, 23:04
it was my impression that folks chasing input lag dont just want to play the darn game and will read the manual that comes with it. weird.
It was my impression that formatting the quotes here was easy, yet, shock of shocks, other people are different than me, and find certain things more trouble than I do! Apply the same to the above, and you've got your outcome in that thread.

Also, the steps I provided in that thread were my own. If you've never used a command line to run a program, then the instructions Intel provide on the download page for that program were useless.
schizobeyondpills wrote:
25 Jun 2020, 23:04
its a tool to measure RAM latency under different scenarios, its not meant to prove anything, only measure raw ram latency. those who do decide to overclock their RAM for better input/responsivness/smoothness can verify their "placebo" claims afterwards by measuring RAM latency which they tuned.
Again, I don't think anyone is arguing that an optimized and highly tuned system is better than not. Their argument is whether RAM tuning, all by itself, really has a measurable difference that can be felt over all other changes. Possibly, but beyond subjective impressions, we don't have any real world data to back that up.
schizobeyondpills wrote:
25 Jun 2020, 23:04
you forgot to mention some of these solutions are due to very poor engineering and lack of knowledge(knowledge being things which are already possible and published/taught but not applied rather than things not yet known or researched), as well as management of project funds in other areas (such as marketing/art/design) rather than optimization.
I didn't forget to mention that at all. I was assuming we were talking specifically about the end user experience, and what the end user can do with the hardware and software already available to them. The end user can do nothing about base engineering of modern day computers.

In the mainstream, we currently have a choice of Intel and AMD (and soon to be Apple, though who knows if it will be viable for gaming purposes) for CPU, Nvidia and AMD (and to a lesser degree Intel) for GPU, and Windows, OSX, and Linux for OS. That's about it. Beggars can't be choosers, and the end user isn't far off from a beggar in this respect.
schizobeyondpills wrote:
25 Jun 2020, 23:04
They will admit it matters, however they will deny with very strong, empty argument, and personal attachment that how much it matters is very small/impossible to notice while exact opposite is true (which I defend with arguments and evidence from computer engineering pioneers rather than personal claims - you cannot notice something if your OS/PC/CPU goes to sleep 1000 times a second on Balanced power plan)
If you're so confident in your knowledge, who cares what they think? My advice is to let it go and find like-minded people where able, and ignore the rest, because you're never going to get everyone to think like you, let alone agree with you.

As for the balanced power plan, you do know there is virtually no difference between it and the high performance plan when using an intensive application, right? The difference is mainly during desktop use, where there is a mix of activity and inactivity. I'd also add here that while I use the balanced power plan, I also unpark my cores and disable all sleep/power saving options in the profile, so the only thing that really scales is my CPU frequency (can go as low as 5%) when idle.
schizobeyondpills wrote:
25 Jun 2020, 23:04
People are claiming it doesnt matter because its in nanoseconds, using their perception of time (seconds) rather than that of CPU operating at 5 billion cycles a second ( 0.2ns tick - average human 1 thought a second so nanosecond is not much in your own head).
Yes, the internet sucks, what's new? I'd say this forum is tame compared to the rest. Try Guru3D or ResetEra.
schizobeyondpills wrote:
25 Jun 2020, 23:04
That single page of one article is from a computer engineering pioneer with a PhD under Donald Knuth (https://sites.google.com/site/dicksites/ ), its a bit insulting to call it "single page of one article".
I'm not insulting the article (which in layman's terms accurately states that RAM throughput is important to overall system performance), I'm "insulting" (that's a bit strong, I'd say "questioning") your particular usage of it. We get it, but again, there is no actionable info for the end user there, just basic facts about computing.

Anyway, I'm running out of things to say. Again, I think we agree that an optimized and highly tuned system is better than not, so I'm not sure what's left to discuss (let alone disagree about).
(jorimt: /jor-uhm-tee/)
Author: Blur Busters "G-SYNC 101" Series

Displays: ASUS PG27AQN, LG 48CX VR: Beyond, Quest 3, Reverb G2, Index OS: Windows 11 Pro Case: Fractal Design Torrent PSU: Seasonic PRIME TX-1000 MB: ASUS Z790 Hero CPU: Intel i9-13900k w/Noctua NH-U12A GPU: GIGABYTE RTX 4090 GAMING OC RAM: 32GB G.SKILL Trident Z5 DDR5 6400MHz CL32 SSDs: 2TB WD_BLACK SN850 (OS), 4TB WD_BLACK SN850X (Games) Keyboards: Wooting 60HE, Logitech G915 TKL Mice: Razer Viper Mini SE, Razer Viper 8kHz Sound: Creative Sound Blaster Katana V2 (speakers/amp/DAC), AFUL Performer 8 (IEMs)

Tell
Posts: 21
Joined: 09 May 2018, 07:27

Re: How much do ram timings affect input lag?

Post by Tell » 23 Jan 2022, 11:09

The tfaw timing absolutely changes the mouse response. Going from 6-9-36tfaw to 4-6-16tfaw produces a very noticeable change. This is obvious at the desktop so it has nothing to do with fps or frametimes.

Maybe this shouldn't be happening. Maybe it only happens on certain processor architectures or maybe this is only happening on defective motherboards with buggy bioses but It's definitely not placebo on my AMD system.

I know this is an old thread but as far as I can tell there isn't anywhere else on the internet talking about this. Why is it off topic?

User avatar
Chief Blur Buster
Site Admin
Posts: 11647
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: How much do ram timings affect input lag?

Post by Chief Blur Buster » 06 Jul 2022, 18:51

This old thread was created long ago before the Rare/EMI issues forum was created.

Since this is an issue in the sphere, I'm going to move it there.

RAM timings is one of those things that merits further investigation from some professionals more qualified to diagnose this.

There could be theoretically unexpected left-field causes -- like certain RAM timings generating an EMI on cheap motherboards. Creating a trick latency effect (where you thought it was the RAM latency issue itself, when it was really a dominoe-effect causing a different latency such as USB error correction latency).

Because of this theory, I've moved this obscure issue to the Rare Lag Issues forum. This discussion deserves more incubation by people more experienced in this matter, but I would very much love scientists, researchers and university Ph.D's to determine why RAM latencies may cascade to human-noticeable mouse latency issues and game latency issues, etc. After all, the saying, death a thousand paper cuts, a million nanoseconds is a millisecond, and so on.

There are many random obscure theories why changing RAM timings may cascade to visible lag, for unexpected reasons;
- Bugs in firmware, software, driver, or other, that specific memory latencies might trigger;
- Memory-critical inner loops in some driver (e.g. bigger CPU spikes in a driver when memory has high lag);
- Massive error correction storms occuring somewhere else (on a different bus/circuit, since consumer RAM is not error corrected);
- Resonant RFI effects (specific memory frequencies starts making other circuits underperform because of tight S/N margins);
- Wild goose and red herring effects (the mere *action* of changing RAM settings created an accidental latency change, e.g. fresh
computer reboots can improve/worsen lag);
- Pure placebo (definitely common, but this is not the guaranteed answer for every single gaming rig);
- Chipset bugs (Certain RAM timings amplify to something more inefficient);
- Flow control latency behaviors (Certain memory timings changes triggers more inefficient flow control in some firmware/driver/etc);
- Coincidental effects (Something you did too at the same time such as rebooting -- rebooting computer will often reset error correction algorithms and automatically temporarily reset to faster bus/wire/circuit protocols, and Windows operating system has behaviours that performs better on a freshly rebooted system)
- Etc.

With mouse reports being only 1000 per second, it really takes a LOT of memory accesses per mouse report, in order for RAM timings changes to cascade to visible mouse latency behaviors -- and many latencies definitely are not human feelable, while others are, as seen in The Amazing Human Visible Feats Of The Millisecond. There is probably lots of placebo and lots of legitimate reports in this sphere, the problem is scientifically determining it for a specific out-in-the-field computer system at the consumers' premises.

Blur Busters exist because we did not dismiss 240 Hz displays or 8000 Hz mice, so this is right up our alley to permit discussion to take place. Sometimes it is a lot of speculation in a situation where we are unable to troubleshoot the precise cause of why for certain lag issues.

It would be fantastic if some professional could trace some of this out.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

Post Reply