How much do ram timings affect input lag?

Separate area for niche lag issues including unexpected causes and/or electromagnetic interference (ECC = retransmits = lag). Interference (EMI, EMF) of all kinds (wired, wireless, external, internal, environment, bad component) can cause error-correction latencies like a bad modem connection. Troubleshooting may require university degree. Your lag issue is likely not EMI. Please read this before entering sub-forum.
Forum rules
IMPORTANT:
This subforum is for advanced users only. This separate area is for niche or unexpected lag issues such as electromagnetic interference (EMI, EMF, electrical, radiofrequency, etc). Interference of all kinds (wired, wireless, external, internal, environment, bad component) can cause error-correction (ECC) latencies like a bad modem connection, except internally in a circuit. ECC = retransmits = lag. Troubleshooting may require university degree. Your lag issue is likely not EMI.
🠚 You Must Read This First Before Submit Post or Submit Reply
Brainlet
Posts: 100
Joined: 30 May 2020, 12:39
Contact:

Re: How much do ram timings affect input lag?

Post by Brainlet » 24 Jun 2020, 20:25

1000WATT wrote:
24 Jun 2020, 17:50
I imagine it like this. A comet flies to the earth. You rush from city to city and shout: it will fall soon, we will all die, this is the punishment of the gods. Even without you, we know that this will happen. But why scream like that?
Because potentially irreversible damage is being done by hardware and software engineers, severely impacting quality of mouse input. A gigantic portion of PC players use controllers nowadays which arent this susceptible to input lag. Engineers and developers focus on providing more and more frames at the cost of input lag.
There is something about RAM that severely impacts the mousefeel and the more information people gather the sooner the puzzle of optimization can be solved, removing a major latency bottleneck in current systems. I dont think there a affordable tools currently that can beat human perception (even with some placebo here and there). 1000 fps cameras simply arent enough. And as previously mentioned there can be a lot of variation in the pipeline leading to undesired and inconsistent behavior, even if the frames are stable. For example 240hz monitors without sync: at a very basic level you'd have variation between monitor refresh and frame generation, between frame generation and moment of mouse being polled and between poll and irl mouse movement. All these variations severely impact mouse input and theres a lot more of them.
Starting point for beginners: PC Optimization Hub

Sparky
Posts: 682
Joined: 15 Jan 2014, 02:29

Re: How much do ram timings affect input lag?

Post by Sparky » 24 Jun 2020, 20:43

Schizo, lets say your RAM is magically running at the same latency and bandwidth as your L3 cache, and in this particular game, framerate stays the same. What specific benefits do you expect to see, from having the faster RAM?

User avatar
schizobeyondpills
Posts: 103
Joined: 06 Jun 2020, 04:00

Re: How much do ram timings affect input lag?

Post by schizobeyondpills » 24 Jun 2020, 20:46

Sparky wrote:
24 Jun 2020, 20:43
Schizo, lets say your RAM is magically running at the same latency and bandwidth as your L3 cache, and in this particular game, framerate stays the same. What specific benefits do you expect to see, from having the faster RAM?
I much rather operate in real world limited by physics(including time and space constraints) than insulting it and going into magic world that doesnt exist. Any questions regarding the real world im free to answer.

User avatar
jorimt
Posts: 2481
Joined: 04 Nov 2016, 10:44
Location: USA

Re: How much do ram timings affect input lag?

Post by jorimt » 24 Jun 2020, 21:21

schizobeyondpills wrote:
24 Jun 2020, 20:21
P.S. Wish this site had a multi quote button.
There's a roundabout way you can do this by hitting the quote button on my message, opening another users message in another tab, and then manually copy/pasting the quote from the one tab to the other (which is what I do), but yeah, not currently an intuitive method.
schizobeyondpills wrote:
24 Jun 2020, 20:21
@jorimt
again, your point is correct, there will be visible differences, yes, but those differences wont be accurate at all or represent whats possible with RAM latency OC, how is this so hard for you people to understand?
It's not hard for me to understand, as input lag is accumulative, and thus, with enough samples, button-to-pixel testing methods (when properly executed), should reflect all contributing factors as a whole in the final results.
schizobeyondpills wrote:
24 Jun 2020, 20:21
So it will show 0.3pixel/ms difference, but the real 1ms difference or more hides behind write combined bottleneck of CPU->GPU memory, so that ram timing OC is now shadowed by later stage of your photon latency pipeline.
Again, if done properly, and so long as it is outside the noise threshold of the testing method, it should show as part of the whole in the results. But yes, if it does escape error margin/noise threshold of such methods, then if it does indeed exist, the reduction may be very small, even over a period of frames.
schizobeyondpills wrote:
24 Jun 2020, 20:21
RAM latency is abstract term/power. If you just measure how that abstract power translates into visible photon latency you miss out on a lot of possible optimizations because abstract power needs to be combined with more components optimized for it to take full advantage of it.
I'm aware that any possible RAM performance improvements via tuning/overclocking are accumulative and dependent on the whole health and optimization of your system across the board, software and hardware. I still don't see how this exempts any possible latency reductions via RAM tuning from being detectable by button-to-pixel testing methods (unless they were incredibly tiny, which admittedly wouldn't be great for your position on the matter). Again, it would simply be difficult to confirm attribution of it directly to the RAM in all cases.
schizobeyondpills wrote:
24 Jun 2020, 20:21
In your specific case i noticed you mentioned that you run a "balanced" power plan, also your os is Win10 which has a lot of bloat(especially newer versions) as well as meltdown/spectre mitigations and many under the hood performance tweaks, no ram OC will give you less latency if you dont adjust entire system for it. Its like OC'ing your 8700k to 5GHz running a balanced/power saving OS plan, it makes no sense, one limits or unlocks/boosts the other.

If you have time please do try to disable all power saving features in your BIOS and debloat/place your OS in high performance, then change your RAM timings into CR1 and you will notice the huge difference. (Also please dont use G-Sync)
You're preaching to the choir. I had phases where I fixated on all these aspects, and (subjectively) tested extensively, only to find, for me personally, it was a case of diminishing returns.

As for CR1, in my experience, this can reduce CPU overclock stability and sometimes greatly increase CPU temps when compared to stock CR2 (especially when you're overclocking CPU and RAM at the same time), and really isn't worth risking unless you know what you're doing (and most don't).

That said, I'm not knocking your perspective or your claims. None of us here ("us" being everyone other than you) can 100% definitively know how much (or if) RAM tuning impacts input lag to a measurable degree over the whole latency chain. Many of us are just saying that it's too niche, inaccessible, and unisolatable to prioritize vs. the more immediate and obvious input lag reduction other avenues create, such as disabling V-SYNC, or to a lesser degree, reducing the pre-rendered frames queue, or even more simply, lowering settings and/or upping system specs to increase the average framerate.

I think what you should ask yourself at this point is, what's your end game? What are you going to do about it that you haven't already done?

For instance, four years ago, Nvidia simply wasn't providing enough information on G-SYNC, so instead of yelling at them or shaking my fist at people who were spreading misinformation and/or ignorance about G-SYNC functionality, I simply shut up, did the work, and provided the information myself.

As its advocate, the onus is on you, not "us" (that being the "us" that you don't think believe in it) to do the convincing. If this is something you truly care about pursuing, go for it (and I genuinely mean it).
(jorimt: /jor-uhm-tee/)
Author: Blur Busters "G-SYNC 101" Series

Displays: ASUS PG27AQN, LG 48CX VR: Beyond, Quest 3, Reverb G2, Index OS: Windows 11 Pro Case: Fractal Design Torrent PSU: Seasonic PRIME TX-1000 MB: ASUS Z790 Hero CPU: Intel i9-13900k w/Noctua NH-U12A GPU: GIGABYTE RTX 4090 GAMING OC RAM: 32GB G.SKILL Trident Z5 DDR5 6400MHz CL32 SSDs: 2TB WD_BLACK SN850 (OS), 4TB WD_BLACK SN850X (Games) Keyboards: Wooting 60HE, Logitech G915 TKL Mice: Razer Viper Mini SE, Razer Viper 8kHz Sound: Creative Sound Blaster Katana V2 (speakers/amp/DAC), AFUL Performer 8 (IEMs)

User avatar
schizobeyondpills
Posts: 103
Joined: 06 Jun 2020, 04:00

Re: How much do ram timings affect input lag?

Post by schizobeyondpills » 24 Jun 2020, 21:56

jorimt wrote:
24 Jun 2020, 21:21
>It's not hard for me to understand, as input lag is accumulative, and thus, with enough samples, button-to-pixel testing methods (when properly executed), should reflect all contributing factors as a whole in the final results.

not only is it accumulative, it can be and is bottlenecked by multiple factors in the entire chain, number of samples doesnt matter if the method of testing is flawed and/or lacks necessary resolution, not to mention understanding of the entire chain. benchmarking is hard, making sure you are benchmarking correct is even harder

>Again, if done properly, and so long as it is outside the noise threshold of the testing method, it should show as part of the whole in the results.
it will show, but as you said, it needs to be measured properly(good equipment/good resolution/accurate measures), and on a proper setup for that specific measure (tuned/optimized F1 racing car, not a family power saving car)

>I'm aware that any possible RAM performance improvements via tuning/overclocking are accumulative and dependent on the whole health and optimization of your system across the board, software and hardware. I still don't see how this exempts any possible latency reductions via RAM tuning from being detectable by button-to-pixel testing methods (unless they were incredibly tiny, which admittedly wouldn't be great for your position on the matter). Again, it would simply be difficult to confirm attribution of it directly to the RAM in all cases.

In this entire thread I have never said its not visible. I have however said that due to wrong method on wrong subject (unoptimized) the results will be flawed and innacurate, although still there, in case of very bad testing method being performed they will be fully innacruate and wrong. RAM performance improvements for lower RAM latency are not accumulative, they are chrono-exponentional. Meaning, they scale with their space counterpart exponentionally by their time factor (latency). Like speed of light and Lorentz factor ( https://en.wikipedia.org/wiki/Lorentz_factor ). Meaning that RAM latency scales with how RAM read/writes are used, together with how much load there is on ram, what type of commands are scheduled (reads/writes), what the access patterns are, what the CPU caches are, what the OS is doing with interrupts/DPCs, what the OS is doing with thread scheduling, what the OS is doing with power saving things, and many other things that would take up entire page to write down. So that RAM latency is "accumulative" is a heavy understatement of what RAM latency affects.

>You're preaching to the choir. I had phases where I fixated on all these aspects, and (subjectively) tested extensively, only to find, for me personally, it was a case of diminishing returns.
i am defending the truth and helping people lower their input lag as this is the goal of this subforum. i fail to see how tuning your system once that should even in most extreme cases take a week is a diminishing return if you will no doubt use it 3+ hours a day for atleast a year.

>As for CR1, in my experience, this can reduce CPU overclock stability and sometimes greatly increase CPU temps when compared to stock CR2 (especially when you're overclocking CPU and RAM at the same time), and really isn't worth risking unless you know what you're doing (and most don't).
Well yeah, if you want to drive a F1 and go fast you need to know what you are doing, CR1 having effects on CPU overclock and other things is obvious since now your RAM uses your CPU IMC far more.

>That said, I'm not knocking your perspective or your claims. None of us here ("us" being everyone other than you) can 100% definitively know how much (or if) RAM tuning impacts input lag to a measurable degree over the whole latency chain. Many of us are just saying that it's too niche, inaccessible, and unisolatable to prioritize vs. the more immediate and obvious input lag reduction other avenues create, such as disabling V-SYNC, or to a lesser degree, reducing the pre-rendered frames queue, or even more simply, lowering settings and/or upping system specs to increase the average framerate.

Its not too niche since all the things you mentioned in that sentence are all affected by RAM latency, since it sits inside the base layer of how computers work. Its exact opposite of too niche, its too important to not do it since it affects EVERYTHING. there is no point in trying to chase lower latency through things that sit on layer 2/3/4 rather than primary layer. Its like trying to smash your head harder and harder to open the door instead of using the door knob.

>I think what you should ask yourself at this point is, what's your end game? What are you going to do about it that you haven't already done?
My endgame is to raise awareness of the ongoing low quality engineering either for monetary or any other gain about RAM latency affecting input lag and help people get most of their systems. To do some lab work and collect evidence I need to have someone to present it to however being labeled mentally insane and other terms such as "placebo" is very insulting when i provide hard evidence from computer engineering pioneers rather than my own thoughts.

I dont really care about G-Sync, its a power saving feature of monitors due to serious lack of great quality software architecture/game developers as well as consumer hardware power saving features and other profit maximizing things.

I am not advocating anything, so far all I have provided were cold hard evidence from computer architecture, nothing I said was my opinion, and yet still people refuse to accept the objective raw truth presented due to lack of knowledge or other social fears it may result in. Although the few people in this thread who did the timings however far from perfect and reported back with claims that difference it makes is amazing show there is hope for people listening.

Brainlet
Posts: 100
Joined: 30 May 2020, 12:39
Contact:

Re: How much do ram timings affect input lag?

Post by Brainlet » 24 Jun 2020, 22:04

One major problem with button to screen measurements is that mouse movement is continous motion broken down into many tiny steps. One would have to measure the behavior of each of those packets sent every 1ms and how the pipeline with its buffers affects their outcome. How will they get grouped into the final frame? This is the stage where strong RAM can smooth things out to be more consistent and eliminate variations to a certain degree.
Starting point for beginners: PC Optimization Hub

User avatar
Chief Blur Buster
Site Admin
Posts: 11647
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: How much do ram timings affect input lag?

Post by Chief Blur Buster » 24 Jun 2020, 22:18

Latency volatility effects are quite important -- e.g. mouse microstutter that is amplified during ultrasmooth or ultraclear motion (i.e. VRR, strobing, 240 Hz, 360 Hz, 480 Hz, etc).
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

User avatar
jorimt
Posts: 2481
Joined: 04 Nov 2016, 10:44
Location: USA

Re: How much do ram timings affect input lag?

Post by jorimt » 24 Jun 2020, 22:39

schizobeyondpills wrote:
24 Jun 2020, 21:56
So that RAM latency is "accumulative" is a heavy understatement of what RAM latency affects.
Yet whatever improvement it yields is still ultimately going to be visible in a frame to the end user, there's really no getting around that, and it doesn't matter how it comes about in this respect, only where and when it shows up.
schizobeyondpills wrote:
24 Jun 2020, 21:56
i fail to see how tuning your system once that should even in most extreme cases take a week is a diminishing return if you will no doubt use it 3+ hours a day for atleast a year.
First of all, overclocking is no simple matter, and if done right, takes knowledge, experience, and tons of time, including stability testing. It is also heavily dependent on capable and reliable cooling, of which few layman understand the importance of, let alone will go to the lengths to properly maintain.

As for the remainder, you have yet to lay out these steps once. I suggest you do so if able. Feel free to create a new thread containing a guide. It will be much more helpful than vague inferences.
schizobeyondpills wrote:
24 Jun 2020, 21:56
Its not too niche
And I'm still suggesting it is due to the apparent reaction you get from experts and novices alike, not only in this forum, but others (as you've inferred in previous posts).
schizobeyondpills wrote:
24 Jun 2020, 21:56
terms such as "placebo" is very insulting when i provide hard evidence from computer engineering pioneers rather than my own thoughts.
I haven't once claimed your stance on this subject is placebo, but due to your "tell" don't "show" approach, it does tend to draw that crowd, along with skeptics, which I'm sure is unwanted on both counts.
schizobeyondpills wrote:
24 Jun 2020, 21:56
I dont really care about G-Sync, its a power saving feature of monitors due to serious lack of great quality software architecture/game developers as well as consumer hardware power saving features and other profit maximizing things.
I'm sorry, but that's an ignorant statement, and a hypocritical one at that; if you can't respect another person's expertise on a subject, it is unreasonable for you to expect others to respect yours.

Simply put, G-SYNC isn't a power saving feature, it's a "lagless" sync method that matches the display rate to the render rate to prevent the added input lag and stutter caused by traditional syncing methods that instead force the render rate to match the display rate to prevent tearing. I invite you to read the article if you're interested in learning more (which by the dismissive tone of your statement, I doubt; but fair enough, to each his own).
schizobeyondpills wrote:
24 Jun 2020, 21:56
I am not advocating anything
ad·vo·cate
noun: advocate; plural noun: advocates
/ˈadvəkət/

a person who publicly supports or recommends a particular cause or policy.


--------

^ If that's not what you're doing, I don't know what you are...

Nothing wrong with that, but you have to do something. Suggesting you don't have someone to test for you or equipment to test with or the like isn't good enough. You have to become the tester. You have to provide evidence, and you have yet to. I'm happy to see it, and in any verifiable form.

Regedits, test programs, and (albeit result-less) technical articles are great, but they don't actually prove the difference in real world usage.
schizobeyondpills wrote:
24 Jun 2020, 21:56
Although the few people in this thread who did the timings however far from perfect and reported back with claims that difference it makes is amazing show there is hope for people listening.
Again, such subjective reactions lean toward the placebo group, of which you want to avoid if you want your stance to gain traction.

Finally, I say this all with complete sincerity, and no ill will or hostility. If this is something you care about, pursue it, otherwise, don't be surprised or offended that anyone outside the placebo crowd is not taking you as seriously as you'd like.
(jorimt: /jor-uhm-tee/)
Author: Blur Busters "G-SYNC 101" Series

Displays: ASUS PG27AQN, LG 48CX VR: Beyond, Quest 3, Reverb G2, Index OS: Windows 11 Pro Case: Fractal Design Torrent PSU: Seasonic PRIME TX-1000 MB: ASUS Z790 Hero CPU: Intel i9-13900k w/Noctua NH-U12A GPU: GIGABYTE RTX 4090 GAMING OC RAM: 32GB G.SKILL Trident Z5 DDR5 6400MHz CL32 SSDs: 2TB WD_BLACK SN850 (OS), 4TB WD_BLACK SN850X (Games) Keyboards: Wooting 60HE, Logitech G915 TKL Mice: Razer Viper Mini SE, Razer Viper 8kHz Sound: Creative Sound Blaster Katana V2 (speakers/amp/DAC), AFUL Performer 8 (IEMs)

User avatar
schizobeyondpills
Posts: 103
Joined: 06 Jun 2020, 04:00

Re: How much do ram timings affect input lag?

Post by schizobeyondpills » 24 Jun 2020, 23:34

>Yet whatever improvement it yields is still ultimately going to be visible in a frame to the end user, there's really no getting around that, and it doesn't matter how it comes about in this respect, only where and when it shows up.

They will be visible, just not to the full extent because RAM latency sits at the base of everything while display sits at the very end. I have never denied it wont be, just that it will be heavily tainted by everything in the pipeline to photon if pipes arent clean all the way through.

>First of all, overclocking is no simple matter, and if done right, takes knowledge, experience, and tons of time, including stability testing. It is also heavily dependent on capable and reliable cooling, of which few layman understand the importance of, let alone will go to the lengths to properly maintain.

All of the requirements you mentioned should already be present in the minds of individuals who seek to reduce their input lag. After all this isn't a place for my grandmother to discuss her candy crush scores, is it? Individuals looking for input lag reduction however..

Also, those who have powerful setups should probably invest in proper cooling rather than using whatever stock packaging contains.

A $20 worth of ingredients of pizza you make yourself tastes and is much better than buying a $20 fast food pizza for obvious economical/market/for profit reasons.

>As for the remainder, you have yet to lay out these steps once. I suggest you do so if able. Feel free to create a new thread containing a guide. It will be much more helpful than vague inferences.
To teach someone how to fish they must first be hungry, and I dont see any hunger, just people denying facts pulled straight from computer engineering books. sad and disappointing.

>And I'm still suggesting it is due to the apparent reaction you get from experts and novices alike, not only in this forum, but others (as you've inferred in previous posts).
Ignorance is the hardest thing to accept for someone who belives they are expert on the matter or have personal attachment to their purchases.

G-Sync only exists as a bandaid solution, there is no reason for it to exists ever, the reason it does is because of market pushing fancy graphics and using money on marketing/promotion of their game rather than proper optimized engineering even further caused by low development time "if it works, ship it!" mentality of game industry which saturated the quality of game engineering to the point its unable to run even 60 FPS constant without drops.

>a person who publicly supports or recommends a particular cause or policy.
Raw objective truth is not a cause or a policy. RAM latency affects everything, thats what im saying here. OP asked if it does, i gave hard evidence why it does, some dont like the sound of that due to attachment to their purchases/hardware.

>Nothing wrong with that, but you have to do something. Suggesting you don't have someone to test for you/to test with or the like isn't good enough. You have to become the tester. You have to provide evidence, and you have yet to.

Me and few others have provided evidence in other thread about Intel vs Ryzen, saying there is no evidence is insane.
Image

This one page, literally first page of book about RAM explains why RAM latency matters, explains why there are 3 levels of caches, yet somehow 20 years later with 5GHz 8 core CPUs people refuse to accept it.

Do CPUs have 3 levels of caching? Is that my claim? Or is it the truth backed by chip wiki? So why are multibillion corporations (Intel/AMD) wasting so much money and CPU space on dies if RAM latency doesnt matter? You and others who are in your placebo group of RAM latency not mattering much should contact them and tell them to remove caches for something better.
https://en.wikichip.org/wiki/intel/core_i7/i7-9700k
Why are there 3 caches? RAM latency was not improved at all, and look at what we have, hundreds of power saving/downclocking/turning off features inside every component of every part of input to photon pipeline, I wonder why? Could it be memory latency? :OOOOOOOOO

User avatar
jorimt
Posts: 2481
Joined: 04 Nov 2016, 10:44
Location: USA

Re: How much do ram timings affect input lag?

Post by jorimt » 25 Jun 2020, 07:05

schizobeyondpills wrote:
24 Jun 2020, 23:34
They will be visible, just not to the full extent because RAM latency sits at the base of everything while display sits at the very end. I have never denied it wont be, just that it will be heavily tainted by everything in the pipeline to photon if pipes arent clean all the way through.
You do realize that what you're saying basically suggests this can't be practically tested so we just have take your (and a single page of a single technical article's) word for it?
schizobeyondpills wrote:
24 Jun 2020, 23:34
All of the requirements you mentioned should already be present in the minds of individuals who seek to reduce their input lag. After all this isn't a place for my grandmother to discuss her candy crush scores, is it? Individuals looking for input lag reduction however..

Also, those who have powerful setups should probably invest in proper cooling rather than using whatever stock packaging contains.
Yet you don't even find it worth the trouble of properly wrapping my comments in a quote; not everyone has the same priorities or tolerances, and what you're suggesting would require the majority of PC gamers (many of which just want to play the darn game) to go far more out of their way than navigating the inconveniences of basic forum post formatting.

And further still, you're not even bothering to offer explicit steps or instructions, since that, as opposed to pure talk, would make you accountable and vulnerable to actual critique.

The closest you've come is discussing the Intel MLC tool (in this thread: viewtopic.php?f=10&t=7033, and I don't even think you were the original poster to suggest it?), but then you didn't even lay out the steps to properly use it when other posters ended up using it incorrectly, and I and RealNC had to ultimately step in to provide them.

That, and Intel MLC is yet another benchmark that does nothing to prove real world benefits in end user gaming.
schizobeyondpills wrote:
24 Jun 2020, 23:34
To teach someone how to fish they must first be hungry, and I dont see any hunger, just people denying facts pulled straight from computer engineering books. sad and disappointing.
Build it and they will come.
schizobeyondpills wrote:
24 Jun 2020, 23:34
Ignorance is the hardest thing to accept for someone who belives they are expert on the matter or have personal attachment to their purchases.
So how are you going to convince them? Because this sort of argument isn't going to achieve it.
schizobeyondpills wrote:
24 Jun 2020, 23:34
G-Sync only exists as a bandaid solution
You do realize this is the case for almost every supplemental hardware and software solution in existence (including RAM tuning parameters)? They are all ultimately compensatory mechanics for imperfect hardware performance, which no amount of tuning by the end user will 100% correct.

Some of this is limitations of modern PCs, and some of it is limitation of physics. So long as perfection isn't achievable (hint: it never will be), these solutions will remain necessary.
schizobeyondpills wrote:
24 Jun 2020, 23:34
Raw objective truth is not a cause or a policy. RAM latency affects everything, thats what im saying here.
And so does the cable that brings power to the PC. Even your greatest skeptics here will admit that RAM capability has an impact on overall performance, but what's your point beyond that? What do you want everyone to do, and HOW do they do it to your satisfaction?
schizobeyondpills wrote:
24 Jun 2020, 23:34
Me and few others have provided evidence in other thread about Intel vs Ryzen, saying there is no evidence is insane.
Image
I've seen that one image twice now. And? No one is claiming RAM latency doesn't matter here, but that single page of one article isn't providing any actionable instructions or insight to the average end user.

I think we're at an impasse, and attempting any further conversation with you would be equivalent to a snake eating its own tail, but I wish you luck on your mission (whatever it ends up being) regardless.
(jorimt: /jor-uhm-tee/)
Author: Blur Busters "G-SYNC 101" Series

Displays: ASUS PG27AQN, LG 48CX VR: Beyond, Quest 3, Reverb G2, Index OS: Windows 11 Pro Case: Fractal Design Torrent PSU: Seasonic PRIME TX-1000 MB: ASUS Z790 Hero CPU: Intel i9-13900k w/Noctua NH-U12A GPU: GIGABYTE RTX 4090 GAMING OC RAM: 32GB G.SKILL Trident Z5 DDR5 6400MHz CL32 SSDs: 2TB WD_BLACK SN850 (OS), 4TB WD_BLACK SN850X (Games) Keyboards: Wooting 60HE, Logitech G915 TKL Mice: Razer Viper Mini SE, Razer Viper 8kHz Sound: Creative Sound Blaster Katana V2 (speakers/amp/DAC), AFUL Performer 8 (IEMs)

Post Reply