Can more cores cause lower input lag? And more cpu questions.

Everything about latency. Tips, testing methods, mouse lag, display lag, game engine lag, network lag, whole input lag chain, VSYNC OFF vs VSYNC ON, and more! Input Lag Articles on Blur Busters.
SKILLEN
Posts: 93
Joined: 01 Sep 2020, 23:08

Re: Can more cores cause lower input lag? And more cpu questions.

Post by SKILLEN » 29 Oct 2022, 11:54

Worstylolzs wrote:
29 Oct 2022, 11:32
InputLagger wrote:
28 Oct 2022, 20:46
Worstylolzs wrote:
27 Oct 2022, 04:09

Actualy i play cpu bound games like apex legends and WoW, sometimes battlefield 4.

hi, do you have input lag or some "lag" problems (lika many people on this forum) with the current configuration? games are buttery smooth and fluid?
In my opinion yes. When i play apex, i got feel to have lag all times in firing actions mean stress situations, when you 1v1 or sometimes when i "tracking" fire someone. It's not same smooth like when i looking at the streamers. I try check utilization on my CPU and i got between 70-100% so it's reason i think there is the problem. I use process lasso what help a little but still i can see how after half hour play my graph show some cores what hit 100% still after the proces lasso optimalization. But how i say, its only my feeling and one graph. I don't have any good testing methods. If you got someone i can try it. But in other way, apex legends has rly bad engine and if i try to play minecraft, its smooth AF. But on minecraft my cpus is bored on 30%... sooo....

Wow is smooth but only for some FPS. If im near 100 fps i got not smooth screen but IDK now what CPU utilization i got when i have 100FPS.


But you say same problems like ppl on forum. I experienced how can do cable to PSU different. I bought maybe 6 cables with different brand and with some cables i got feel like my eyes are more hurt and it's not so smooth like on last one what i use now. So IDK it that means my cables are bad or if that means PSU is broken. But again, its only my feeling testing, i dont have any proof how to show you.

Whole point of this topic is to open debate if can more cores help to smoother/lower input lag experience and help my to do decision what cpu to buy. If it's on bad forum, i will ask moderators to change forum of this topic.



In cs go its same in stress situation or tracking(mouse input lag) ,no fps drops, CPU utilization 30%

User avatar
Chief Blur Buster
Site Admin
Posts: 11653
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: Can more cores cause lower input lag? And more cpu questions.

Post by Chief Blur Buster » 29 Oct 2022, 19:51

triplese wrote:
29 Oct 2022, 10:58
Tuhin Lavania wrote:
28 Oct 2022, 20:56
Please leave 59XX series out of the choice if you are looking for low input lag. i have a 5950x and i have to disable 8 cores ( 1 CCD ) to get acceptable latency.
Never again buying AMD. Intel will always have lower latency averages than AMD. I have the 13900k and 4090 ordered for my new PC, lets see what it brings in terms of input lag.
What latencies do you measured?
I had 5600x and now 5950x - any popular benchmarks demonstrating +\- same latency.
I dont believe that you can see ~50 nanoseconds difference while playing with 2 CCD or on 1 CCD. Internet can cause way bigger lags than nanoseconds intercore latency
Wouldn't dismiss this off the bat. We're Blur Busters, after all, to keep an open mind on the unexpected consequences of the refresh rate race to retina refresh rates.

Remember it's death by a thousand paper cuts (nanoseconds). There are situations where tiny latencies can cascade into visible latencies. A million nanoseconds is a millisecond.

The right AMD processor is still good, but a suboptimal system (suboptimal combo) with suboptimal software optimization, it's possible for intercore latencies to start to build up massively if there's a lot of locks/semaphores/mutexes/etc in processing. I've seen some developer report multi-millisecond differences in software development.

If software and OS was designed perfectly, it would stay nanoseconds and microseconds, but that is not always the case. There are "death by a thousand cuts per instant" situations where latencies can builds up.

Also, again, there can be placebo issues, red herrings, and wild goose chases. That does happen. But I wouldn't dismiss intercore latencies fully -- they have a nasty habit of domino effects. For example a multithreaded app where two threads communicates far over 10,000x per second between the threads -- THAT is where nanoseconds start to build up in an ugly way.

50 nanoseconds multiplied by 30,000 equals 1.5 millisecond. If it's clumped in a way that prevents smooth event-processing, your mouse input might be delayed by a bit.

Even a 0.5ms delay to a mouse poll creates a 5 pixel mouse position jump at 10,000 pixels/sec flick turn -- enough to miss a faraway enemy if it's a latency-divergence away from your prior aimtraining.

Now 1.5 milliseconds? That's a 15 pixel mouse-position-erraticness during 10,000 pixel flick turn.

"...I wonder why my aim feels wrong..." -- 'nuff said.

I am a software developer and I am a messenger to tell you that can be real. In your C++ application, if you've mutex'd or semaphore'd between two threads and accidentally thrash the communications between two threads (where each thread asks the other thread to finish something first before resuming, and it goes back and fourth thousands of times per second by accident), lots of latency problems happen from major orders-of-manitude dominoe effect multipliers!!!

If this never happened, AMD vs Intel would be largely moot, but it ain't the case... Since the software works anyway, such accidental efficiency bugs (from rapid thread-communciation-thrash) often stay hidden and unfixed by developers.

As the Temporal Myth Busters (zapping all those descendants of "Humans Can't Tell Apart 30fps vs 60fps" in a mic dropping factory), I am a messenger to tell you that sometimes 50 nanoseconds is dangerous if it creates a clumped blocking condition, sort of defacto like a non-maskable interrupt behavior (e.g. CPU pegged, delays to DPC processing, delays to mouse event loop, etc). A thread-communication-thrash at high priority in a Ring 0 driver is deadly to other lower-priority drivers, or the game event's loop -- even at 50 nanoseconds.

Even if it's 90-99% chance of placebos, we don't dismiss the 1%-10%.
We Are The Blur Busters. Who Ya Gonna Call? :D
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

Tuhin Lavania
Posts: 59
Joined: 21 Feb 2022, 09:26

Re: Can more cores cause lower input lag? And more cpu questions.

Post by Tuhin Lavania » 29 Oct 2022, 21:13

triplese wrote:
29 Oct 2022, 10:58
Tuhin Lavania wrote:
28 Oct 2022, 20:56
Please leave 59XX series out of the choice if you are looking for low input lag. i have a 5950x and i have to disable 8 cores ( 1 CCD ) to get acceptable latency.
Never again buying AMD. Intel will always have lower latency averages than AMD. I have the 13900k and 4090 ordered for my new PC, lets see what it brings in terms of input lag.
What latencies do you measured?
I had 5600x and now 5950x - any popular benchmarks demonstrating +\- same latency.
I dont believe that you can see ~50 nanoseconds difference while playing with 2 CCD or on 1 CCD. Internet can cause way bigger lags than nanoseconds intercore latency
I usually watch Latency Mon numbers and while anyone would say they are not accuracte but they still give a fair knowhow of how responsive your system is. Disabling 1 CCD clearly lowers down a lot of dpc latency and it shows in the game. Now i know not many would be able to notice it, but i can and that has led me to venture down to this input lag problem. The people who i play with, they dont even notice that something is wrong with their system and they take network latency as the holy grail which i just refuse that is the case.

Let me tell you one another example. I have a 7.2.2 Home Theatre system in a dedicated black velvet room. My PC ( 5950x and a 3080Ti ) was earlier in that room, connected to a 20A socket. Now i know this isnt a place for those talks, but let me write anyway.
i wanted my PC stuff to be out of the HT room since i dont want much clutter there. So i brought in an electrician, told him to get me a dedicated 20A socket in my other room as i wanted to shift my PC setup there in another room. Did that, connected my PC there and the first thing i did was to check my DPC latency and i was shocked to find that it much lower ( 0.65-0.67 to 0.50 to 0.52 ) Now this cant be a placebo. Playing Halo since then and the game feels more " grounded " and less " floaty ".

Coming back to CCD topic, it clearly makes a noticeable difference and is not a placebo. i will not play any mp game with dual CCDs anymore. You do it, and tell us the results what you feel. Dual CCDs arent needed anyway, 8 cores are enough for any game.

Another shocker i found out is you have to enable AMD Cool and Quiet ( PSS in bios ), earlier i had that disabled. Its even said to enable Global C States but i havent done that now.
Getting a 13900k and a 4090 setup for gaming. The AMD system will go to HT to function as a madVR HTPC.

greenenemy
Posts: 35
Joined: 11 Mar 2015, 04:45

Re: Can more cores cause lower input lag? And more cpu questions.

Post by greenenemy » 30 Oct 2022, 07:48

Chief Blur Buster wrote:
29 Oct 2022, 19:51
triplese wrote:
29 Oct 2022, 10:58
Tuhin Lavania wrote:
28 Oct 2022, 20:56
Please leave 59XX series out of the choice if you are looking for low input lag. i have a 5950x and i have to disable 8 cores ( 1 CCD ) to get acceptable latency.
Never again buying AMD. Intel will always have lower latency averages than AMD. I have the 13900k and 4090 ordered for my new PC, lets see what it brings in terms of input lag.
What latencies do you measured?
I had 5600x and now 5950x - any popular benchmarks demonstrating +\- same latency.
I dont believe that you can see ~50 nanoseconds difference while playing with 2 CCD or on 1 CCD. Internet can cause way bigger lags than nanoseconds intercore latency
Wouldn't dismiss this off the bat. We're Blur Busters, after all, to keep an open mind on the unexpected consequences of the refresh rate race to retina refresh rates.

Remember it's death by a thousand paper cuts (nanoseconds). There are situations where tiny latencies can cascade into visible latencies. A million nanoseconds is a millisecond.

The right AMD processor is still good, but a suboptimal system (suboptimal combo) with suboptimal software optimization, it's possible for intercore latencies to start to build up massively if there's a lot of locks/semaphores/mutexes/etc in processing. I've seen some developer report multi-millisecond differences in software development.

If software and OS was designed perfectly, it would stay nanoseconds and microseconds, but that is not always the case. There are "death by a thousand cuts per instant" situations where latencies can builds up.

Also, again, there can be placebo issues, red herrings, and wild goose chases. That does happen. But I wouldn't dismiss intercore latencies fully -- they have a nasty habit of domino effects. For example a multithreaded app where two threads communicates far over 10,000x per second between the threads -- THAT is where nanoseconds start to build up in an ugly way.

50 nanoseconds multiplied by 30,000 equals 1.5 millisecond. If it's clumped in a way that prevents smooth event-processing, your mouse input might be delayed by a bit.

Even a 0.5ms delay to a mouse poll creates a 5 pixel mouse position jump at 10,000 pixels/sec flick turn -- enough to miss a faraway enemy if it's a latency-divergence away from your prior aimtraining.

Now 1.5 milliseconds? That's a 15 pixel mouse-position-erraticness during 10,000 pixel flick turn.

"...I wonder why my aim feels wrong..." -- 'nuff said.

I am a software developer and I am a messenger to tell you that can be real. In your C++ application, if you've mutex'd or semaphore'd between two threads and accidentally thrash the communications between two threads (where each thread asks the other thread to finish something first before resuming, and it goes back and fourth thousands of times per second by accident), lots of latency problems happen from major orders-of-manitude dominoe effect multipliers!!!

If this never happened, AMD vs Intel would be largely moot, but it ain't the case... Since the software works anyway, such accidental efficiency bugs (from rapid thread-communciation-thrash) often stay hidden and unfixed by developers.

As the Temporal Myth Busters (zapping all those descendants of "Humans Can't Tell Apart 30fps vs 60fps" in a mic dropping factory), I am a messenger to tell you that sometimes 50 nanoseconds is dangerous if it creates a clumped blocking condition, sort of defacto like a non-maskable interrupt behavior (e.g. CPU pegged, delays to DPC processing, delays to mouse event loop, etc). A thread-communication-thrash at high priority in a Ring 0 driver is deadly to other lower-priority drivers, or the game event's loop -- even at 50 nanoseconds.

Even if it's 90-99% chance of placebos, we don't dismiss the 1%-10%.
We Are The Blur Busters. Who Ya Gonna Call? :D
Wouldn't that be just visible as stutter or low fps?

I believe that when comparing two CPU's input lag in any particular game, the easiest way is to check which one has better performance in that game and that's it, regardless of intercore latencies, because if that game is affected by " death by a thousand paper cuts" it would be reflected in performance results.

User avatar
Chief Blur Buster
Site Admin
Posts: 11653
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: Can more cores cause lower input lag? And more cpu questions.

Post by Chief Blur Buster » 30 Oct 2022, 17:43

greenenemy wrote:
30 Oct 2022, 07:48
Wouldn't that be just visible as stutter or low fps?
Yes and no.

It depends on the mathematical variables.

Depends on how big the clumped the thread-thrash chained-freezes are (e.g. 1.5ms freezes in a driver) and how frequent they are.

IF (the microfreezes are regular/cyclic) AND (they are sufficiently big in milliseconds) THEN it can becomes visible stutter (or extra blur).

Don't forget that there's a stutter-to-blur continuum (www.testufo.com/eyetracking#speed=-1 ...stare at 2nd UFO for 30 seconds to understand). Stutters and persistence blur is essentially the same thing! Like slow vibrating music strings (shaky) versus fast vibrating music strings (blurry).

Rapid jitters = extra persistence motion blur above-and-beyond display's lowest persistence capability. Like a fast vibrating guitar string looks blurry.

Microstutters are only easily visible when the stutter jump (stutter error in millisecond) exceeds the motion blur (in persistence milliseconds). If they're just occasional one-off minor microstutters, they may not be visible in the motion blur of low frame rates (e.g. 3ms stutter in 20ms MPRT), they may not be visible stutter but still create a "It Feels Off" effect. Now, 3ms stutter in 3ms MPRT is much more visible (3ms is bigger than a single framedrop at 360fps 360Hz on a 360Hz monitor is more human visible than a 3ms microstutter at 30fps on a 60Hz LCD with its 16.7ms refreshtimes).

Oftentimes, the refresh rate race, with higher frame rates, massively amplify visibility of small problems. As the refresh rate race pushes the Hz, the gradually unpeeled stutter onion (of multiple stutter/microstutter causes) reveals ever-tinier human visible problems.

Other times, it's a "I feel like I'm performing worse but I don't know why" situation. Allow me to explain why:

Just like an Olympics 100 meter sprint, two racers may finish almost simultaneously (1ms apart) and not know who won, until they see the electronic scoreboard. If your gunshot is 1.5ms delayed and both of you have identical human reaction time (e.g. both averaging 175ms), then the one with the less delay is more likely to get the frag. You don't always notice except you notice your scores are lowering. It could be simply an aim stabilizer fail (e.g. overshoot/undershoot by 1.5ms) or a buttonpress erraticness (button reacts 1.5ms later than usual), that you don't really feel but it feels like you're underperforming.

Likewise, the same situation happens with two FPS players suddenly seeing each other simultaneously and shooting each other simultaneously -- with the same human reaction time. In this situation, that millisecond is an advantage even if you don't feel the millisecond until after the frag has already happened. If you start suddenly playing badly and don't know why, it's often common for the esports player to suspect latency-change effects -- to legitimate reason. Since it's a race-to finish situation. Not all studies successfully eliminate error margins to successfully test these factors surgically.

See Millisecond Matters: The Amazing Human Visible Feats of the Millisecond as the micdropping factory, that's why we are snobs against temporal-dismissers luddites around here, especially since nanoseconds can cascade to microseconds can cascade to milliseconds, in things like mutex/semaphore thread-thrash situations in a buggy unfixed Ring 0 device driver that can jitter other things elsewhere in the system....

Yes yes, 128 tick servers give granularity that obscure milliseconds, but even a random 1ms earlier = still 1/8 chance of making an earlier 1/128sec tick and getting that frag. It all builds up over time!

Sure sure, could be 90%-99% placebo, but it's not always, and we don't dismiss if it's more than a 0% chance. We're Blur Busters, the Temporal Mythbusters. Hz, GtG, MPRT, lag, etc.

Sure sure, network issues (and lag compensation etc) dominate the performance issues, but a lot of people do play LAN games, or play against other players on FTTH-only servers (matchmaking that filtered to lowest lag players) in certain game engines that are very precise, or other situations where the millisecond may be more amplified.

Either way:

We're in over 25 peer reviewed research papers now, and think outside the box on unexpected situations where tiny temporals start to have human visible effects in some use cases.

</🎤>
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

User avatar
Chief Blur Buster
Site Admin
Posts: 11653
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: Can more cores cause lower input lag? And more cpu questions.

Post by Chief Blur Buster » 31 Oct 2022, 23:41

Also stutter=sudden lag change. A 1.5ms stutterbehind is ALSO a 1.5ms lagbehind for that frame.

Some people don't realise "stutter = lag variance problem".

Bad stutter (even high-frequency erratic stutter that vibrates so fast it blends to motion blur) can still hurt aiming.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

greenenemy
Posts: 35
Joined: 11 Mar 2015, 04:45

Re: Can more cores cause lower input lag? And more cpu questions.

Post by greenenemy » 01 Nov 2022, 04:04

Chief Blur Buster wrote:
30 Oct 2022, 17:43
greenenemy wrote:
30 Oct 2022, 07:48
Wouldn't that be just visible as stutter or low fps?
Just like an Olympics 100 meter sprint, two racers may finish almost simultaneously (1ms apart) and not know who won, until they see the electronic scoreboard. If your gunshot is 1.5ms delayed and both of you have identical human reaction time (e.g. both averaging 175ms), then the one with the less delay is more likely to get the frag. You don't always notice except you notice your scores are lowering. It could be simply an aim stabilizer fail (e.g. overshoot/undershoot by 1.5ms) or a buttonpress erraticness (button reacts 1.5ms later than usual), that you don't really feel but it feels like you're underperforming.

Likewise, the same situation happens with two FPS players suddenly seeing each other simultaneously and shooting each other simultaneously -- with the same human reaction time. In this situation, that millisecond is an advantage even if you don't feel the millisecond until after the frag has already happened. If you start suddenly playing badly and don't know why, it's often common for the esports player to suspect latency-change effects -- to legitimate reason. Since it's a race-to finish situation. Not all studies successfully eliminate error margins to successfully test these factors surgically.

</🎤>
I agree that even 1ms can make a difference in some situations but I'm afraid that people are going to simplify this to something like:
"I'm elite player stuck in bronze because of input lag and AMD CPUs have higher latency thus don't buy their CPUs they will make you bad".
:?
When having a browser in the background will probably make a bigger difference.

Also when we compare framerate charts for any game like this:
Image
It is clear that there are other shortcomings in processors that a game can expose. If we take for example i5-7600k and compare it to ryzen 3300x in battlefield 5 then ryzen has not only better avg framerate = lower input latency but also much better 1% Low so less stutter.
So statements like that:
Tuhin Lavania wrote:
28 Oct 2022, 20:56
Intel will always have lower latency averages than AMD.
are simply not true.
Last edited by greenenemy on 02 Nov 2022, 01:23, edited 1 time in total.

User avatar
Chief Blur Buster
Site Admin
Posts: 11653
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: Can more cores cause lower input lag? And more cpu questions.

Post by Chief Blur Buster » 01 Nov 2022, 19:55

greenenemy wrote:
01 Nov 2022, 04:04
So statements like that:
Tuhin Lavania wrote:
28 Oct 2022, 20:56
Intel will always have lower latency averages than AMD.
are simply not true.
I agree too. AMD can still produce superior results if you know what you're doing -- and if you've properly changed the variables, and aren't doing anything that is hobbled by AMD quirks.

After all, the system is a package (motherboard, etc). I've seen crappy Intel motherboards create more lag than an excellent AMD motherboard with an excellent AMD CPU.

There's always exceptions.

It's a shame that there are so many systems hobbled by hidden bottlenecks, never debugged by their vendors, because it didn't create a big-enough inefficiency that affected a big number of users (including outside esports).
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

Kyouki
Posts: 193
Joined: 20 Jul 2022, 04:52

Re: Can more cores cause lower input lag? And more cpu questions.

Post by Kyouki » 02 Nov 2022, 02:54

Personally can verify on my 3950x that the below chart is very accurate:
https://images.anandtech.com/doci/16214 ... _575px.png

Planning to move to a 5950x for easier management and better ccx to ccx latency:
https://images.anandtech.com/doci/16214 ... _575px.png

And those are the only 'penalty' you'll get.
Unfortunately, EAC protected games do not allow you to do Core Affinity :x
Hope that maybe someone has some advice for this, perhaps or other secret things to the 3950x/5950x to improve gaming performances.
Already own a copy of Process Lasso to try and mitigate the ccx-ccx penalty.

Source: https://www.anandtech.com/show/16214/am ... x-tested/5
CPU: AMD R7 5800x3D ~ PBO2Tuner -30 ~ no C states
RAM: Gskill Bdie 2x16gb TridentZ Neo ~ CL16-16-16-36 1T ~ fine tuned latency
GPU: ASUS TUF 3080 10G OC Edition(v1/non-LHR) ~ disabled Pstates ~ max oced
OS: Fine tuned Windows 10 Pro, manual tuned.
Monitor: Alienware AW2521H ~ mix of ULMB/Gsync @ 240hz/360hz
More specs: https://kit.co/Kyouki/the-pc-that-stomps-you

daviddave1
Posts: 408
Joined: 04 Aug 2017, 17:43

Re: Can more cores cause lower input lag? And more cpu questions.

Post by daviddave1 » 02 Nov 2022, 09:31

Worstylolzs wrote:
27 Oct 2022, 04:09
Hello,

story: I want to update my cpu to ryzen 5000 series, but i cant do decision what cpu to buy.
I can buy 5600x what is rly on low cost, but he have 6/12 cores. Then i can buy 5700x what have 8/16. This is real good budget choice, but im scared about cpu Utilization, what can cause horible lags then gpu bound situations. So i dont know if to buy 5600x/5700x or spend more money for 5900x/5950x or just buy 5800x3D.

my setup is: Cpu:intel i7-6700k, MB 270 tomahawk, psu 550w ( i think bronze... some lowcost, must buy new too) gtx 1080 360hz monitor.
Actualy i play cpu bound games like apex legends and WoW, sometimes battlefield 4.

1) Can more cores cause lower input lag in term better stability because don't have 100% cpu utilization all the time ?

2) Is for FPS Gaming more stable to have cpu with 12cores, than 5700x/5800x3D

3) Is cpu utilization rly matter?

4) 4) can i get 100% utilization on ryzen cpus 5700x,5800x/3D when i play on low settings with GTX 1080 and 360hz 1920x1080 monitor?

Thanks for you future answers!
Back in the Day ( 4 years ago) Fr33thy advised to play old games like CS GO with Multi Treading off. And bind your GPU, soundcard, network card, USB ports to specific cores. After some intensive testing he let go of this statement. He now is a firm believer in Multi Treading gameplay and also don't believe in binding specific cores to your for example your GPU. He also is a AMD fanboy now. GPU and CPU Wise. plus he let go of his believes in Latencymon.
| Now: ASUS PG248QP 540Hz. | Past : VG259QM with the Qisda panel/PG27AQN/XL2566K

Post Reply