Screen tearing fix with GSYNC OFF

Talk about NVIDIA G-SYNC, a variable refresh rate (VRR) technology. G-SYNC eliminates stutters, tearing, and reduces input lag. List of G-SYNC Monitors.
Eonds
Posts: 262
Joined: 29 Oct 2020, 10:34

Re: Screen tearing fix with GSYNC OFF

Post by Eonds » 09 Oct 2021, 05:44

jorimt wrote:
05 Oct 2021, 09:07
Eonds wrote:
05 Oct 2021, 07:49
G-sync adds lag it's not virtually zero (It sounds nit picky but we don't say that when referring to CPU's cache latencies).
The only lag G-SYNC "adds" is directly from hiding the tearline, nothing more, and it's technically not an "addition," since it's adhering to the native scanout time of the display, which is what single, tear-free frame delivery speed is limited by.

Tearing is a form of input lag reduction as well, one that many prefer to opt out of due to the very appearance of the artifact that allows the reduction, and something the OP was specifically asking about; avoiding screen tearing.

If you've heard of Scanline Sync, which is a no sync form of "V-SYNC," so to speak, G-SYNC is the superior (dynamic) version of that. It steers the tearline off-screen without forcing the GPU to deliver in fixed intervals of the display like traditional V-SYNC does, which is the only thing that causes V-SYNC input lag and stutter in the first place; the GPU output syncing to the display.

At 360Hz, crosshair-level "lag" is indeed virtually identical with G-SYNC on/off (assuming all other settings in both scenarios are the same but for that) at the same framerate within the refresh rate.
Eonds wrote:
05 Oct 2021, 07:49
Just use a stable system that has lower latency. stop circle jerking about FPS & refresh rates.
I find it odd you're suggesting we ignore the most obvious and easy-to-fix forms of latency and focus solely on the most elusive and difficult to objectively measure and achieve, and only call the latter worth pursuing. I believe we should instead address the former first and work our way down. I.E. start with macro, end with micro.

For instance, going by some of your recent posts on this forum, are you suggesting it's perfectly fine to ignore substantial latency increases due directly to things such as average FPS, max refresh rate, the render queue, and traditional syncing methods (such as double and triple buffer V-SYNC, which can add frames of delay if not properly configured), and limit ourselves to something like 60Hz with uncapped FPS (V-SYNC on or off) so long as we're running certain legacy CPU models with an OS, bios, and (DDR3) RAM that are tuned to the nth degree?

Even no sync with an uncapped FPS can have 2 full frames more delay (that's 33.2ms at 60 FPS, for instance) if the system is GPU-limited at any point. And just going from 60Hz with a 60 FPS average to 120Hz with a 120 FPS average (not considering any form of sync or GPU-limitation-related delay) reduces average latency by 8.3ms, and that's just from the render time and scanout cycle time reduction.

Achievable average FPS and max refresh rate are kind of a big deal where latency is concerned. In fact, they're the primary path to guaranteed native latency reduction.

The higher the refresh rate, the less tearing artifacts are visible as well, regardless of framerate, which means the less "need" for syncing methods, even G-SYNC. In fact, at 1000Hz, syncing methods will effectively no longer be needed for tearing prevention due to that refresh rate's sheer scanout speed.

Most of your suggested (micro) tweaks (which, don't get me wrong, I'm not knocking) may reduce latency by 1 to sub-milliseconds on average (and thus would instead tend to provide an increased feeling of consistency—another important metric—more than a notable latency reduction), while those like mine (macro) reduce it by dozens of milliseconds or more, and with much less effort required by the end-user.

I try to avoid any unnecessarily confrontational interactions like this, but if you're going to address me directly with such dismissive comments, perspective please; there's a place for both micro and macro. It's all part of one bigger latency picture.




Reducing latency jitter (max & min latency) is far more important than just the average. Using g-sync will inherently increase latency (obviously), but i don't consider tearing a form of latency. I'm not saying don't use g-sync btw, I can see it's usefulness. I'm more interested in the nitty gritty of systems. I want to know why this jitter is present. What is changing within the systems operation that causes a lack of consistency. You know, I won't say this as a fact but simply an observation, my USB 2.0 port is much more responsive than my 3.0 ports (probably due to it being closer but the lack of power saving features on 2.0). Keep in mind, I own two 8000 polling rate peripherals connected to the same "usb hub". These are the types of latencies which are really hard to measure with clicks as when you're playing you're moving your mouse larger distances and your eyes can recognize changes which aren't exactly testable with click to photon or other methodologies. DRAM accounts for at LEAST 10% of system latency at any given time but people see/say the word nanoseconds & automatically think that tuning ram will only result in nanosecond latency improvements which is just showing the lack of understanding.


I forgot the exact name of someone who said this but the quote goes something like: " I'd rather have 144 FPS on a i7-3770k & DDR3 (assuming it's tuned to the gills), then 144 FPS on a modern system". I don't think your suggesting humans CANT perceive sub millisecond changes in latency (assuming it's something directly affecting the mouse's input?)


The comment there is saying you're either forgetting or not understanding that a frame is simply a frame which is "encapsulating your systems output from any given moment. FPS Is not the only latency metric and the fact that it's such a popular benchmark people start misunderstanding that a increase in FPS (a tweak/hardware change/upgrade etc) may COST you higher latency & or jitter, than the increase of FPS gives you from a specific "tweak or change in hardware". It's like people forget how computers work. They weren't meant to be a toy, or be operated by people who don't understand them. Which is fine i suppose, but now this is the product. We get a massively massively uneducated "audience" or market that is easily manipulated by FPS benchmarks etc. I'm not saying it's the end of the world at all, after all we're talking about milliseconds here.


The thing about g-sync is the latency penalty. Sure you can consider tearing a form of latency but you're essentially making a trade off that will directly affect your perception (etc go down the chain of how humans perceive latency blah blah) so ultimately affecting the player.. I personally dislike anything that comes at a relatively significant cost of latency / latency jitter. I also don't believe g-sync is necessary, although useful for many scenarios which don't require peak system performance. You're making assumptions that a system can always and accurately deliver a frame at the theoretical minimum. Which simple things like tile based rasterization inherently add delay to when you finally get the frame, the trade off is less power & more FPS at the expense of latency. There's a reason companies have spent millions upon millions of dollars into this topic. Real-time systems do still have a place in our world & if you look at how they're built/tuned you'll understand my perspective much more. If you've ever played on a relatively low latency setup, nothing compares. I'm not saying forego refresh rates either, Let's take a scenario (280hz LCD 1.7ms of "lag" vs a 100hz CRT) You get more visual updates of what's going on, OR you go for a clearer & lower latency display. In this scenario it's really preference, and some competitive high level players have completely difference opinions/choices when it comes to these things.



TLDR: Latencies can be preference and traded off for certain improvements. Windows sucks, systems are complex and not simple by any means and are easily affected by thousands of variables.

User avatar
jorimt
Posts: 2481
Joined: 04 Nov 2016, 10:44
Location: USA

Re: Screen tearing fix with GSYNC OFF

Post by jorimt » 09 Oct 2021, 09:22

Eonds wrote:
09 Oct 2021, 05:44
Reducing latency jitter (max & min latency) is far more important than just the average. Using g-sync will inherently increase latency (obviously), but i don't consider tearing a form of latency.
The very fact that you're still stating the emboldended means you don't understand G-SYNC well enough. And tearing is a form of latency "reduction," not "latency." It cheats the scanout by showing more than part of one frame scan in a single refresh cycle.

I'm cross-posting another post I made recently on the Guru3D forums about this for someone that had the same misconception:
https://forums.guru3d.com/threads/shoul ... st-5953288

Read it if you want to fully understand why G-SYNC can be considered "neutral" in relation to the given max refresh rate latency-wise. Again, it does not "increase" latency. G-SYNC syncs frames, it does not render them; that's already done once it receives them.
Eonds wrote:
09 Oct 2021, 05:44
I'm not saying don't use g-sync btw, I can see it's usefulness. I'm more interested in the nitty gritty of systems. I want to know why this jitter is present. What is changing within the systems operation that causes a lack of consistency.
Okay, but you're still effectively comparing apples and oranges, and then stating "Only consider apples; everyone should ignore the oranges because I'm not interested in oranges personally."

Again, I'm not discounting your perspective, I'm saying it encompasses only one aspect of the latency chain. There's more to consider.

That, and the irony of your "What is changing within the systems operation that causes a lack of consistency" statement, is that G-SYNC actually increases the consistency of frame delivery over no sync, but at the cost of "not reducing latency further." I.E. increased consistency can sometimes = more absolute latency, be that system or display side.
Eonds wrote:
09 Oct 2021, 05:44
You know, I won't say this as a fact but simply an observation, my USB 2.0 port is much more responsive than my 3.0 ports (probably due to it being closer but the lack of power saving features on 2.0). Keep in mind, I own two 8000 polling rate peripherals connected to the same "usb hub". These are the types of latencies which are really hard to measure with clicks as when you're playing you're moving your mouse larger distances and your eyes can recognize changes which aren't exactly testable with click to photon or other methodologies.
But as you're about to say later down, everything is encapsulated in the frame. CORRECT. Click-to-photon is capturing those very frames that encapsulate all that information. The only reason you're "feeling" ANY differences is because you're seeing them in the frame. This does not invalidated click-to-photon tests, they emulate what we end up seeing on-screen, and that's all that ultimately matters.

As for click-to-photon vs, say, actually moving the mouse, it doesn't matter much with what we're currently talking about, which is any added delay from syncing methods, something that will be the same whether we're clicking or dragging our mice. I.E. V-SYNC lag will add the same amount in both scenarios.

And whether or not sub-millisecond differences from things such as DRAM tuning or polling rates get lost in the "noise" threshold or from other limitations of such tests is another question, but then it's possible it gets lost on many but the most sensitive users experiencing such reductions as well.

That doesn't mean things such as DRAM latency isn't important, it is, and also has a role, along with the CPU, for increasing minimum framerates, which improves frametime performance and reduces certain forms of stutter.
Eonds wrote:
09 Oct 2021, 05:44
I forgot the exact name of someone who said this but the quote goes something like: " I'd rather have 144 FPS on a i7-3770k & DDR3 (assuming it's tuned to the gills), then 144 FPS on a modern system".
Legacy hardware tuned to the nth degree having lower latency than equivalent modern hardware is all well and good...until it prevents you from playing modern games at an acceptable refresh/framerate and resolution. Sure, you can upgrade your GPU in that scenario, but it will quickly become limited by the aging CPU regardless.

Unless you're happy sticking solely to retro emulated titles and the oldest of comp games, it is not a viable long term solution; you eventually have to keep up with the times to play the latest titles using the latest methods, trade-offs and all.

Heck, even my 8700k is limiting my 3080 at 1440p 240Hz in many, especially older games, some of which primarily rely on increases in CPU speed (single core in particular) to achieve higher average framerates (CS:GO being one of them), and only newer CPUs will (very gradually) continue to increase that metric.
Eonds wrote:
09 Oct 2021, 05:44
The comment there is saying you're either forgetting or not understanding that a frame is simply a frame which is "encapsulating your systems output from any given moment. FPS Is not the only latency metric and the fact that it's such a popular benchmark people start misunderstanding that a increase in FPS (a tweak/hardware change/upgrade etc) may COST you higher latency & or jitter, than the increase of FPS gives you from a specific "tweak or change in hardware". It's like people forget how computers work. They weren't meant to be a toy, or be operated by people who don't understand them. Which is fine i suppose, but now this is the product. We get a massively massively uneducated "audience" or market that is easily manipulated by FPS benchmarks etc. I'm not saying it's the end of the world at all, after all we're talking about milliseconds here.
Short of any input device delay, framerate is the primary latency metric where gaming is concerned. All any system-side tuning does is increase the achievable minimum and maximum framerate, in an attempt to bring min/avg/max framerates as close as possible to each other in order to increase consistency and reduce maximum render time.

And the truth is, even for the most well-informed and capable end-user, the ability to tweak the performance and efficiency of the PC (once assembled), at the level you are speaking, is still extremely limited. We are ultimately stuck with what they give us based on the manufacturing capabilities (and trends, good or bad) of the time we're in.

PC components are made by people, and people are heavily limited and heavily flawed. Everything we make is also ultimately limited by physics.

Don't get me wrong, I believe the end-user should feel free to tweak and tune all they want, but the law of diminishing returns is very real; there comes a point where PC tuning becomes a fixation about tuning the PC and talking about tuning the PC, and not actually using it.
Eonds wrote:
09 Oct 2021, 05:44
The thing about g-sync is the latency penalty. Sure you can consider tearing a form of latency but you're essentially making a trade off that will directly affect your perception (etc go down the chain of how humans perceive latency blah blah) so ultimately affecting the player.. I personally dislike anything that comes at a relatively significant cost of latency / latency jitter. I also don't believe g-sync is necessary, although useful for many scenarios which don't require peak system performance. You're making assumptions that a system can always and accurately deliver a frame at the theoretical minimum.
Again, what you're saying about G-SYNC means you don't understand it well enough. And regarding "You're making assumptions that a system can always and accurately deliver a frame at the theoretical minimum," no, I'm not (I'm not even sure how you got that from anything I said).

Firstly, the system has already rendered the frame before G-SYNC receives it. G-SYNC's only responsibility at that point is to align the frame with the native scanout time of the display to prevent tearing.

Secondly, in the majority of my past latency testing (with a high speed camera and the Nvidia LDAT device) I've included the minimum, average, and maximum readings in the results, so of course the system doesn't produce the same minimum latency response from moment-to-moment, that's why we have to take multiple readings and do multiple runs per. That's what the "average" consists of; an averaging of the total min and max values we capture during latency testing.
Eonds wrote:
09 Oct 2021, 05:44
Real-time systems do still have a place in our world & if you look at how they're built/tuned you'll understand my perspective much more. If you've ever played on a relatively low latency setup, nothing compares.
I never said otherwise, and I have played on a "relatively low latency setup"; my own.

At 240Hz, I'm getting 16ms average click-to-display latency in games such as Overwatch with G-SYNC, for instance. If we remove the scanout time of the display (4.2ms), which is the minimum time it takes to display a complete frame at said refresh rate, that just leaves a 11.8ms average from the latency my input devices, any further game engine processing, and my display introduce.

It's a far cry from early-mid 2000 consoles on 60Hz TVs at the time, which typically had 200ms+ latency. We're now in an era where 3D games and non-CRT displays have never been lower latency, be that on current-gen consoles or PCs.
Eonds wrote:
09 Oct 2021, 05:44
I'm not saying forego refresh rates either
Good to hear, because it's counterproductive from a pure overall minimum achievable latency perspective to have a perfectly tuned system and limit yourself to 60Hz when you can otherwise play at, say, 390Hz, the former of which is akin to trying to reach top speed in a Ferrari with the parking brake still engaged.
(jorimt: /jor-uhm-tee/)
Author: Blur Busters "G-SYNC 101" Series

Displays: ASUS PG27AQN, LG 48CX VR: Beyond, Quest 3, Reverb G2, Index OS: Windows 11 Pro Case: Fractal Design Torrent PSU: Seasonic PRIME TX-1000 MB: ASUS Z790 Hero CPU: Intel i9-13900k w/Noctua NH-U12A GPU: GIGABYTE RTX 4090 GAMING OC RAM: 32GB G.SKILL Trident Z5 DDR5 6400MHz CL32 SSDs: 2TB WD_BLACK SN850 (OS), 4TB WD_BLACK SN850X (Games) Keyboards: Wooting 60HE, Logitech G915 TKL Mice: Razer Viper Mini SE, Razer Viper 8kHz Sound: Creative Sound Blaster Katana V2 (speakers/amp/DAC), AFUL Performer 8 (IEMs)

Eonds
Posts: 262
Joined: 29 Oct 2020, 10:34

Re: Screen tearing fix with GSYNC OFF

Post by Eonds » 10 Oct 2021, 05:03

jorimt wrote:
09 Oct 2021, 09:22
Eonds wrote:
09 Oct 2021, 05:44
Reducing latency jitter (max & min latency) is far more important than just the average. Using g-sync will inherently increase latency (obviously), but i don't consider tearing a form of latency.
The very fact that you're still stating the emboldended means you don't understand G-SYNC well enough. And tearing is a form of latency "reduction," not "latency." It cheats the scanout by showing more than part of one frame scan in a single refresh cycle.

I'm cross-posting another post I made recently on the Guru3D forums about this for someone that had the same misconception:
https://forums.guru3d.com/threads/shoul ... st-5953288

Read it if you want to fully understand why G-SYNC can be considered "neutral" in relation to the given max refresh rate latency-wise. Again, it does not "increase" latency. G-SYNC syncs frames, it does not render them; that's already done once it receives them.
Eonds wrote:
09 Oct 2021, 05:44
I'm not saying don't use g-sync btw, I can see it's usefulness. I'm more interested in the nitty gritty of systems. I want to know why this jitter is present. What is changing within the systems operation that causes a lack of consistency.
Okay, but you're still effectively comparing apples and oranges, and then stating "Only consider apples; everyone should ignore the oranges because I'm not interested in oranges personally."

Again, I'm not discounting your perspective, I'm saying it encompasses only one aspect of the latency chain. There's more to consider.

That, and the irony of your "What is changing within the systems operation that causes a lack of consistency" statement, is that G-SYNC actually increases the consistency of frame delivery over no sync, but at the cost of "not reducing latency further." I.E. increased consistency can sometimes = more absolute latency, be that system or display side.
Eonds wrote:
09 Oct 2021, 05:44
You know, I won't say this as a fact but simply an observation, my USB 2.0 port is much more responsive than my 3.0 ports (probably due to it being closer but the lack of power saving features on 2.0). Keep in mind, I own two 8000 polling rate peripherals connected to the same "usb hub". These are the types of latencies which are really hard to measure with clicks as when you're playing you're moving your mouse larger distances and your eyes can recognize changes which aren't exactly testable with click to photon or other methodologies.
But as you're about to say later down, everything is encapsulated in the frame. CORRECT. Click-to-photon is capturing those very frames that encapsulate all that information. The only reason you're "feeling" ANY differences is because you're seeing them in the frame. This does not invalidated click-to-photon tests, they emulate what we end up seeing on-screen, and that's all that ultimately matters.

As for click-to-photon vs, say, actually moving the mouse, it doesn't matter much with what we're currently talking about, which is any added delay from syncing methods, something that will be the same whether we're clicking or dragging our mice. I.E. V-SYNC lag will add the same amount in both scenarios.

And whether or not sub-millisecond differences from things such as DRAM tuning or polling rates get lost in the "noise" threshold or from other limitations of such tests is another question, but then it's possible it gets lost on many but the most sensitive users experiencing such reductions as well.

That doesn't mean things such as DRAM latency isn't important, it is, and also has a role, along with the CPU, for increasing minimum framerates, which improves frametime performance and reduces certain forms of stutter.
Eonds wrote:
09 Oct 2021, 05:44
I forgot the exact name of someone who said this but the quote goes something like: " I'd rather have 144 FPS on a i7-3770k & DDR3 (assuming it's tuned to the gills), then 144 FPS on a modern system".
Legacy hardware tuned to the nth degree having lower latency than equivalent modern hardware is all well and good...until it prevents you from playing modern games at an acceptable refresh/framerate and resolution. Sure, you can upgrade your GPU in that scenario, but it will quickly become limited by the aging CPU regardless.

Unless you're happy sticking solely to retro emulated titles and the oldest of comp games, it is not a viable long term solution; you eventually have to keep up with the times to play the latest titles using the latest methods, trade-offs and all.

Heck, even my 8700k is limiting my 3080 at 1440p 240Hz in many, especially older games, some of which primarily rely on increases in CPU speed (single core in particular) to achieve higher average framerates (CS:GO being one of them), and only newer CPUs will (very gradually) continue to increase that metric.
Eonds wrote:
09 Oct 2021, 05:44
The comment there is saying you're either forgetting or not understanding that a frame is simply a frame which is "encapsulating your systems output from any given moment. FPS Is not the only latency metric and the fact that it's such a popular benchmark people start misunderstanding that a increase in FPS (a tweak/hardware change/upgrade etc) may COST you higher latency & or jitter, than the increase of FPS gives you from a specific "tweak or change in hardware". It's like people forget how computers work. They weren't meant to be a toy, or be operated by people who don't understand them. Which is fine i suppose, but now this is the product. We get a massively massively uneducated "audience" or market that is easily manipulated by FPS benchmarks etc. I'm not saying it's the end of the world at all, after all we're talking about milliseconds here.
Short of any input device delay, framerate is the primary latency metric where gaming is concerned. All any system-side tuning does is increase the achievable minimum and maximum framerate, in an attempt to bring min/avg/max framerates as close as possible to each other in order to increase consistency and reduce maximum render time.

And the truth is, even for the most well-informed and capable end-user, the ability to tweak the performance and efficiency of the PC (once assembled), at the level you are speaking, is still extremely limited. We are ultimately stuck with what they give us based on the manufacturing capabilities (and trends, good or bad) of the time we're in.

PC components are made by people, and people are heavily limited and heavily flawed. Everything we make is also ultimately limited by physics.

Don't get me wrong, I believe the end-user should feel free to tweak and tune all they want, but the law of diminishing returns is very real; there comes a point where PC tuning becomes a fixation about tuning the PC and talking about tuning the PC, and not actually using it.
Eonds wrote:
09 Oct 2021, 05:44
The thing about g-sync is the latency penalty. Sure you can consider tearing a form of latency but you're essentially making a trade off that will directly affect your perception (etc go down the chain of how humans perceive latency blah blah) so ultimately affecting the player.. I personally dislike anything that comes at a relatively significant cost of latency / latency jitter. I also don't believe g-sync is necessary, although useful for many scenarios which don't require peak system performance. You're making assumptions that a system can always and accurately deliver a frame at the theoretical minimum.
Again, what you're saying about G-SYNC means you don't understand it well enough. And regarding "You're making assumptions that a system can always and accurately deliver a frame at the theoretical minimum," no, I'm not (I'm not even sure how you got that from anything I said).

Firstly, the system has already rendered the frame before G-SYNC receives it. G-SYNC's only responsibility at that point is to align the frame with the native scanout time of the display to prevent tearing.

Secondly, in the majority of my past latency testing (with a high speed camera and the Nvidia LDAT device) I've included the minimum, average, and maximum readings in the results, so of course the system doesn't produce the same minimum latency response from moment-to-moment, that's why we have to take multiple readings and do multiple runs per. That's what the "average" consists of; an averaging of the total min and max values we capture during latency testing.
Eonds wrote:
09 Oct 2021, 05:44
Real-time systems do still have a place in our world & if you look at how they're built/tuned you'll understand my perspective much more. If you've ever played on a relatively low latency setup, nothing compares.
I never said otherwise, and I have played on a "relatively low latency setup"; my own.

At 240Hz, I'm getting 16ms average click-to-display latency in games such as Overwatch with G-SYNC, for instance. If we remove the scanout time of the display (4.2ms), which is the minimum time it takes to display a complete frame at said refresh rate, that just leaves a 11.8ms average from the latency my input devices, any further game engine processing, and my display introduce.

It's a far cry from early-mid 2000 consoles on 60Hz TVs at the time, which typically had 200ms+ latency. We're now in an era where 3D games and non-CRT displays have never been lower latency, be that on current-gen consoles or PCs.
Eonds wrote:
09 Oct 2021, 05:44
I'm not saying forego refresh rates either
Good to hear, because it's counterproductive from a pure overall minimum achievable latency perspective to have a perfectly tuned system and limit yourself to 60Hz when you can otherwise play at, say, 390Hz, the former of which is akin to trying to reach top speed in a Ferrari with the parking brake still engaged.

I'm far from a g-sync expert but any sort of syncing requires X amount of time for it to happen delaying the frames delivery right?

I mean I don't disagree with you, I just think it's highly overlooked and disregarded. Still click to photon isn't 100% accurate. Low latency is possible on modern hardware for sure. I'm glad you're on a relatively low latency setup. Computers are far too complex and variable to simply use those types of testing methods. They're good enough to usually measure millisecond changes. Even high speed cameras aren't a way to measure true system latency. You'd need to use scopes & probes & other equipment. I'm glad there's still millions of dollars being spent on measuring real world latency and hopefully we can start by contacting motherboard manufacturers to start properly designing boards for latency, better thermals, cleaner power delivery & unlocking the bios completely. There's just much more going on behind the scenes of a frame is what i'm talking about. My main focus wasn't g-sync though, I believe there's other important aspects that can really make a massive difference. Here's a scenario, basic ram kit & g-sync VS the best ram kit, and no g-sync. For me i believe the ram kit to be a lot more beneficial to the overall experience. The reason for that scenario is to highlight the importance of it and not push it to the side. It's one of the most important things you can do for a system. I believe at one point you talked about focusing on the things that make a larger difference. My response to that is simply, not all latency is perceptually the same therefor people will prefer certain types of latency reductions over others. Some people may choose to buy a really high end ram kit over a 240hz g-sync display because for them that latency reduction feels more impactful than a higher refresh rate.


Delivering frames consistently at a specific interval is nice but not if it means that frame is delayed by X amount of time. Same with tile based rasterization, you'll get that FPS boost, and save energy, but at the cost of a delayed frame delivery.


TLDR, i don't disagree with you, I just have a preference of latency it seems. Which may be insignificant for some. G-sync is useable competitively for some and some say it isn't. It may truly be one of those preference things.

User avatar
jorimt
Posts: 2481
Joined: 04 Nov 2016, 10:44
Location: USA

Re: Screen tearing fix with GSYNC OFF

Post by jorimt » 10 Oct 2021, 08:58

Eonds wrote:
10 Oct 2021, 05:03
I'm far from a g-sync expert but any sort of syncing requires X amount of time for it to happen delaying the frames delivery right?
Not with G-SYNC. It, unlike every other syncing method, syncs the display directly to the GPU-ouptut, which means it displays, within its range, the frame in the exact same intervals the GPU renders it in.

E.g. G(PU)-SYNC makes the display wait on the GPU instead of making the GPU wait on the display like most traditional syncing methods do.

It's as close to a 1:1 GPU output to display method as possible.

I won't bother posting any more detailed of a breakdown, because I already have, and you've apparently not taken the time to assimilate it. My G-SYNC 101 article is always available as well, if you care to read it.
Eonds wrote:
10 Oct 2021, 05:03
Still click to photon isn't 100% accurate.

[...]

Computers are far too complex and variable to simply use those types of testing methods. They're good enough to usually measure millisecond changes. Even high speed cameras aren't a way to measure true system latency. You'd need to use scopes & probes & other equipment.
What do you think the majority of "scopes and probes" are measuring? I'll tell you; rapid changes in light output of the display on certain parts of the screen.

Again, I've used both high speed camera methods and photodiode devices that read changes in light levels. The latter is the Nvidia LDAT I mentioned in my previous post (which doesn't even need a mouse).
Eonds wrote:
10 Oct 2021, 05:03
I'm glad there's still millions of dollars being spent on measuring real world latency and hopefully we can start by contacting motherboard manufacturers to start properly designing boards for latency, better thermals, cleaner power delivery & unlocking the bios completely. There's just much more going on behind the scenes of a frame is what i'm talking about.
Translation: "I'm keeping it vague so no one can pin me on any one thing I'm talking about."

I'm trying to tell you everything you're merely assuming, does have specifics, and many of them can already be quantified and measured in many areas. There's no reason to say "there's too much to test, and too many variables, so no one can know anything 100% for sure."

There's a lot of stuff we do know, and the information is readily available. One of those things is how much delay various syncing methods (and no sync) do or don't incur.
Eonds wrote:
10 Oct 2021, 05:03
My main focus wasn't g-sync though, I believe there's other important aspects that can really make a massive difference.
You originally posted a dismissive and uninformed comment containing an inaccurate statement about G-SYNC in a thread directly about G-SYNC. This is how all this started.
Eonds wrote:
10 Oct 2021, 05:03
Here's a scenario, basic ram kit & g-sync VS the best ram kit, and no g-sync.
That's not how latency testing works; when testing G-SYNC "latency," you're only testing it directly against other syncing methods and no sync.

In other words, you would tune the system FIRST (including RAM), and THEN, with all those variables being identical, you would only change the syncing method being used as to see how much more or less latency they incur against other syncing methods (or no sync) in like-for-like scenarios.

When testing the latency of a single metric, you have to isolate it down to said metric and compare it against its direct alternatives (or itself in some cases), and on the same system (including any input and display devices), else everything is entirely relative and any data produced is useless.

For instance, if you were testing RAM tuning, you would change nothing but the RAM params (voltage, timings, clock speed, etc) per test scenario.
Eonds wrote:
10 Oct 2021, 05:03
For me i believe the ram kit to be a lot more beneficial to the overall experience. The reason for that scenario is to highlight the importance of it and not push it to the side. It's one of the most important things you can do for a system. I believe at one point you talked about focusing on the things that make a larger difference.
I'm curious, beyond it helping increasing minimum framerates (which is indeed what it primarily does in this respect), what other benefits are you imagining it allows?

And I had yet to mention this, but while my current DDDR4 ram is only running at 3200Mhz, I prioritized the 14-14-14-34 timings it came with out-of-box, since timings can typically make a bigger difference (relatively) than sheer clock speed in many cases.

I.E. I already understand and appreciate the benefits of DRAM optimization. I just stop before the point of fixation, because once diminishing returns kick in, all I would find myself doing is tuning RAM and talking about RAM all day, every day, and not actually using my system for the very reason I implemented the lower latency RAM; gaming.
Eonds wrote:
10 Oct 2021, 05:03
TLDR, i don't disagree with you, I just have a preference of latency it seems. Which may be insignificant for some. G-sync is useable competitively for some and some say it isn't. It may truly be one of those preference things.
To be clear, I'm not even defending G-SYNC. As I stated to someone else here in another post you may have not noticed, while I'm a G-SYNC "expert," I view it as entirely optional (as it literally is an "option"), and I don't recommend it so much as I recommend how to configure it if an end-user chooses to use it, simple as that.

The only reason I took issue with your comment, is because it wasn't factual in regards to G-SYNC operation. Misinformation is never good information.
(jorimt: /jor-uhm-tee/)
Author: Blur Busters "G-SYNC 101" Series

Displays: ASUS PG27AQN, LG 48CX VR: Beyond, Quest 3, Reverb G2, Index OS: Windows 11 Pro Case: Fractal Design Torrent PSU: Seasonic PRIME TX-1000 MB: ASUS Z790 Hero CPU: Intel i9-13900k w/Noctua NH-U12A GPU: GIGABYTE RTX 4090 GAMING OC RAM: 32GB G.SKILL Trident Z5 DDR5 6400MHz CL32 SSDs: 2TB WD_BLACK SN850 (OS), 4TB WD_BLACK SN850X (Games) Keyboards: Wooting 60HE, Logitech G915 TKL Mice: Razer Viper Mini SE, Razer Viper 8kHz Sound: Creative Sound Blaster Katana V2 (speakers/amp/DAC), AFUL Performer 8 (IEMs)

Eonds
Posts: 262
Joined: 29 Oct 2020, 10:34

Re: Screen tearing fix with GSYNC OFF

Post by Eonds » 12 Oct 2021, 07:08

jorimt wrote:
10 Oct 2021, 08:58
Eonds wrote:
10 Oct 2021, 05:03
I'm far from a g-sync expert but any sort of syncing requires X amount of time for it to happen delaying the frames delivery right?
Not with G-SYNC. It, unlike every other syncing method, syncs the display directly to the GPU-ouptut, which means it displays, within its range, the frame in the exact same intervals the GPU renders it in.

E.g. G(PU)-SYNC makes the display wait on the GPU instead of making the GPU wait on the display like most traditional syncing methods do.

It's as close to a 1:1 GPU output to display method as possible.

I won't bother posting any more detailed of a breakdown, because I already have, and you've apparently not taken the time to assimilate it. My G-SYNC 101 article is always available as well, if you care to read it.
Eonds wrote:
10 Oct 2021, 05:03
Still click to photon isn't 100% accurate.

[...]

Computers are far too complex and variable to simply use those types of testing methods. They're good enough to usually measure millisecond changes. Even high speed cameras aren't a way to measure true system latency. You'd need to use scopes & probes & other equipment.
What do you think the majority of "scopes and probes" are measuring? I'll tell you; rapid changes in light output of the display on certain parts of the screen.

Again, I've used both high speed camera methods and photodiode devices that read changes in light levels. The latter is the Nvidia LDAT I mentioned in my previous post (which doesn't even need a mouse).
Eonds wrote:
10 Oct 2021, 05:03
I'm glad there's still millions of dollars being spent on measuring real world latency and hopefully we can start by contacting motherboard manufacturers to start properly designing boards for latency, better thermals, cleaner power delivery & unlocking the bios completely. There's just much more going on behind the scenes of a frame is what i'm talking about.
Translation: "I'm keeping it vague so no one can pin me on any one thing I'm talking about."

I'm trying to tell you everything you're merely assuming, does have specifics, and many of them can already be quantified and measured in many areas. There's no reason to say "there's too much to test, and too many variables, so no one can know anything 100% for sure."

There's a lot of stuff we do know, and the information is readily available. One of those things is how much delay various syncing methods (and no sync) do or don't incur.
Eonds wrote:
10 Oct 2021, 05:03
My main focus wasn't g-sync though, I believe there's other important aspects that can really make a massive difference.
You originally posted a dismissive and uninformed comment containing an inaccurate statement about G-SYNC in a thread directly about G-SYNC. This is how all this started.
Eonds wrote:
10 Oct 2021, 05:03
Here's a scenario, basic ram kit & g-sync VS the best ram kit, and no g-sync.
That's not how latency testing works; when testing G-SYNC "latency," you're only testing it directly against other syncing methods and no sync.

In other words, you would tune the system FIRST (including RAM), and THEN, with all those variables being identical, you would only change the syncing method being used as to see how much more or less latency they incur against other syncing methods (or no sync) in like-for-like scenarios.

When testing the latency of a single metric, you have to isolate it down to said metric and compare it against its direct alternatives (or itself in some cases), and on the same system (including any input and display devices), else everything is entirely relative and any data produced is useless.

For instance, if you were testing RAM tuning, you would change nothing but the RAM params (voltage, timings, clock speed, etc) per test scenario.
Eonds wrote:
10 Oct 2021, 05:03
For me i believe the ram kit to be a lot more beneficial to the overall experience. The reason for that scenario is to highlight the importance of it and not push it to the side. It's one of the most important things you can do for a system. I believe at one point you talked about focusing on the things that make a larger difference.
I'm curious, beyond it helping increasing minimum framerates (which is indeed what it primarily does in this respect), what other benefits are you imagining it allows?

And I had yet to mention this, but while my current DDDR4 ram is only running at 3200Mhz, I prioritized the 14-14-14-34 timings it came with out-of-box, since timings can typically make a bigger difference (relatively) than sheer clock speed in many cases.

I.E. I already understand and appreciate the benefits of DRAM optimization. I just stop before the point of fixation, because once diminishing returns kick in, all I would find myself doing is tuning RAM and talking about RAM all day, every day, and not actually using my system for the very reason I implemented the lower latency RAM; gaming.
Eonds wrote:
10 Oct 2021, 05:03
TLDR, i don't disagree with you, I just have a preference of latency it seems. Which may be insignificant for some. G-sync is useable competitively for some and some say it isn't. It may truly be one of those preference things.
To be clear, I'm not even defending G-SYNC. As I stated to someone else here in another post you may have not noticed, while I'm a G-SYNC "expert," I view it as entirely optional (as it literally is an "option"), and I don't recommend it so much as I recommend how to configure it if an end-user chooses to use it, simple as that.

The only reason I took issue with your comment, is because it wasn't factual in regards to G-SYNC operation. Misinformation is never good information.
The only issue i have with this reply is when you said "i'm being vague". There's thousands of variables, I'm not going to bother listing them all. That's common sense & trying to even come at me about that is in itself stupid. Voltage fluctuations is just one complicated variable for example that can easily affect latency. The types of testing big companies do for measuring latency or any sort of nanosecond scale change doesn't involve waiting for the output on the screen & or measuring changes in light. They use equipment I've never even heard of and can't pronounce. Like i said, i don't care about displays as much as i do what's happening behind the scenes. G-sync is fine, you won lol. I told you i was wrong about what i said. G-Sync does add latency maybe not on a display level. You can measure your latency how you please, but it's not accurate & or precise. The question you asked me about RAM is such a loaded question. You're trying to pin me down as you said before but I know i'm right about it. RAM is at least 10% of total system latency and is fundamental to the systems operation. Any change to it's operation adversely affects performance. It's not about frame rates, and that's another massive mistake you're making. In the real-time machine we're not fixating about frames per second. We're simply optimizing LATENCY and consistency. Like i said, I wont bother to explain how complex RAM is and how that correlates with system latency, not only because it's far above my understanding, it's incredibly complex and I don't think anyone here could do it. One thing i do know, is it DOES indeed make a massive difference. Again, I don't care about FPS as much as everyone else. I think it's a massive mistake, most of your latency doesn't come from your FPS. Network/everything behind your frame matter 100000x more. (fake number for exaggeration).

User avatar
jorimt
Posts: 2481
Joined: 04 Nov 2016, 10:44
Location: USA

Re: Screen tearing fix with GSYNC OFF

Post by jorimt » 12 Oct 2021, 07:49

Eonds wrote:
12 Oct 2021, 07:08
G-Sync does add latency maybe not on a display level.
Wha? G-SYNC is AT the display-level. It's impossible for it to add latency during the render-process, because it doesn't operate at the render-level. As I've stated repeatedly, G-SYNC receives the frames AFTER render.
Eonds wrote:
12 Oct 2021, 07:08
You can measure your latency how you please, but it's not accurate & or precise.
Yes, and we never landed on the moon, and the earth is flat. Got it.
Eonds wrote:
12 Oct 2021, 07:08
It's not about frame rates, and that's another massive mistake you're making. In the real-time machine we're not fixating about frames per second. We're simply optimizing LATENCY and consistency.
What in the (apparently flat) earth are you talking about?

Not directly counting things like server farms, folding, or cloud computing, EVERYTHING system-side revolves around rendering frame information, especially in the context of gaming. Achievable render time and its direct effect on min/avg/max framerate IS the point. The reduction of latency and increase in consistency directly affects the RENDERING of...frames. I'm not sure how that still isn't clear to you.
Eonds wrote:
12 Oct 2021, 07:08
I wont bother to explain how complex RAM is and how that correlates with system latency, not only because it's far above my understanding, it's incredibly complex and I don't think anyone here could do it. One thing i do know, is it DOES indeed make a massive difference.
DRAM tuning helps increase minimum framerates, which can improve frametime consistency. That's pretty much it where gaming is concered. You're complicating it to the extreme.

Do you even know what DRAM (Dynamic Random-Access Memory) does? No? I'll tell you...

In layman's terms, certain files from actively running programs (aka games) are temporarily offloaded from the HDD/SSD onto the DRAM so that the CPU can repeatedly access them more quickly, since DRAM transfer rates are typically much faster than HDD and even current SSD rates.

Thus, RAM in this particular context is basically an ultra fast temporary storage medium for CPU access of the more vital program (aka game) files.

(on a side note, this is why dipping into the pagefile of your hard drive is considered a bad thing; it's only resorted to when usable DRAM runs out, forcing the HDD or SSD to act as surrogate virtual RAM, and since HDD and SSD transfer rates are slower, this can create noticeable performance issues whenever the same data that would otherwise be swapped between the DRAM and CPU is instead swapped between the CPU and the storage device).

With that considered, again, reducing the timings and increasing the clock of your DRAM can make the transfer rates between it, the storage drives, and the CPU faster, which, in turn, where gaming is directly concerned, will help RENDER a FRAME faster.

And the faster each frame is rendered in sequence, the higher the average framerate, and the higher the average framerate, the more frequently new information is being displayed on-screen, and the more frequently new information is being displayed on-screen, the less LATENCY there is between user input and what happens on the display at any given moment.

Further, the higher the minimum framerate (which, again, the DRAM and CPU primarily contribute to, since the CPU is responsible for generating the initial information necessary for the GPU to finish render of each frame), the more CONSISTENCY there is in frame delivery, as the spread between minimum and maximum framerate is now reduced.

There, I did it.

All I can say is, if you truly care about this subject, and want to achieve any authority on it, you need to drop everything you currently think you know about it and re-learn, starting with the fundamentals.

Beyond that, I, for one, give up on this exchange. Think what you will.
(jorimt: /jor-uhm-tee/)
Author: Blur Busters "G-SYNC 101" Series

Displays: ASUS PG27AQN, LG 48CX VR: Beyond, Quest 3, Reverb G2, Index OS: Windows 11 Pro Case: Fractal Design Torrent PSU: Seasonic PRIME TX-1000 MB: ASUS Z790 Hero CPU: Intel i9-13900k w/Noctua NH-U12A GPU: GIGABYTE RTX 4090 GAMING OC RAM: 32GB G.SKILL Trident Z5 DDR5 6400MHz CL32 SSDs: 2TB WD_BLACK SN850 (OS), 4TB WD_BLACK SN850X (Games) Keyboards: Wooting 60HE, Logitech G915 TKL Mice: Razer Viper Mini SE, Razer Viper 8kHz Sound: Creative Sound Blaster Katana V2 (speakers/amp/DAC), AFUL Performer 8 (IEMs)

Eonds
Posts: 262
Joined: 29 Oct 2020, 10:34

Re: Screen tearing fix with GSYNC OFF

Post by Eonds » 13 Oct 2021, 18:03

jorimt wrote:
12 Oct 2021, 07:49
Eonds wrote:
12 Oct 2021, 07:08
G-Sync does add latency maybe not on a display level.
Wha? G-SYNC is AT the display-level. It's impossible for it to add latency during the render-process, because it doesn't operate at the render-level. As I've stated repeatedly, G-SYNC receives the frames AFTER render.
Eonds wrote:
12 Oct 2021, 07:08
You can measure your latency how you please, but it's not accurate & or precise.
Yes, and we never landed on the moon, and the earth is flat. Got it.
Eonds wrote:
12 Oct 2021, 07:08
It's not about frame rates, and that's another massive mistake you're making. In the real-time machine we're not fixating about frames per second. We're simply optimizing LATENCY and consistency.
What in the (apparently flat) earth are you talking about?

Not directly counting things like server farms, folding, or cloud computing, EVERYTHING system-side revolves around rendering frame information, especially in the context of gaming. Achievable render time and its direct effect on min/avg/max framerate IS the point. The reduction of latency and increase in consistency directly affects the RENDERING of...frames. I'm not sure how that still isn't clear to you.
Eonds wrote:
12 Oct 2021, 07:08
I wont bother to explain how complex RAM is and how that correlates with system latency, not only because it's far above my understanding, it's incredibly complex and I don't think anyone here could do it. One thing i do know, is it DOES indeed make a massive difference.
DRAM tuning helps increase minimum framerates, which can improve frametime consistency. That's pretty much it where gaming is concered. You're complicating it to the extreme.

Do you even know what DRAM (Dynamic Random-Access Memory) does? No? I'll tell you...

In layman's terms, certain files from actively running programs (aka games) are temporarily offloaded from the HDD/SSD onto the DRAM so that the CPU can repeatedly access them more quickly, since DRAM transfer rates are typically much faster than HDD and even current SSD rates.

Thus, RAM in this particular context is basically an ultra fast temporary storage medium for CPU access of the more vital program (aka game) files.

(on a side note, this is why dipping into the pagefile of your hard drive is considered a bad thing; it's only resorted to when usable DRAM runs out, forcing the HDD or SSD to act as surrogate virtual RAM, and since HDD and SSD transfer rates are slower, this can create noticeable performance issues whenever the same data that would otherwise be swapped between the DRAM and CPU is instead swapped between the CPU and the storage device).

With that considered, again, reducing the timings and increasing the clock of your DRAM can make the transfer rates between it, the storage drives, and the CPU faster, which, in turn, where gaming is directly concerned, will help RENDER a FRAME faster.

And the faster each frame is rendered in sequence, the higher the average framerate, and the higher the average framerate, the more frequently new information is being displayed on-screen, and the more frequently new information is being displayed on-screen, the less LATENCY there is between user input and what happens on the display at any given moment.

Further, the higher the minimum framerate (which, again, the DRAM and CPU primarily contribute to, since the CPU is responsible for generating the initial information necessary for the GPU to finish render of each frame), the more CONSISTENCY there is in frame delivery, as the spread between minimum and maximum framerate is now reduced.

There, I did it.

All I can say is, if you truly care about this subject, and want to achieve any authority on it, you need to drop everything you currently think you know about it and re-learn, starting with the fundamentals.

Beyond that, I, for one, give up on this exchange. Think what you will.
Drivers don't exist and or interact with g-sync & you're 100% right :?:


You still believe it's about frames per second lmao, you haven't learned anything. :|


I never said DRAM doesn't affect frames per second :lol:


I really don't need to say much more. Your ignorance on the subject is astounding considering you have access to the internet.


Keep circle jerking about FPS ;) , & make sure to upgrade to DDR5 when it's released for that sick fps gains bro :mrgreen:

User avatar
jorimt
Posts: 2481
Joined: 04 Nov 2016, 10:44
Location: USA

Re: Screen tearing fix with GSYNC OFF

Post by jorimt » 13 Oct 2021, 18:53

Eonds wrote:
13 Oct 2021, 18:03
Drivers don't exist and or interact with g-sync & you're 100% right :?:
I can't keep having that conversation.
Eonds wrote:
13 Oct 2021, 18:03
You still believe it's about frames per second lmao, you haven't learned anything. :|
"Rendered" frames (frametime duration, consistency, and distribution, specifically), not FPS. For the 1000th time, how do you think you're seeing the reduction in latency or increase in consistency? Frames. You don't play Rocket League with a blindfold solely through haptic feedback (I hope), right?
Eonds wrote:
13 Oct 2021, 18:03
I really don't need to say much more. Your ignorance on the subject is astounding considering you have access to the internet.
You do realize I'm the G-SYNC 101 author, built the current iteration of the Blur Busters website, help update and maintain the forum, and am a moderator here, right?

If you do, why are you even hanging around a forum that's run, in part, by ignorant, know-nothing morons? Any of your regular Discord channels are, I'm sure, just a window away...
Eonds wrote:
13 Oct 2021, 18:03
Keep circle jerking about FPS ;) , & make sure to upgrade to DDR5 when it's released for that sick fps gains bro :mrgreen:
Yes, that was entirely my point (hint: sarcasm).
(jorimt: /jor-uhm-tee/)
Author: Blur Busters "G-SYNC 101" Series

Displays: ASUS PG27AQN, LG 48CX VR: Beyond, Quest 3, Reverb G2, Index OS: Windows 11 Pro Case: Fractal Design Torrent PSU: Seasonic PRIME TX-1000 MB: ASUS Z790 Hero CPU: Intel i9-13900k w/Noctua NH-U12A GPU: GIGABYTE RTX 4090 GAMING OC RAM: 32GB G.SKILL Trident Z5 DDR5 6400MHz CL32 SSDs: 2TB WD_BLACK SN850 (OS), 4TB WD_BLACK SN850X (Games) Keyboards: Wooting 60HE, Logitech G915 TKL Mice: Razer Viper Mini SE, Razer Viper 8kHz Sound: Creative Sound Blaster Katana V2 (speakers/amp/DAC), AFUL Performer 8 (IEMs)

Eonds
Posts: 262
Joined: 29 Oct 2020, 10:34

Re: Screen tearing fix with GSYNC OFF

Post by Eonds » 14 Oct 2021, 07:49

jorimt wrote:
13 Oct 2021, 18:53
Eonds wrote:
13 Oct 2021, 18:03
Drivers don't exist and or interact with g-sync & you're 100% right :?:
I can't keep having that conversation.
Eonds wrote:
13 Oct 2021, 18:03
You still believe it's about frames per second lmao, you haven't learned anything. :|
"Rendered" frames (frametime duration, consistency, and distribution, specifically), not FPS. For the 1000th time, how do you think you're seeing the reduction in latency or increase in consistency? Frames. You don't play Rocket League with a blindfold solely through haptic feedback (I hope), right?
Eonds wrote:
13 Oct 2021, 18:03
I really don't need to say much more. Your ignorance on the subject is astounding considering you have access to the internet.
You do realize I'm the G-SYNC 101 author, built the current iteration of the Blur Busters website, help update and maintain the forum, and am a moderator here, right?

If you do, why are you even hanging around a forum that's run, in part, by ignorant, know-nothing morons? Any of your regular Discord channels are, I'm sure, just a window away...
Eonds wrote:
13 Oct 2021, 18:03
Keep circle jerking about FPS ;) , & make sure to upgrade to DDR5 when it's released for that sick fps gains bro :mrgreen:
Yes, that was entirely my point (hint: sarcasm).


I was talking about your lack of knowledge about DRAM not g-sync.



I don't think the forum is "ran" by "ignorant, know nothing morons".


Your problem is you circle jerk about FPS & click latency. You'll eventually learn click latency is a meme & so is the fixation about FPS.

User avatar
jorimt
Posts: 2481
Joined: 04 Nov 2016, 10:44
Location: USA

Re: Screen tearing fix with GSYNC OFF

Post by jorimt » 14 Oct 2021, 09:07

Eonds wrote:
14 Oct 2021, 07:49
I was talking about your lack of knowledge about DRAM not g-sync.



I don't think the forum is "ran" by "ignorant, know nothing morons".


Your problem is you circle jerk about FPS & click latency. You'll eventually learn click latency is a meme & so is the fixation about FPS.
If you think how long it takes for your inputs to show up on the display doesn't matter, okay then?

And if anyone is "circle jerking," (you've said that at least three times now) it's you with DRAM. Going by one of your first posts on this forum, it sounds like you had bad RAM holding back your system, and when you upgraded to better RAM, it no longer held the system back, making you correlate everything with RAM from then on out.

If you had instead had a bad CPU and good RAM, and then upgraded the CPU, you probably would have been only proselytizing CPUs instead of DRAM now.

I've seen many of your posts in this forum and could have said any of number of things, but I, for one, let people have their opinion.
You, on the other hand, directly targeted a moderator with an insulting and off-topic comment. Again, that's how this all started, else I wouldn't have bothered to converse with you at all.

Anyway, look, any other moderator would have probably soft banned you, or, at the very least, locked this thread by now. I won't. I'll let anyone who sees this exchange judge for themselves.

Finally, unless you reply here with ban-worthy content (at which point I will consult with the Chief for possible actions), this will be my last comment in this thread to you.
(jorimt: /jor-uhm-tee/)
Author: Blur Busters "G-SYNC 101" Series

Displays: ASUS PG27AQN, LG 48CX VR: Beyond, Quest 3, Reverb G2, Index OS: Windows 11 Pro Case: Fractal Design Torrent PSU: Seasonic PRIME TX-1000 MB: ASUS Z790 Hero CPU: Intel i9-13900k w/Noctua NH-U12A GPU: GIGABYTE RTX 4090 GAMING OC RAM: 32GB G.SKILL Trident Z5 DDR5 6400MHz CL32 SSDs: 2TB WD_BLACK SN850 (OS), 4TB WD_BLACK SN850X (Games) Keyboards: Wooting 60HE, Logitech G915 TKL Mice: Razer Viper Mini SE, Razer Viper 8kHz Sound: Creative Sound Blaster Katana V2 (speakers/amp/DAC), AFUL Performer 8 (IEMs)

Post Reply