I'M SO CLOSE TO FINISHING MY INPUT LAG!

Everything about latency. Tips, testing methods, mouse lag, display lag, game engine lag, network lag, whole input lag chain, VSYNC OFF vs VSYNC ON, and more! Input Lag Articles on Blur Busters.
Unixko
Posts: 212
Joined: 04 Jul 2020, 08:28

Re: I'M SO CLOSE TO FINISHING MY INPUT LAG!

Post by Unixko » 07 Dec 2020, 09:58

empleat wrote:
07 Dec 2020, 09:38
Unixko wrote:
07 Dec 2020, 03:04
empleat wrote:
06 Dec 2020, 21:09
Unixko wrote:
05 Dec 2020, 23:16
stop reading after user x7007
Wut? This user found out about english philiphines, it has much lower input lag. Check blurbusters input lag ab test, how much input lag you can tell, me even 6 ms. I can tell, it makes all the difference in the world actually! Maybe don't post, if you don't have anything relevant to say, don't get :oops: :x
change keyboard language has nothing to do with input lag
i read all delusional post from x7007 in OC
LMAO i tested myself with blurbuster's input lag AB test and i could tell even 6ms difference. Also i was supreme master class in cs go and i very sensitive to input lag! I can tell difference after changing this! How would you know that changing language hasn't to do anything with input lag?! Also it wasn't keyboard language but display! Unless you know how languages work exactly in Windows, if not than you don't know if it could cause input! There are a lot of bugs and weird things in windows, which affect input lag!

Also because one user is wrong in some things, doesn't make him automatically wrong in everything...

You know there are problems, if there is bad grounding in your house, can literally cause mouse jitter and weird input lag! So maybe that's why he was getting UPS. Also don't see anything delusional on this!
he is right about EMI inside his house but nothing more

empleat
Posts: 149
Joined: 28 Feb 2020, 21:06

Re: I'M SO CLOSE TO FINISHING MY INPUT LAG!

Post by empleat » 07 Dec 2020, 12:26

He is right about display language, as it is display, it has to be processed graphically. Who knows what it does, bugdows 10 you never know! Unless you have proof, this doesn't work. Then how do you know? Multiple peoples said it helped! Also i can tell even difference of 6ms in input lag and i am extremely sensitive to that and i can tell a difference!!! After changing it!

Unixko
Posts: 212
Joined: 04 Jul 2020, 08:28

Re: I'M SO CLOSE TO FINISHING MY INPUT LAG!

Post by Unixko » 07 Dec 2020, 13:47

empleat wrote:
07 Dec 2020, 12:26
He is right about display language, as it is display, it has to be processed graphically. Who knows what it does, bugdows 10 you never know! Unless you have proof, this doesn't work. Then how do you know? Multiple peoples said it helped! Also i can tell even difference of 6ms in input lag and i am extremely sensitive to that and i can tell a difference!!! After changing it!
ok buddy when you feel then you feel its ok

User avatar
Chief Blur Buster
Site Admin
Posts: 11647
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: I'M SO CLOSE TO FINISHING MY INPUT LAG!

Post by Chief Blur Buster » 21 Dec 2020, 14:33

empleat wrote:
07 Dec 2020, 09:38
Also because one user is wrong in some things, doesn't make him automatically wrong in everything...

You know there are problems, if there is bad grounding in your house, can literally cause mouse jitter and weird input lag! So maybe that's why he was getting UPS. Also don't see anything delusional on this!
<Aside>
Not addressed to this poster, but the whole thread in general...

I have to chime in because there's been a few attempted posts by a few members that posted some less diplomatic language, that I had to Disapprove from the moderator queue.

We know that some people spend so much money -- and sometimes those people are mean to you -- sometimes a lot of money spending wasted with some of it occasionally benefitting. Could the money be put to better use? Perhaps. But occasionally I see such users start to accidentally fix things with such as shotgunning approach. We can dismiss some of the silly tweaks, but there's some sensible tweaks mixed in with the nonsense. A person's genuine frustration with technology Murphy's Laws compels them to do things inefficiently (like spending extra money instead of more troubleshooting-efficient routes), but we don't allow namecalling such as "idiot" or such in these forums. So, those who got posts "Disapproved", please nuance your pannings/dismisses of crazy-tweaking with a bit more diplomacy.

We can still dismiss people and criticize people, but let's do it tactfully. We are well known as being a discussion safe-place (more-or-less, best-effort) for these frontiers of technology. We aren't the mostly-unmoderated realms of Facebook/Twitter/4chan/whatever around here. So, no name-calls when criticizing. OK? Just because some people are mean to you in a different forum, doesn't mean we use this forum to retaliate against them. We can still criticize, but to new forum members, keep those name-calls out of those posts, OK?

It's kind of like the EMI stuff that (even NVIDIA acknowledges) -- lots of wild goose chase to red herrings -- but some truths there too, given the increased vulnerability to EMI sensitivities in certain electronics thanks to transistors becoming tinier (pushing miniaturization further) and being more subject to interference/etc.

Blur Busters exists because we didn't dismiss the crazy strobing and refresh rate thing, the crazy "sub-millisecond MPRT" thing (that is actually human visible), and other nutty display technology thing that -- a decade later after people stopped laughing -- is now textbook reading stuff from Gaming Monitor Engineering to Virtual Reality Engineering 101. Sure, some things are certainly useless, while other things really ended up important.

Sometimes we have to throw lots of things at the wall, in a trial, and see what sticks as good stuff.

Anyway, back to normal discussions. In the noise/FUD, there's sometimes new discoveries of "Milliseconds Matters" tidbits, even mixed in with the useless stuff. :D

</Aside>
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

Brainlet
Posts: 100
Joined: 30 May 2020, 12:39
Contact:

Re: I'M SO CLOSE TO FINISHING MY INPUT LAG!

Post by Brainlet » 21 Dec 2020, 16:45

Chief Blur Buster wrote:
21 Dec 2020, 14:33
It's kind of like the EMI stuff that (even NVIDIA acknowledges)
Got a link to that? (NVIDIA claiming it can be an issue)
Starting point for beginners: PC Optimization Hub

User avatar
MaxTendency
Posts: 59
Joined: 22 Jun 2020, 01:47

Re: I'M SO CLOSE TO FINISHING MY INPUT LAG!

Post by MaxTendency » 25 Dec 2020, 08:41

Chief Blur Buster wrote:
21 Dec 2020, 14:33
It's kind of like the EMI stuff that (even NVIDIA acknowledges)
Brainlet wrote:
21 Dec 2020, 16:45
Got a link to that? (NVIDIA claiming it can be an issue)
I'm also interested in this. I don't remember nvidia mentioning this in their input lag related articles atleast.

timecard
Posts: 65
Joined: 25 Jan 2020, 01:10

Re: I'M SO CLOSE TO FINISHING MY INPUT LAG!

Post by timecard » 26 Dec 2020, 23:23

Brainlet wrote:
21 Dec 2020, 16:45
Chief Blur Buster wrote:
21 Dec 2020, 14:33
It's kind of like the EMI stuff that (even NVIDIA acknowledges)
Got a link to that? (NVIDIA claiming it can be an issue)
I assume it's this, read this a while back. Lots of diagrams regarding current flow, coupling and emi. Enjoy.

Thesis: EMI analysis of DVI link connectors, Abhishek Patnaik, 2015
Also associated with Dr. YaoJiang Zhang at Missouri University of Science and Technology and Chen Wang and Chuck Jackson at NVIDIA, Inc.
https://scholarsmine.mst.edu/cgi/viewco ... ers_theses

User avatar
Chief Blur Buster
Site Admin
Posts: 11647
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: I'M SO CLOSE TO FINISHING MY INPUT LAG!

Post by Chief Blur Buster » 02 Jan 2021, 01:13

<PandoraBox state="OPEN">
timecard wrote:
26 Dec 2020, 23:23
I assume it's this, read this a while back. Lots of diagrams regarding current flow, coupling and emi. Enjoy.

Thesis: EMI analysis of DVI link connectors, Abhishek Patnaik, 2015
Also associated with Dr. YaoJiang Zhang at Missouri University of Science and Technology and Chen Wang and Chuck Jackson at NVIDIA, Inc.
https://scholarsmine.mst.edu/cgi/viewco ... ers_theses
It's not the only one, research is a needle in a haystack when they aren't always acknowledged to end users.

But many EMI papers exist that NVIDIA co-authored. Here are a cherrypick of a few. It takes time to dig up good examples but they're there if you know where to look ("NVIDIA + EMI" or "NVIDIA" + "electromagnetic interference" and various other Google-Fu in academic search engines and patent search engines). Or just broaden your net and slap in a common name such as Samsung or RCA (of yesteryear big-company fame, EMI was a big problem even back in the 1940s and 1950s -- like analog ghosts on a TV and interference on radio). Anyway, it is almost High School 101 stuff to advanced NVIDIA engineers -- the engineers know EMI is a huge problem these days. These have been issues for a long time (many of these papers are from 2010)

Research Paper Example 1: Virtual-EMI lab: Removing mysteries from black-magic to a successful front-end design
EMI engineers are struggling everyday with complex radiation problems that fail critical products to pass EMI certification and causes big loss of profit. Advances in EMI engineering are following a similar trend like Signal-Integrity engineering 10-years ago when simulation tools became capable of providing accurate predictive simulations in a reasonable amount of time. With careful engineering utilizing cutting-edge full-wave field-solver software: Momentum (MOM), EMpro (FDTD) along with a hardware boost with heterogeneous massive CPU/GPU parallel processing (CUDA) technology, we can move the EMI teams from the back-end black-magic to a successful cost-effective front-end design. This paper presents an innovative process (Virtual-EMI lab) for pre- and post-tape-out providing the designers with an early stage EMI-suppression matrix (on-chip and onboard enablers) to find the optimum trade-off between performance and cost.

Hany Fahmy (NVIDIA Corporation, Santa Clara, CA, USA)
Chen Wang (NVIDIA Corporation, Santa Clara, CA, USA)
Davy Pissoort (Department IW&T, FMEC-KHBO, Oostende, Belgium)
Amolak Badesha (Agilent Technologies, Inc., Santa Clara, CA, USA)
Research Paper Example 2: Implementation of a Virtual EMI Lab to Cost-Effectively Tackle Multi-Gigahertz EMI Challenges
Due to the increasing overall complexity and integration, electronic engineers are faced every day with ever more complicated ElectroMagnetic Interference (EMI) issues. As a result many first prototypes fail to pass all EMI certification tests causing a big loss of time and profit. Up to now, debugging EMI issues has mostly been done in costly EMI chambers. Moreover these tests are done rather late in the design cycle when there is not much flexibility left to implement the optimal and most cost-efficient EMI mitigation methods. Simulations offer a lot a flexibility when estimating the EMI impact of different elements in the electronic system and can really help to find the real sources for possible EMI issues. By having this knowledge very early in the design stage, one can implement cheap, yet effective EMI mitigation methods without resorting to more costly EMI suppressors like shielded connectors, chokes, or specially-designed enclosures. Unfortunately, due to the complexity, most of EMI problems require excessive computer resources both in terms of simulation time as computer memory. However, thanks to recent advances in the adoption of GPU parallel processing technology to modern EM simulation tools, it becomes feasible to accurately solve complex EMI problems within a reasonable amount of time. This paper gives an overview of some efficient methodologies that are currently used by within NVIDIA for cost-effective EMI suppression techniques by means of a virtual EMI lab and this both early in the design process or after physically testing a first prototype . The challenges that were successfully tackled include (i) the optimal routing of on a multi-layered Printed Circuit Board (PCB), (ii) the use of on-board shielding, (iii) the influence of grounding to a connector-to-PCB transition, (iv) simultaneous switching output (SSO) noise emission reduction, and (v) estimating the influence of the exact location of an ESD diode.
Example #3: Patent by NVIDIA cites a HP EMI Patent
Patent US8169789B1 by NVIDIA cites HP patent US6219239B1 (Scroll down "Cited By" where 108 are listed, including NVIDIA)
EMI reduction device and assembly
Abstract: An EMI reduction device is coupled between a printed circuit board (PCB) assembly and a heat sink. The PCB assembly includes a processor core that is the source of unintentional electromagnetic interference (EMI). The EMI reduction device is attached to a heat sink which is positioned over the processor core such that it capacitively couples emissions from the processor core to a grounding plane resident in the PCB assembly, thereby reducing the unintentional EMI. Simultaneously, the EMI reduction device is able to maintain thermal contact with the heat sink.
.
.
.
.
Example 1,000+ (skipped)

(...Jaded about EMI which I first knew about in late 1980s. This is only a <1% textdump. These are just small examples from 15 minutes of Chief Blur Busters Google-Fu. Most of the mainstream don't realize just how major EMI-solving is part of those highly paid jobs within multiple job positions within NVIDIA, but they alas cannot test for the infinite untestable EMI combinations, many businesses try their best, but can be harder to have complete-coverage-debugging than bugs in an entire operating system -- since EMI is such a massive universe of infinite different kinds of EMIs.)
Brainlet wrote:
21 Dec 2020, 16:45
Got a link to that? (NVIDIA claiming it can be an issue)
MaxTendency wrote:
25 Dec 2020, 08:41
I'm also interested in this. I don't remember nvidia mentioning this in their input lag related articles atleast.
1. All big companies such as NVIDIA does a lot of work with EMI but you have to look for them in other channels (research, patents, etc)
2. In articles to end users, they don't really link EMI to input lag. We just assume EMI is not a factor (like for most situations). But EMI leads to random pauses which is random lag.

NVIDIA acknowledges it internally and it's all over those research/patents. It's how important it is.

NVIDIA does not generally directly call it out to end users. But it is indirectly alluded to already indirectly to users -- as overclocking also increases EMI sensitivity and creates various error-correction latencies, in this earlier post.

Image
A major reason of adding error correction to electronics is EMI-resistance amongst so many other reasons -- error correction is now part of PCI-X buses, NVMe buses, SATA buses, memory buses, GDDR6, USB buses, DisplayPort connections, etc.

Error correction is also invented partially to resist crashing on EMI events -- whether non-overclock related or overclock related.

Punching those bits above above Shannon Theorem. On some of those mediums nowadays, data bits are now often far below noise floor, and advanced algorithms brings them back above noise floor -- this is made possible by advanced modulation/demodulation techniques, which is how USB sped up, Ethernet sped up, PCIe sped up, etc -- the ability to transmit binary data far below the noise floor -- sometimes so vanishingly close to the theoretical limits Shannon-Hartley Theorem, that any tiny amount of EMI just nullifies the signal. More data, more EMI sensitive. Ouch.

Winning the EMI battle while being as tight as possible to Shannon-Hartley margins, is a polar-opposites goal -- there are many situations where you can have one (faster) or the other (more resistant). Do you know why the NASA Mars Rovers use heavily underclocked radiation-hardened chips such as IBM RAD 6000 CPU processor -- at speeds of around the territory of 20 megahertz (0.02 gigahertz)? Bingo! EMI (but of cosmic radiation form) in outer space is harsher than on planet Earth. See, the battle?

Anyway, needless to say, EMI-resistant engineering is a major part of a lot of electronics/chip/circuitboard/powersupply/system/etc design. But there are an infinite number of EMI needles (that punch through the anti-EMI stuff) found in all locations of all the population of nearly 8 billion -- more than the number of lines of code in Linux or Windows -- and it is hard to eliminate all possible EMI problems. They can only debug so much.

Anyway, those who haven't studied electronics engineering, can read Post #1, Post #2, and Post #3 to educate themselves about EMI as well as EMI-derived latency issues.

Also, just because (cherrypicking simplified number) ...., say 99% or 99.9% people don't have major EMI-related lag/glitches can mean 10,000 (1%) out of 1,000,000 has some kind of relatively significant EMI glitch with their computer because they're living next to high voltage transmission lines or live in a very EMI-noisy old apartment building with a malfunctioning fridge compressor in the next tenant's wall right behind the computer, or whatever. Lots of near-EMP-league EMI edge cases that punch through a lot of anti-EMI stuff. EMI is an infinitely vast universe with a spectrum wider than audio spectrum and visible light spectrum, of all kinds of signal noises (broadband, narrowband, pure noises at below-light frequencies, or random noise at above-light frequencies, over-the-air, over-the-wire, or trillions/quadrillions/quintillons/infinite other patterns more numerous than the number of atoms in this universe, etc). It's a lot of FUD of wild goose chase for red herrings, very hard to debug stuff for end users. There is little hope for a full EMI debug though one can always do their best, and increase EMI-resistance success %, but not for the entire locations of the entire population...

Now, if any of you are old enough to remember analog TV or radio -- you had interference. Ghosts. Etc. Whether over the air or over the wire (bad cable). Indirectly, interference was one of the reasons Communications Act of 1934 that created Federal Communications Commission, and old problems happen over the years, such as the Freeze of 1948 caused by radio interference (a classic form of EMI, as radio waves are an electromagnetic spectrum).

Fast Forward >> To Today:

Electronics circuits running very tight to Shannon-Hartley Theorem (weakest possible signal at highest possible bit rates) are nowadays like the equivalent of a very weak HDTV broadcast signal almost ready to black out. There's often no room to increase power more, and there's often no room to transmit faster. Now it becomes sensitive to the weakest EMI. Those dropouts you see on OTA HDTV broadcasts from rabbit ear antennas connected to a flat panel? When that digital signal dropout happens inside a computer (e.g. inside a USB cable, or inside PCIe lane, or inside an interrupt-signal circuit line), those are microfreezes, whether it's a nanosecond, a microsecond, or a millisecond. Those latency pauses on a computer caused by EMI.

Yesterday's computer crashed with EMI spikes (old IDE bus, old memory, old ISA bus, old serial bus)...
...But thanks to error correcting layers throughout a modern PC, a computer simply microfreezes now with most ordinary EMI (error correcting SATA/NVMe bus, error correcting GDDR6, error correcting USB, error correcting PCIe).

Even if the pause is a nanoseconds, there can be millions of microfreezes throughout a whole system -- and still get "death by a billion nanoseconds", much like losing 50% of a packets on an Internet connection. But at bus speed scales -- (whereas a modern PCIe bus is literally packetized nowadays, at millions of packets per second, with ever-increasing layers of error correction). It's a complex quagmire.

The net result is we just merrily compute along -- not caring about whatever that little stutter or lagspike was -- at least the computer did not crash. Perhaps, who knows -- 99% of the time it was software but might be 1% of time EMI related. Or 0.1% or 10% -- who knows? Very hard to troubleshoot. That said, we thank our merry stars that a computer doesn't crash. We focus on lower-lying apples like troubleshooting software first, and most of the issues is Windows or software, even if EMI is one of the many problems that needs to be battled against.

Telemetry/Statistics:
....This post opens the the Pandora Box by: Less Than 0.0125%
....Type of this forum post: Basic ELI5 League
....Experience Modifier to understand Pandora Box by 1-to-10%: Ph.D Degree (and I'm not even at that level)

For most end users, I recommend not to bother troubleshooting for EMI; it is often a waste of time and resources even if it is a major problem. Most of the time one needs to just focus on troubleshooting the troubleshootable as that is much easier, because playing rabbitears roulette with a computer innards is horrendously hard and hard to trace.

Hope this is an educational read!

Cheers,

</PandoraBox>
ERROR 1001 at Line 1: Failed to to close the <PandoraBox> tag
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

Orbxtal
Posts: 1
Joined: 21 Jan 2021, 19:46

Re: I'M SO CLOSE TO FINISHING MY INPUT LAG!

Post by Orbxtal » 21 Jan 2021, 19:50

Hey, after applying all the reg keys, my latency feels better for sure but I can't get some of my games to run in exclusive fullscreen anymore, any idea why?

f1ndus
Posts: 165
Joined: 30 Dec 2020, 10:38

Re: I'M SO CLOSE TO FINISHING MY INPUT LAG!

Post by f1ndus » 22 Jan 2021, 09:33

Orbxtal wrote:
21 Jan 2021, 19:50
Hey, after applying all the reg keys, my latency feels better for sure but I can't get some of my games to run in exclusive fullscreen anymore, any idea why?
key regs? can you explaing please?

Post Reply