AMD Radeon AntiLag

Everything about latency. Tips, testing methods, mouse lag, display lag, game engine lag, network lag, whole input lag chain, VSYNC OFF vs VSYNC ON, and more! Input Lag Articles on Blur Busters.
User avatar
RealNC
Site Admin
Posts: 3741
Joined: 24 Dec 2013, 18:32
Contact:

Re: AMD Radeon AntiLag

Post by RealNC » 12 Jun 2019, 02:40

This looks like nvidia is just trying to downplay AMD. And they're contradicting themselves while doing so. They say they don't know what AMD's tech actually is, and yet in the same statement claim nvidia already had it for a decade?

They should hire someone at nvidia to proofread statements before sending them out...
SteamGitHubStack Overflow
The views and opinions expressed in my posts are my own and do not necessarily reflect the official policy or position of Blur Busters.

MatrixQW
Posts: 278
Joined: 07 Jan 2019, 10:01

Re: AMD Radeon AntiLag

Post by MatrixQW » 12 Jun 2019, 03:04

Is it possible that Intel HD graphics has less input lag than Nvidia or AMD?
I'm thinking that since it's integrated in the cpu maybe it takes less time to process things.
Was this ever measured by real tests?
Also, there is no pre-rendered frames option with Intel, so how many is it using?

User avatar
Chief Blur Buster
Site Admin
Posts: 11647
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: AMD Radeon AntiLag

Post by Chief Blur Buster » 12 Jun 2019, 16:06

MatrixQW wrote:Is it possible that Intel HD graphics has less input lag than Nvidia or AMD?
I'm thinking that since it's integrated in the cpu maybe it takes less time to process things.
Short answer: No

Proof: My Tearline Jedi Research research and my understanding of Present()-to-photons science.

However, there are latency differences at the software driver level, which has nothing to do with whether the GPU is integrated or not -- that will only produce microsecond differences that are meaningless.

The granularity of horizontal scanrate is all that matters in practical considerations (single-scanline movements of a tearline = the smallest possible latency granularity possible = one unit of horizontal scanrate, which is 1/160000sec for a 160KHz horizontal scanrate. Anything less than that is a rounding error -- including signal propagation difference between in-CPU versus PCI-Express, or the memory access/latency differences

It's how the software does it (driver level, frame buffers, etc) that makes the latency differences significant enough to affect latency (and/or location of tearlines) -- even simple nuances such as power management can affect latency (lengthening the time from inputread to photons)
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

Q83Ia7ta
Posts: 761
Joined: 18 Dec 2013, 09:29

Re: AMD Radeon AntiLag

Post by Q83Ia7ta » 12 Jun 2019, 17:11

I'm sure it's just a gimmick in most use cases. Maybe for some games or in some situations it will help/work. Btw value of prerendered frames or flip queue size are ignored/overridden in most competitive games. It's just set by the game engine.

Stitch7
Posts: 86
Joined: 27 Mar 2019, 08:26

Re: AMD Radeon AntiLag

Post by Stitch7 » 12 Jun 2019, 17:49

jorimt wrote:For what it's worth:
https://www.pcgamesn.com/nvidia/nvidia- ... lternative
Tony Tamasi, VP of technical marketing, says [Nvidia] has an alternative to [Radeon Anti-lag] too… and has done for over a decade [...] That’s the setting ‘maximum pre-rendered frames’ in the GeForce control panel, which controls the size of the Flip Queue. The higher the number here, the higher the lag between your input and the displayed frame, while lower queue lengths result in less lag but increased strain on your CPU.
1-3 frames at 60fps is more than 15ms.
I think this is something different.

User avatar
jorimt
Posts: 2481
Joined: 04 Nov 2016, 10:44
Location: USA

Re: AMD Radeon AntiLag

Post by jorimt » 12 Jun 2019, 22:28

^ Sure, let's hope; if it ends up being something entirely new, obviously none of us can know until they tell us. Just best (educated) guesses at this point.
(jorimt: /jor-uhm-tee/)
Author: Blur Busters "G-SYNC 101" Series

Displays: ASUS PG27AQN, LG 48CX VR: Beyond, Quest 3, Reverb G2, Index OS: Windows 11 Pro Case: Fractal Design Torrent PSU: Seasonic PRIME TX-1000 MB: ASUS Z790 Hero CPU: Intel i9-13900k w/Noctua NH-U12A GPU: GIGABYTE RTX 4090 GAMING OC RAM: 32GB G.SKILL Trident Z5 DDR5 6400MHz CL32 SSDs: 2TB WD_BLACK SN850 (OS), 4TB WD_BLACK SN850X (Games) Keyboards: Wooting 60HE, Logitech G915 TKL Mice: Razer Viper Mini SE, Razer Viper 8kHz Sound: Creative Sound Blaster Katana V2 (speakers/amp/DAC), AFUL Performer 8 (IEMs)

User avatar
Chief Blur Buster
Site Admin
Posts: 11647
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: AMD Radeon AntiLag

Post by Chief Blur Buster » 13 Jun 2019, 13:06

That said...

I can confirm there's inefficiencies with current graphics workflows (including VSYNC ON, or power-management-induced latency increases, or other factors) that can create additional latencies specific to driver software versions, computer system setups, and graphics vendors.

Virtual reality development has bore this out; in pushing innovations for lower latency. Including doing things like building in raster-based beam-raced rendering workflows and/or even simply VBI-beamraced pageflipping (like RTSS Scanline Sync) -- it's theoretically possible that AntiLag is simply a variant of driver-based scanline sync but I'm not sure. If so, then this will be great news that AMD has copied a "Blur Busters suggested invention" under popular demand. ;) Probably not, I am just speculating -- the Present()-to-Photons science is a Pandora Box that is very hard to optimize (reducing lag and stutters can often make things difficult for game developers).

Maybe that's not it -- who knows what is being done behind the scenes?

I would not be surprised if there's some latency reductions in certain graphics drivers, but actual testing will need to bear this out -- actual hard data.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

mello
Posts: 251
Joined: 31 Jan 2014, 04:24

Re: AMD Radeon AntiLag

Post by mello » 13 Jun 2019, 14:25

This is what AMD rep says about AntiLag feature: https://youtu.be/uPpdtXFx7gQ?t=232

User avatar
ko4
Posts: 126
Joined: 06 Jul 2018, 16:14

Re: AMD Radeon AntiLag

Post by ko4 » 14 Jun 2019, 09:06

mello wrote:This is what AMD rep says about AntiLag feature: https://youtu.be/uPpdtXFx7gQ?t=232
like i said
this will only work when gpu limited/bottlenecked

User avatar
jorimt
Posts: 2481
Joined: 04 Nov 2016, 10:44
Location: USA

Re: AMD Radeon AntiLag

Post by jorimt » 14 Jun 2019, 09:08

mello wrote:This is what AMD rep says about AntiLag feature: https://youtu.be/uPpdtXFx7gQ?t=232
^ Yeah, hmm, he still makes it sound something like a reduced pre-rendered frames queue for GPU-limited and/or usage-maxed (which does indeed affect both synced and unsynced) scenarios, at least under specific (still to-be specified) conditions (and possibly only at lower/certain framerates).

That's one of the very reason we recommend having the framerate limited by an FPS cap (and sustaining it there) at all times where possible (regardless of synced/unsynced), if you desire the lowest input lag at all times.

Seems like one of the only potential differences from Nvidia here is that AMD may have found a way to guarantee override (and/or auto regulation, tuned for balance of lowest lag + highest framerate performance) of each game's internal pre-rendered queue value at the driver level, something that can't currently be said for Nvidia's MPRF setting (which has no guarantee of overriding the behavior in the given game).

That said, I'm still hoping for it being something more revolutionary, I just haven't seen anything that tells me that it is (yet...).
(jorimt: /jor-uhm-tee/)
Author: Blur Busters "G-SYNC 101" Series

Displays: ASUS PG27AQN, LG 48CX VR: Beyond, Quest 3, Reverb G2, Index OS: Windows 11 Pro Case: Fractal Design Torrent PSU: Seasonic PRIME TX-1000 MB: ASUS Z790 Hero CPU: Intel i9-13900k w/Noctua NH-U12A GPU: GIGABYTE RTX 4090 GAMING OC RAM: 32GB G.SKILL Trident Z5 DDR5 6400MHz CL32 SSDs: 2TB WD_BLACK SN850 (OS), 4TB WD_BLACK SN850X (Games) Keyboards: Wooting 60HE, Logitech G915 TKL Mice: Razer Viper Mini SE, Razer Viper 8kHz Sound: Creative Sound Blaster Katana V2 (speakers/amp/DAC), AFUL Performer 8 (IEMs)

Post Reply