Re: AMD Radeon AntiLag
Posted: 14 Jun 2019, 12:01
Input lag is additive. You aren't talking solely about the time between frames, but rather time in which frames can be messed up. For instance take popular serpentine movement in a FPS. When someone jukes back the opposite direction with AD strafing, you're going to overshoot. This is why buffering is really bad and input delay is a buffer. It doesn't matter how much FPS you get, because you're buffering your actions by the input delay.
FPS and Input Delay matter immensely, both in their own right. Unfortuantely very few people have talked about input delay outside of monitor testing.
The new tech looks highly promising and the demo is definitely true. Something to note is the Nvidia system is 4fps less then the AMD system, it's set to 55-56 while the AMD system is set to 59-60. You can tell they're using their own internal version of the mouse light tester with the sync at the top of the screen. It would've been more helpful and more understandable both to their audience (derpy gamers) and to people watching it with technical background if they actually showed their testing setup for context. When they turn on their Input Delay Reduction (I refuse to call it anti-lag), you can see the screen doing really weird and janky stuff with the timer, probably due to some sort of time syncing issue. This is very good, it definitely means something weird and alpha is happening.
This is first gen technology, it's only going to get better the more they focus on it. Now it's basically proof of concept and it's showing 30% gains... That's very impressive.
Nvidia has not done anything like this or even mentioned anything similar. They're just playing 'me too' with something they don't understand. AFAIK AMD has always had 1 pre-rendered frame ahead, that's why you can't change it. There used to be a registry way of changing it, but that was no longer needed.
FPS and Input Delay matter immensely, both in their own right. Unfortuantely very few people have talked about input delay outside of monitor testing.
The new tech looks highly promising and the demo is definitely true. Something to note is the Nvidia system is 4fps less then the AMD system, it's set to 55-56 while the AMD system is set to 59-60. You can tell they're using their own internal version of the mouse light tester with the sync at the top of the screen. It would've been more helpful and more understandable both to their audience (derpy gamers) and to people watching it with technical background if they actually showed their testing setup for context. When they turn on their Input Delay Reduction (I refuse to call it anti-lag), you can see the screen doing really weird and janky stuff with the timer, probably due to some sort of time syncing issue. This is very good, it definitely means something weird and alpha is happening.
This is first gen technology, it's only going to get better the more they focus on it. Now it's basically proof of concept and it's showing 30% gains... That's very impressive.
Nvidia has not done anything like this or even mentioned anything similar. They're just playing 'me too' with something they don't understand. AFAIK AMD has always had 1 pre-rendered frame ahead, that's why you can't change it. There used to be a registry way of changing it, but that was no longer needed.