iceboy wrote: ↑12 Apr 2021, 17:31
I have seen in multiple game communities - new players play with default settings, medium (think of gold/diamond rank of league of legends) players play with maximum pre-rendered frames set to 1 (or low-latency mode on/ultra) and very high FPS, and top/pro players play with default settings (maximum pre-rendered frames set to 3, I've seen some set to 4 or 8) again with very low FPS, and with secret FPS limiting/frame pacing techniques.
I should probably start by addressing pre-rendered frames. I once switched to 560 to try 0 pre-rendered frames and maybe there was difference - don't quite remember. Tho GPU needs at least 1 pre-rendered frame - says NVIDIA support. Otherwise it wouldn't have anything to work on. 1 pre-rendered frame will cause generally delay of 1 frame. Tho I heard Chief Blur Buster say (if I Am not mistaken) that 3 pre-rendered frames won't be always used. Some can be discarded, but still for lower input lag you should set it to 1!
LOOOOL can you show me please, e.g. link some stream? I never heard that pros would play on low FPS with secret FPS limiting/frame pacing techniques - interesting! Because logically you want more FPS, I don't think there would be any advantage on lower FPS - even with secret frame pacing techniques. G-Sync already reduces tearing drastically. What could you possibly gain from this on low FPS I wonder?
What exactly you mean by low FPS that pros play on BTW?
iceboy wrote: ↑12 Apr 2021, 17:31
The pre-render queue is just like other producer-consumer queues. When the producer is faster than the consumer, some frames will be stored in the queue causing delay, by limiting the queue size to 1, the producer will block on the enqueue operation when the previous frame has not been dequeued
I don't think this is relevant. Because CPU is significantly slower, almost about 50% than GPU! In BF1 (DX11) and Rainbov Six Siege (Vulkan) my GPU frame time is like 3-4 ms max all the time, while CPU never goes under 6.7ms in BF1. I have never seen GPU frame time to spike, not even once! And I was monitoring it extensively, because I had CPU frame time spikes in BF1.
While if I had 3 pre-rendered frames. I would have delay up to 3 frames! I always could tell between 1 and 3-4 pre-rendered frames. 3-4 introduces significant delay.
Interesting that if producer would be faster (you say) frame would be dequeued, didn't know that. Maybe if one has a weak GPU, this should be concern of his then.
But it makes logical sense since pre-rendered queue is "1". That CPU waits before it starts rendering a new frame on GPU, that also means it will draw more recent frame later, then if it started drawing right away. Not sure what would be better.
iceboy wrote: ↑12 Apr 2021, 17:31
I still remember how low the latency is with NVidia 285/295 drivers with G92 (9800GTX/GTS250) GPU, because it allows to set the maximum pre-rendered frames to 0. I think that turns off the queue.
NVIDIA staff said that you can't have less than 0 pre-rendered frames. GPU needs at least 1 frame from CPU in order to do work. I don't know if this is true, NVIDIA staff couldn't disclose something, which wasn't meant for public anyways. So what this could exactly mean for GPU, I don't know. Maybe like you said: 1 pre-rendered frame, but without waiting on GPU, so if CPU was faster, it would start next frame right away. Or that it would send less than 1 frame right away and NVIDIA lied to us
Or it could try reduce driver latency, which is precisely what Ultra mode is for. But you have to have high GPU usage - someone did 1000fps camera test and if you don't have 99% GPU usage, it is worse than off - i think. I could tell difference in low GPU usage game, on felt much better than Ultra/off.
iceboy wrote: ↑12 Apr 2021, 17:31
Hardware and software developers seem to be ignorant - they only solve the requirements while that don't necessarily create a better system. Thanks to AR/VR applications that have low latency requirements
Don't even start about VR
TELL ME THIS: How is this possible that 60fps Index with some reprojection: 110-120 interpolated frames (not fixed), on high/ultra feels like 0 input lag. Never seen something more responsive in my life!!!!!!!!!!!! I was afraid that VR will be lagged 60hz gameplay, but this is amazing! Even 120fps with G-Sync feels like shit to me! VR 60 FPS feels maybe even better than 144hz with G-Sync capped. It can also update controller position into interpolated frame, but still. You have interpolated frames only! It also doesn't have tearing, I didn't notice it! How is this EVEN possible??? It is best thing - I have ever experienced in terms of input lag and smoothness!
iceboy wrote: ↑12 Apr 2021, 17:31
After struggling for many years, I wrote my own frame rate limiter, because none of the existing ones (e.g. RTSS) can do the job without adding latency. It started with a negative-feedback frame pacing with the windowed VSync - the windowed VSync is so stable, it must be a masterpiece, and it don't have any lag compared to full screen VSync which seems not implemented correctly. I didn't find a way to create a timer source with similar stability without a busy loop, comparing to the windowed VSync so I pace with it. However the frame pacing needs to add ~2ms lag for stability and I noticed - I can use busy loop, and I can create an accurate delay if I use busy loop in the last 1ms. When I limit the game to 59.94fps with a 59.94Hz screen, I can see a stable tearline in the middle of the screen - this is achieved without any scanline sync. The pixels just above the scanline is just computed so they have lowest latency. Now I play 59.94fps on 239.76Hz screen for faster scanout.
V-Sync wait what? Even triple buffering V-Sync cause lag, there is also multi-buffering, which is sort of better. Not sure now, but I don't think: it is better than G-Sync. Oh I know why: it adds extreme amount of memory to VRAM (and decreases FPS maybe slightly) even like 12GB I heard. My RTX 3070 has only 8GB LOL!!!
Also even if you had 0 tearing on 60 FPS. 60 FPS is such a
low value:
1. it increases input lag drastically
2. increases motion blur (someone claim at least 180 FPS is needed to see enemy character, when you rapidly looking around)
for me even on 144hz monitor with low pixel response time looking around quickly was blurred and on 144hz with G-Sync also blurred
60 FPS is such a low value. Pros played already on 200 FPS CRTs in CS GO 1.6. Can't believe anyone would play on it 60!!!
I have G-Sync monitor, but I would still prefer fixed refresh rate for competitive play. Even it causes tearing, it has ultimately lowest latency and on 144fps+ tearing is less of a concern.
Yeah Windowed mode has higher input lag, than exclusive. Also since 460.09 there is new MPO overlay, if you switch to windowed-bordeless. It has very low delay, tested in Rainbow Sig Siege on Vulkan.
500hz monitors coming in 1-2 year and 1000hz in 3-4 maybe. Can't believe anyone still plays on 60