andrelip wrote:The mouse feels very fast and that is not a placebo but I seem to play way worse with it and that could be a placebo.
It's very system dependant and person dependant on how you perform.
Could be unknown lag or playfeel factors
There are unknown lag factors such as scaling (might not be it..) I can't tell, or you might be trained to play with slightly more lag than you currently have. Also, strange interactions with mouse pollrate may occur, or RTSS may be adding some extra lag that cancels-out the lag savings. We'd have to use a button-to-pixels high speed video of the game, to figure what goes on -- though that is time/labour intensive
Could be lag re-training effect
Lag changes (sudden decrease or sudden increase) can affect your game. Remember that aiming at something while panning 4000 pixels/sec (mid-mouse-turn), a 1ms change to lag means a 4 pixel overshoot/undershoot (4000pps / 1000ms) = 4 pixels per millisecond. So a 5ms change to lag, even a sudden 5ms increase or 5ms decrase, means 20 pixel overshoot/undershoot in this situation. This goes pretty much with any display change, your game may be briefly affected until you're used to the new lagfeel. Especially when aiming at fast-moving targets or aiming without stopping turns, etc.
andrelip wrote:1) How could they buffer something from a higher scanrate to display at a lower one? They just discard the excessive blank interval? They discard the entire frame?
No, no, no.
Metaphorically, imagine a big pipe between GPU and monitor motherboard. And a tiny pipe between monitor motherboard and panel. If the pixels are arriving faster than the pixels can be refreshed onto the screen, the "tank" (buffer) fills up. The monitor is still refreshing at its maximum velocity, but that maximum bitrate to the panel (from monitor motherboard to monitor panel) may be less than the cable delivery bitrate (from the GPU to the monitor motherboard). The buffer is simply a temporary "holding tank".
A different metaphor is like Netflix streaming. If your Internet connection is too slow to stream in realtime, Netflix has to buffer before displaying video. The GPU-to-monitor-motherboard is like that "internet connection" and the "monitor-motherboard-to-monitor-panel" is like the video playback beginning.
The buffer may be only up to 1/240sec (per refresh cycle) so the buffer is so brief and imperceptible to human eye. It just a few milliseconds of lag per refresh cycle. But all that's happening is 1/360sec delivery is lagged-down to 1/240sec delivery, so 1/360sec and 1/240sec should be equal on a panel that scansout a whole refresh cycle in 1/240sec. So QFT faster than a panel's scanout velocity, doesn't necessarily worsen things. In fact, in one situation, a display that requires the whole refresh cycle to be transmitted BEFORE beginning scanout -- then a faster delivery can cause a refresh cycle to begin scanning out (from the monitor motherboard's buffer) sooner. But most gaming monitors begin scanout even before the refresh cycle has been fully delivered. So it's just a rolling-window line buffer, that may predict when the scanout completes, and begin refreshing at that moment.
A buffer is simply the only way to convert a slow-delivery (transmission of pixels over cable) into a fast-scanout (playback of pixels onto panel). Fortunately, the way it works is that once enough buffer has been built up, it can begin refreshing the top part of the display even while waiting for the bottom part of the display to finish delivering over the video cable.
[Questions on this is somewhat offtopic, please post future questions about display engineering in the Area 51 Display Engineering Forum]
2) 360hz seems to be the limit for me. Even at 640x480 it didn't pass that value. More than that the display just goes black with osd popping the preferred resolution.
Interesting. That said, I assume you're referring to 1/360sec frame delivery (a lower Hz but frames delivered in 1/360sec), rather than "360Hz" (360 refresh cycles per second). Can you post a screenshot of the best 1/360 mode you created?
3) We should consider the theory that AW2518HF's panel (AU Optronics M250HTN01.0) could achieve more than 1/240hz but since more than that could reach pixel clock limit of dp 1.2 they decided to limit at that rate.
From a cable perspective, that might very well be right. All of these are undocumented quirks since the monitors are typically not designed for faster than 1/240sec -- we're essentially using out-of-spec video signals when we're forcing a quick frame transport via a custom resolution.
4) I could actually use 1024x768@240hz with 1/360 scanrate but the screen have massive overshoot.
So maybe you're not having a buffered refresh cycle after all... (not 100% sure, but...)
Major ghosting worsenings is possible evidence the panel is actually scanning out in 1/360sec! If you're seeing massive ghosting. Massive ghosting, however, can actually worsen reaction time in games -- the ~1ms time savings of 1/360sec versus 1/240sec is massively outweighed by a "more than 1ms slower LCD GtG". LCD pixels can't refresh completely when refreshed too briefly, so pixel response (GtG) slows down. Not enough time sending voltage to pixels. So LCD GtG pixel response suffers during this overclocked scanrate. So you get more lag because of slower pixel response. Trying to get 1ms less lag from 1/360sec QFT, and then finding you're getting more lag from slower LCD GtG. Ha.
Also, remember QFT tricks (large Vertical Totals) don't work with unsynchronized VSYNC OFF -- it only reveals benefits when used with top-edge-of-visible-refresh-cycle framebuffer flipping (rather than Microsoft's usual method of bottom-edge-of-visible-refresh framebuffer flipping)
(I've seen this behaviour before in monitor overclocking -- pixel response can often suffer)
Totally off-topic but since you asked: I read more than a few hundreds different topics about performance tweaks and games with input lag, negative acceleration and micro-stuttering. DCP latency, Timer Resolution and all those bullshits. None of that gave me any real advantage and the game was always inconsistent with good and bad days even with high fps and relative good pc spec. After I discovered RTSS I skyrocket in the current league that I play here in Brazil for CSGO.
Excellent to know! RTSS is a very valuable tool when used correctly.
I'm Global but that something easy to achieve in my region but in this league I was 16 of total 20 and my k/d for the year fluctuated between 0.8 and 1.1. Now I have 1.48 and I'm 19 of 20 with more than 30 games on record playing with 160hz @ 160fps. I'm not sure if I prefer to cap the fps for perfect stability or set to scanrate of 350 but the game feels way faster with perfect recoil and mouse control. It could also be software related since the calculations and interpolation are better with a perfect frametime.
It's tough to get even further, because we've got a complex soup of unknown interactions that is hard to test for:
--> RTSS latency overheads
--> Panel scaling latency (sometimes doesn't exist, as scaling can be made lagless, albiet not always)
--> LCD GtG response speed changes from overclocked scanrates (from QFT)
--> Tradeoffs made (e.g. trading VSYNC OFF with RTSS-capped tearingless VSYNC OFF)
--> Mouse poll beatfrequencying / harmonic effects against the refresh rate / framerate.
--> Game design interacting. (Too high/too low caps can go wonky, aimfeel can go strange at 1000fps CS:GO)
My recommendations is to back off QFT margins a little bit and aim at 1/240sec QFT of any low-Hz. For example 1/240sec delivery of 160Hz or 200Hz. (Btw, calculating the frame delivery time is done by (Active Vertical Resolution) / (Horizontal Scan Rate) .... That's your frame delivery time). If your overdrive problems disappears, then that solves your problem, see if your aimfeel is better.
Also one possible issue is that a monitor may have bugs in its overdrive processing, using the wrong overdrive formula - Quick Frame Transport signals require overdrive to be done differently, e.g. 160Hz with 1/240sec QFT should use a 240Hz overdrive formula, not a 160Hz overdrive formula. So ghosting can still look worse and this can be frustrating if the overdrive adjustments (e.g. "OD GAIN") is not adjustable.
No guarantees obviously, but aren't we the etxreme of bleeding edge tweaking on Blur Busters?
We essentially implemented mainstream prosumer-tweakable Quick Frame Transport before HDMI Forum did (there's no shipping HDMI 2.1 consoles or computers that officially use Quick Frame Transport in an easy user-friendly manner -- yet it's part of
HDMI 2.1 specification waiting for vendors to implement it fully, including properly into Windows drivers / Microsoft Windows). Pushing the limits of our monitors...