flood wrote:well if it's just scaling down every frame, that should only add as much input lag as it takes to scale down a frame. which isn't that long (~1ms maybe)
Yes, normally this is true, but if your GPU is extremely busy doing 3D rendering, then scaling could theoretically slow things down. Adding 1ms to rendertimes can turn 10ms frame cycles (100fps) into 11ms frame cycles (91fps). So you see, 1ms can slow your framerate by a double-digit percentage (10%+). Unless the GPU has a scaling pipeline independently of the rendering pipeline (which may not be true for all GPUs).
Not all forms of AA uses scaling methods (supersampling), there are many kinds of AA, such as FSAA, FXAA, MSAA, all of which potentially produce different behaviors to input lag. The internal implementation of AA is often a company trade secret, so is often an input lag black box, that needs to be measured end-to-end such as the high-speed-video LED mouse button test that Blur Busters pioneered (for
GSYNC Preview #2).
Try cycling AA to a different AA and see how your input lag behaves.