Hi, Ive got the XL2540 for almost a year now, I have 2 main refresh rates I use ; 240Hz no VT just the default (blur reduction ON) and 180Hz 1500VT . Playing csgo till now I used 240hz and today Battalion 1944 came out, the engine of this game is the UE4 (which caps at 200FPS) . My question is this : Will I get frame stutter or something like that in case of : FPS the game produces < my refresh rate?
(e.g keep using 240hz with the 200FPS in this game) .. because I dont want to change everytime just for this game..
And also, A quick question on a slightly different subject, Ive heard on this forum and other sites online that GPU scaling adds input lag? (also heard it was some SPECIFIC driver version related but not quite sure) and is Display scaling actually superior to GPU?
its just that only with GPU scaling it scales the blackbars in csgo correctly .. so the game requires me use GPU if I want 240 in this game..
Is it still relevant and true to this day?
A question concerning FPS & refresh rate relation
Re: A question concerning FPS & refresh rate relation
Most online guides recommends to use Display scaling.b0t wrote:And also, A quick question on a slightly different subject, Ive heard on this forum and other sites online that GPU scaling adds input lag? (also heard it was some SPECIFIC driver version related but not quite sure) and is Display scaling actually superior to GPU?
its just that only with GPU scaling it scales the blackbars in csgo correctly .. so the game requires me use GPU if I want 240 in this game..
Is it still relevant and true to this day?
It was relevant at least in a ~2years, but now, playing with 300fps I am not sure that I will be able to tell the difference.
In 2 years, I had feeling that GPU scaling adds around a full frame of input delay. Maybe it's possible to create environment (e.g. fps limited), where the diffence if it still exists, would become more obvious.
Personally I still set my scaling to No scaling - Display, just because I dont want any scaling.
Maybe you should ask yourself a question - what's the point of scaling for me? Your PC is not powerful enough to drive CSGO at 240fps 1080p? Or it's just a habit of playing 1024x768?
Honestly, I think that sometimes its necessary to review your old habits
Re: A question concerning FPS & refresh rate relation
Appreciate the time u took to post this reply, but I already wrote that I need GPU scaling (aspect ratio) to get black bars @ 240HZ in csgo (4:3 resolution).. I asked whether the GPU scaling adding input lag IS STILL RELEVANT TO THIS DAY (with current NV drivers and such..) ?k2viper wrote:Most online guides recommends to use Display scaling.b0t wrote:And also, A quick question on a slightly different subject, Ive heard on this forum and other sites online that GPU scaling adds input lag? (also heard it was some SPECIFIC driver version related but not quite sure) and is Display scaling actually superior to GPU?
its just that only with GPU scaling it scales the blackbars in csgo correctly .. so the game requires me use GPU if I want 240 in this game..
Is it still relevant and true to this day?
It was relevant at least in a ~2years, but now, playing with 300fps I am not sure that I will be able to tell the difference.
In 2 years, I had feeling that GPU scaling adds around a full frame of input delay. Maybe it's possible to create environment (e.g. fps limited), where the diffence if it still exists, would become more obvious.
Personally I still set my scaling to No scaling - Display, just because I dont want any scaling.
Maybe you should ask yourself a question - what's the point of scaling for me? Your PC is not powerful enough to drive CSGO at 240fps 1080p? Or it's just a habit of playing 1024x768?
Honestly, I think that sometimes its necessary to review your old habits
Re: A question concerning FPS & refresh rate relation
I think it's real to create environment to test it.
Switch to 60hz (to make possible difference more obvious), but its better to use identical full-screen 1080p game in both GPU-scaling and Display-no scaling scenarios. Reboot between testing just to make sure.
For an experienced csgo player it should be possible to feel additional frame of delay at 60hz, if it still exists.
Sorry for the inconvinience, but does 4:3 give any competitive advantages over 16:9? Afaik csgo fully supports 16:9 aspect ratio, correct me if I'm wrong.
Switch to 60hz (to make possible difference more obvious), but its better to use identical full-screen 1080p game in both GPU-scaling and Display-no scaling scenarios. Reboot between testing just to make sure.
For an experienced csgo player it should be possible to feel additional frame of delay at 60hz, if it still exists.
Sorry for the inconvinience, but does 4:3 give any competitive advantages over 16:9? Afaik csgo fully supports 16:9 aspect ratio, correct me if I'm wrong.
Last edited by k2viper on 02 Feb 2018, 08:48, edited 1 time in total.
Re: A question concerning FPS & refresh rate relation
https://www.reddit.com/r/nvidia/comment ... input_lag/
Theory in this thread makes statement. I think that guy argues right, speaking of framebuffers with low and high resolution, it really should be done at driver level, and looks like it really should add latency.
Theory in this thread makes statement. I think that guy argues right, speaking of framebuffers with low and high resolution, it really should be done at driver level, and looks like it really should add latency.
Any computational task adds latency. It doesn't matter if it's post-process or not. If it increases rendering time, it adds latency.
If GPU scaling copy from one (lower-resolution) frame buffer to another (higher-resolution) frame buffer while scaling the image this would add one refresh latency(not a frame latency, don't get them mix up).
Display scaling be done in real time by scaling as it scans out to the display(as you output lines), it might need just one-scanline lookahead buffer.
1080p 120Hz is 135KHz horizontal scan rate -- so one row of pixels is 1/135,000th of a second -- less than 10 microseconds, GPU scaling would be slower than that, not that it would be easy to notice the difference.
Re: A question concerning FPS & refresh rate relation
k2viper wrote:https://www.reddit.com/r/nvidia/comment ... input_lag/
Theory in this thread makes statement. I think that guy argues right, speaking of framebuffers with low and high resolution, it really should be done at driver level, and looks like it really should add latency.
Any computational task adds latency. It doesn't matter if it's post-process or not. If it increases rendering time, it adds latency.
If GPU scaling copy from one (lower-resolution) frame buffer to another (higher-resolution) frame buffer while scaling the image this would add one refresh latency(not a frame latency, don't get them mix up).
Display scaling be done in real time by scaling as it scans out to the display(as you output lines), it might need just one-scanline lookahead buffer.
1080p 120Hz is 135KHz horizontal scan rate -- so one row of pixels is 1/135,000th of a second -- less than 10 microseconds, GPU scaling would be slower than that, not that it would be easy to notice the difference.
ok but I dont get the desired scaling in-game if I DONT use GPU scaling so...
My bigger question was about using 240hz when the maxfps in Battalion1944 is 200..
tearing? stutter? apparent drawbacks?
- Chief Blur Buster
- Site Admin
- Posts: 11653
- Joined: 05 Dec 2013, 15:44
- Location: Toronto / Hamilton, Ontario, Canada
- Contact:
Re: A question concerning FPS & refresh rate relation
This was true before.
But....
I've now seen it happen both ways.
GPU scaling can become more lagless than monitor scaling now.
The best GPU can scale faster than the slower-scaling monitors.
Scaling can be done in a scanline-buffered way. No framebuffer delay needed.
In theory, either side (GPU or monitor) can do line buffered scaling equally as good.
It's a matter of how the manufacturers implements scaling.
Right now, if you're using a 1000-series GPU, it's okay to go for GPU scaling.
But....
I've now seen it happen both ways.
GPU scaling can become more lagless than monitor scaling now.
The best GPU can scale faster than the slower-scaling monitors.
Scaling can be done in a scanline-buffered way. No framebuffer delay needed.
In theory, either side (GPU or monitor) can do line buffered scaling equally as good.
It's a matter of how the manufacturers implements scaling.
Right now, if you're using a 1000-series GPU, it's okay to go for GPU scaling.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter
Forum Rules wrote: 1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
2. Please report rule violations If you see a post that violates forum rules, then report the post.
3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!
Re: A question concerning FPS & refresh rate relation
Ty for replying Chief,Chief Blur Buster wrote:This was true before.
But....
I've now seen it happen both ways.
GPU scaling can become more lagless than monitor scaling now.
The best GPU can scale faster than the slower-scaling monitors.
Scaling can be done in a scanline-buffered way. No framebuffer delay needed.
In theory, either side (GPU or monitor) can do line buffered scaling equally as good.
It's a matter of how the manufacturers implements scaling.
Right now, if you're using a 1000-series GPU, it's okay to go for GPU scaling.
what about 980GTX alongside an XL2540 (v2 firmware) ? whats better to scale with in theory? the display?
Re: A question concerning FPS & refresh rate relation
Most online guides don't have a clue what they're talking aboutk2viper wrote:Most online guides recommends to use Display scaling.
They just tell people to enable display scaling in the driver control panel to get display scaling. Except it doesn't. The GPU driver will not allow display scaling for modes which are not listed in the display's EDID. And guess what? Displays don't list high refresh rate versions of 800x600 or 1024x768 or anything like that in their EDID. So you get GPU scaling *anyway*.
If you want display scaling, you need to override your EDID using CRU.
The internet if full with clueless guides. Furthermore, there's not even any latency tests out there to suggest GPU scaling has a latency problem.
Steam • GitHub • Stack Overflow
The views and opinions expressed in my posts are my own and do not necessarily reflect the official policy or position of Blur Busters.
The views and opinions expressed in my posts are my own and do not necessarily reflect the official policy or position of Blur Busters.
Re: A question concerning FPS & refresh rate relation
Clueless is the best word for it..RealNC wrote:Most online guides don't have a clue what they're talking aboutk2viper wrote:Most online guides recommends to use Display scaling.
They just tell people to enable display scaling in the driver control panel, and that will give them display scaling. Except it doesn't. The GPU will driver will not allow display scaling for modes which are not listed in the display's EDID. And guess what? Displays don't list high refresh rate versions of 800x600 or 1024x768 or anything like that in their EDID. So you get GPU scaling *anyway*.
If you want display scaling, you need to override your EDID using CRU.
The internet if full with clueless guides.