RTSS Scanline Sync HOWTO

Everything about latency. Tips, testing methods, mouse lag, display lag, game engine lag, network lag, whole input lag chain, VSYNC OFF vs VSYNC ON, and more! Input Lag Articles on Blur Busters.
Post Reply
furfoot
Posts: 1
Joined: 01 Feb 2020, 05:00

Re: RTSS Scanline Sync HOWTO

Post by furfoot » 01 Feb 2020, 05:05

I think I'm missing something. So settings:

App detection low
Stealth off
Custom direct 3d off
Framerate limit 0
Scanline Sync value of 1
OSD On (Raster, Framebuffer)

My display is set to 120hz 1080p with no frame limit or VSYNC in place.

Start overwatch and my framerate doesn't seem to be limited to my refresh rate. I was under the presumption that scanline sync option actually limited frames to your refresh rate. What am I doing wrong?

Aldagar
Posts: 33
Joined: 12 Mar 2020, 14:27

Re: RTSS Scanline Sync HOWTO

Post by Aldagar » 30 Mar 2020, 15:47

Is it a good idea to use Scanline Sync in combination with VSync? I haven't found any answer about it.

Using Scanline Sync in demanding games that require high GPU usage makes the tearline jump erratically, but activating VSync solves the problem. I suppose this reintroduces input lag and stuttering when the frame rate falls below the monitor's refresh rate. However, I noticed from my experimentation using Afterburner and FRAPS (obsolete, but the only program I know that measures frame times at every single frame and allows to make graphs) for monitoring, that in most games, using standalone VSync without any frame rate cap makes the frame times fluctuate heavily, but with Scanline Sync activated frame times are stabilized and both Afterburner and FRAPS show a flat line graph.

This is true for all the games I tested. I haven't found any downside yet and the image feels smooth with no stuttering or tearing, but I'm uncertain if this finding is just a result of the method these programs use to measure frame times and if it really provides any benefit.

Knowing this, what would be the optimal configuration to achieve perfect frame pacing with consistent input lag (no matter how high)?

User avatar
Chief Blur Buster
Site Admin
Posts: 11648
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: RTSS Scanline Sync HOWTO

Post by Chief Blur Buster » 30 Mar 2020, 19:25

Aldagar wrote:
30 Mar 2020, 15:47
Is it a good idea to use Scanline Sync in combination with VSync? I haven't found any answer about it.
Yes, there are ways to make RTSS kind of work with VSYNC if you're careful about calibrating it.
It does not always work nor interact well, but there are some combinations that work.

See first page of this thread.
viewtopic.php?f=10&t=4916
Aldagar wrote:
30 Mar 2020, 15:47
Using Scanline Sync in demanding games that require high GPU usage makes the tearline jump erratically, but activating VSync solves the problem. I suppose this reintroduces input lag and stuttering when the frame rate falls below the monitor's refresh rate. However, I noticed from my experimentation using Afterburner and FRAPS (obsolete, but the only program I know that measures frame times at every single frame and allows to make graphs) for monitoring, that in most games, using standalone VSync without any frame rate cap makes the frame times fluctuate heavily, but with Scanline Sync activated frame times are stabilized and both Afterburner and FRAPS show a flat line graph.
Yes, RTSS Scanline Sync can make inputread-to-presentation more consistent.

While RTSS Scanline Sync is designed for VSYNC OFF, you can follow the instructions in the first page of this thread to calibrate RTSS Scanline Sync with VSYNC ON / Fast Sync / Enhanced Sync / Triple Buffering / Etc. There are some weird interactions that can happen, however, it can provide a "stutter-instead-of-a-tearline" experience for those momentary GPU spikes, if you prefer that.

What you need to do is calibrate RTSS Scanline Sync to try to put your tearline jitter margins just before the VBI, rather than inside the VBI. This will eliminate your ability to gain "Quick Frame Transport" benefits (since VSYNC ON usually keys a frame flip at the beginning of VBI rather than end of VBI), but it would allow you to have a "stutter-instead-of-tearline" (only for refresh cycles that would have contained a tearline) experience while not sacrificing the input lag for tear-free refresh cycles.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

Aldagar
Posts: 33
Joined: 12 Mar 2020, 14:27

Re: RTSS Scanline Sync HOWTO

Post by Aldagar » 31 Mar 2020, 10:23

Thanks for your detailed responses, Chief. I've carefully read this thread but I still have some doubts. I think I understand the way Scanline Sync and Quick Frame Transport operate, but I can't seem to grasp how it interferes or works alongside VSync and Enhanced Sync.

If I'm not mistaken, VSync buffers frames and sends them when it receives a VBI signal, and Enhanced Sync is a triple buffering method that doesn't limit frame rate. Interestingly, Enhanced Sync does not work with a 60 fps cap in my 60Hz monitor (actually 59.949Hz) but it does with Scanline Sync. The point is, in demanding games (or maybe unoptimized) in which the tearline tends to jump, Enhanced Sync causes REALLY BAD stuttering the closer the Scanline Sync index is to the VBI range. If I understand this correctly, it is because when the tearline falls inside the VBI it adds an extra frame delay, thus the erratic stuttering. Would setting the scanline in the middle of the screen ensure stability at the cost of slightly more input lag?

With Vsync I can't seem to reproduce this behaviour. The image feels smooth with no stuttering no matter the scanline index, but the frame times stop fluctuating with Scanline Sync ON. Is it because it buffers frames, so it has padding? So, considering that I prioritize stability and frame pacing but a reduction in input lag is welcome, what method would you recommend, VSync or Enhanced Sync? And what would be the optimal scanline index? For context, my monitor has a large Vertical Total of 1481 with a resolution of 1440p.

Sorry if I'm asking too many questions. I've been quarantined for more than two weeks now due to COVID-19 and I decided to investigate about Sync technologies.

User avatar
Chief Blur Buster
Site Admin
Posts: 11648
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: RTSS Scanline Sync HOWTO

Post by Chief Blur Buster » 31 Mar 2020, 17:50

Aldagar wrote:
31 Mar 2020, 10:23
If I'm not mistaken, VSync buffers frames and sends them when it receives a VBI signal
Correct.

Default VSYNC ON behaviour for most graphics drivers is that the frame buffer flip will occur right at the end of the current refresh cycle's scanout, but before the VBI.

During VSYNC ON at full frame rates, most game software will "block" (software pauses for a few milliseconds = input lag) when the software.

Note: Computer programmers will be using commands like Present() or glxxSwapBuffers() to transfer the frame from the GPU's memory to the front buffer (the buffer to be delivered to the GPU output)

The behaviour os VSYNC ON is just like pause-inputread-render-display-pause-inputread-render-display-pause-etc.

So the game is effectively pausing 60 times a second for 60fps. You don't see those micro-pauses (input lag) caused by the blocking behaviour of frame presentation, but it can add up to a frame of input latency. Knowing this will help you understand the concept of "inputdelay" which is what end-of-VBI presentation aims to do.

In most game engine workflows, this is kind of what it does, but some game engines may behave differently or not co-operate well with this.

In some statistics displays, the stopwatch starts when inputread occurs, and the stopwatch ends when the frame presents. That's your "frametime" because it's dominantly render time. And render time VARIES a lot. Sometimes the stopwatch stop occurs right after the frame presents. VSYNC ON can sometimes block, block longer, and sometimes does not block (no pause), depending on how much the frame buffer has queued up. If there's a lineup of waiting frames waiting to be delivered to display, the software is forced to do a micro-pause (which can add input lag) -- often between the input read and the display of the frame. So there can be a variable-length micro-pause there.
Aldagar wrote:
31 Mar 2020, 10:23
Enhanced Sync is a triple buffering method that doesn't limit frame rate. Interestingly, Enhanced Sync does not work with a 60 fps cap in my 60Hz monitor (actually 59.949Hz) but it does with Scanline Sync.
Enhanced Sync has algorithms that has sometimes nasty interactions unless you pre-calibrate with VSYNC OFF before you re-enable Enhanced Sync. Even so, it's not always perfect.

VSYNC OFF and VSYNC ON is easier to explain, while algorithms like Enhanced Sync is sometimes difficult to explain.
Aldagar wrote:
31 Mar 2020, 10:23
The point is, in demanding games (or maybe unoptimized) in which the tearline tends to jump, Enhanced Sync causes REALLY BAD stuttering the closer the Scanline Sync index is to the VBI range. If I understand this correctly, it is because when the tearline falls inside the VBI it adds an extra frame delay, thus the erratic stuttering.
Yes, because if the figurative "tearline" is inside VBI, it is already too late for the existing driver VSYNC ON behavior to flip because it's always flipping on the beginning of VBI. This is an annoying behaviour of graphics drivers and less than 10% of people at NVIDIA and AMD understands this problem (think of interns and great engineers who understand lots but aren't familiar with "raster interrupts" or "beam racing" science like Blur Busters understands). People like Blur Busters have to explain to them how Quick Frame Transport works sometimes.
Aldagar wrote:
31 Mar 2020, 10:23
Would setting the scanline in the middle of the screen ensure stability at the cost of slightly more input lag?
Yup. About half a refreshtime added input lag.
But you can custom-optimize this, although optimizing can sometimes be game-dependant (GPU-load-dependant)

1. Use VSYNC OFF initially
2. Determine how big your tearline jitter amplitude is.
3. If your tearline jitter amplitude is about 10% screen height
4. Set your RTSS Scanline Sync number to roughly twice the jitter amplitude above the bottom edge of the screen (20%)
5. Now re-enable your preferred non-VSYNC-OFF sync tech (such as VSYNC ON or Enhanced Sync or Fast Sync)
6. Your stutters SHOULD be gone.
7. Your latency will be that margin (approximate 20% of a refresh cycle margin -- i.e. adding 3ms lag (but it's still at least 13ms less lag than ordinary VSYNC ON)
8. Your safety jitter-margin is your latency. Bigger jitter margin, slightly higher latency.
Aldagar wrote:
31 Mar 2020, 10:23
With Vsync I can't seem to reproduce this behaviour. The image feels smooth with no stuttering no matter the scanline index, but the frame times stop fluctuating with Scanline Sync ON. Is it because it buffers frames, so it has padding?
Glassfloor frametimes with Scanline Sync is caused by creating a fixed time between inputread and pageflip, regardless of GPU rendertime.

VSYNC ON:
Some games manage to do glassfloor with VSYNC ON, but not all of them. The varying rendertime is creating the varying latency between inputread to frame presentation, occasionally creating inconsistent latency despite smooth motion. Other times, the latency is actually consistent to the human but the logging is out of sync (not recording numbers that are closer of input-to-photons). Even RTSS is unable to accurately record inputread-to-photontime without a photodiode sensor. There's only so much knowledge that RTSS can be able to pull in, but it can't "see" beyond the confines of the computer.

RTSS Scanline Sync
Frame presentation becomes re-timed to RTSS timing, and RTSS has 100% full knowledge of frame presentation time during Scanline Sync (unlike the black box ness of a graphics driver VSYNC ON algorithm which can add lag not measurable by RTSS), and you've got the potential ability of a more exact time between input read to frame presentation time. Other times, it is just an artifact of how RTSS stopwatching behaves during Scanline Sync versus letting graphics drivers use its own sync technology.

The bottom line is RTSS frametimes are not always equal to:
-- frametime:photontime relative sync
-- inputread:photontime relative sync
These can be 3 completely independent volatilities, unfortunately because RTSS can't always view deeper into the graphics driver pipeline, nor always know when the game did its own inputread. So the RTSS glassfloorness may still have inputread non-glassfloorness, as well as RTSS non-glassfloorness may still mean inputread glassfloorness. Because of how some sync technologies de-jitters for you (e.g. VSYNC ON), and other sync technologies amplify jitters (e.g. VSYNC OFF), and changing modes on the display (VRR, strobing) may reduce/amplify dissonances between these three volatilities.

But, Wait! There's One More Thing (Apple Style)
-- Latency can feel different for top edge of screen than bottom edge -- or vice versa -- because of gametime:photontime differnces on top edge versus bottom edge -- High Speed Videos of Scanout. It's VERY weird.
-- This is sometimes noticed during strobed VSYNC OFF at lower refresh rates, which is why I prefer strobed VSYNC ON (glassfloor top-to-bottom), or with RTSS Scanline-Sync.
-- Single number RTSS data will not tell you information about the input lag of each individual pixel on the screen and/or their lag offsets relative to screen center.

The input lag of different pixels of different parts of the screens can be different due to www.blurbusters.com/scanout
The latency gradient of sync technology combinations are as follows:
VSYNC ON non-strobed creates TOP > CENTER > BOTTOM
VSYNC OFF non-strobed TOP = CENTER = BOTTOM (easy glassfloor input-to-photons)
VSYNC ON strobed creates TOP = CENTER = BOTTOM (easy glassfloor input-to-photons, but with fixed higher lag)
VSYNC OFF strobed creates TOP < CENTER < BOTTOM (but highly volatile, requires ~100 measurement samples to notice)

Also, on certain panels (like BenQ ZOWIE monitors), if you adjust strobe phase to screen middle, you can "split" the latency gradients with, as seen in this forum thread.
BenQ Strobe Calibration & Input Lag Gradient Behaviours
This allows you to zero-out the lag of the screen middle during strobed modes, at the cost of higher lag for certain parts of the screen, and a bit of a crosstlak bar underneath your low-lag area.
Aldagar wrote:
31 Mar 2020, 10:23
Sorry if I'm asking too many questions. I've been quarantined for more than two weeks now due to COVID-19 and I decided to investigate about Sync technologies.
It's quite a fascinating science.

I may actually split this thread to cover advanced "Area 51: Display Science, Research and Engineering" aspects of inputreadtime:photontime dissonances not measurable by RTSS.

Even things like power management can interact with VRR to create volatility, and we've seen 0.5ms stutter (dissonances in gametime:photontime) become human visible, see this thread 0.5ms Stutters Are Human Visible In Some Situations In The Refresh Rate Race To Retina Refresh Rates

In the Blur Busters "Milliseconds Matter" science, we're very fascinated how sub-milliseconds slowly begin to reveal themselves unexpectedly when refresh rates go up (144Hz -> 240Hz -> 360Hz -> 480Hz), in the Vicious Cycle Effect. This is why ASUS now has a roadmap to 1000Hz displays.

P.S. We're veering into advanced concepts, so we might split off this thread.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

Aldagar
Posts: 33
Joined: 12 Mar 2020, 14:27

Re: RTSS Scanline Sync HOWTO

Post by Aldagar » 01 Apr 2020, 14:18

Very insightful read. I'm learning a lot, but now I have even more things to worry about! The concept of inputreadtime:photontime divergence is really interesting. I have read about it before, but now I understand the importance of it.

On another note, I managed to make Enhanced Sync work with a 60 fps cap. I just had to enable it in per game settings instead of global settings. However, I realized it's not a good idea since an fps cap close to the monitor's refresh rate generates a rolling tearline, and that causes a stuttering effect in combination with Enhanced Sync similar to what imprecise Scanline Sync does.

Regarding Scanline Sync, I'm wondering if an unprecise scanline with high GPU load causes a variation in input lag in combination with other sync methods. I guess it's not a big deal if it does since it would mean variations of a few milliseconds, but I want to understand how exactly does it work. I also still have trouble understanding how does it work alongside VSYNC and the effect it has in input lag, since if I'm not mistaken, Scanline Sync has to buffer at least one frame but traditional VSYNC already has a frame buffer queue, so I'm not sure if it interferes.

For the moment I'm using Scanline Sync + Enhanced Sync and I think it's the best method for games in which you can't get stable 60 fps 100% of the time even with high-end hardware (think about unoptimized game engines or bad console ports), since both features have a timeout parameter to disengage when the frame rate falls below the monitor's refresh rate, so you get tearing instead of huge stutters as you would with VSYNC ON.

I wonder how consoles get around these problems. From what I've seen from Digital Foundry, many games fluctuate below 60 fps but don't suffer from neither tearing nor stuttering. I'm guessing they use some kind of triple buffering method.

P.S. Regarding the split of this thread, please do as you consider appropiate.

User avatar
Chief Blur Buster
Site Admin
Posts: 11648
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: RTSS Scanline Sync HOWTO

Post by Chief Blur Buster » 01 Apr 2020, 16:00

Aldagar wrote:
01 Apr 2020, 14:18
I wonder how consoles get around these problems. From what I've seen from Digital Foundry, many games fluctuate below 60 fps but don't suffer from neither tearing nor stuttering. I'm guessing they use some kind of triple buffering method.
They still stuttered with frame drops! I saw every single framedrop stutter them since 1990s. Even with triple buffering. It helped, it just didn't eliminate them. Not everyone was as picky about less-annoying stutters.

Stutters less visible at lower resolutions
But it was much less visible at 320x200 resolution, and at only 30fps.

Single frame-drop stutters less visible at 30fps than 60fps
Also, 30fps-dropping-to-29fps is less visible than 60fps-dropping-to-59fps on a CRT tube, because you went from sudden brief singlestrobe-to-doublestrobe (jarring), instead of sudden brief doublestrobe-to-triplestrobe (less jarring), from diagram of duplicate images on impulse displays.

30fps-vs-60fps is much more visible today on bigger higher-resolution screens than yesterday 14-inch CRTs.


<Advanced Science Talk>
One part of the problem is the Vicious Cycle Effect that is currently amplifying visibility of stutters nowadays. Higher resolutions means a single stutter jumps over more pixels. Bigger displays means wider FOV which amplifies visibility of stutter-jumps. Brighter HDR means stutter visibility is amplified. Increasingly-clearer motion (higher refresh rates or better sterobing) means stutter visibility is further amplified.

They all amplify each other in a Vicious Cycle Effect. Retina-FOV (180+ degree) retina-resolution displays (16K+ if using 180+ degree FOV) can require quintuple digits (>10,000Hz) to reach retina refresh rates (Hz maxed to human perception), which is why refresh rate limitations are hugely amplified in virtual reality, Stroboscopic Effect of Finite Frame Rate Displays. Motion blur reduction modes helps (like LightBoost and ULMB strobing), but amplifies visiblity of stutters that is no longer hidden by motion blur anymore. And they also add unnatural flicker, and stroboscopic effects. But fundamentally, strobe-backlights are a humankind band-aid, because real life doesn't strobe. You have to emulate analog motion using ultrafine refresh cycles (ultra-Hz), to eliminate motion blur strobelessly, so that 100% of all motion blur is human-natural, and zero induced by the display itself. Emulating a Star Trek Holodeck successfully enough to pass a blind test (can't tell apart real life from VR) is very hampered by the Vicious Cycle effect, too.

Blur or stutters become fainter again. Then something changes (such as a FOV increase, or a resolution increase, or a refresh rate increase or or a HDR increase), and the artifacts or stutters become more visible again. Keep whac-a-mole by improving the other variables, but they still Vicious-Cycle into each other. See the conundrum?

Researchers keep writing very useful papers (whether the great Color MPRT paper that still sometimes neglects the human-visibility beyond the below-10% above-90% GtG Measurement Cutoff Threshold as we've noticed artifacts completely missed in GtG/MPRT tests Red Phosphor Interferes With Strobing, or the commonly redditted 255Hz Fighter Pilot Test (apples vs oranges -- it was limited-brightness single-frame identification test, not a continuous-blur test, nor a phantom-array test).

However, we back up and focus on the whole picture: What are ALL the weak links in a refresh rate?. We think of refresh rates differently, like geometry versus physics. The great academic writers are focussed on physics, but I think refresh rate as a temporal geometry problem in my human brain (much like a photogenic memory, but successfully predicting a refresh rate artifact), and come up with concepts missed by many.

There are many situations, you tell me a refreshing algorithm, and I correctly predict the resulting artifacts long before a display prototype is built. Happened repeatedly since 2012, all the way back to "LCD can't get CRT motion clarity" days -- till I mic-dropped with all the LightBoost talk back in the day, and with our good understanding of strobe-backlight artifacts (Electronics Hacking: Creating a Strobe Backlight (2012), and Strobe Crosstalk FAQ, and High Speed Videos of LCD Refreshing, and Red Phosphor).

That's why we have Blur Busters Law: The Amazing Journey To Future 1000 Hz Displays which have converged into practically unamious agreement by many researchers and acclaimed by reputable organizations (including NVIDIA) who enjoy using some of my articles as easy Cole Notes equivalents of complex technical scientific papers. Simple Blur Busters 101 stuff to my "Refresh Rate Einstein" brain matter -- but difficult concepts for a lot of people to grasp ("Why the hell do we need 1000 Hz?"). We even taught many new things to some displays engineers who are 90% correct but miss the 10% easily explained by Blur Busters, and have used our skills to properly complete their papers. ;)

Not everyone is picky about stutter, but others are picky, and when stutters are amplified, that can be a big problem. There are people who get nausea playing any FPS unless I've custom-tweaked a gaming monitor to their specific vision-sensitivity (that they didn't know about -- some have motion blur eyestrain, others have stutter nausea, more have blue-light sensitivity, etc. Everybody sees different). But there are also many who will never be able to play a FPS game or VR game nausea-free, until we're further along in the refresh rate race. It'll never be five-sigma population comfort for a long time. We'll have a century (or more) of whac-a-mole.
</Advanced Science Talk>


Now you know why I'm known as the Refresh Rate Einstein!
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

whitestar
Posts: 112
Joined: 12 May 2014, 06:11

Re: RTSS Scanline Sync HOWTO

Post by whitestar » 12 Apr 2020, 15:20

Hi all :)
I'm trying to use this feature in Assetto Corsa Competizione (ACC), the GT3 simulator that uses Unreal Engine 4.

However, there is a problem. The hotkeys for changing view height in that game are the same that scanline sync uses, i.e. CTRL+SHIFT+arrows up/down. And I do believe they are hard coded and can't be changed. And they seem to override the ones in RTSS.

So is there any way around this? Is there some way I can move the tear line without those hotkeys, or alternatively change hotkeys?

whitestar
Posts: 112
Joined: 12 May 2014, 06:11

Re: RTSS Scanline Sync HOWTO

Post by whitestar » 13 Apr 2020, 10:44

Never mind, I got it to work now. The hotkeys weren't acually hard coded into ACC.

This is brilliant! Works really well.

But...one problem: When I exit the game the position of the scan line isn't saved. Which means I have to repeat the procedure of hiding the tearline every time I start the game.

Is this expected behaviour? If so then this isn't exacly an ideal solution.

EDIT: I should give some info here:
I'm using 3x Benq xl2720z in surround mode with Motion Blur (strobing) active @ 60Hz. I'm using RTSS version 7.2.3. I set Framerate limit to 0 and Scanline Sync to 60. I'm also using a Vertical Total of 1350.
In-game I get a tear in the upper part of the screen. Using CTRL+SHIFT+up arrow I hold those down until the line goes out of the screen on top and appears again at the bottom. I don't mind it staying there (at the bottom) because that's where the inside of the car cockpit is, which means it hardly moves at all and any tear is almost unnoticable.

But like I said, I have to do it every time I enter the game. Which makes it pretty much useless.

Appreciate if anyone can help. :)

User avatar
Chief Blur Buster
Site Admin
Posts: 11648
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: RTSS Scanline Sync HOWTO

Post by Chief Blur Buster » 13 Apr 2020, 12:15

whitestar wrote:
13 Apr 2020, 10:44
Never mind, I got it to work now. The hotkeys weren't acually hard coded into ACC.

This is brilliant! Works really well.

But...one problem: When I exit the game the position of the scan line isn't saved. Which means I have to repeat the procedure of hiding the tearline every time I start the game.

Is this expected behaviour? If so then this isn't exacly an ideal solution.
You can save a default scanline offset into the configuration. After adjusting to your favourite tearline-hiding position, see what the scanline offset became, then type that number into the RTSS configuration.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

Post Reply