My current high-speed setup can't test for frametime performance, but I can get an input lag baseline with relatively few samples. I recently did a couple of (very) casual tests comparing RTSS against Nvidia's NULL auto-limiter in Apex, and they appeared to have the same input lag increase over the in-game limiter in that game:
The scenario in the above link isolates the input lag difference between limiters by using G-SYNC + V-SYNC + an FPS limit just within the G-SYNC range, and since every frame update first occurs at the top of the scanout, I simply capture the top left of the display and watch for the change on the given vertical element on-screen (in this case, a pole) after strafe (mapped to left click).
I only did 10 samples per for these (compared to the 40 per I did for my article), but it's enough to tell how many more frames of input lag the external limiter has over the internal limiter (without me going to an insane amount of work).
Honestly, if your limiter's method is comparable to RTSS (haven't looked into this myself), then it will likely be the same (up to 1 frame more input lag than an in-game limiter).
External limiters are typically anywhere from up to 1-2 frames more input lag than an in-game limiter, with the worst I've tested for G-SYNC (where input lag is concerned) being the legacy Nvidia Inspector driver-level limiter, with up to a 3 1/2 frames input lag increase over the in-game limiter.
Anyway, if I get the chance (no guarantee when that will be), I'd just do a quick 10 sample test in Apex and add it to the existing charts in the link I posted above. If you have any specific recommendations for optimal FPS limiter parameters in your tool for this purpose, feel free to share them (no rush).