Dooberknob wrote: ↑17 Aug 2025, 11:28
If I'm playing game that is locked at 60fps would there be an advantage of using a monitor that has a refresh rate as a multiple of 60 (i.e. 480hz) compared to a refresh that is similar, if not slightly higher, but not a multiple of 60 (500hz for example)? Also, are there certain configurations where this might change like Vsync on/off, Gsync on/off, ect.?
 
Let's look at this from a different perspective.
The goal of every computer display is to emulate analog reality using digital electronics.
What does this mean?
This means that "ultra" frame rate @ "ultra" high sample rates (also referred to as refresh rates) are required to be able to make our eyes work as intended.
Developers, due to hardware constraints & lack of the fact that pixel visibility time is the #1 graphical fidelity upgrade, have forsaken the end-user by developing these... 60fps games.
In a perfect world, we'd be playing +500fps games at 480-540hz OLEDs.
However, the reality of the situation is completely different. 
Fixed, locked frame rate games have plagued the industry and still continue to do so even in 2025.
As a end-user, what can you do to try and get some type of visual quality increase?
I'll list a few options, with their pros & cons:
• #1 — Run some form of frame generation (LSFG fixed mode, NV Smooth Motion,..) at 480Hz.
+severe reduction in pixel visibility time (persistence), thus better visual performance (severely lower eye-tracked motion blur)
+high sample rate (480hz) leads to better fluidity, thus increases overall visual performance
-visual artifacts at higher scaling factors possible, depending on game type & FG solution chosen
-small "tearing" (multiple frames causing visual distortion on image) may be still visible
-input latency penalty (i don't know the exact values)
-higher total gputime, which could lead to a possible gpu bottleneck depending on game/system
• #2 — Run G-SYNC at 480-500Hz
+no "tearing"
+no (theoretical) input latency penalty, severe reduction in scan-out time
-severe eye-tracked motion blur due to high pixel visibility time (referred to as "persistence blur")
-no fluidity increases
• #3 — (I personally avoid this) Run some form of backlight strobing / BFI (60FPS @ 60Hz)
+severe reduction in pixel visibility time, thus much lower eye-tracked motion blur
-added input latency (haven't seen a "lagless" implementation of bl strobing or bfi yet)
-"tearing" present
-no fluidity added
-SEVERE eye strain and mind-altering effects possible
You could also use a combination of #1 & #2, but I definitely recommend these two paths instead of #3.
Another personal favorite is to run FG at a smaller factor (fixed 2x in LSFG if 60fps, fixed 4x in LSFG if 30fps game), which would drastically lower total visual artifacts & use some form of ~1ms or lower MPRT BL strobing / BFI at a higher refresh rate (120-240hz) to avoid the severe eye-strain of 60hz strobing.
As for the write-up of @mango87 above, he did a slight misnomer.
What he refers to as "stuttering" is actually called "juddering". 
Here's more information about it: 
Hope this helps.