120Hz is almost always better, of course. I'm just saying there has been exceptions in the past, where the microstutter harmonic frequencies are better at a specific refresh rate. In fact, you pointed out in another thread lowering the monitor refresh rate to 100Hz solved mouse microstutters for a 500Hz/1000Hz mouse, since it's a multiple of the mouse Hz.
There's the strobing use case, where you need one-strobe-per-frame for the best effect, so 60Hz impulse-driving has clearer motion than than 120Hz impulse-driving for a 60fps-limited game.
There's also movie playback, as several movie players have had problems playing 60fps smoothing during 120Hz operation, because they aren't perfectly synchronizing to every other VSYNC (e.g. variable bitrate where sometimes some 60fps frames take slightly more than 1/120sec to decode, and other frames take slightly less than 1/120sec to decode).
The microstutter harmonics effect happens in certain games where some 60Hz frames may be rendered a little slower than 1/120sec and others a little longer than 1/120sec (e.g. your GPU utilization is hovering around roughly the 50% level), causing the occasional early/missed VSYNC. It happens with just about any 60fps-limited game during the "exact 50% GPU workload situation" where falling to 49% one frame then going to 51% the next frame, 48% the next frame, 51% the next frame, causes the 1:3:1:3 microstutter effect. It does not matter which game, but by the specific scenes that hit that midpoint of GPU workload. It happens when your GPU workload is hovering around the 1/120sec interval (two 1/1210sec intervals make a single 60fps frametime, so varying missing a VSYNC or being on time on a VSYNC). I'm obviously talking about the VSYNC ON situation -- whether double buffered or triple buffered -- because triple buffering during a 60fps framecap during either 60Hz or 120Hz, actually ends up looking almost like double-buffered motion, with some stutter harmonics.
Input lag is a different story altogether, as 120Hz can reduce input lag relative to 60Hz, and many 120Hz monitors have faster pixel response (which also reduces effective input lag too), amongst less monitor processing latencies.
Again, you would agree with me that 120Hz is almost always better -- just not "unamiously always" as there are exceptions.
That said, any 120Hz monitor can easily be run in 60Hz mode, so it's not an excuse not to buy a 120Hz monitor
We're getting back off topic, so back to the original OP's question.