They still stuttered with frame drops! I saw every single framedrop stutter them since 1990s. Even with triple buffering. It helped, it just didn't eliminate them. Not everyone was as picky about less-annoying stutters.
Stutters less visible at lower resolutions
But it was much less visible at 320x200 resolution, given our CRT tubes were small, and many games only 30fps.
On impulsed CRT tubes, single frame-drop stutters less visible at 30fps than 60fps
Also, 30fps-dropping-to-29.5fps is sometimes less visible than 60fps-dropping-to-59fps on a CRT tube, because you went from sudden brief singlestrobe-to-doublestrobe (jarring), instead of sudden brief doublestrobe-to-triplestrobe (less jarring), from diagram of duplicate images on impulse displays.
30fps-vs-60fps is much more visible today on bigger higher-resolution screens than yesterday 14-inch CRTs.
Normally, framedrops are less visible at higher Hz, but the multi-strobe effect of a fixed-Hz impulse driven display can amplify visibility of framedrops, since fps=Hz creates perfect zero motion blur (amplifying visibility of frame drops), and the double images of 30fps will better hide the framedrops at 30fps.
<Advanced Science Talk>
One part of the problem is the Vicious Cycle Effect that is currently amplifying visibility of stutters nowadays. Higher resolutions means a single stutter jumps over more pixels. Bigger displays means wider FOV which amplifies visibility of stutter-jumps. Brighter HDR means stutter visibility is amplified. Increasingly-clearer motion (higher refresh rates or better sterobing) means stutter visibility is further amplified.
They all amplify each other in a Vicious Cycle Effect. Retina-FOV (180+ degree) retina-resolution displays (16K+ if using 180+ degree FOV) can require quintuple digits (>10,000Hz) to reach retina refresh rates (Hz maxed to human perception), which is why refresh rate limitations are hugely amplified in virtual reality, Stroboscopic Effect of Finite Frame Rate Displays. Motion blur reduction modes helps (like LightBoost and ULMB strobing), but amplifies visiblity of stutters that is no longer hidden by motion blur anymore. And they also add unnatural flicker, and stroboscopic effects. But fundamentally, strobe-backlights are a humankind band-aid, because real life doesn't strobe. You have to emulate analog motion using ultrafine refresh cycles (ultra-Hz), to eliminate motion blur strobelessly, so that 100% of all motion blur is human-natural, and zero induced by the display itself. Emulating a Star Trek Holodeck successfully enough to pass a blind test (can't tell apart real life from VR) is very hampered by the Vicious Cycle effect, too.
Blur or stutters become fainter again. Then something changes (such as a FOV increase, or a resolution increase, or a refresh rate increase or or a HDR increase), and the artifacts or stutters become more visible again. Keep whac-a-mole by improving the other variables, but they still Vicious-Cycle into each other. See the conundrum?
Researchers keep writing very useful papers (whether the great Color MPRT paper that still sometimes neglects the human-visibility beyond the below-10% above-90% GtG Measurement Cutoff Threshold as we've noticed artifacts completely missed in GtG/MPRT tests Red Phosphor Interferes With Strobing, or the commonly redditted 255Hz Fighter Pilot Test (apples vs oranges -- it was limited-brightness single-frame identification test, not a continuous-blur test, nor a phantom-array test).
However, we back up and focus on the whole picture: What are ALL the weak links in a refresh rate?. We think of refresh rates differently, like geometry versus physics. The great academic writers are focussed on physics, but I think refresh rate as a temporal geometry problem in my human brain (much like a photogenic memory, but successfully predicting a refresh rate artifact), and come up with concepts missed by many.
There are many situations, you tell me a refreshing algorithm, and I correctly predict the resulting artifacts long before a display prototype is built. Happened repeatedly since 2012, all the way back to "LCD can't get CRT motion clarity" days -- till I mic-dropped with all the LightBoost talk back in the day, and with our good understanding of strobe-backlight artifacts (Electronics Hacking: Creating a Strobe Backlight (2012), and Strobe Crosstalk FAQ, and High Speed Videos of LCD Refreshing, and Red Phosphor).
That's why we have Blur Busters Law: The Amazing Journey To Future 1000 Hz Displays which have converged into practically unamious agreement by many researchers and acclaimed by reputable organizations (including NVIDIA) who enjoy using some of my articles as easy Cole Notes equivalents of complex technical scientific papers. Simple Blur Busters 101 stuff to my "Refresh Rate Einstein" brain matter -- but difficult concepts for a lot of people to grasp ("Why the hell do we need 1000 Hz?"). We even taught many new things to some displays engineers who are 90% correct but miss the 10% easily explained by Blur Busters, and have used our skills to properly complete their papers.
Not everyone is picky about stutter, but others are picky, and when stutters are amplified, that can be a big problem. There are people who get nausea playing any FPS unless I've custom-tweaked a gaming monitor to their specific vision-sensitivity (that they didn't know about -- some have motion blur eyestrain, others have stutter nausea, more have blue-light sensitivity, etc. Everybody sees different). But there are also many who will never be able to play a FPS game or VR game nausea-free, until we're further along in the refresh rate race. It'll never be five-sigma population comfort for a long time. We'll have a century (or more) of whac-a-mole.
</Advanced Science Talk>
Now you know why I'm known as the Refresh Rate Einstein!