How come many older games didn't have any vsync options?

Everything about displays and monitors. 120Hz, 144Hz, 240Hz, 4K, 1440p, input lag, display shopping, monitor purchase decisions, compare, versus, debate, and more. Questions? Just ask!
Post Reply
xenphor
Posts: 69
Joined: 28 Feb 2018, 11:47

How come many older games didn't have any vsync options?

Post by xenphor » 11 Dec 2018, 20:53

I built a Windows 98 retro pc recently and have been installing quite a few 90s era games and realized that I wasn't just imagining all the screen tearing that occurred back then with no real way to fix it.

Some games like Quake and Quake 2 don't even have vsync options in the menu and require you to edit config files (which of course back then I didn't know anything about). Unreal also doesn't have a vsync option I believe unless you access the hidden preferences menu or edit the config files. Then other games like Diablo, Starcraft, and WarCraft II just have screen tearing by default with no options that I know about to fix it. I suppose depending on the video card and driver version it might be possible to force it but so far that has not been very reliable in my experience.

Why would developers back then not allow you to easily change, or even include at all, such a basic (and to me actually essential) option? I know vsync had some sort of stigma attached to it (and maybe still does), but did that many people, even developers, just not care about screen tearing back then? I mean I was just a dumb kid and even I found it extremely annoying, although I didn't know what was wrong, only that it was a graphic artifact seemingly exclusive to the pc that I never noticed on consoles (which unfortunately isn't the case anymore lol).

edit: Also, is there a technical reason why something similar to freesync couldn't have been implemented much earlier?

User avatar
Chief Blur Buster
Site Admin
Posts: 11653
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: How come many older games didn't have any vsync options?

Post by Chief Blur Buster » 11 Dec 2018, 21:50

Tearing in pre-GPU/pre-3D era was completely unavoidable often because of law of physics

(1) VGA / SVGA cards did not have registers to allow synchronizing to blanking intervals;
(2) VGA / SVGA cards were often not able to have back buffers; you could only write to front buffer;
(3) Computers weren't fast enough to copy all pixels of a bitmapped framebuffer to the front buffer in the blanking interval between refresh cycles.
(4) Only dedicated chips could "bank" between framebuffers, and character modes were faster (Commodore, Sega Master System, original Nintendo, etc) since there were often only 500 or 1000 character blocks (bytes) to copy, so you could copy all at once straight to the front buffer.

In some cases, it just wasn't physically nor mathematically possible to blit a whole bitmap onto screen fast enough -- like trying to copying 64 kilobytes of graphics data to slow video memory in less than 1 millisecond on a low 286 computer with a slow VGA card. It just wasn't possible.

Even if it were possible, it was still very hard. A few games kind of managed to pull it off with some clever tricks -- but old graphics cards often did not make it easy, especially since you had the soup of CGA, EGA, VGA, SVGA graphics cards manufactured by different vendors -- many with no access to blanking interval synchronization or memory-bank-switching abilities.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

xenphor
Posts: 69
Joined: 28 Feb 2018, 11:47

Re: How come many older games didn't have any vsync options?

Post by xenphor » 11 Dec 2018, 22:12

But why did issues exist well into the 3d era with vsync not being easily accessible to tweak, or even tweak at all? I mean something like resolution was usually adjustable and you would think vsync would be just as essential an option to tweak since it affects the entire presentation of the game.

User avatar
Chief Blur Buster
Site Admin
Posts: 11653
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: How come many older games didn't have any vsync options?

Post by Chief Blur Buster » 11 Dec 2018, 23:03

It wasn't a problem that affected me once I got a 3D accelerator...

Probably because I was on the 3Dfx Voodoo bandwagon, they were famous for tearfree visuals. I never had tearing on my 3Dfx Voodoo graphics cards, nor the first GeForce256 cards. Tearing was often very graphics-driver-problems related, so you might have tearing with the Radeon drivers, but not with the NVIDIA drivers -- that was really annoying to me. So you buy Graphics Card "X" and it has tearing-by-default. While Graphics Card "Y" and it doesn't have tearing-by-default. Is that the problem you're encountering during setting up your retro system? Back then, I hated tearing even when playing online, as it really distracted from my gaming experience.

Now we all know better, that the option is important -- VSYNC OFF is lower lag.

Many graphics vendors, game developers and drivers didn't realize how important it was to give the user the option of lower lag (VSYNC OFF) versus better motion (VSYNC ON).

Tearlines were just viewed as evil with no beneficial side-effects (even many developers didn't realize it meant a lower lag mode, or didn't think that 16-to-33ms was actually important). But is now an essential mode in paid competitive eSports play.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

xenphor
Posts: 69
Joined: 28 Feb 2018, 11:47

Re: How come many older games didn't have any vsync options?

Post by xenphor » 11 Dec 2018, 23:27

Well back in the day I only had a voodoo 2 card and don't remember noticing any screen tearing. Then I finally got an Nvidia Riva TnT because it was supposedly better with support for 32bit color and higher resolutions, but the first thing I remember was playing quake 2 and getting screen tearing; I had no idea what was going on. I'm not even sure if I could've searched the internet for it properly because there wasn't google to help guide your search results to the actual problem. I think later I finally knew what vsync was supposed to do but everyone always said to leave it off because it hurt performance.

I'm not sure how the nvidia drivers for the tnt card progressed or if it ever got driver level vsync options or not. Apparently the voodoo cards did at some point but as I said, I either didn't notice it with the voodoo card, or it had it forced by default like you said. I also had an s3 savage 2000 card for awhile but that had way more issues than vsync. I believe I ended up still using the voodoo2 for most games even though the image quality and performance was worse because gameplay in motion felt more fluid.

On my retro machine I have the luxury of using specific known driver versions and overpowered hardware, so I have a geforce 4 ti 4800se and a pentium dual core e5800. However, even with this hardware combo that is supposedly one of the best for Windows 98, it is very hard to achieve a completely locked vsync'd 60 fps in all games, because the driver vsync doesn't really work properly, at least for direct3d, and many games' own vsync is weird, like Need for Speed III/IV/Porsche Unleashed that run at some weird refresh rate and stutter a lot.

So my dream of playing old games and eliminating the tearing and stuttering I encountered on real hardware of the era (or newer) seems to be a lost cause. I suppose using PCem with gsync could approximate a best case scenario but kind of defeats the purpose.

Post Reply