CRT and Stuttering - False memories?

Everything about displays and monitors. 120Hz, 144Hz, 240Hz, 4K, 1440p, input lag, display shopping, monitor purchase decisions, compare, versus, debate, and more. Questions? Just ask!
Talrivian
Posts: 14
Joined: 04 May 2019, 09:12

CRT and Stuttering - False memories?

Post by Talrivian » 18 Aug 2019, 15:30

I have been getting interested in CRTs lately due to problems with motion blur bothering me a lot. I went out and found 2 really nice CRTs online for $40 dollars and snatched them up.

Last night, I got everything hooked up and ready to rock and roll. My Geforce 980 Ti has a DVI-I port and so I bought a DVI-I to VGA adapter and hooked up the two monitors to try them out.

My first game I played was Terraria. I used Scanline Sync with Rivatuner and set it to 60 fps, which the game is locked at, and it was the smoothest most amazing experience I ever had. Played the game for hours, it was like playing my SNES side scrollers as a child. Terraria will NEVER look that good on an LCD. Period.

Then, my nostalgia took over and I loaded up Half-Life 1 and Diablo 2 on my CRT. Half-life one, with 60 FPS scanline sync brought me back to high school days. It was amazing.

Then I loaded up Diablo 2................

Diablo 2, runs natively at 25 FPS. All internal game ticks occur at this 25 fps. All animations occur at this 25 FPS. It is a 2D game. If you play multiplayer, the game runs with vsync, and raises the FPS, but the animations are just repeated, I assume. The mouse feels smoother. However, the stuttering from DOUBLE IMAGES is unbearable. The game is trying to play 25 FPS at 50hz or 60hz, and since my monitor is CRT, which is impulse driven, the double images are very pronounced.

I then loaded up Half-Life 1 and locked the FPS at 30 FPS at 60hz, because back in the day, that's all my old computer could handle. It was UNBEARABLY stuttery with double images too!

Almost all of my games in the past played at this frame-rate as a child because I never had fast powerful computers, and I used nothing but CRTs until 2011. I was SO excited about playing Diablo 2 on my CRT because the motion blur on my LCD was literally giving me headaches and making me motion sick to the point of nausea.

My big question is this:

How did I play these games all through middle school and high school on a CRT? My entire life, I grew up playing the classics... Quake 2, Half-Life 1, Diablo 1 and 2, Close Combat 3, Hellbender and Fury 3, Shogo... And always with sub 60hz FPS, sometimes as low as 25-30 FPS, which I remember being considered the old benchmark. If the double-image stutter is SLAUGHTERING my eyes now, how come it never bothered me as a teenager? What changed? Why can I NOT see it now? It is unplayable it bothers me so much.

Thoughts? I really want to play Diablo 2 very badly, and I can't do it on an LCD or a CRT. It's breaking my heart. Was there some kind of magical black frame insertion that old video cards did or something? I had a voodoo 3 at one point until I upgraded to an ATI Radeon 9800 Pro, which was my card for years. Did something change along the way?

Thanks!

Jason38
Posts: 102
Joined: 24 May 2019, 10:23

Re: CRT and Stuttering - False memories?

Post by Jason38 » 18 Aug 2019, 16:11

You sound like me. Get a gaming monitor with great overdrive. This will give you the least amount of flicker and eye strain/nausea. Also control the lights in your game room and make sure that isn't contributing to your eye issues. The reason is because your eyes can handle anything as a child/teenager. Even if the screen was doing crazy things flicker/double images and all you recover so quickly.

Talrivian
Posts: 14
Joined: 04 May 2019, 09:12

Re: CRT and Stuttering - False memories?

Post by Talrivian » 18 Aug 2019, 20:24

Any suggestions? I bought the Acer Predator XB271HU. I loved it at first, but the 4ms GtG is so noticeable to me. I feel like I wasted $500. I couldn't return it. I even used it side by side with my old 144hz TN Asus. :cry: :cry:

What would be the best possible monitor currently available for motion blur reduction if I wanted the following:

24"
240hz
TN
1920x1080 native
1ms GtG minimum
Gsync

No money spared, any expense.

masneb
Posts: 239
Joined: 15 Apr 2019, 03:04

Re: CRT and Stuttering - False memories?

Post by masneb » 19 Aug 2019, 13:40

Computers change over time, the implementations and how games operate have changed as well. Some games have dependencies that get patched or changed which then break over time. You aren't going to have the same experience with new hardware on W10 or even W7. That's why when people do throwback comparisons like this they use old hardware and XP. You can't install all the newest updates either as stuff broke over time and MS was just busy attempting to keep major security flaws slipping through.

D2 is still a supported game, I'm not sure why you'd cap the FPS in that game. None of the games you should ideally be capping regardless of what is considered 'optimal' FPS input unless the game doesn't function without doing so. What is the purpose of capping the FPS in the first place? People cap FPS to reduce thermals (which also reduces clock speed and can cause stuttering), that's counterproductive to another reason which is to make things more consistent (however with newer hardware it can clock down causing stuttering in and of itself unless you lock it), and to stay within the G/Fsync range so it's always turned on.

I can only imagine how stuttery H1 is running at 30fps when your CPU cores are borderline going to sleep or your GPU is.

Putting that aside, escalation. Gaming when you're younger, myself included, was a much cruder experience and since you didn't have anything better to compare it to you always just dealt with what was in front of you, so it didn't bug you. Everyone had the same problem or they weren't networked enough to discuss them. Benchmarking in the 90s barely existed, as well as comparisons. Youtube didn't exist and there weren't active hardware discussion channels on it.

I've also thought about getting a CRT though, but I know how much eye fatigue they cause and they're also very blurry compared to a LCD. OLED is the answer to all of this, once someone finally makes a gaming display and doesn't warranty burn in.

User avatar
Chief Blur Buster
Site Admin
Posts: 11653
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: CRT and Stuttering - False memories?

Post by Chief Blur Buster » 19 Aug 2019, 13:45

For CRTs, try this:

Use framerate=Hz with CRTs
Try VSYNC ON.
(there are low-lag tricks)

Some games were designed to run at exactly 30fps or 60fps back in those old days, so some framerates were consistent. Especially with old 8-bit games (Nintendo etc) which were designed to play fps=Hz to prevent stuttering. Those Nintendo slowdowns were also annoying too! Anytime a 60fps console game slowed down to 30fps, it was annoying too, remember?

Today, we have wildly varying framerates due to 3D render complexity. Games don't run as consistent framerates today as they did on 8-bit consoles of the CRT era, because of the more realtime-ness of the graphics stuff (e.g. raster interrupts, beam racing, etc) rather than the asynchronous framebuffer workflow that decoupled rendertimes from refreshtimes, and made stutters way worse today. We just never noticed because we switched to LCDs. The LCD motion blur hides stutters better.

It's just that old 8bit games were highly optimized to run at a perfect 60fps because they didn't have to render the whole screen from scratch like today's 3D games. 8-bit Nintendos were always permanently VSYNC ON... because VSYNC ON never added input lag because we didn't use 3D framebuffers.

Also, even as we transitioned to framebuffers -- even many PC games back in the CRT days were framerate locked, such as Tom Raider 30 frames per second on 3Dfx Voodoos. The framerate consistency kept things looking good on a CRT. And back in the 90s, the original DOOM was framerate capped to 35 frames per second to reduce annoyance of the wildly-fluctuating framerates.

Stutters are annoying on impulsed/strobed displays (LightBoost, ULMB, CRT, plasma).

The less motion blur, the more visible stutters are.

To fix stutters on impulsed/strobed displays
(A) Framerate = Hz
(B) Do whatever you have to make (A) happen. Lower refresh rate, lower resolution, lower game detail, upgrade GPU, VSYNC ON or RTSS Scanline Sync, etc.
(C) For smoothing mouse microstutters, raise mouse DPI to high and also lower ingame sensitivity, use clean mousepad & clean mousefeet

TL;DR: Doing fps = Hz always looks better when using impulse/strobed display tech
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

Talrivian
Posts: 14
Joined: 04 May 2019, 09:12

Re: CRT and Stuttering - False memories?

Post by Talrivian » 19 Aug 2019, 14:49

Masneb, I think you misunderstood me.

1) Diablo 2 runs natively at 25 FPS, you don't cap it yourself. The game was just programmed to run at this speed. Which strikes me as odd, because back when Diablo 2 came out, all monitors were 60hz CRTs. It is literally impossible to play Diablo 2 on a CRT without horrible judder/stutter/double images or on an LCD without horrible motion blur. I don't know how it didn't bother me as a child.

2) As for Half-Life 1, I purposefully capped my FPS to 30 to emulate how the game ran back in the 90s on my old PC. I wanted to experience what it was like to play it when I was a teenager. I was surprised at how blurry and horribly unbearable the double images and judder was on the 60hz CRT. I used a 60hz CRT to play Half-Life 1 when it first came out in 1998.

I was basically recreating the conditions under which I originally played these games to see how they looked. I'm just astonished how badly they looked and can't believe it never bothered me at all as a child.

Diablo 2 is the biggest surprise. In singleplayer mode, Diablo 2 runs at 25 FPS exactly, and you can't change this. It's hardcoded. It's also a completely 2D sprite based game. In multiplayer on battle.net, the FPS can go to any amount, but it doesn't change anything except mouse responsiveness. The animations are still playing at 25 FPS and the screen still scrolls at 25 FPS and the characters still run across the screen at 25 FPS. The entire timing of the game is hardcoded at that rate. All attack speeds and casting speeds of spells in the game work off of these 25 animation frames to determine attack speed animation break points, etc.

I'm guessing when the game processes at higher FPS, it must just duplicate frames, which still causes the same issues with the CRT's strobed method of displaying frames.

Essentially, Diablo 2 and older games such as Doom, with it's 35 FPS, had double images from the get go. As horrible as it is on the eyes, I can't believe no one talked about it or really noticed it back then. Maybe after being spoiled by newer monitor technology, our brains and eyes are more trained to look for it.

It makes me very sad because I'll never be able to play Diablo 2 again, one of my favorite games of all time. On my CRT, it has unbearable, double images which causes juddering for everything on the screen. On my LCD, it has god awful motion blur because of 40ms of motion blur from the horrible hardcoded 25 FPS.

I guess my main point of the original post was if older operating systems or video card drivers added black frames or something in anticipation of the low FPS and the knowledge that people were using CRTs and the technology's limitations. Getting 25 to 30 FPS was pretty much the norm back in the early days of video gaming on the PC.

I mean, the software developers had to be aware of this phenomenon when they were programming their games to run at that frame rate, right? Right??? There's no way Blizzard spent all that money making Diablo 2, hardcoded it to run at 25 FPS, and then play tested it and was like, "yeah this is okay, I enjoy throwing up every time I play this and our customers will, too".

User avatar
Chief Blur Buster
Site Admin
Posts: 11653
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: CRT and Stuttering - False memories?

Post by Chief Blur Buster » 19 Aug 2019, 15:30

Talrivian wrote:I guess my main point of the original post was if older operating systems or video card drivers added black frames or something in anticipation of the low FPS and the knowledge that people were using CRTs and the technology's limitations. Getting 25 to 30 FPS was pretty much the norm back in the early days of video gaming on the PC.
No, they did not.

Are you playing Diablo natively or through virtualization? Make sure to play natively, since emulators & other virtualization can add stutter that was not there before. For example, sometimes 70Hz VGA was emulated at 60Hz, so you got VGA stutter even with emulated framerate=Hz. Eliminate those stutter weak links and make sure your emulator is stutterless at the video mode you plan to run your game at. Then you only have the game's native stutter.

Also, there's an additional psychological factor: It's a case of getting used to something.

-- Just like when we switch from 60Hz to 144Hz, some people now find it unbearable to switch back to 60Hz.
-- Likewise, people were used to CRT stutter, and then got used to LCD blur hiding stutters, then switching back makes it look unbearable.

So there may be an element of 'false memory' but it's grounded simply in getting used to something better (in certain ways). LCDs were worse in motion blur, but were better in hiding stutters. We just got used to that.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

Jason38
Posts: 102
Joined: 24 May 2019, 10:23

Re: CRT and Stuttering - False memories?

Post by Jason38 » 19 Aug 2019, 16:12

Chief Blur Buster wrote:For CRTs, try this:

Use framerate=Hz with CRTs
Try VSYNC ON.
(there are low-lag tricks)

Some games were designed to run at exactly 30fps or 60fps back in those old days, so some framerates were consistent. Especially with old 8-bit games (Nintendo etc) which were designed to play fps=Hz to prevent stuttering. Those Nintendo slowdowns were also annoying too! Anytime a 60fps console game slowed down to 30fps, it was annoying too, remember?

Today, we have wildly varying framerates due to 3D render complexity. Games don't run as consistent framerates today as they did on 8-bit consoles of the CRT era, because of the more realtime-ness of the graphics stuff (e.g. raster interrupts, beam racing, etc) rather than the asynchronous framebuffer workflow that decoupled rendertimes from refreshtimes, and made stutters way worse today. We just never noticed because we switched to LCDs. The LCD motion blur hides stutters better.

It's just that old 8bit games were highly optimized to run at a perfect 60fps because they didn't have to render the whole screen from scratch like today's 3D games. 8-bit Nintendos were always permanently VSYNC ON... because VSYNC ON never added input lag because we didn't use 3D framebuffers.

Also, even as we transitioned to framebuffers -- even many PC games back in the CRT days were framerate locked, such as Tom Raider 30 frames per second on 3Dfx Voodoos. The framerate consistency kept things looking good on a CRT. And back in the 90s, the original DOOM was framerate capped to 35 frames per second to reduce annoyance of the wildly-fluctuating framerates.

Stutters are annoying on impulsed/strobed displays (LightBoost, ULMB, CRT, plasma).

The less motion blur, the more visible stutters are.

To fix stutters on impulsed/strobed displays
(A) Framerate = Hz
(B) Do whatever you have to make (A) happen. Lower refresh rate, lower resolution, lower game detail, upgrade GPU, VSYNC ON or RTSS Scanline Sync, etc.
(C) For smoothing mouse microstutters, raise mouse DPI to high and also lower ingame sensitivity, use clean mousepad & clean mousefeet

TL;DR: Doing fps = Hz always looks better when using impulse/strobed display tech
When I play my Retro AVS which is a Nintendo with FPGA on my LG Ultragear 24GL600F-B 24 it's awesome. To me this is the most flicker free comfortable experience playing retro games. I have the Mega SG and Super NT which are both FPGA Genesis and SNES both play great on this monitor. I find them easier on my eyes on this screen then my plasma TV. It was interesting I was at a garage sale and someone was giving away a NEC CRT so I took it. I tried to hook it up to a Windows 10 machine and thought I was going to die from the flicker. 5 minutes made me feel terrible. It's strange though because when I hook my Sony Wega KV-20FS120 and play N64 it doesn't feel terrible like the NEC hooked up to Windows 10. Makes me wonder if I would get a different result with Windows XP and older hardware? I can't notice the motion blur with the LG ultragear and this is something that almost always bothers me and causes me eyestrain. I play Sonic on it and can't notice it causing me any strain.

User avatar
Chief Blur Buster
Site Admin
Posts: 11653
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: CRT and Stuttering - False memories?

Post by Chief Blur Buster » 19 Aug 2019, 17:53

Jason38 wrote:It was interesting I was at a garage sale and someone was giving away a NEC CRT so I took it. I tried to hook it up to a Windows 10 machine and thought I was going to die from the flicker. 5 minutes made me feel terrible. It's strange though because when I hook my Sony Wega KV-20FS120 and play N64 it doesn't feel terrible like the NEC hooked up to Windows 10.
That's to be expected especially if you used it in a TV room style manner.
(brightness was set to a comfortable level, you used a room with sufficient lighting, etc)
Jason38 wrote:Makes me wonder if I would get a different result with Windows XP and older hardware?
Nope. Same problem happens to other people.

The explanation is simple. My elementary, it is very simple in Sherlock Holmes terms. The reasons are mutlifold:

(1) Brightness of Windows themes. Windows has a lot of white-colored windows. Bright = harsh.
(2) Loss of acclimation. One is no longer used to it.
(3) Default refresh rate of Windows 10 is automatically 60Hz.
(4) Viewing distance to a CRT. Viewing too close to 60Hz.

You must use NVIDIA Control Panel, AMD Catalyst Control Center, or ToastyX CRU to properly configure a higher Hz such as 85Hz. It partially compensates for the factors above, and back in the day, 75Hz, 85Hz options were easily available, but is configured through a different mechanism today than it was yesterday. Did you raise refresh rate from the default 60 Hz setting?

I've heard of this over and over enough and it just essentially boils down to (1)/(2)/(3)/(4) and similar factors.
Jason38 wrote:I can't notice the motion blur with the LG ultragear and this is something that almost always bothers me and causes me eyestrain. I play Sonic on it and can't notice it causing me any strain.
Use a higher refresh rate when using PC games, but stick to games that allow framerate = Hz.
Too low Hz = too much flicker
Too high Hz = too hard for framerate to match Hz

Find your goldilocks Hz. I recommend at least 100Hz for impulse-driven viewed displays viewed at computer-viewing distances.

We're familiar with this at Blur Busters...
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

masneb
Posts: 239
Joined: 15 Apr 2019, 03:04

Re: CRT and Stuttering - False memories?

Post by masneb » 19 Aug 2019, 19:03

Talrivian wrote:Masneb, I think you misunderstood me.

1) Diablo 2 runs natively at 25 FPS, you don't cap it yourself. The game was just programmed to run at this speed. Which strikes me as odd, because back when Diablo 2 came out, all monitors were 60hz CRTs. It is literally impossible to play Diablo 2 on a CRT without horrible judder/stutter/double images or on an LCD without horrible motion blur. I don't know how it didn't bother me as a child.

2) As for Half-Life 1, I purposefully capped my FPS to 30 to emulate how the game ran back in the 90s on my old PC. I wanted to experience what it was like to play it when I was a teenager. I was surprised at how blurry and horribly unbearable the double images and judder was on the 60hz CRT. I used a 60hz CRT to play Half-Life 1 when it first came out in 1998.

I was basically recreating the conditions under which I originally played these games to see how they looked. I'm just astonished how badly they looked and can't believe it never bothered me at all as a child.

Diablo 2 is the biggest surprise. In singleplayer mode, Diablo 2 runs at 25 FPS exactly, and you can't change this. It's hardcoded. It's also a completely 2D sprite based game. In multiplayer on battle.net, the FPS can go to any amount, but it doesn't change anything except mouse responsiveness. The animations are still playing at 25 FPS and the screen still scrolls at 25 FPS and the characters still run across the screen at 25 FPS. The entire timing of the game is hardcoded at that rate. All attack speeds and casting speeds of spells in the game work off of these 25 animation frames to determine attack speed animation break points, etc.

I'm guessing when the game processes at higher FPS, it must just duplicate frames, which still causes the same issues with the CRT's strobed method of displaying frames.
I didn't misunderstand you, I specifically stated that you shouldn't cap your FPS unless there is a very specific reason for doing so as it leads to a lot of other anomalous behavior, especially at very low CPU utilization levels and I also mentioned is due to extreme amounts of patching over the years especially on dependencies you can't test something like this and have it be the same as when you first played the games as the games you're playing not only are not made for hardware you're remotely playing on, but also that their software dependencies can be completely changed over the years.

That's why I mentioned, when people do stuff like this they test it on old hardware and old builds of software from back when these games came out or they run into all sorts of weird bugs.


Yes, 60hz is hard on your eyes (just like ULMB). Some monitors are worse then others. That wasn't just specifically relating to CRTs, but also the coating on the monitor and the brightness of the monitor itself.

Post Reply