Overclocking VGA Input chips

Talk about overclocking displays at a higher refresh rate. This includes homebrew, 165Hz, QNIX, Catleap, Overlord Tempest, SEIKI displays, certain HDTVs, and other overclockable displays.
Post Reply
RLBURNSIDE
Posts: 104
Joined: 06 Apr 2015, 16:09

Overclocking VGA Input chips

Post by RLBURNSIDE » 06 Nov 2015, 13:53

Hello fellow Blurbusters!

So I've been playing around with my BenQ w1070 projector to max out its capabilities, and I found out that I can reach 70hz at 1080p by using the VGA port. Of course since VGA is analog, this has the added benefit of opening up 10-bit color support which the projector handles (it supports 1.07 Billion colors, probably through FRC or something, but who cares). If I use HDMI, I have to drop to 422 YCbCr and that blurs text so I'd rather not switch back and forth. Of course, because I have an NVidia card, I don't really get 10-bit desktop use (what a bunch of doophuses over there), but I can get it in full-screen exclusive D3D11 apps like games or movie players which benefit from 10-bit color support. Also, when UHD Blurays and 4K source become more common, 10-bit color and P3 colorspace support will become more commonly available, further adding my incentive to activate 10-bit on my projector through the VGA port.

My big question is, has anyone ever soldered anything onto their VGA chips or ADCs to accept higher Hz refresh rates? I'd love to be able to get 72hz at 10-bit going, instead of 70hz. Or even higher. But 72hz is a multiple of 24 so that has obvious benefits for movie playback, even when you use interpolation like I do since only 2/3 frames will be interpolated and things should be a bit sharper.

What's the best way to boost from 70 to 72hz? I'm capable with a soldering iron and have completely taken apart my projector and rebuilt it several times over the years, plus I'm going to buy a new one soon anyway so I'd like to try it.

If I get this working, I'm strongly considering buying a yellow notch filter to boost my colours from rec 709 to DCI P3, although so far the cheapest one I found is 428 bucks. Or I might also try to score some red and green quantum dots plus a blue laser to try and hack my projector into a silent laser projector. Then I could also boost the light by quite a bit by adding more lasers and then cutting out the light every 8ms to reduce image persistence and motion blur. I'm also thinking of using a UV laser, with all three primaries in quantum dots. And then of course an IR and UV filter before the light hits the DMD. Thing is, blue lasers are dirt cheap now thanks to Bluray, so I think I might start experimenting with that first.

My question to this forum is, has anyone ever opened up an old CRT and soldered in a resistor or something to change the clockspeed of their VGA ports? I used to do this with my old "Turbo" button to overclock my Pentium V from 75mhz to 100mhz back in the day. Ahh those were the times.

AustinClark
Posts: 42
Joined: 14 Mar 2015, 00:03

Re: Overclocking VGA Input chips

Post by AustinClark » 06 Nov 2015, 15:38

Depends on what's causing the bottleneck. If the clock rate won't increase further, or if the ICs used can't handle the increased clock rate.

If the ICs won't handle the higher clock rate, slightly increasing voltage may provide just enough extra headroom (I've been wanting to apply this concept to overclockable monitors as well).

However, if the clock rate won't increase further, then you're going to have a really hard time finding a simple fix. Everything has to be in sync, with tight timing constraints (especially if you're running at a higher-than-intended rate).

Post Reply