flood wrote:i'm in the process of (re)building my photodiode circuit
this time it will have bandwidth above 1MHz to allow for true microsecond measurements
@op there's a lot of stuff i can share with you if you'd like. depends on how accurate/precise you want things to be in the end. i'm aiming for <1us for my new setup and sensitivity such that 1 scanline from my crt is detectable.
but to start with:
1. ideally you want a detector that looks at the entire screen, not just one small part. because unless you have vsync, any part of the screen could be the first part to be updated
2. photoresistors tend to be slow but if ~1ms is enough for you it might be ok. but you still want to know how much its speed contributes to systematic error. i'm not sure what would be the easiest way for this though.
Thanks flood for your offer. We can join forces and put Leo Bodnar to shame and build another standard of measurements. My approach as I already mentioned is to take into consideration the usb input lag. My device has already better accuracy of 0.1ms than LB. Regarding vsync off I've managed to draw everything on vblank, so the very top part of the screen is giving me a consistent results, so there is no need to scan the whole screen.
The main issue I'm struggling now is the 50% threshold of the pixel brightness. As you probably know the display brightness output is not linear, that's why we have the 2.2 gamma compensation for. On my monitor I have 4.5ms lag between 10%-50% and 7.4ms between 50%-90%
What do you think I should do? Apply some curve to compensate the times to be equal or scale it to the receptive 50% brightness by looking at the high speed recording?