New Easy Free Pursuit Camera: Smartphone. No rail needed.
Posted: 30 Dec 2018, 20:09
Easy Pursuit Camera 101
Hand-Waved iPhone/Android Without A Rail
First, let's explain the invention.
A Blur Busters display testing invention by me -- that some of you may be familiar with the Pursuit Camera Instructions. Background info: NOKIA, NIST.gov, and KELTEK researchers have a co-authored conference paper here:
With a rail, one can capture really good pursuit camera photographs like this:
Even though this is much less expensive than commercial equipment -- not all readers can afford a camera rail.
But, in case you didn't know yet.... Display motion blur photography can be done without a rail now, for enduser/amateur purposes.
Now if you see the TestUFO ghosting, you'll see my temporal test pattern invention (the tickmarks that forms a horizontal ladder when eye-tracking the UFOs)
1. Stationary gaze: The sync track is disjointed/broken.
2. Eye-tracking the UFOs: The sync track aligns into what looks like a horizontal ladder.
If you try to photograph it stationary, the photograph is inaccurate for the eye-tracking situation (display motion blur behaves differently while eye tracking, see ww.testufo.com/eyetracking ... It's the sample-and-hold effect)
Instructions for easy pursuit camera with smartphones. No rail needed:
1. Go to www.testufo.com/ghosting and put it in full screen mode.
2. Start your camera and then tap-hold the screen to lock your autofocus to close-range (easier to do in stationary text at top of TestUFO)
3. Start recording video with your smartphone
4. Hand-pan the camera sideways a few times, and you'll see the sync track
Just like this video:
(You can freezeframe this video and use the , and . hotkeys to single-step to clearest freeze frames.)
Many can do much better than this low-quality video (a random "first try" on an old iPad mini); however, this video demonstrates how much more accurate even a bad pursuit camera (end-user motion blur photographing) can be than a static camera.
You can observe how WYSIWYG the horizontal ladder is -- and ghosting streaks are. This is just an iPad camera at default settings, not even using the "Pro Camera" app that lets you set camera exposure length. Full camera adjustment (exposure, white balance, etc) is preferred for more scientific results. However, in a pinch, this is great for remote troubleshooting -- aka end-user ghosting-artifact troubleshooting.
Certainly, these are not perfect (e.g. smartphone focus issues, shaky hands, etc), but it is a clear demonstration of how a moving camera lens (tracking motion) is a better equivalent of a moving eyeball (tracking camera). Ideal exposure for a pursuit camera is approximately 1/30sec (or four refresh cycles at 120Hz) -- roughly matching human vision integration times -- but there's some leeway for deviations from this, with increasing error margins.
Now you understand pursuit camera! And it's easy to learn.
This can kind of be done with other TestUFO patterns such as www.testufo.com/eyetracking and www.testufo.com/persistence although without a sync track, it's not as easy to aim the speed of the camera.
But for troubleshooting simple motion blur behaviours (e.g. helping a user), it can still have enough visual data to be useful. Here's an example of 120fps having roughly half the motion blur of 60fps, and 240fps having roughly half the motion blur of 120fps. Exactly as you saw in person. (Note: GtG limitations start limiting differences, as refresh durations get shorter, so faster GtG will amplify differences between 120fps and 240fps).
While not as accurate as a rail, it's very clear that that the different framerates have different motion blurs, in an uncannily accurate match to what your human eye saw during eye-tracking.
Further Advanced Reading
For more reading about pursuit camera photography
Want to read more?
- See pursuit camera thread,
- See Display Testers/Reviewers using our pursuit camera technique
- See Full Instructions For Pursuit Camera
- See My Pursuit Camera Paper (peer reviewed) -- I'm the co-author. Co-authored with NOKIA, NIST.gov and KELTEK, and conference paper is on ResearchGate by yours truly Mark Rejhon (Chief Blur Buster).
This test is currently adopted by RTINGS.com, HDTVtest.co.uk, HDTV Poland, TFTCentral.co.uk, and other sites.
But even you, can practice pursuit photography to accurately capture a relatively accurate WYSIWYG approximation of human-perceived display motion blur! Most of them use a rail, but some now do it raillessly and single-step through the frame to find the most accurate freezeframe -- free pursuit camera photography using just your smartphone and your hand!
(If you run a commercial site, and decide to use a pursuit camera -- just remember to credit Blur Busters for the invention and link to us -- and please consider using a rail for improved accuracy. That said, it is indisputable that even hand pursuits can produce more accurate WYSIWYG display motion blur effects than stationary photography.)
Hand-Waved iPhone/Android Without A Rail
First, let's explain the invention.
A Blur Busters display testing invention by me -- that some of you may be familiar with the Pursuit Camera Instructions. Background info: NOKIA, NIST.gov, and KELTEK researchers have a co-authored conference paper here:
With a rail, one can capture really good pursuit camera photographs like this:
Even though this is much less expensive than commercial equipment -- not all readers can afford a camera rail.
But, in case you didn't know yet.... Display motion blur photography can be done without a rail now, for enduser/amateur purposes.
Now if you see the TestUFO ghosting, you'll see my temporal test pattern invention (the tickmarks that forms a horizontal ladder when eye-tracking the UFOs)
1. Stationary gaze: The sync track is disjointed/broken.
2. Eye-tracking the UFOs: The sync track aligns into what looks like a horizontal ladder.
If you try to photograph it stationary, the photograph is inaccurate for the eye-tracking situation (display motion blur behaves differently while eye tracking, see ww.testufo.com/eyetracking ... It's the sample-and-hold effect)
Instructions for easy pursuit camera with smartphones. No rail needed:
1. Go to www.testufo.com/ghosting and put it in full screen mode.
2. Start your camera and then tap-hold the screen to lock your autofocus to close-range (easier to do in stationary text at top of TestUFO)
3. Start recording video with your smartphone
4. Hand-pan the camera sideways a few times, and you'll see the sync track
Just like this video:
(You can freezeframe this video and use the , and . hotkeys to single-step to clearest freeze frames.)
Many can do much better than this low-quality video (a random "first try" on an old iPad mini); however, this video demonstrates how much more accurate even a bad pursuit camera (end-user motion blur photographing) can be than a static camera.
You can observe how WYSIWYG the horizontal ladder is -- and ghosting streaks are. This is just an iPad camera at default settings, not even using the "Pro Camera" app that lets you set camera exposure length. Full camera adjustment (exposure, white balance, etc) is preferred for more scientific results. However, in a pinch, this is great for remote troubleshooting -- aka end-user ghosting-artifact troubleshooting.
Certainly, these are not perfect (e.g. smartphone focus issues, shaky hands, etc), but it is a clear demonstration of how a moving camera lens (tracking motion) is a better equivalent of a moving eyeball (tracking camera). Ideal exposure for a pursuit camera is approximately 1/30sec (or four refresh cycles at 120Hz) -- roughly matching human vision integration times -- but there's some leeway for deviations from this, with increasing error margins.
Now you understand pursuit camera! And it's easy to learn.
This can kind of be done with other TestUFO patterns such as www.testufo.com/eyetracking and www.testufo.com/persistence although without a sync track, it's not as easy to aim the speed of the camera.
But for troubleshooting simple motion blur behaviours (e.g. helping a user), it can still have enough visual data to be useful. Here's an example of 120fps having roughly half the motion blur of 60fps, and 240fps having roughly half the motion blur of 120fps. Exactly as you saw in person. (Note: GtG limitations start limiting differences, as refresh durations get shorter, so faster GtG will amplify differences between 120fps and 240fps).
While not as accurate as a rail, it's very clear that that the different framerates have different motion blurs, in an uncannily accurate match to what your human eye saw during eye-tracking.
Further Advanced Reading
For more reading about pursuit camera photography
Want to read more?
- See pursuit camera thread,
- See Display Testers/Reviewers using our pursuit camera technique
- See Full Instructions For Pursuit Camera
- See My Pursuit Camera Paper (peer reviewed) -- I'm the co-author. Co-authored with NOKIA, NIST.gov and KELTEK, and conference paper is on ResearchGate by yours truly Mark Rejhon (Chief Blur Buster).
This test is currently adopted by RTINGS.com, HDTVtest.co.uk, HDTV Poland, TFTCentral.co.uk, and other sites.
But even you, can practice pursuit photography to accurately capture a relatively accurate WYSIWYG approximation of human-perceived display motion blur! Most of them use a rail, but some now do it raillessly and single-step through the frame to find the most accurate freezeframe -- free pursuit camera photography using just your smartphone and your hand!
(If you run a commercial site, and decide to use a pursuit camera -- just remember to credit Blur Busters for the invention and link to us -- and please consider using a rail for improved accuracy. That said, it is indisputable that even hand pursuits can produce more accurate WYSIWYG display motion blur effects than stationary photography.)