I don't know if they're paid or not. Honestly, I don't care.
You do know because I spoon fed the evidence at your feet, denying it any further is just you intentionally being obtuse about the matter. Knowing that a reviewer is being paid is an extremely important tool you need to take advantage of to help you remain skeptical. Purposely ignoring this information is neither chivalrous nor endearing. It makes you just seem gullible and naive. Not only did I lay it out for you, it even says they are being paid for it in the description on the exact video you shared.
"Review unit provided free of charge by ASUS. This video is sponsored by Be Quiet. As per Hardware Canucks guidelines, no review direction was received from manufacturer. As an Amazon Associate we earn from qualifying purchases
Since you are basing your entire view solely on images from these sources you need to know these things. "Are they just pushing product" "How did they create these images" "What equipment did they use for X test". Openly admitting to not watching any of the tests and refusing look at the charts or other data then basically saying you know exactly how all of this works is just ridiculous. Is your last name Trump by chance?
When I look at the pictures, let's take the the spaceship one you pointed out, I prefer the ELMB-sync version easily without a doubt. It looks like a vast improvement to me.
I agree let's review them again. We can start off by how Techspots review shows DRASTICALLY more ghosting at "Tracefree 80" than RTINGs same example. Why are they so different? I have a hard time believing techspot labeled this correctly because it looks like the example they used has zero overdrive applied. Either that or they did not use the standard 960 pixels per second as a baseline test. One thing they DO have in common, though, is neither one of these ELMB free example have such horrible artifacting as the Overwatch image comparison that you showed us.
So we need to ask ourselves, why is there again such a difference in the visual quality. Well if I were a betting man I would assume it's because they are not using the pursuit camera correctly which is resulting in a false representation of what you would actually see in person and the multiple images on the ELMB off side are from the shutters of the camera and not at all from the monitor itself. Luckily Marc was smart enough to find a solution to this and created the pixel trackers so we can find the answer to this very quickly.....https://youtu.be/LRTFZMdn714?t=342
clearly shows the recording to be out of sync supporting my theory. Now compare that to when they actually had it in sync here https://youtu.be/LRTFZMdn714?t=347
and you can see a huge difference. I believe this the same reason why the 'ELMB off' overwatch comparison photo you provided is so bad looking.
This is not even remotely close to what it would look like in person. This is either intentional to push product, or unintentional due to lack of experience. Either way it just shows that you should not take data points from them to form an educated decision.
But I also do get you, you say that's not your experience with ELMB-sync. Then I asked could it be because your model was different and thankfully the site admin answered that part.
No, this is another attempt to validate your original agenda. Marc said nothing along the lines confirming that a different model would not have the issues you asked about. Your exact words were; "Also could this "2/3 of the screen is bad, 1/3 is perfect" thing be related to the XG version or your particular monitor?" Which has nothing to do with what he was addressing. The 2/3 bad 1/3 good thing is NOT tied only to multistrobing. In fact every single BFI mode available right now has this issue including his own calibrated and endorsed version as shown in the link I provided. All he was saying is we do not know what is causing the issues on ALL of the ELMB-Sync sets and we should not lump them in one category while simply slapping a 'multiframe strobing sucks' sticker on them as the cause and they should all be tested individually, ideally by the same person. There are multiple testimonies on this very forum that you could find easily where people have stated these same issues over and over.
At this point this is becoming a bit silly and this topic should probably just be deleted to stop any future buyers from seeing your "evidence" and thinking it's legitimate because it's on this website. There should probably just be an informative sticky post where we can discuss it as a group.