The Official *MSI Optix MAG251RX* Owners Thread

Everything about displays and monitors. 120Hz, 144Hz, 240Hz, 4K, 1440p, input lag, display shopping, monitor purchase decisions, compare, versus, debate, and more. Questions? Just ask!
RLCSContender*
Posts: 541
Joined: 13 Jan 2021, 22:49
Contact:

Re: The Official *MSI Optix MAG251RX* Owners Thread

Post by RLCSContender* » 22 Sep 2020, 11:36

Joel D wrote:
22 Sep 2020, 05:09
Thanks for clearing that up guys.

Besides the lag though, I didn't hear any elaboration on this statement, which IMO is the most concern.
RLCScontender wrote:
21 Sep 2020, 03:44
scaling via GPU which adds slight input lag and slight blur due to scaling
Is this indeed true ? Where is this coming from technical wise ?

Me myself I do not downscale ever. But I do want settings that are most efficient. For obvious reasons. I have a RTX 2080ti so also want to do what's best for my personal set up.

RLCScontender, I see you mentioned in Step 5 to "switch to 10 bit color". Well I got my settings set like you said, but I never can change my color bit. Its grayed out. But when I make it so its not grayed out, the choice is only 8 bit anyway. So what am I doing wrong, and how do I make that setting offer 10bit ?

BTW, I have iPhone,, which app does the white balance readings for iPhone ? And what are you using to adjust the r/g/b ? Nvidia CP ? Or ??

Thanks !
i mean, if you want black bars, micro stutter, input lag, and a slightly blurry image, that's on you.

Without CRU, any slight deviations from the 240hz framerate will cause micro stutter. In order to get that stable, it would need to be at 240hz frame rate constantly and the best way to do that without introducing input lag is to lower the resolution slightly(that way, i still get to keep some settings that may lower FPS such as image quality).

Via GPU scaling, if u try to lower the resolution(say u want stable FPS or want to get 360hz framerate if u bought the 360hz monitor) black bars will show or it will automatically stretch the image for you and in my opinion, that will make the image extremely ugly(blurry) since scaling is done from a different aspect ratio thanks to the GPU

Latency, again for most people who own modern GPUs, the differences in latency are negligible; however, i can easily tell the difference since i play a dexterity heavy game. For most people, it's very difficult to tell but for me, this is a difference between getting fired and losing my sponsorship. I need as little input lag as possible(especially now since they are doing esports sponsored tournaments such as dreamhack, rlcs, etc online due to the coronavirus pandemic.

With 10 bit color depth, even at 240hz, i can easily change that option (from 8 bit color depth to 10 bit color depth)on nvidia control panel. It's unusual for you to tell me you aren't able to have that option. Are you on "RGB"? also, have you tried changing the DP cable? I've owned FOUR msi mag251rx's and 3 of the cables had missing pins/pin connectors and causes derpy things to happen (random black screen, random display shut down, purple looking jaggies, etc)

tld;rl


guys, it's a TWO minute fix. Just watch this video and you are set for life(u can even lower the resolution if u wish to get more FPS with no lag, black bars, or blurry scaling)

phpBB [video]

User avatar
Chief Blur Buster
Site Admin
Posts: 11653
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: The Official *MSI Optix MAG251RX* Owners Thread

Post by Chief Blur Buster » 22 Sep 2020, 13:41

RLCScontender wrote:
22 Sep 2020, 11:36
i mean, if you want black bars, micro stutter, input lag, and a slightly blurry image, that's on you.
RLCScontender,

*** Conflation Alarm ***
*** Conflation Alarm ***
*** Strawman Argument Alarm ***
*** Strawman Argument Alarm ***


Black bars are beneficial for some esports players because of peripheral vision.

Many professional players only stare at crosshairs in CS:GO and use peripheral vision to see the whole screen.

They don't move their eyes away from the crosshairs. When the screen is too big, they can't use their peripheral vision to see the whole screen reliably.

Scores go down.

Therefore, some of them shrink the image slightly by adding black bars around the image. Now they see everything with peripheral vision because the image is shrunk slightly.

Scores go up.

Not everyone operates their eyeballs the way they do. Everyone tracks eyes differently. Some CS:GO players only keep their eyes staring only at crosshairs, even when spotting enemies to the left and right. 100% using peripheral vision.

...Yes, Rocket League has no crossairs. But others games do!
...Yes, not everyone plays games as a career. But some here do!
...Yes, you can sit further back from bigger screens. But not all desks/rooms are big enough, some are also nearsighted!
...Yes, most want 24" because of peripheral vision. But sometimes people want a bigger screen (27", 32") so they can use Windows, but use a smaller screen for CS:GO. And if you prefer 22" you still need black bars because there's no good 240p 22" monitor.

Black bars on all 4 sides allows you to have cake and eat it too, owning one screen that can simultaneously emulate a bigger or smaller display, on small desks without needing 2 monitors.

Black bars (1:1 scaling) sometimes reduce input lag if both monitor scaling is laggy and GPU scaling is laggy -- and the lower resolution means higher frame rates resulting in less GPU rendering time resulting in less lag. While producing sharper 1:1 pixel mapped images. This depends on the game though, if you are getting lag from higher resolutions, but it can happen with lots of newer games.

Don't tell career CS:GO professionals to lose money by playing on bigger monitors on shallow-depth desks without black bars, RLCScontender!

/micdrop
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

User avatar
axaro1
Posts: 627
Joined: 23 Apr 2020, 12:00
Location: Milan, Italy

Re: The Official *MSI Optix MAG251RX* Owners Thread

Post by axaro1 » 22 Sep 2020, 13:49

RealNC wrote:
22 Sep 2020, 09:25
but when it comes to scaling latency, everyone just assumes the display's scaler is better, even though there's a huge variation of display scalers out there...
Can you elaborate? is it possible to compare gpu scaling to display scaling?
XL2566K* | XV252QF* | LG C1* | HP OMEN X 25 | XL2546K | VG259QM | XG2402 | LS24F350[RIP]
*= currently owned



MONITOR: XL2566K custom VT: https://i.imgur.com/ylYkuLf.png
CPU: 5800x3d 102mhz BCLK
GPU: 3080FE undervolted
RAM: https://i.imgur.com/iwmraZB.png
MOUSE: Endgame Gear OP1 8k
KEYBOARD: Wooting 60he

User avatar
Chief Blur Buster
Site Admin
Posts: 11653
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: The Official *MSI Optix MAG251RX* Owners Thread

Post by Chief Blur Buster » 22 Sep 2020, 13:53

axaro1 wrote:
22 Sep 2020, 13:49
Can you elaborate? is it possible to compare gpu scaling to display scaling?
AFAIK, the NVIDIA latency analyzer can do that if you have the photodiode device.

I think right now it's below noisefloor (0.1ms) on the supported screens, you'll probably see identical results.

Also, my internal Blur Busters latency tester (used for Blur Busters Approved) can also test scaling lag differences.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

User avatar
axaro1
Posts: 627
Joined: 23 Apr 2020, 12:00
Location: Milan, Italy

Re: The Official *MSI Optix MAG251RX* Owners Thread

Post by axaro1 » 22 Sep 2020, 13:59

Chief Blur Buster wrote:
22 Sep 2020, 13:53
AFAIK, the NVIDIA latency analyzer can do that if you have the photodiode device.

I think right now it's below noisefloor (0.1ms) on the supported screens, you'll probably see identical results.

Also, my internal Blur Busters latency tester (used for Blur Busters Approved) can also test scaling lag differences.
Have you seen noticeable discrepancy between high refresh rate display scalers and Nvidia/Amd gpu scalers?
XL2566K* | XV252QF* | LG C1* | HP OMEN X 25 | XL2546K | VG259QM | XG2402 | LS24F350[RIP]
*= currently owned



MONITOR: XL2566K custom VT: https://i.imgur.com/ylYkuLf.png
CPU: 5800x3d 102mhz BCLK
GPU: 3080FE undervolted
RAM: https://i.imgur.com/iwmraZB.png
MOUSE: Endgame Gear OP1 8k
KEYBOARD: Wooting 60he

User avatar
Chief Blur Buster
Site Admin
Posts: 11653
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: The Official *MSI Optix MAG251RX* Owners Thread

Post by Chief Blur Buster » 22 Sep 2020, 14:03

axaro1 wrote:
22 Sep 2020, 13:59
Have you seen noticeable discrepancy between high refresh rate display scalers and Nvidia/Amd gpu scalers?
Generally, display scalers vary a lot more (0-16.7ms) than recent GPU scalers now usually (0-to-few ms). You have situations of "adds 1ms vs 1ms".... "adds 1ms vs 0ms"... "adds 0ms vs 1ms"... "adds 16.7ms vs 0ms", etc.

At extremes, during VSYNC OFF, the laggest display scalers are much more laggy than the laggiest of a post-GTX-1080 GPU scaler.

All in all, it's really a coinflip these days for a random display connected to a random GPU. Thousands of displays/televisions, and dozens of GPUs, means all combinations is hard to test for.

Now, USB-connected Arduino photodiode analyzers often have USB latency jitter, hiding those 0ms-vs-1ms below the noisefloor of a Present()-to-Photons latency analysis tool. They'll catch the big offenders, though.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

Joel D
Posts: 158
Joined: 25 Apr 2020, 19:06

Re: The Official *MSI Optix MAG251RX* Owners Thread

Post by Joel D » 22 Sep 2020, 15:05

RLCScontender wrote:
22 Sep 2020, 11:36

Without CRU, any slight deviations from the 240hz framerate will cause micro stutter. In order to get that stable, it would need to be at 240hz frame rate constantly and the best way to do that without introducing input lag is to lower the resolution slightly(that way, i still get to keep some settings that may lower FPS such as image quality).

Via GPU scaling, if u try to lower the resolution(say u want stable FPS or want to get 360hz framerate if u bought the 360hz monitor) black bars will show or it will automatically stretch the image for you and in my opinion, that will make the image extremely ugly(blurry) since scaling is done from a different aspect ratio thanks to the GPU
Ahh, ok got it. So the possibly blurriness is only caused if you downscale via GPU. And that is because it will sometimes stretch the screen. Thank you for this clarification.
RLCScontender wrote:
22 Sep 2020, 11:36
With 10 bit color depth, even at 240hz, i can easily change that option (from 8 bit color depth to 10 bit color depth)on nvidia control panel. It's unusual for you to tell me you aren't able to have that option. Are you on "RGB"? also, have you tried changing the DP cable? I've owned FOUR msi mag251rx's and 3 of the cables had missing pins/pin connectors and causes derpy things to happen (random black screen, random display shut down, purple looking jaggies, etc)
Ok, let me go back and investigate again and see what the heck I got going on. And I am almost positive I'm on RGB, I mean, I haven't touched the setting I don't think. My cable is a very very high end good cable. DP 1.4 so I guess I could try the one that it came with, 1.2, but... Jesus, why ? lol

I will report back.

Joel D
Posts: 158
Joined: 25 Apr 2020, 19:06

Re: The Official *MSI Optix MAG251RX* Owners Thread

Post by Joel D » 22 Sep 2020, 17:20

Good news ! I got my bit depth to do 10bit finally while still at 240hz 1920x1080 native resolution. The issue was, I guess I did the CRU thing wrong when I first did it a couple months ago. I got it to do Native PC 240hz, but something was holding it up as far as the color depth.. Super weird. I have to thank you for the new video showing how to do it, cause it is a little different than the first instructions months ago. This time I did it exactly as video says and when I went back to NvidiaCP the 240hz NativePC was the same as before.. BUT the difference is I was able to switch to 10bit color ! Go figure. I have no clue what I did before that would cause that. Step wise what I did originally was I never went to the "Extensions Block" box and delete and add. Thats where your new video is different than the original instructions. Originally it says to delete and add in the "Detailed Resolutions" box which is what I did and it does work. But for some reason it locks out 10bit option at 240hz. To be clear, that 10bit option comes available at any resolution below 240hz.

Now thanks to your new video, I just reset everything, started over and this time ONLY went to the "Extensions Block" box, deleted what was there and added the resolution as instructed. Tis is the way that works right. Hopefully that helps someone else. USE THE VIDEO ! Not the older instructions.
Yea Baby !
Yea Baby !
NCP1.jpg (83.9 KiB) Viewed 7406 times
NEXT -
RLCScontender wrote:
21 Sep 2020, 03:44
Try to get a 6500k white balance. Adjust your monitor's red/green/blue to to get to 6500k, if it's not 6500k, just keep on adjusting the red/green/blue(most of the time, the red is too high)
Where is this R/G/B adjustment you use for this step ? The gaming OSD doesn't have it from what I can tell and I even do have the optional downloaded one you linked to. See Pics. If I use Nvidias settings for this, then I have to activate Nvidia Color settings, and you instructed against this as you lose something when using them. And ControlMyMonitor app doesn't really work for this, or anything really (read next stuff below).
Settings.jpg
Settings.jpg (108.6 KiB) Viewed 7406 times
Furthermore, what's odd is I have HDR mode engaged, but my monitor overlay thing always says HDR No
RLCScontender wrote:
21 Sep 2020, 03:44
For backlight strobing(AMB), they lock you out from controlling the brightness setting. In my opinion, it's too dimm so I use "controlmymonitor" app or u can use nvidia control panel. (on nvidia control panel, i never go higher than 65-70. The crosstalk is average and since black frame insertion is dominant, it's extremely difficult to se the crosstalk.

https://www.portablefreeware.com/index.php?id=2911

click on the app, then go to the "Brightness" folder and choose your brightness level (from 0 to 100), thus u won't lose color accuracy if you use this app compared to nvidia control panel
This isn't working for me. What am I doing wrong here ? It tells me brightness is at 100, and max brightness is 100, so I can't do anything with brightness. lol Its reading what the OSD setting is I think. See Pic:
CMM2.jpg
CMM2.jpg (82.72 KiB) Viewed 7406 times
I'm clicking on the exe. Is that the right one?

RLCScontender wrote:
21 Sep 2020, 03:44
There's an optional gaming OSD app where you can adjust every thing(including rgb) or have macros or use certain aiming crosshairs for first person shooters. I use this for useful macros to get an edge above my opponents. There's an lag switch tactic i like to use on this app

https://us.msi.com/Monitor/support/Opti ... wn-utility
I got this, and nowhere in there do I see any R/G/B adjustments.

Sorry and one more question. I see there is a HDR setting in Windows Display settings. I tried it but it looks god awful. What is the correct way to set to HDR ?

Joel D
Posts: 158
Joined: 25 Apr 2020, 19:06

Re: The Official *MSI Optix MAG251RX* Owners Thread

Post by Joel D » 25 Sep 2020, 15:45

Any help here appreciated -

RLCSContender*
Posts: 541
Joined: 13 Jan 2021, 22:49
Contact:

Re: The Official *MSI Optix MAG251RX* Owners Thread

Post by RLCSContender* » 25 Sep 2020, 22:35

Ditch the controlmymonitor app. After a few weeks of not using the mag251rx, i was checking the white point balance and the temperature changes at around 70 brightness (nvidia control panel). which is MORE THAN ENOUGH if you have AMB(Backlight strobing on). So as long as you don't surpass 70 brightness on nvidia control panel, the white balance and color accuracy shouldn't be affected.

i only use it for special occasion(i want to keep my mag251rxs in mint condition since i won the panel lottery). My workhorse monitor is the alienware aw2521hfl. I rarely use AMB on the mag251rx unless i was sniping or i wanted to experiment on something. It adds about 4.5-5ms input lag which in my opinion isn't worth it considering the MSI MAG251RX already has among the best motion clarity at 240hz.

also, have a gazillion attempts of guessing green on the human bencmark website, i never got below 4ms on the mag251rx. The only two monitors i got 2ms on 1000 polling rate on a ps4 controller was the predator xb253q gx(3 times) and the aw2521hfl(once).


but hopefully you have the basics down at least

1. OD Fast(don't touch the other overdrive settings, inverse ghosting version(faster) has slightly worse motion clarity)
2. 6500k White Balance(any smart phone camera app will tell you the WB if you point directly at a white background)
3. you made PC "native" for 1080p resolution under CRU(i posted a video about this). You can also adjust the resolution to your liking if the framerate isn't at 240hz stable. Remember this is a freesync monitor and anything that deviates away from the 1/240 scaler will microstutter. You can also overclock the monitor if you lower the resolution enough(Be careful)
4. if you use AMB(backlight strobing), use nvidia control panel and do not go over 70 brightness. 70 on my panel is the maximum before the white balance changes temperature.
5. Use the alarm clock. It's an underrated feature and the only monitor that i've used that has one

Post Reply