FreeSync isn't working properly on my Gigabyte M28U

Talk about AMD's FreeSync and VESA AdaptiveSync, which are variable refresh rate technologies. They also eliminate stutters, and eliminate tearing. List of FreeSync Monitors.
Segundo
Posts: 12
Joined: 17 Jan 2023, 01:59

Re: FreeSync isn't working properly on my Gigabyte M28U

Post by Segundo » 21 Jan 2023, 20:38

To be clear, you don't need to manually end all processes in the Task Manager, you just need to ensure you don't have other generally heavy windows/programs and/or background processes (in the taskbar) running, and only IF you're experiencing sync error messages in TestUFO. If not, you can leave whatever you have open, open.
Cool, I'll try that again under those conditions then. Thank you!
If your system "posts" (aka starts up) and you can use it for hours on end without it BSODing, then it's probably fine. Severe enough static damage would likely prevent the system from starting.
It does indeed post. I've left it on for a maximum of 24-48 hours since I left it overnight sometimes running tests and benchmarks like MemTest86. I want to run stress tests like FurMark overnight, but that'd likely be disastrous if the temps got too high without me there to stop the test.
I don't know of any modern games with zero stutter on PC. Games with minimal stutter usually pre-compile shaders though, like Horizon Zero Dawn, for instance.
That's good to know. I don't currently have Horizon, but I'll buy it to test that out then, especially since it's 67% off right now on Steam. Do any older games run with minimal stutter on PC? I thought that a game from 2013 (Arkham Origins) would work fine even at 4K 140-141 FPS, but perhaps I'm overestimating the 6800XT's capabilities. Should a 6800XT be able to run Arkham Origins and other games that are 9-10 years at 4K 140-141 FPS or at least 1440p 140-141 FPS upscaled to 4K with RSR or FSR? How far back do I have to go to get to games that have minimal stutter at 4K 140-141 FPS? I suppose it depends on how well-optimized the game is and that you can't go too far back though to stuff like Deus Ex in 1999 or F.E.A.R. from the early 2000s since those aren't optimized for modern systems. What is the sweet spot for minimum stutter then? Would stuff like MGSV from 2015 and Minecraft Bedrock work? I do want to test open world games since they're obviously giving me the most trouble, so open world games in general would be good.

Neon White is from last year and works great for the most part, but that is not intensive at all, perhaps by design since it's meant for speedrunning. Valorant also has minimal stutter, though I have to wonder if Riot Vanguard is causing issues since it's kernel-level... also just anticheat programs in general such as BattleEye which I also have through Rainbow Six Siege. I'll try to uninstall Vanguard, Valorant, Siege, BattlEye, and any other anticheat services I have.

I do notice that Neon White, Valorant, and Siege as well as some other games drop to 139 FPS when something "new" happens. By "new," I mean something that I haven't seen yet in that particular play session. For example, it happens if I see a new enemy in Neon White or use a new gun card. If I replay that same level, it runs nearly flawlessly at a completely consistent 140 FPS. I wonder if this is related to the shader precompilation matter you mentioned. Since it's already compiled, maybe it works flawlessly the second time? I don't know though. That doesn't happen in every game, just a few games.
Just load times.
That's good to hear. Save times don't really matter anyway except in games with autosaves that take place during gameplay. If I can reduce my stuttering to just happen rarely and then happen every time an autosave happens, I'd be perfectly fine with that. If I see the autosave symbol, I can just stop playing for a few seconds. Additionally, that's an actual reason for stuttering to happen as opposed to who knows what.
I found this capture of what is supposedly your monitor model's OSD menu. Is what is highlighted in green below the same as the fluctuating readout on your display, or is it separate?

Image
Yes, that is indeed the same as the fluctuating readout on my display. When my refresh rate changes, it also changes there to whatever the current refresh rate is. For example, I just opened Arkham Origins. For some reason, the intro sequence only displays at 30-40 FPS even though it displayed at 140 FPS before. If I open the OSD, it changes between 70-85 Hz. This intro sequence was also another reason I was concerned about FreeSync though. If the FPS drops to 30-40 FPS, shouldn't my refresh rate drop to 48Hz since I believe my FreeSync refresh rate range is 48-144 FPS according to my monitor's specs and AMD Adrenalin (if I remember correctly)? Or should it reset back to 144Hz? Instead, it stays in the 70-85 Hz range which just seems like a completely random number. Additionally, is the refresh rate displayed in the OS slightly behind? It sure seems like it's a second or two behind since, say, a drop in FPS to 130 is often accompanied by a change in my refresh rate to about 134 FPS two seconds later, but I don't know for sure.
Monitor firmware versions aren't reverted when you do a clean install of windows or swap out PC hardware components since it's done directly on the monitor itself.

As for downgrading to the previous version, that's model-dependent, but 99% of the time the answer is no.
Nice. I won't go through that process again then. Gigabyte's update process is notoriously buggy. It took me 2-3 hours to download OSD Sidekick and then to download the monitor firmware since the update kept failing for no reason... glad I don't have to do that again and infect my PC with bloatware I can't delete. It's good it's independent from my PC too. I'll use my terrible laptop then if any more firmware updates release in the future. It is terrible, but that's why I'm willing to break it even more haha.
If you go that route, post here:
viewforum.php?f=24

Fair warning, I have 0 interest in exploring that subject.
Thanks, I'll post there then! Frankly, I also have 0 interest in exploring this subject, but I'll do what I must. Maybe I'll learn more about how electricity works and all...
Yes, but there's typically (though not always) less because 1) consoles don't have shader-compilation stutter since they are able to automatically have all shaders pre-compiled as console are a fixed platform, whereas shader-compilation on PC is entirely dependent on what combination of hardware you have, so they have to be loaded on demand unless the devs implement pre-compilation (either on the title menu when you first launch the game, or asynchronously as you are playing) and 2), on PS5 specifically, unlike PC currently, it utilizes the full capability of higher SSD speeds to reduce asset streaming-related stutter.

PC is still stuck utilizing legacy IO HDD speeds, even on SSDs. This will hopefully be remedied soon by DirectStorage, but it remains to be seen how effective it will be.
I was wondering why consoles have more consistent performance even though they're weaker. That makes sense. I guess the modular nature of PCs has its disadvantages too. And it's interesting that PS5s can use SSDs better than PCs. Is that also because PS5s are a fixed platform while PCs are not? As for DirectStorage, will all SSDs be compatible with that or do I need an SSD specifically designed with DirectStorage in mind? I currently have a Samsung 970 Evo Pro 2TB and a Crucial P5 Plus 2TB. I'd imagine PCIE 3.0 SSDs would not be compatible, but perhaps PCIE 4.0 SSDs would? The P5 Plus was only $20 more including a heatsink even though it has almost double the read and write speeds and is PCIE 4.0 instead of 3.0, so I decided to get it. The only problem is that it runs hotter... indeed, I'm pretty sure I melted part of the damn thing after only four days, so that is an issue. I assumed the mobo's included M2 heatsink would be good enough, but I'll buy another P5 Plus and actually use the included heatsink to see if's a me problem or a Crucial problem.

User avatar
jorimt
Posts: 2065
Joined: 04 Nov 2016, 10:44
Location: USA

Re: FreeSync isn't working properly on my Gigabyte M28U

Post by jorimt » 21 Jan 2023, 21:22

Segundo wrote:
21 Jan 2023, 20:38
I thought that a game from 2013 (Arkham Origins) would work fine even at 4K 140-141 FPS, but perhaps I'm overestimating the 6800XT's capabilities.
The Arkham series is infamous for open-world stutter on PC. it runs on the UE3 engine, which was prone to it, so any psuedo-open world or open world games on UE3 have the issue in varying degrees. Better specs and a more powerful system won't fix it, since it stems from the engine.

UE3 basically isn't great with real-time asset-streaming and texture management.
Segundo wrote:
21 Jan 2023, 20:38
I do notice that Neon White, Valorant, and Siege as well as some other games drop to 139 FPS when something "new" happens. By "new," I mean something that I haven't seen yet in that particular play session. For example, it happens if I see a new enemy in Neon White or use a new gun card. If I replay that same level, it runs nearly flawlessly at a completely consistent 140 FPS. I wonder if this is related to the shader precompilation matter you mentioned. Since it's already compiled, maybe it works flawlessly the second time? I don't know though. That doesn't happen in every game, just a few games.
Likely real-time shader compilation in some of those cases, yes.
Segundo wrote:
21 Jan 2023, 20:38
Yes, that is indeed the same as the fluctuating readout on my display. When my refresh rate changes, it also changes there to whatever the current refresh rate is.
Okay, then it is indeed a two-in-one on your particular model.
Segundo wrote:
21 Jan 2023, 20:38
For some reason, the intro sequence only displays at 30-40 FPS even though it displayed at 140 FPS before. If I open the OSD, it changes between 70-85 Hz. This intro sequence was also another reason I was concerned about FreeSync though. If the FPS drops to 30-40 FPS, shouldn't my refresh rate drop to 48Hz since I believe my FreeSync refresh rate range is 48-144 FPS according to my monitor's specs and AMD Adrenalin (if I remember correctly)? Or should it reset back to 144Hz? Instead, it stays in the 70-85 Hz range which just seems like a completely random number. Additionally, is the refresh rate displayed in the OS slightly behind? It sure seems like it's a second or two behind since, say, a drop in FPS to 130 is often accompanied by a change in my refresh rate to about 134 FPS two seconds later, but I don't know for sure.
The real-time variable refresh rate meters on FreeSync displays are approximated and aren't always exactly 1:1 from moment-to-moment, they also include LFC multiples in the number. As long as the readout is fluctuating, FreeSync is engaged and working as intended.
Segundo wrote:
21 Jan 2023, 20:38
Is that also because PS5s are a fixed platform while PCs are not?
See:
viewtopic.php?f=10&t=7603&p=78594&hilit ... age#p78594
Segundo wrote:
21 Jan 2023, 20:38
As for DirectStorage, will all SSDs be compatible with that or do I need an SSD specifically designed with DirectStorage in mind?
As far as I'm aware, any relatively recent NVMe SSD should be compatible, but DirectStorage is in its early days and has to be implemented at the engine-level per game directly by the devs to function optimally as far as I understand it.

Forspoken will be the first PC game to implement it:
https://www.windowscentral.com/gaming/p ... ge-support

Refer to the below for official information on DirectStorage:
https://devblogs.microsoft.com/directx/ ... ing-to-pc/
(jorimt: /jor-uhm-tee/)
Author: Blur Busters "G-SYNC 101" Series

Displays: ASUS PG279QM, LG 48CX VR: Valve Index, HP Reverb G2 OS: Windows 11 Pro MB: ASUS Z690 Extreme CPU: Intel i9-12900k GPU: GIGABYTE RTX 4090 GAMING OC RAM: 32GB G.SKILL Trident Z5 @5600MHz SSD: 1TB Samsung 980 Pro (OS), 2TB WD_BLACK SN850 / 4TB Samsung 870 EVO (Games)

Segundo
Posts: 12
Joined: 17 Jan 2023, 01:59

Re: FreeSync isn't working properly on my Gigabyte M28U

Post by Segundo » 21 Jan 2023, 23:24

The Arkham series is infamous for open-world stutter on PC. it runs on the UE3 engine, which was prone to it, so any psuedo-open world or open world games on UE3 have the issue in varying degrees. Better specs and a more powerful system won't fix it, since it stems from the engine.

UE3 basically isn't great with real-time asset-streaming and texture management.
Ah okay, so even Origins, City, and Asylum aren't all that good. I ran benchmarks in City and Asylum, and I did see stuttering even in the benchmarks although my FPS was quite high. I played them for like 10 minutes on PC, and the performance seemed better than they were back in the PS3 era... though I played them like 5-10 years ago, so I don't know if I remember them that well. I mean, they looked much prettier at least, obviously. I thought more power would fix everything, but more power can't fix terrible optimization, I guess. I'll avoid UE3 games in general then, at least for testing purposes. Would you recommend games with newer engines like UE4? I suppose they can't be new too either though, so I'll have to find the middle ground in terms of old vs new.
Likely real-time shader compilation in some of those cases, yes.
That makes sense. It's good that's working as expected then.

The real-time variable refresh rate meters on FreeSync displays are approximated and aren't always exactly 1:1 from moment-to-moment, they also include LFC multiples in the number. As long as the readout is fluctuating, FreeSync is engaged and working as intended.
That's good to hear. What exactly do LFC multiples do, and how do they affect that number? Does that mean the number displayed on the FreeSync displays are slightly different than the actual refresh rates? I'll be sure to not match my refresh rates in the meters exactly to my FPS then, just make sure they're similar enough and that the refresh rate isn't staying at a constant 144 Hz and not actually acting as VRR for some reason. That's good that it is working as intended and engaged though.
Thanks! That's an interesting post for sure. So, even upgrading from an HDD to a SSD won't really fix issues. Wow, I suppose PC devs really need to improve this system somehow, but that's definitely difficult since there are so many different PC variations and some people are still using HDDs. I was considering getting the SN850X or SN850 too, so how is that? I started with a PCIE 3.0 Samsung 970 Evo Pro. I then returned that and upgraded to a PCIE 4.0 Crucial P5 Plus with almost double the read and write speeds. The silicon oil in the thermal pad seemed to have leaked out in only 4 days, so I returned that for a PCIE 4.0 Corsair MP600 Core. The read speeds are marginally better than my 970 Evo Pro, the write speeds are marginally worse, and it's only 1TB, so I'll probably return that too and get a new Crucial P5 Plus; the silicon oil thing is probably since I thought the mobo M2 heatsink would be good enough. Next time, I'll use the heatsink that actually comes with the P5 Plus and is presumably designed with it in mind. I don't really need 6000/5000 read/write speeds or PCIE 4.0, but the P5 Plus is only $20 more than the 970 Evo Pro even with the included heatsink. Might as well... I don't want to go over $200 for my SSD since the P5 Plus is about $180 and the Samsung is about $160, and the SN850 and SN850X are unfortunately both above $200. I found the SN850X for $183 with a heatsink on Best Buy, but that's unfortunately sold out.

Would a better SSD be necessary for anything? I'd imagine it's useful for transferring files locally. My download and upload speeds are likely limited by my Wi-Fi rather than my SSD, and I don't plan on upgrading my Wi-Fi anytime soon. This is only 300/300, but I practically only get 120/110 on my PC and 100/100 on other devices. I don't know why my Wi-Fi speed is so much lower, but I suppose it's probably spread out between all the devices in my household. I suppose it's also useful for video/game/photo/etc. rendering, editing, and streaming/recording videos. I do plan on recording and streaming from time to time as well as video editing. I plan on making a game at some point, so I think a top-of-the-line SSD might be beneficial in that regard.
As far as I'm aware, any relatively recent NVMe SSD should be compatible, but DirectStorage is in its early days and has to be implemented at the engine-level per game directly by the devs to function optimally as far as I understand it.

Forspoken will be the first PC game to implement it:
https://www.windowscentral.com/gaming/p ... ge-support

Refer to the below for official information on DirectStorage:
https://devblogs.microsoft.com/directx/ ... ing-to-pc/
I see, that makes sense. I'm glad DirectStorage is coming out relatively soon! I wasn't aware it has to be implemented at the engine level, so that's a shame. Still, I hope more games come out that support DirectStorage like Forspoken. I might give that a go and see how good DirectStorage really is!

User avatar
jorimt
Posts: 2065
Joined: 04 Nov 2016, 10:44
Location: USA

Re: FreeSync isn't working properly on my Gigabyte M28U

Post by jorimt » 22 Jan 2023, 11:33

Segundo wrote:
21 Jan 2023, 23:24
Would you recommend games with newer engines like UE4?
Not for testing stutter, because most games based on that engine are infamous for shader-compilation stutter.

The general rule is, if there's one or more games on your system where you experience minimal to no stutter, and other games where you do, then it's the games themselves, not your system, causing the majority of that stutter.
Segundo wrote:
21 Jan 2023, 23:24
What exactly do LFC multiples do, and how do they affect that number? Does that mean the number displayed on the FreeSync displays are slightly different than the actual refresh rates?
As per my article:
https://blurbusters.com/gsync/gsync101- ... ettings/2/
Minimum Refresh Range

Once the framerate reaches the approximate 36 and below mark, the G-SYNC module begins inserting duplicate refreshes per frame to maintain the panel’s minimum physical refresh rate, keep the display active, and smooth motion perception. If the framerate is at 36, the refresh rate will double to 72 Hz, at 18 frames, it will triple to 54 Hz, and so on. This behavior will continue down to 1 frame per second.

Regardless of the reported framerate and variable refresh rate of the display, the scanout speed will always be a match to the display’s current maximum refresh rate; 16.6ms @60Hz, 10ms @100 Hz, 6.9ms @144 Hz, and so on. G-SYNC’s ability to detach framerate and refresh rate from the scanout speed can have benefits such as faster frame delivery and reduced input lag on high refresh rate displays at lower fixed framerates (see G-SYNC 101: Hidden Benefits of High Refresh Rate G-SYNC).
Two notes on the above, however, 1) I called this behavior "Minimum Refresh Range," or MRR at the time, because it didn't have an official name. Only some time after I released my article AMD named it "LFC" (low framerate compensation), and 2) the "approximate 36 and below mark" only applied to the monitor I was testing (and some other native G-SYNC displays) at the time; the range at which LFC kicks in varies from monitor to monitor.

As for how LFC multiples displayed in the monitor OSD's refresh rate meter "affect" anything, they don't where the readout is concerned, it's just reflecting if your system is currently running at 30 FPS, LFC may be refreshing at 60 (x2), etc.
Segundo wrote:
21 Jan 2023, 23:24
I'll be sure to not match my refresh rates in the meters exactly to my FPS then
I'm not sure what you mean by this? The whole point of VRR is it automatically matches the variable refresh rate to the current framerate.

I.E. you shouldn't be trying to limit your framerate based on what your monitor OSD meter shows, but instead on your system-side average FPS readout (Afterburner, etc). The monitor OSD meter is, again, only really useful for ensuring VRR is active; static = not active, fluctuating = active.
Segundo wrote:
21 Jan 2023, 23:24
Would a better SSD be necessary for anything?
For gaming specifically? Not really.

For non-DirectStorage games (aka 100% of games before Forspoken), any middling SSD, even SATA, will provide 99% of the experience of a premium NVMe SSD where gaming performance is concerned, with marginal differences in load time speeds.
(jorimt: /jor-uhm-tee/)
Author: Blur Busters "G-SYNC 101" Series

Displays: ASUS PG279QM, LG 48CX VR: Valve Index, HP Reverb G2 OS: Windows 11 Pro MB: ASUS Z690 Extreme CPU: Intel i9-12900k GPU: GIGABYTE RTX 4090 GAMING OC RAM: 32GB G.SKILL Trident Z5 @5600MHz SSD: 1TB Samsung 980 Pro (OS), 2TB WD_BLACK SN850 / 4TB Samsung 870 EVO (Games)

Segundo
Posts: 12
Joined: 17 Jan 2023, 01:59

Re: FreeSync isn't working properly on my Gigabyte M28U

Post by Segundo » 22 Jan 2023, 19:33

Not for testing stutter, because most games based on that engine are infamous for shader-compilation stutter.

The general rule is, if there's one or more games on your system where you experience minimal to no stutter, and other games where you do, then it's the games themselves, not your system, causing the majority of that stutter.
Oh okay. So even UE4 games have issues too... it seems Unreal games in general have issues then haha. That makes sense though. So far, I've been playing a few games at max settings. Valorant and Siege are working very smoothly with minimal stutter for the most part, though they're both esports games and they're obviously not open-world or anything. Arkham Origins is typically fine except for after I disable Detective Mode when it microstutters and in the open world when I get massive framerate spikes, though that's expected since you said Origins is infamous for poor optimization. I'd say Arkham Knight works surprisingly well given how poorly optimized it is which I guess shows something is right about my system. It's bad in the open world, but I typically get a consistent 140 FPS otherwise. Minecraft Java is pretty consistent except when it's loading new chunks, though I know that game is horribly optimized since OpenGL is apparently not optimized very well for AMD and Java is just not good for games. I do have to massively lower my render distance, so I can't max out the settings; it seems MInecraft Java is CPU-limited though since my GPU is barely used. I'll try Minecraft Windows to compare. Neon White is also working very smoothly apart from the shader precompilation thing I mentioned, so I think I'll just load up a previous level and play that to get rid of any stutters (if that works anyway).

I foolishly assumed that one game stuttering means every game would stutter, but I see now that some games are just poorly optimized. I'll have to test even more games to see how they work. I did try all the games with built-in benchamrks that I own which was about 30-40 games. The average FPSes matched up with or were slightly higher than what they should be for a 6800XT based on sites like OpenBenchmarking except for a few games like Rise of the Tomb Raider and The Talos Principle. I'm not sure why those two are so much worse, but maybe they're CPU-limited or something or maybe OpenBenchmarking used different settings than me. I'll try to install Ubuntu (again and on a different SSD this time) to run all these benchmark suites automatically and see how they compare since most automatic benchmarks are not available on Windows.
As per my article:
https://blurbusters.com/gsync/gsync101- ... ettings/2/
Minimum Refresh Range

Once the framerate reaches the approximate 36 and below mark, the G-SYNC module begins inserting duplicate refreshes per frame to maintain the panel’s minimum physical refresh rate, keep the display active, and smooth motion perception. If the framerate is at 36, the refresh rate will double to 72 Hz, at 18 frames, it will triple to 54 Hz, and so on. This behavior will continue down to 1 frame per second.

Regardless of the reported framerate and variable refresh rate of the display, the scanout speed will always be a match to the display’s current maximum refresh rate; 16.6ms @60Hz, 10ms @100 Hz, 6.9ms @144 Hz, and so on. G-SYNC’s ability to detach framerate and refresh rate from the scanout speed can have benefits such as faster frame delivery and reduced input lag on high refresh rate displays at lower fixed framerates (see G-SYNC 101: Hidden Benefits of High Refresh Rate G-SYNC).
Two notes on the above, however, 1) I called this behavior "Minimum Refresh Range," or MRR at the time, because it didn't have an official name. Only some time after I released my article AMD named it "LFC" (low framerate compensation), and 2) the "approximate 36 and below mark" only applied to the monitor I was testing (and some other native G-SYNC displays) at the time; the range at which LFC kicks in varies from monitor to monitor.

As for how LFC multiples displayed in the monitor OSD's refresh rate meter "affect" anything, they don't where the readout is concerned, it's just reflecting if your system is currently running at 30 FPS, LFC may be refreshing at 60 (x2), etc.
Oh okay, that makes sense! I'll read through the rest of the article to get a deeper understanding of G-Sync and FreeSync since I think FreeSync is pretty similar to G-Sync for the most part. So, it's actually expected for my VRR to be 70-80Hz if my FPS is 30-40 FPS. I'm glad that's working as intended then. I believe my minimum is 48 since that's what it says in AMD Adrenalin and in the specs for my monitor, so it makes sense that LFC kicks in at 30-40 FPS. That's good the LFC matches up with the OSD refresh rate meter as expected though.
I'm not sure what you mean by this? The whole point of VRR is it automatically matches the variable refresh rate to the current framerate.

I.E. you shouldn't be trying to limit your framerate based on what your monitor OSD meter shows, but instead on your system-side average FPS readout (Afterburner, etc). The monitor OSD meter is, again, only really useful for ensuring VRR is active; static = not active, fluctuating = active.
That's my bad. I misspoke there. Actually, I was very tired yesterday, so I'm not even sure what I was trying to say to be honest haha. I think I was saying that I'll stop caring if the refresh rates in the meters exactly match up to my FPS since that's not gonna happen and that's not supposed to happen.
For gaming specifically? Not really.

For non-DirectStorage games (aka 100% of games before Forspoken), any middling SSD, even SATA, will provide 99% of the experience of a premium NVMe SSD where gaming performance is concerned, with marginal differences in load time speeds.
That makes sense. I hope DirectStorage becomes supported by a lot more games though!

User avatar
jorimt
Posts: 2065
Joined: 04 Nov 2016, 10:44
Location: USA

Re: FreeSync isn't working properly on my Gigabyte M28U

Post by jorimt » 22 Jan 2023, 21:00

Segundo wrote:
22 Jan 2023, 19:33
The average FPSes matched up with or were slightly higher than what they should be for a 6800XT based on sites like OpenBenchmarking except for a few games like Rise of the Tomb Raider and The Talos Principle. I'm not sure why those two are so much worse, but maybe they're CPU-limited or something or maybe OpenBenchmarking used different settings than me.
Rise of the Tomb Raider has two APIs (DX11 and DX12), and Talos Princle has three (DX11, DX12, Vulkan). Perhaps you weren't benching them on the same APIs as the sites were.
Segundo wrote:
22 Jan 2023, 19:33
I'll read through the rest of the article to get a deeper understanding of G-Sync and FreeSync since I think FreeSync is pretty similar to G-Sync for the most part.
But for labeling differences here and there, fundamental VRR behavior is the effectively the same across G-SYNC, FreeSync, adaptive sync, etc.
(jorimt: /jor-uhm-tee/)
Author: Blur Busters "G-SYNC 101" Series

Displays: ASUS PG279QM, LG 48CX VR: Valve Index, HP Reverb G2 OS: Windows 11 Pro MB: ASUS Z690 Extreme CPU: Intel i9-12900k GPU: GIGABYTE RTX 4090 GAMING OC RAM: 32GB G.SKILL Trident Z5 @5600MHz SSD: 1TB Samsung 980 Pro (OS), 2TB WD_BLACK SN850 / 4TB Samsung 870 EVO (Games)

Segundo
Posts: 12
Joined: 17 Jan 2023, 01:59

Re: FreeSync isn't working properly on my Gigabyte M28U

Post by Segundo » 23 Jan 2023, 09:45

Rise of the Tomb Raider has two APIs (DX11 and DX12), and Talos Princle has three (DX11, DX12, Vulkan). Perhaps you weren't benching them on the same APIs as the sites were.
I’ll go back to those games and see if the APIs are different. I also did get them for free on Epic Games so maybe their performance on Epic Games is different somehow than Steam? I don’t see why it would be any different since they should be the same game, and I also benchmarked other games on Epic and they seemed fine.
But for labeling differences here and there, fundamental VRR behavior is the effectively the same across G-SYNC, FreeSync, adaptive sync, etc.

That’s good that they’re not too different then. I’ll definitely read through that article. Thank you for all the help!

User avatar
jorimt
Posts: 2065
Joined: 04 Nov 2016, 10:44
Location: USA

Re: FreeSync isn't working properly on my Gigabyte M28U

Post by jorimt » 23 Jan 2023, 11:15

Segundo wrote:
23 Jan 2023, 09:45
I’ll go back to those games and see if the APIs are different. I also did get them for free on Epic Games so maybe their performance on Epic Games is different somehow than Steam? I don’t see why it would be any different since they should be the same game, and I also benchmarked other games on Epic and they seemed fine.
1) To be clear, I mean those APIs are selectable in each game, and you may be testing the wrong ones compared to those sites, creating different results (AMD GPUs and drivers typically perform better with DX12 vs. DX11, for instance), and 2) no, the performance shouldn't be different on Epic vs. Steam (same game files in 99% of cases).
(jorimt: /jor-uhm-tee/)
Author: Blur Busters "G-SYNC 101" Series

Displays: ASUS PG279QM, LG 48CX VR: Valve Index, HP Reverb G2 OS: Windows 11 Pro MB: ASUS Z690 Extreme CPU: Intel i9-12900k GPU: GIGABYTE RTX 4090 GAMING OC RAM: 32GB G.SKILL Trident Z5 @5600MHz SSD: 1TB Samsung 980 Pro (OS), 2TB WD_BLACK SN850 / 4TB Samsung 870 EVO (Games)

Segundo
Posts: 12
Joined: 17 Jan 2023, 01:59

Re: FreeSync isn't working properly on my Gigabyte M28U

Post by Segundo » 23 Jan 2023, 12:13

jorimt wrote:
23 Jan 2023, 11:15
Segundo wrote:
23 Jan 2023, 09:45
I’ll go back to those games and see if the APIs are different. I also did get them for free on Epic Games so maybe their performance on Epic Games is different somehow than Steam? I don’t see why it would be any different since they should be the same game, and I also benchmarked other games on Epic and they seemed fine.
1) To be clear, I mean those APIs are selectable in each game, and you may be testing the wrong ones compared to those sites, creating different results (AMD GPUs and drivers typically perform better with DX12 vs. DX11, for instance), and 2) no, the performance shouldn't be different on Epic vs. Steam (same game files in 99% of cases).
Oh okay. I’ll go back to them and choose the API that matches up to the one they used. It’s all good the Epic and Steam versions are the same. I can still use that to compare to the Steam versions then since they’re the same.

Post Reply