Need for Speed: Rivals capped @30fps on pc

Everything about displays and monitors. 120Hz, 144Hz, 240Hz, 4K, 1440p, input lag, display shopping, monitor purchase decisions, compare, versus, debate, and more. Questions? Just ask!
User avatar
nimbulan
Posts: 323
Joined: 29 Dec 2013, 23:32
Location: Oregon

Re: Need for Speed: Rivals capped @30fps on pc

Post by nimbulan » 02 Jan 2014, 19:10

Ahigh wrote:You maximum framerate in any racing game should be the framerate that you can hit 99.9% of the time. And when you do miss framerate, that is when you want to do something like "Adaptive Sync." Going from 60fps to 30fps and back to 60fps is a horrible experience. G-Sync solves this and so does Adaptive Sync to a lesser extent. But locking at 30Hz for a driving game where you can basically drive anywhere is not a surprising solution to a very difficult problem.
Are jarring 30<->60 fps transitions really still an issue? The last game I played that I can remember not supporting triple buffering was FEAR back in 2005 and I play a LOT of games. There are probably a few others that aren't graphically intensive enough to ever drop below 60 fps even on moderate hardware.

I'm 99% sure that they locked the framerate at 30 is for simulation consistency rather than visual smoothness, hence why the game's simulation speed is tied to the framerate. A predictable constant framerate is much easier to manage from physics simulation standpoint than a variable framerate. Since the console versions would not be able to maintain 60 fps at all times, 30 fps is the next choice.

That said, there is a partial solution for this game. You increase the framerate cap and the simulation framerate, but the game speed will still change if the framerate ever deviates from that value. See PC Gaming Wiki.

User avatar
trey31
Posts: 146
Joined: 23 Dec 2013, 19:17

Re: Need for Speed: Rivals capped @30fps on pc

Post by trey31 » 02 Jan 2014, 19:57

Ahigh wrote:I put so much work into making Rush 2049 coin-op be able to run at a solid 60hz in solo mode with no drones (phantom mode). And when you play it in linked mode, it locks to 30hz too. So there ya go!
Rush?! Wow, I remember that series. Wasn't the first one supposed to debut on Jaguar? Found memories of that system. Tempest, AvP, Pitfall Mayan Adventure were all wonderful. Everything on Jaguar just looked better than the others did. I miss mine. Lost in a move years ago. :(
Ahigh wrote:My point about framerates, though, is that a racing game that stays on rails can target framerate more easily than a roam-anywhere style of game like GTA. It's apples and oranges. No matter what the framerate ends up being, when you are roaming the world and especially if you are going to have fires, explosions, and lots of stuff going on _sometimes_ and other times not much going on, you're talking about variable framerate by game design.
After first seeing this issue elsewhere, I immediately thought of Burnout Paradise which is 6 years old and runs at 60fps. How it does that probably has no relation to Rivals I'm sure, but it did have a similar drive anywhere/do anything style like Rivals.
Ahigh wrote:I wish the world were as simple as some people see it when it comes to dealing with these issues as a designer.
I have a friend that works for an indie developer. He laughs at me when I make suggestions or critiques. :roll:

User avatar
Ahigh
Posts: 95
Joined: 17 Dec 2013, 19:22

Re: Need for Speed: Rivals capped @30fps on pc

Post by Ahigh » 02 Jan 2014, 21:50

nimbulan wrote:Are jarring 30<->60 fps transitions really still an issue? The last game I played that I can remember not supporting triple buffering was FEAR back in 2005 and I play a LOT of games. There are probably a few others that aren't graphically intensive enough to ever drop below 60 fps even on moderate hardware.
If they weren't there would be zero market for G-Sync. The issue was first addressed by free-running (not sync'ing to the verical refresh) then later by what NVidia calls adaptive V-sync, and now with G-Sync.

The first game I ever saw that didn't sync to vertical retrace was Hydro Thunder. I went down to see that game in San Diego before Rush 2049 was released, and I saw the tearing. When I saw the tearing, I asked lead programmer at the time Stephen Ranck what framerate he was running at, and he admitted it was variable around 45fps. He then described that he didn't even lock to the vertical retrace. My instant reaction was "I never thought that was an option for production."

That was the moment I began working on what NVidia now calls adaptive v-sync for Rush 2049.

But the 30/60 jarring effect only happens when you sync to vertical refresh. The way it has generally been handled by PC game developers is to just push the framerate issue off onto the player. Let them choose whether to sync or not and let them upgrade their machine until they get a framerate that they are happy with.

But for coin-op games, it's my opinion that solid locked sync'd 60 frames per second, certainly in 1999 and earlier, is your goal. There are no options for the player to tweak things and all your hardware is the same.

Even console games at that time were not really powerful enough for this to be much of an option.

Very few 3d console games could muster 60 frames per second back then.

Short answer is that this has always been a problem before G-sync (temporal accuracy of the data being presented to the player at the right time).

User avatar
nimbulan
Posts: 323
Joined: 29 Dec 2013, 23:32
Location: Oregon

Re: Need for Speed: Rivals capped @30fps on pc

Post by nimbulan » 02 Jan 2014, 23:55

Ahigh wrote:If they weren't there would be zero market for G-Sync. The issue was first addressed by free-running (not sync'ing to the verical refresh) then later by what NVidia calls adaptive V-sync, and now with G-Sync.

...

But the 30/60 jarring effect only happens when you sync to vertical refresh. The way it has generally been handled by PC game developers is to just push the framerate issue off onto the player. Let them choose whether to sync or not and let them upgrade their machine until they get a framerate that they are happy with.
This is only an issue with double buffered vsync, which is very rare these days. Like I said before I can't remember running into any games where this was an issue since 2005. Triple buffering eliminates the jarring transition and replaces it with microstutter which varies in severity depending on the game. I've run into many games that handle this surprisingly smoothly (Source engine games, most Unreal engine games, recently Tomb Raider) and some that handle it very poorly (Far Cry 3, Guild Wars 2, Serious Sam games). It's this microstuttering that G-sync aims to solve since the current choice is between screen tearing and microstutter.

User avatar
Ahigh
Posts: 95
Joined: 17 Dec 2013, 19:22

Re: Need for Speed: Rivals capped @30fps on pc

Post by Ahigh » 03 Jan 2014, 00:14

nimbulan wrote:
Ahigh wrote:If they weren't there would be zero market for G-Sync. The issue was first addressed by free-running (not sync'ing to the verical refresh) then later by what NVidia calls adaptive V-sync, and now with G-Sync.

...

But the 30/60 jarring effect only happens when you sync to vertical refresh. The way it has generally been handled by PC game developers is to just push the framerate issue off onto the player. Let them choose whether to sync or not and let them upgrade their machine until they get a framerate that they are happy with.
This is only an issue with double buffered vsync, which is very rare these days. Like I said before I can't remember running into any games where this was an issue since 2005. Triple buffering eliminates the jarring transition and replaces it with microstutter which varies in severity depending on the game. I've run into many games that handle this surprisingly smoothly (Source engine games, most Unreal engine games, recently Tomb Raider) and some that handle it very poorly (Far Cry 3, Guild Wars 2, Serious Sam games). It's this microstuttering that G-sync aims to solve since the current choice is between screen tearing and microstutter.
I had to look up microstuttering as I wasn't already familiar with this term. But yeah, it appears that this is an oscillation effect of changing the performance of the game to meet framerate in a triple-buffered system and getting slightly different frame-dt's in your game update loop or graphics engine. I've never felt like triple buffering was anything but a patch that adds more latency to the problems, which I'm not sure I would want for the type of games that I play anyway. But I could imagine it good for a 30fps game.

In any case, thanks for the modern summary as I've been out of the video game programming circuit for almost 5 years now if you can believe that. I've been doing 3d programming, but all double buffered v-sync stuff, and very simple by video game standards as I've been working for the largest slot machine manufacturer for the last 4 years.

Now I feel old. Anyway, thanks for the update on micro-stuttering. I've never messed with crossfire or SLI. But back in the early 90's I had one of the very first dual-raster-manager RE-II's (early days of the same concept as SLI and crossfire) with the MCO option able to drive 6 displays at a time. That machine cost about a cool million bucks, and wasn't even as powerful as today's $1000 cards. LOL.

User avatar
nimbulan
Posts: 323
Joined: 29 Dec 2013, 23:32
Location: Oregon

Re: Need for Speed: Rivals capped @30fps on pc

Post by nimbulan » 03 Jan 2014, 01:45

Ahigh wrote:I had to look up microstuttering as I wasn't already familiar with this term. But yeah, it appears that this is an oscillation effect of changing the performance of the game to meet framerate in a triple-buffered system and getting slightly different frame-dt's in your game update loop or graphics engine. I've never felt like triple buffering was anything but a patch that adds more latency to the problems, which I'm not sure I would want for the type of games that I play anyway. But I could imagine it good for a 30fps game.

In any case, thanks for the modern summary as I've been out of the video game programming circuit for almost 5 years now if you can believe that. I've been doing 3d programming, but all double buffered v-sync stuff, and very simple by video game standards as I've been working for the largest slot machine manufacturer for the last 4 years.

Now I feel old. Anyway, thanks for the update on micro-stuttering. I've never messed with crossfire or SLI. But back in the early 90's I had one of the very first dual-raster-manager RE-II's (early days of the same concept as SLI and crossfire) with the MCO option able to drive 6 displays at a time. That machine cost about a cool million bucks, and wasn't even as powerful as today's $1000 cards. LOL.
That's basically what triple buffer microstuttering is, a quick oscillation between 30 and 60 fps. When a game engine handles it well, it compensates for rapid change in frame times by adjusting the simulation time accordingly so in a best case, you should hardly notice the stuttering. Also when well-implemented, the additional input lag is pretty minor though this varies widely depending on the PC being used. I myself have been lucky enough to have only had significant input lag using triple buffered vsync in 2 or 3 games in the past 10 years. Some people are not so lucky and have very bad input lag in most games. There's also the problem of games that don't handle triple buffering well and stutter like there's no tomorrow.

Triple buffering still is exactly what you said: a patch to address the classic double buffering problem of jarring 30<->60 fps transitions. Hopefully that will be obsolete soon.

I have never messed with SLI or Crossfire either, though I have heard that using multiple video cards tends to make microstuttering worse. After hearing all the problems one of my friends has had over the years with SLI though, I don't think it's worth the trouble.

Post Reply