Chief Blur Buster wrote: ↑23 Jan 2025, 21:09
Not yet.
Future versions of GPUs is currently predicted (by the ~7000 series, possibly ~6000 series RTX -- and any AMD equivalent) to have scene-editing capabilities (AI and non-AI, framegen and non-framegen) since the ginormous compute demands of rendering one ray traced photorealistic frame is extreme.
The art of modifying a pre-existing rendered frame, to update positionals, will eventually become cheaper than having to re-render (by 2030) a 4K 1000fps 1000Hz frame or, heaven forbid (by 2040) an 8K 2000fps 2000Hz frame.
Scene editing is going to get very interesting. One option I haven't seen mentioned yet is already partially implemented as RTX remix:
https://www.nvidia.com/en-us/geforce/rtx-remix/
Everyone is talking about fake frames but RTX remix shows an option of Nvidia stepping in between the game and the hardware and simply replacing the rendering pipeline with whatever they want. Currently RTX remix allows old games without programmable shaders to be pathtraced and have things like DLSS and framegen enabled in addition to replacing assets wholesale.
A very near potential future could allow such a platform to simply interpolate or extrapolate not the rendered image but rather the 3D assets and items in the game world entirely after and independent of the CPU gameloop and potentially even add responses to user input (like changing camera position) well after all CPU tasks are completed and use that to render a "real" frame at a far lower latency and potentially higher framerate than could otherwise be achieved.
While there are mountains of difficulties in creating such a system, Nvidia has good engineers.
This approach has the potential to sidestep many of the issues of current framegen and rendering processes and offload far more work onto the GPU. The beauty is that such an approach essentially replicates the interpolation used to make smooth animations in games for decades. It allows for both the elimination of most of the artifacts that plague framegen today while not requiring any change to the game itself.
We're seeing the development of (still rather single threaded) CPU performance in games cause numerous issues in simply trying to make games run at 60FPS let alone acceptable framerates. This approach could sidestep that issue.
My second remark is around issues of temporal fidelity and benchmarking/objective measurement.
I have issues with the way performance is being discussed especially in light of framegen. I want to start by saying that framegen and interpolation has a significant place in the future development of graphics rendering. I use it every day and I'm working on the technology myself. I think it can bring tremendous value and create far better results when the native input framerate can not be easily or effectively improved. It is however, in my opinion, always and inherently inferior to native high framerates. If it's only a little worse, that's still a win but it can never be as good, it can only get close.
This is to say generated frames can only mitigate the issues of low framerates, they can not solve it nor can they replace the need for native high framerates.
While the result of framegen will often create a far preferable result to native and the latency penalties are quite effectively mitigated, it seems disingenuous to describe the resulting framegen output as a comparable "framerate" value. Framegen brings an asterisk, and a large one at that, that should be acknowledged.
Temporal performance is getting more and more complex and the corollary is that it's getting more and more difficult to accurately and adequately describe and measure it. Even ignoring framegen, simple framerate or frametimes are woefully inadequate at explaining the temporal performance of modern games. For decades now we have games that run things like world physics or cloth physics, even animations of characters or environmental objects at a lower framerate than the actual rendering of the game. In Halo Infinite for example, aiming down the sight of a gun triggers a non-interpolated low framerate animation that differs in smoothness by an order of magnitude from the rendering of the game itself. Such disparities will only grow as the upper bounds for rendering performance climb.
We often see games like Quantum Break that have 24 or 30FPS cutscenes, which are more understandable when filmed in live action or prerendered, that entirely disrupt the smoothness of the otherwise high framerate presentation. The worse offenses are when those cutscenes are rendered in realtime and arbitrarily capped. I remember that Quantum Break had one such cutscene that "smoothly" transitioned into user controlled gameplay but the framerate remained capped at that terribly low framerate for a couple of seconds into that gameplay. Obviously that's a bug but it emphasizes the arbitrary degradation developers inflict on the temporal fidelity of the presentation.
I don't know how but for temporal fidelity to improve, we need buy in from the content creators themselves. We can only do so much to mitigate the harm that a bad developer, or filmmaker can inflict on temporal fidelity after the fact. Part of the problem is, it's a lot easier to advertise the static visuals of a game than the fidelity in motion but the issues are systemic.