RealNC wrote:I really don't care much about how a game looks. The most important thing is how a game plays. And with a lower frame rate, many games don't play well.
Jensen Huang can say "wow" all he wants, but when it comes down to it, if enabling RTX in a game cuts its frame rate in half, it is completely useless to me.
The bolded part is fine, but it's just one preference, yours. I love high FPS too, who doesn't. The question is "what would you pay for in FPS to get something that looks
real". So don't use it if you don't like the tradeoff. I'd personally say it depends on what type of game it is. A competitive shooter will have players running with barely any textures at low res. I mean, CounterStrike and more recently PubG are all very basic-looking for a reason.
Lots of people prefer higher quality with lower FPS, actually I would say the majority do. There's a point below which most consider things "unplayable" on PC which is often 60 fps, but these early implementations will improve.
Ray tracing optimizations are being iterated on every day, to reduce the number of rays cast per pixel while still getting the same (or better) image quality. In the past year alone I've read more peer-reviewed papers that show a 2-10x speed improvement in various ray-tracing related algorithms on the same hardware (e.g. photon mapping). The beauty of RTX and machine learning combined is that things will surely improve a great deal. Imagine comparing an early Geforce driver and then waiting several years to update it. You'll see a massive FPS boost most likely. The same thing is going to happen with ray tracing, except many of those optimisations will be done inside the engine, not the driver. But those are being added to Unreal and Unity all the time, with help from NVidia and Microsoft engineers.
I'd encourage people to read this link about several misconceptions gamers have (who aren't devs thus have a limited conception of the reality behind the scenes):
http://www.codercorner.com/blog/?p=2013
I've literally read on various forums all of these various "points" repeated ad nauseum. It's almost not even worth correcting them, because the industry is moving hard into ray tracing, even with consoles five years behind. They always were, and always will be. But hybrid rasterization + ray tracing games will get to the point of hitting minimum 60 fps at 4K (using various temporal upscalers and other tricks) sooner rather than later, mark my words. Reducing the number of rays cast below 1 spp is certainly possible through frequency analysis of the frame. So you cast more rays where it counts. And there are other "super secret" solutions coming down the pike that will also greatly amplify framerates even with full global illumination ray tracing activated.
Trust me, this is a huge deal. The image quality boost from GI is something we haven't seen in ages. It's far more important than HDR and that alone was a big leap in visual fidelity on its own. I agree that for some games image quality takes a back seat to gameplay but above 60hz, trust me, most people do not care. Sites like this one with high framerate nuts like us aren't representative of the norm.
All I'm saying is, you're free to use these new technologies, or not, as you wish. I don't see Skyrim benefitting much from running ultra-high FPS except in VR mode, for instance. Many adventure games are slower paced. But of course, eventually, you'll be able to get both ray tracing for indirect light and shadows and reflections without going below 60 fps much, if at all. It's really a new golden age of rendering coming on now.