Nvidia GPU GTX 20XX Series likely revealed @ Gamescom 20th

Breaking news in the gaming monitor industry! Press releases, new monitors, rumors. IPS, OLED, 144Hz, G-SYNC, Ultrawides, etc. Submit news you see online!
lossofmercy
Posts: 76
Joined: 06 Feb 2018, 18:00

Re: Nvidia GPU GTX 20XX Series likely revealed @ Gamescom 20

Post by lossofmercy » 21 Aug 2018, 17:02

darzo wrote:
lossofmercy wrote:Let's ignore for a bit that no serious developer is going to be developing exclusively for this technology until a console is created for it. And let's ignore for a bit that the games and effects will need to be developed around this tech to be impressive. Where would this tech be the most useful?
I think people are getting this part pretty wrong. One of the key things he mentioned is that this technology actually saves resources. As long as you're designing the physics of your game properly, presumably as you already should be, ray tracing is a technology that will unfold organically or largely on its own.
Yes, ray tracing is a lot easier for the developers. But it's still not getting implemented and taken advantage of in AAA games until it's available easily for consoles. Unless people really think that PC alone will be able to fund a 100 million dollar game.

open
Posts: 223
Joined: 02 Jul 2017, 20:46

Re: Nvidia GPU GTX 20XX Series likely revealed @ Gamescom 20

Post by open » 21 Aug 2018, 18:59

The advantage here is that it is an nvidia technology and they have developed an sdk for it that will make it pretty easy for developers to tap into. This is pretty much the main reason I like nvidia. They will undoubtedly reach out to game developers as well and offer assistance to get their tech in big games. They tried a similar strategy when they amped up their tesselation shader performance and you could see how they can be used for things like hair works but I think that the potential of tesselation shaders has not really been tapped into yet and it requires some creativity. Ray tracing should offer more straightforward applications as well as a generally easier process of integration into modern games. And the motivations for developers to use raytracing should be alot easier to come by. Where special tech using tesselation shaders would involve developing 2 branches of rendering and might not be worth the time, raytracing could easily plug into existing games and wouldn't involve simultaneously developing complex alternatives for non-supporting hardware. I think we will see it in limited amounts but some of the uses should be very impressive and wont detract from performance too much.

darzo
Posts: 211
Joined: 12 Aug 2017, 12:26

Re: Nvidia GPU GTX 20XX Series likely revealed @ Gamescom 20

Post by darzo » 21 Aug 2018, 22:48

What's your background, open?
open wrote:I think we will see it in limited amounts but some of the uses should be very impressive and wont detract from performance too much.
So what does it mean that a demo of a game is showing twice as much fps?

darzo
Posts: 211
Joined: 12 Aug 2017, 12:26

Re: Nvidia GPU GTX 20XX Series likely revealed @ Gamescom 20

Post by darzo » 21 Aug 2018, 23:40

https://www.techradar.com/reviews/nvidi ... tx-2080-ti
Nvidia GeForce RTX 2080 Ti is a whopper of a graphics card that’s finally able to push 60fps 4K gaming with a single GPU.
I'm numb to this sort of reviewer stupidity at this point. In my opinion some of these people should pay rather than be payed for their employment.
This GPU also features two additional types of cores its predecessor never had in the form of RT and Tensor cores. The RT cores power ray tracing and, though Nvidia has yet to reveal how many of these RT cores are actually in the RTX 2080 Ti, they will supposedly allow this graphics card to render much more complex lighting scenarios and natural shadows than the 1080 Ti ever could.

Meanwhile, Tensor cores bring artificial intelligence to consumer graphics cards. Nvidia demoed how the neural network inside the RTX 2080 Ti can automatically colorize images and add detail to create super resolution images.
I just read another mention of these different types of cores on a different site, except there it was in the context that these cores replace more CUDA cores that could've been added to improve performance. Needless to say at this point I'm considering picking up a used 1080 ti and not rewarding Nvidia with more money, then await 7nm.
Although we haven’t had the chance to benchmark the card thoroughly, we did get to play multiple PC games at 4K and in excess of 60 frames per second (fps) with the RTX 2080 Ti at Nvidia’s GeForce Gaming Celebration event at Gamescom 2018.

Unfortunately, due to multiple non-disclosure agreements we signed, we can only tell you that Shadow of the Tomb Raider looks stunning with ray tracing turned on. Thanks to Nvidia’s RTX technology, shadows do indeed look more realistic – with different intensities everywhere depicted in a stony ruin in the rainforest. We also just gawked at the walls, looking them shimmer as light reflected and refracted off of them.

In terms of frame rate, Shadow of the Tomb Raider ran at a mostly consistent 50-57 fps, which is impressive giving the game is running on a single GPU and in such an early state – on top of all the new ray tracing techniques.
This rather strongly suggests an improvement of <=20% rather than 40%-50%, and despite the dedicated cores ray tracing might be taking a toll on the entire graphics card.
We also played a variety of other PC games that shall not be named, and saw performance run in excess of 100 fps at 4K and Ultra settings. Unfortunately, we also don’t know how much power these GPUs had to draw to reach this level of performance.
Does Fortnite have ultra settings? People who have 1080 tis can really help giving these numbers a good context. Hell, I can reach like 170 fps and play at 144 fps on 4k 100% render scale low settings in Overwatch with a 1080!

http://www.pcgameshardware.de/Grafikkar ... r-1263244/
https://www.pcgamesn.com/nvidia-rtx-2080-ti-hands-on

This keeps getting better. I can't believe that second article. Apparently multiple sources claim well under 60 fps at 1920x1080 with ray tracing. What a debacle.

mello
Posts: 251
Joined: 31 Jan 2014, 04:24

Re: Nvidia GPU GTX 20XX Series likely revealed @ Gamescom 20

Post by mello » 22 Aug 2018, 02:39

darzo wrote:https://www.techradar.com/reviews/nvidi ... tx-2080-ti
Nvidia GeForce RTX 2080 Ti is a whopper of a graphics card that’s finally able to push 60fps 4K gaming with a single GPU.
I'm numb to this sort of reviewer stupidity at this point. In my opinion some of these people should pay rather than be payed for their employment.
NDA. They simply can't give you a proper review or say anything at this point. They can tease a little tho.
darzo wrote:
This GPU also features two additional types of cores its predecessor never had in the form of RT and Tensor cores. The RT cores power ray tracing and, though Nvidia has yet to reveal how many of these RT cores are actually in the RTX 2080 Ti, they will supposedly allow this graphics card to render much more complex lighting scenarios and natural shadows than the 1080 Ti ever could.

Meanwhile, Tensor cores bring artificial intelligence to consumer graphics cards. Nvidia demoed how the neural network inside the RTX 2080 Ti can automatically colorize images and add detail to create super resolution images.
I just read another mention of these different types of cores on a different site, except there it was in the context that these cores replace more CUDA cores that could've been added to improve performance. Needless to say at this point I'm considering picking up a used 1080 ti and not rewarding Nvidia with more money, then await 7nm.
Keep in mind the fact that these chips are much larger compared to Pascal, so when you look at that this way, you are not really overpaying for Turing, as long as performance will be point. For me it depends on how good 20x0 are WITHOUT RT while compared to 10x0 series. But it might be wise to wait for 7nm process, both from NVIDIA and AMD, and RT implementation should be also much better when 7nm rolls out on the market. My personal opinion, **wait for Cyberpunk 2077** (rumored to be released in middle 2019 or 2020) and then buy the strongest GPU available.
darzo wrote:
Although we haven’t had the chance to benchmark the card thoroughly, we did get to play multiple PC games at 4K and in excess of 60 frames per second (fps) with the RTX 2080 Ti at Nvidia’s GeForce Gaming Celebration event at Gamescom 2018.

Unfortunately, due to multiple non-disclosure agreements we signed, we can only tell you that Shadow of the Tomb Raider looks stunning with ray tracing turned on. Thanks to Nvidia’s RTX technology, shadows do indeed look more realistic – with different intensities everywhere depicted in a stony ruin in the rainforest. We also just gawked at the walls, looking them shimmer as light reflected and refracted off of them.

In terms of frame rate, Shadow of the Tomb Raider ran at a mostly consistent 50-57 fps, which is impressive giving the game is running on a single GPU and in such an early state – on top of all the new ray tracing techniques.
This rather strongly suggests an improvement of <=20% rather than 40%-50%, and despite the dedicated cores ray tracing might be taking a toll on the entire graphics card.
Poor optimization for SotTR confirmed at this point, better wait for final release/patches/drivers instead of jumping to conclusions based solely on this.
darzo wrote:
We also played a variety of other PC games that shall not be named, and saw performance run in excess of 100 fps at 4K and Ultra settings. Unfortunately, we also don’t know how much power these GPUs had to draw to reach this level of performance.
Does Fortnite have ultra settings? People who have 1080 tis can really help giving these numbers a good context. Hell, I can reach like 170 fps and play at 144 fps on 4k 100% render scale low settings in Overwatch with a 1080!

http://www.pcgameshardware.de/Grafikkar ... r-1263244/
https://www.pcgamesn.com/nvidia-rtx-2080-ti-hands-on

This keeps getting better. I can't believe that second article. Apparently multiple sources claim well under 60 fps at 1920x1080 with ray tracing. What a debacle.
At this point ray tracing is completely irrelevant and still will be for some time. You will able to turn it off in every game that supports it. What matters is a performance in NON RT scenarios compared to 10x0 series.

open
Posts: 223
Joined: 02 Jul 2017, 20:46

Re: Nvidia GPU GTX 20XX Series likely revealed @ Gamescom 20

Post by open » 22 Aug 2018, 08:44

We just have to wait until trusted sources can do thorough benchmarks. Unfortunately we don't really know what the best option would be. If I were buying I would just get a 1080 ti. They are overstocked right now and that has made the prices finnaly fall. As well the 2080 ti is a gamble. When you buy the card you are paying a premuim for the silicon used for rt and tensor cores. It is likely that even with some architecture improvements the 20 series will offer a worse performance to price ratio.

But these are my theorys. When the cards come out or the ndas are lifted then we will find out.

mello
Posts: 251
Joined: 31 Jan 2014, 04:24

Re: Nvidia GPU GTX 20XX Series likely revealed @ Gamescom 20

Post by mello » 22 Aug 2018, 09:16

Also, black friday is not that far away, so this is another reason that someone might want to wait, because you might get a great deal on 1080 Ti for example.

BlurZapper
Posts: 15
Joined: 12 Sep 2017, 21:31

Re: Nvidia GPU GTX 20XX Series likely revealed @ Gamescom 20

Post by BlurZapper » 22 Aug 2018, 18:08

So there is a graph Nvidia presented showing the performance increase of the 2080 vs the 1080. The graph pretty much shows that of the 6 or so games benchmarked, 2080 has at least 1.2x more performance, and in 90% of cases 1.3x performance, and in some cases 1.5x performance. The catch here though is that there is another data benchmark of all games - 2080 using DLSS (Deep Learning Super Sampling) as aliasing. All the games get about 2x performance versus 1080. This is probably due to the Ai Tensor cores being utilized for aliasing instead of the shader cores. This is nice to see.

One question I have though is how will RTRT hamper these performance benchmarks, since it requires the Tensor cores which DLSS would be hogging. I can imagine a compromise that DLSS and RTRT will still give a net performance gain from the prior series GPU, not to mention better looking images and light generation/reflection.

On the off note I love this new PG279Q IPS monitor, it may be 3 years old but I am impressed, especially that ULMB doesn't hamper the image quality at all. Previous monitor didn't have ULMB or Gsync, both are great.

Got a pre-order for the 2080, will be exciting times soon.

(TLDR benchmarks of pre-RTX GPU's don't really mean anything anymore since Aliasing can be sidelined away via Tensor cores. Earlier people in the thread said RTX cards "only have ~20% more cuda cores i.e. less than 20% fps gain", but this doesn't matter since Aliasing is no longer a tax on the GPU cuda core. Make sense?)

darzo
Posts: 211
Joined: 12 Aug 2017, 12:26

Re: Nvidia GPU GTX 20XX Series likely revealed @ Gamescom 20

Post by darzo » 22 Aug 2018, 22:31

It does make sense. But if you take anti-aliasing out of the equation it appears to be slower than a 1080 ti. Do you have any sources for further information?

darzo
Posts: 211
Joined: 12 Aug 2017, 12:26

Re: Nvidia GPU GTX 20XX Series likely revealed @ Gamescom 20

Post by darzo » 23 Aug 2018, 14:24

https://www.extremetech.com/gaming/2759 ... rong-cards

This person argues against the value of the new cards but ironically presents them to be significantly better than I expected, DLSS and ray tracing aside (the latter being in absolute gimmick territory given the pathetic Tomb Raider demo, although that didn't stop those clowns from making it the focus even of performance). A number of sources also present your claims as lower. The 2080 may on average be 40% faster than the 1080. Furthermore, and this surprises me, according to actual tests the 1080ti turns out to be only 27% rather than 37% better than a 1080, which contradicts what Nvidia has claimed and may also cast doubt on how they're presenting this new generation. To be honest that seems too low in general. I haven't heard people complaining about upgrading to the 1080ti.

Post Reply