Nvidia GPU GTX 20XX Series likely revealed @ Gamescom 20th

Breaking news in the gaming monitor industry! Press releases, new monitors, rumors. IPS, OLED, 144Hz, G-SYNC, Ultrawides, etc. Submit news you see online!
User avatar
RealNC
Site Admin
Posts: 2821
Joined: 24 Dec 2013, 18:32
Contact:

Re: Nvidia GPU GTX 20XX Series likely revealed @ Gamescom 20

Post by RealNC » 21 Aug 2018, 08:44

darzo wrote:That CEO of theirs is a sleazy guy. He lied about the release, kept pulling that Titan trick, and spoke exclusively about ray tracing, even branding the cards as 6x faster in reference to it. I'm hopeful the recent rumors won't be wildly wrong but given I'm short on money I'll be waiting a little. He likes to portray himself as your common guy but I'm starting to dislike that man.
You forgot "wow, wow. Wow. Wow. Wow. Wow. Wow."

Yeah, he makes it very, very difficult to like him.

EDIT:

That came off stronger than I intended. Basically I should instead say "he makes it very difficult to care much about what is being said."
TwitterSteamGitHubStack Overflow
The views and opinions expressed in my posts are my own and do not necessarily reflect the official policy or position of Blur Busters.

User avatar
Chief Blur Buster
Site Admin
Posts: 6480
Joined: 05 Dec 2013, 15:44

Re: Nvidia GPU GTX 20XX Series likely revealed @ Gamescom 20

Post by Chief Blur Buster » 21 Aug 2018, 12:31

While we respect members rights to opinions -- we don't want to burn Blur Busters' good relationship with NVIDIA and AMD who helps pays the bills indirectly through GSYNC and FreeSync as well as blur reduction innovations.

So on that note, let's give due kudos to the hard-working employees of companies, the programmers, the developers, the tweakers, who tries to milk the most out of technology. They may not be executives or board members or whatnot, but many people are very appreciated by Blur Busters in pushing the state of art wherever possible. Kudos where it counts.

That said, I'm happy to see slightly more sane prices after the Bitcoin mess. Hopefully that means much cheaper 1080 Ti's. I need another AORUS GeForce 1080 Ti Extreme 11GB card to go SLI for better VR performance for upcoming VR reviews.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

       To support Blur Busters:
       • Official List of Best Gaming Monitors
       • List of G-SYNC Monitors
       • List of FreeSync Monitors
       • List of Ultrawide Monitors

lossofmercy
Posts: 61
Joined: 06 Feb 2018, 18:00

Re: Nvidia GPU GTX 20XX Series likely revealed @ Gamescom 20

Post by lossofmercy » 21 Aug 2018, 13:30

Let's ignore for a bit that no serious developer is going to be developing exclusively for this technology until a console is created for it. And let's ignore for a bit that the games and effects will need to be developed around this tech to be impressive. Where would this tech be the most useful?

This really needs to get into a game with a lot of construction and destruction, like Rustl, Battlefield V, and Siege. Overall, the demos/trailers were very mediocre. The only one that was impressive was the Battlefield V due to the amount of reflective surfaces and the urban environment and everything being in motion. The Tomb Raider one was like whatever... the kid isn't even running around.

In the future, this could be really impressive with some really interesting sequences in a night club etc.

open
Posts: 223
Joined: 02 Jul 2017, 20:46

Re: Nvidia GPU GTX 20XX Series likely revealed @ Gamescom 20

Post by open » 21 Aug 2018, 14:14

I think some of it depends on how it is implemented on the gpu and in the drivers. I remember one of the gpu gems books showed off a technique where you could have the pixel shader do a ray tracing operation and have 3d data mapped to a 2d texture in game. So as long as the polygon in question didn't go too close to perpendicular to the screen plane you could fit alot more geometry data onto something that was completely flat before. GPUs have a history of becoming highly programmable and lending themselves to original and unintended techniques for both rendering and computation. I assume that the implementation of the ray tracing hardware will lend itself to these kind of techniques and we will see at least a few creative uses of them down the road a bit. But it could also just be a highly specialized set of hardware that is more difficult to adapt to new uses.

I don't think we can say the ceo is really a sleazy guy over the whole thing. Even if the cards are not a huge step up for current games it just represents nvidia's philosophy as it has changed in the recent years. As a company they have clearly diversified their focus. Investing heavily in areas like self driving cars and ai. With this architecture they have gone with a heavy focus on ray tracing and the performance increase is substantial in that regard. The clear advantage for them here is that they can expand in the non-realtime rendering market. It's an area that will be very important for their quadro cards since they are using the same architecture.

Looking at them as a company from the longer term until now they have been distinguished most as an aggressive company that is always looking for a way to keep advancing when they are already ahead. This is just what we are seeing. Vega didn't challenge their position at the top in gaming so they looked in other directions. But if you compare them to companies like intel they still look pretty good from a consumers standpoint. Intel continue to offer top tier products but there has been significant stagnation on their part. Alot of intels products have had criticism from the gaming and enthusiast market over decisions that seemed to limit life, performance, and uitity. These kind of decisions were probably made intentionally. nvidia on the other hand, while they ARE trying to dominate the market, are doing it by offering more incentives rather than trying to control it by disincentivizing their products as intel have chosen to do.

This time around we MAY have seen the traditional gaming market somewhat overlooked which MAY be a bad thing for nvidia given the possible timing of a significant new tech by amd. amd have proven that they can find very effective use of their r/d budget and should not be discounted from the market. This time around we have their strategy coming full circle to hit many markets with a chain of zen, infinity fabric, and navi technologies all working together. amd have distinguished themselves as a smart company and nvidia have distinguished themselves as an aggressive company. Looking at what other companies are doing not just compared to gaming oriented markets but in the world in general, I don't think we can really say that either company is doing something that is bad. Rest assured that if nvidia have diversified their interest this time around, it does not mean that they intend to stagnate in the gaming sector. From a revenue standpoint gaming is still a massive part of their business. And from a technology standpoint it has really been the foundation of them being able to diversify so well. I think we are just seeing part of a cycle where they focus on core technologies, then diversify and pioneer new tech, and then go back to the core.
Last edited by open on 21 Aug 2018, 14:18, edited 2 times in total.

darzo
Posts: 204
Joined: 12 Aug 2017, 12:26

Re: Nvidia GPU GTX 20XX Series likely revealed @ Gamescom 20

Post by darzo » 21 Aug 2018, 14:16

Looking at the prices of Nvidia's partners the speculation that most cards will be cheaper than the Founder's Edition is of course wrong (Nvidia is significantly overclocking and cooling there finally). The tis will be around $1200, the 2080 around $800 ($850 in a number of cases). The big surprise is the 2070. The Founder's Edition is $600! I think the tis and the 80s might be priced the same as the previous generation (ti=Titan) but I recall buying a 970 for $350 and I think the 1070s were in that ballpark unless I'm mistaken, around $400-450 maybe? Mining does appear to have passed so at least these prices shouldn't be inflated.

This guy doesn't mince words:

https://www.forbes.com/sites/jasonevang ... ype-train/
The reality of the RTX 20 Series that releases next month is this: it's a money-grab designed to get early adopters on the ray tracing hype train for the 20 or so games that will ship with the feature. It's a stopgap to next year's 7nm cards which will offer substantial performance gains and power efficiency improvements. And as for the price tag, Nvidia can charge whatever it desires due to lack of competition in the high-end space.

Seriously, glance at the clock speeds for the 20 Series. Check out the unimpressive CUDA core increase over the 10 Series. Realize that the memory configuration is the same as the 10 Series (albeit with GDDR6 instead of GDDR5). Take a hard look at what the performance increase will be for . Most in the tech media are putting it at maybe 10% to 15% over the 10 Series when it comes to the majority of games out there. But you'll pay 40% to 50% higher prices for this generation's replacements based on MSRP. And you know we won't be paying MSRP. . .
I'm not sure who he's referring to in the performance increase estimations (I've read that the GDDR6 specs are a jump) and I think he's plain wrong about the prices, except in the case of the 2070.
If you really want an RTX 20 series card, there's a strong chance that Nvidia will release 7nm versions in 2019 (RTX uses the 12nm process) with substantial performance gains, improved power efficiency and likely double the GDDR6 memory capacity. Why? Because that's likely the point when AMD will have some kind of competitor at the high end, and Nvidia will want to leapfrog them. It's also when we'll know many more details about Intel's upcoming Arctic Sound gaming GPUs slated for 2020.
I do keep reading about 7nm AMD cards late next year and if those will really be a big deal then Nvidia will be compelled to respond. We'll see. I would be really surprised if it's true that the 20xxs are only up to 20% faster than the 10xxs. I'm still expecting 40%-50% per the rumors. That's going to be the key. The benefiting off of early pre-orders logic doesn't come off as smart. Nvidia's reputation at this point is much more valuable than the little bit of money they'd make in a limited early sale of cards. I've read that the available pre-order units are actually pretty few. If it in fact turns out these cards are only <=20% faster I'll be one of many to remember what happened here. Of course AMD would actually need to produce a card that isn't trying to launch into orbit while performing worse and to support it with forget-about-it drivers (uhm, my monitors are gsync... they might've gotten us here). And even then Nvidia may beat them by a substantial margin. At that point I'd really like to see that guy off the stage though.
Ask yourself why Nvidia showed approximately zero gaming benchmarks without ray tracing. There were no performance comparisons between, say, the GTX 1080 Ti and the RTX 2080 Ti for the most popular games already out there.

When the GTX 1080 launched, Nvidia repeatedly reminded us how its performance dwarfed the GTX 980 in everything from AAA gaming to VR. When the GTX 1070 arrived, Nvidia boasted that it matched or beat the Titan X. That's a compelling upgrade argument.
This is what concerns me. That guy up there strikes me as more sleazy than a goof. Hopefully he just really likes ray tracing and didn't feel he'd need to mention actual performance (I still hate the 6x faster marketing, especially after reading the exact wording of it). Maybe we'll like it a lot too once we see it, although I'm also worried what it would do to fps. Keep in mind monitors have been outpacing graphics cards, not just the 4k 144hz new releases. I wasn't doing either 200 fps at 1080 (game set cap) or 165 fps at 1440 on low settings in Destiny 2 with my Asus Strix 1080, and the other components of my computer have so far held (5ghz 7700k, 3600 RAM).
lossofmercy wrote:Let's ignore for a bit that no serious developer is going to be developing exclusively for this technology until a console is created for it. And let's ignore for a bit that the games and effects will need to be developed around this tech to be impressive. Where would this tech be the most useful?
I think people are getting this part pretty wrong. One of the key things he mentioned is that this technology actually saves resources. As long as you're designing the physics of your game properly, presumably as you already should be, ray tracing is a technology that will unfold organically or largely on its own. I couldn't tell anything from Twitch, that doesn't mean you wouldn't on your own monitor. Consider the demo of the room with the window, how it was completely lit without ray tracing and largely dark with it. It's possible that ray tracing will both become standard and significantly improve what we see. The games listed were just games that will be out relatively soon. You'd expect future games to continue with ray tracing, dispensing with inferior and resource-consuming lighting techniques (although ironically this might be unrealistic given how few cards would support ray tracing; they'd have to support both).

This is similar to people discounting HDR monitors. Just because few games support HDR right now doesn't mean upcoming games won't have it. Interestingly I seem to be getting better picture with Wide Gamut in SDR vs HDR in Destiny 2 but that's for another thread.

Please stop with this philosophy change speculation. Diversifying has nothing to do with performance in gaming, unless you'd like to claim they don't care much for gaming as a result, which I find baseless. Them just being "tone deaf" with respect to catering to gamers is what would make the guy a goof. It's sleazy or goof, pick one. Again, I hope he knew better but given the cards will be an adequate performance improvement (40%-50%) he didn't feel the need to mention that. I also have no idea how you call AMD smart when their product releases have been disappointment after disappointment, which may be to the detriment of all of us except Nvidia.

open
Posts: 223
Joined: 02 Jul 2017, 20:46

Re: Nvidia GPU GTX 20XX Series likely revealed @ Gamescom 20

Post by open » 21 Aug 2018, 14:31

darzo wrote:I'm not sure who he's referring to in the performance increase estimations (I've read that the GDDR6 specs are a jump) and I think he's plain wrong about the prices, except in the case of the 2070.
I think you are right here. You cannot just take a core*clock comparison when techs like the new unified cache architecture are in play. Especially given that it is likely to synergize with the DDR6 upgrade well. There will be SOME extra improvements in performance. What we will ultimately see is a middle ground between the hyped up early benchmarks and the ultra conservative core counters. Where that ends up may well be in the range you predicted.

As for 7nm well the manufacturing companies are not quoting that big of total performance gains from going that small. And it is likely that yield rates will be an issue that will affect performance/price. It makes sense for amd to go for 7nm first given the die size differences and how die sizes also affect yield rates. That's probably why we see them hitting 7nm first. They are the more ideal client for this tech. All things considered both amd and nvidia will not see alot of differentiation through manufacturing.

darzo
Posts: 204
Joined: 12 Aug 2017, 12:26

Re: Nvidia GPU GTX 20XX Series likely revealed @ Gamescom 20

Post by darzo » 21 Aug 2018, 14:35

Most in the tech media are putting it at maybe 10% to 15% over the 10 Series
This is what really caught my eye. One, is it true that most in the tech media think this, two, on what basis?

So you're claiming AMD might hit 7nm first because they won't sell many cards? :?

open
Posts: 223
Joined: 02 Jul 2017, 20:46

Re: Nvidia GPU GTX 20XX Series likely revealed @ Gamescom 20

Post by open » 21 Aug 2018, 15:01

I'm saying that amd will hit 7nm first because they are a more idea client for the tech. Manufacturers sell multiple processes at once. The 7nm is newer and will affect yield rates to be lower. This is especially true at the beginning of its use. Since amd's die sizes will be smaller their yield rates will inherently be better so that helps counter balance the disadvantages of the tech. nvidia's die sizes are bigger and so they want to go with a slightly older tech that has better yield rates to counter balance the worse yield rates that they will have.

But the advantages of 7nm are not as big as they might seem at first. With each step we see lower gains in performance, efficiency, and size. Usually in a performance driven market like gaming all gains are used to offer more performance. And they still wont amount to that much. Furthermore it is likely that some of the gains provided by smaller process will be negated by the price charged by the manufacturer.

Both nvidia and amd are going with the process that makes the most sense for them and will in the end allow them to potentially offer the consumer the best deal as well.

darzo
Posts: 204
Joined: 12 Aug 2017, 12:26

Re: Nvidia GPU GTX 20XX Series likely revealed @ Gamescom 20

Post by darzo » 21 Aug 2018, 15:23

Why are their die sizes smaller? You also seem to be assuming that Nvidia won't go to 7nm with AMD whereas most others seem to believe Nvidia will do so.

darzo
Posts: 204
Joined: 12 Aug 2017, 12:26

Re: Nvidia GPU GTX 20XX Series likely revealed @ Gamescom 20

Post by darzo » 21 Aug 2018, 15:39

So about that claim I quoted yesterday about the 2080ti getting twice the fps of a 1080ti in a demo:

https://twitter.com/NVIDIAGeForce/statu ... 6100091904

Nvidia make this claim themselves, but the crux is the first response:
*Comparing a game with ''RTX'' settings

Or in other words: Turing cards outperform Pascal cards on games with RTX-settings enabled.

Yes. no shit.
Can ray tracing be enabled with non-RTX cards? I know that HDR can't be with non-HDR monitors.
Twice the performance on games supporting ray tracing and roughly 24% on games not supporting it from the pure numbers.

-------------------------------

Don't believe that bullshit. Those words refer only to ray tracing capabilities not the real game performance. It will have more likely l20-30% performance gain.

-------------------------------

Max. 20% more performance. The new Ti have only 17% more CUDA Cores.
Like we've covered, I'm just hoping these people are talking out of their asses looking at a particular spec without understanding what's ultimately involved.

Post Reply