Looking at the prices of Nvidia's partners the speculation that most cards will be cheaper than the Founder's Edition is of course wrong (Nvidia is significantly overclocking and cooling there finally). The tis will be around $1200, the 2080 around $800 ($850 in a number of cases). The big surprise is the 2070. The Founder's Edition is $600! I think the tis and the 80s might be priced the same as the previous generation (ti=Titan) but I recall buying a 970 for $350 and I think the 1070s were in that ballpark unless I'm mistaken, around $400-450 maybe? Mining does appear to have passed so at least these prices shouldn't be inflated.
This guy doesn't mince words:
https://www.forbes.com/sites/jasonevang ... ype-train/
The reality of the RTX 20 Series that releases next month is this: it's a money-grab designed to get early adopters on the ray tracing hype train for the 20 or so games that will ship with the feature. It's a stopgap to next year's 7nm cards which will offer substantial performance gains and power efficiency improvements. And as for the price tag, Nvidia can charge whatever it desires due to lack of competition in the high-end space.
Seriously, glance at the clock speeds for the 20 Series. Check out the unimpressive CUDA core increase over the 10 Series. Realize that the memory configuration is the same as the 10 Series (albeit with GDDR6 instead of GDDR5). Take a hard look at what the performance increase will be for . Most in the tech media are putting it at maybe 10% to 15% over the 10 Series when it comes to the majority of games out there. But you'll pay 40% to 50% higher prices for this generation's replacements based on MSRP. And you know we won't be paying MSRP. . .
I'm not sure who he's referring to in the performance increase estimations (I've read that the GDDR6 specs are a jump) and I think he's plain wrong about the prices, except in the case of the 2070.
If you really want an RTX 20 series card, there's a strong chance that Nvidia will release 7nm versions in 2019 (RTX uses the 12nm process) with substantial performance gains, improved power efficiency and likely double the GDDR6 memory capacity. Why? Because that's likely the point when AMD will have some kind of competitor at the high end, and Nvidia will want to leapfrog them. It's also when we'll know many more details about Intel's upcoming Arctic Sound gaming GPUs slated for 2020.
I do keep reading about 7nm AMD cards late next year and if those will really be a big deal then Nvidia will be compelled to respond. We'll see. I would be really surprised if it's true that the 20xxs are only up to 20% faster than the 10xxs. I'm still expecting 40%-50% per the rumors. That's going to be the key. The benefiting off of early pre-orders logic doesn't come off as smart. Nvidia's reputation at this point is much more valuable than the little bit of money they'd make in a limited early sale of cards. I've read that the available pre-order units are actually pretty few. If it in fact turns out these cards are only <=20% faster I'll be one of many to remember what happened here. Of course AMD would actually need to produce a card that isn't trying to launch into orbit while performing worse and to support it with forget-about-it drivers (uhm, my monitors are gsync... they might've gotten us here). And even then Nvidia may beat them by a substantial margin. At that point I'd really like to see that guy off the stage though.
Ask yourself why Nvidia showed approximately zero gaming benchmarks without ray tracing. There were no performance comparisons between, say, the GTX 1080 Ti and the RTX 2080 Ti for the most popular games already out there.
When the GTX 1080 launched, Nvidia repeatedly reminded us how its performance dwarfed the GTX 980 in everything from AAA gaming to VR. When the GTX 1070 arrived, Nvidia boasted that it matched or beat the Titan X. That's a compelling upgrade argument.
This is what concerns me. That guy up there strikes me as more sleazy than a goof. Hopefully he just really likes ray tracing and didn't feel he'd need to mention actual performance (I still hate the 6x faster marketing, especially after reading the exact wording of it). Maybe we'll like it a lot too once we see it, although I'm also worried what it would do to fps. Keep in mind monitors have been outpacing graphics cards, not just the 4k 144hz new releases. I wasn't doing either 200 fps at 1080 (game set cap) or 165 fps at 1440 on low settings in Destiny 2 with my Asus Strix 1080, and the other components of my computer have so far held (5ghz 7700k, 3600 RAM).
lossofmercy wrote:Let's ignore for a bit that no serious developer is going to be developing exclusively for this technology until a console is created for it. And let's ignore for a bit that the games and effects will need to be developed around this tech to be impressive. Where would this tech be the most useful?
I think people are getting this part pretty wrong. One of the key things he mentioned is that this technology actually saves resources. As long as you're designing the physics of your game properly, presumably as you already should be, ray tracing is a technology that will unfold organically or largely on its own. I couldn't tell anything from Twitch, that doesn't mean you wouldn't on your own monitor. Consider the demo of the room with the window, how it was completely lit without ray tracing and largely dark with it. It's possible that ray tracing will both become standard and significantly improve what we see. The games listed were just games that will be out relatively soon. You'd expect future games to continue with ray tracing, dispensing with inferior and resource-consuming lighting techniques (although ironically this might be unrealistic given how few cards would support ray tracing; they'd have to support both).
This is similar to people discounting HDR monitors. Just because few games support HDR right now doesn't mean upcoming games won't have it. Interestingly I seem to be getting better picture with Wide Gamut in SDR vs HDR in Destiny 2 but that's for another thread.
Please stop with this philosophy change speculation. Diversifying has nothing to do with performance in gaming, unless you'd like to claim they don't care much for gaming as a result, which I find baseless. Them just being "tone deaf" with respect to catering to gamers is what would make the guy a goof. It's sleazy or goof, pick one. Again, I hope he knew better but given the cards will be an adequate performance improvement (40%-50%) he didn't feel the need to mention that. I also have no idea how you call AMD smart when their product releases have been disappointment after disappointment, which may be to the detriment of all of us except Nvidia.