Blur Busters Forums

Who you gonna call? The Blur Busters! For Everything Better Than 60Hz™ Skip to content

Nvidia GPU GTX 20XX Series likely revealed @ Gamescom 20th

Breaking news in the gaming monitor industry! Press releases, new monitors, rumors. IPS, OLED, 144Hz, G-SYNC, Ultrawides, etc. Submit news you see on the Net!

Re: Nvidia GPU GTX 20XX Series likely revealed @ Gamescom 20

Postby BattleAxeVR » 13 Sep 2018, 16:34

I think I'll wait for the reviews to come in, I want the absolute best Ti model possible, for eventual SLI'ing if the RAM is doubled of course.

One thing I found annoying while looking at all the vendors' options, is that the only one that has two HDMI 2.0b outputs is the Asus 2080 non-Ti.

So I ordered a DP 1.4 to HDMI 2.0b adapter given that all the rest have 3 displayport outputs and DP 1.4 is superior to HDMI 2.0b in terms of bandwidth, but I worry about there being any lag differential between the output ports. Say, for VR headsets like StarVR One that use dual HDMI connectors, one for each screen. I need them to be perfectly synced. I recall the non-commercial version of the StarBreeze headset having dual HDMI 1.4 connectors and having to use a DP-to-HDMI adapter, but it was still annoying. I don't see myself ever needing triple monitor setups, and actually I think DisplayPort might die as a standard once HDMI 2.1 comes out, which is far superior. Time will tell I guess. I just think a 35 inch 21:9 4K monitor is enough desktop real-estate for coding on, and don't really see myself doing triple monitors in that setup. Dual monitors, maybe, but I'd rather have two HDMI 2.0b and two DisplayPort 1.4 than 1 / 3. I guess I'm in the minority here. If VirtualLink takes off none of this matters, really.

I also got a Gigabyte motherboard for my new AMD Treadripper 2 workstation so I kind of like the idea of sticking to Gigabyte for the videocards too. They have that cool up / down / up fan config which makes sense to me as an engineer:

https://www.gigabyte.com/Graphics-Card/ ... NG-OC-11GC

One thing I regret is only buying a 1200 watt power supply. I think I'll need 2000 Watt to be safe with SLI'd 2080 Tis and the 800 watts that an AMD 32-core workstation overclocked to 4.2 ghz takes.
BattleAxeVR
 
Posts: 40
Joined: 14 Dec 2017, 11:38

Re: Nvidia GPU GTX 20XX Series likely revealed @ Gamescom 20

Postby lossofmercy » 15 Sep 2018, 13:12

You are absolutely insane man, but more power to you.
lossofmercy
 
Posts: 43
Joined: 06 Feb 2018, 18:00

Re: Nvidia GPU GTX 20XX Series likely revealed @ Gamescom 20

Postby darzo » 15 Sep 2018, 16:40

It's in his line of work.
darzo
 
Posts: 178
Joined: 12 Aug 2017, 12:26

Re: Nvidia GPU GTX 20XX Series likely revealed @ Gamescom 20

Postby mello » 16 Sep 2018, 05:42

There are some leaks already:

https://videocardz.com/77983/nvidia-gef ... e-unveiled
https://www.youtube.com/channel/UCIdTXc ... v3A/videos (6 videos so far)

Nothing special or mindblowing...just a steady progress as expected. But still, i am more interested how is the performance at 1080p and 1440p. 4k gaming at high fps 90+ or 144-200fps is still at least 1-2 gens away.
mello
 
Posts: 160
Joined: 31 Jan 2014, 04:24

Re: Nvidia GPU GTX 20XX Series likely revealed @ Gamescom 20

Postby darzo » 16 Sep 2018, 17:51

https://wccftech.com/nvidia-geforce-rtx ... -official/

50% is not what some were expecting, without DLSS.
darzo
 
Posts: 178
Joined: 12 Aug 2017, 12:26

Re: Nvidia GPU GTX 20XX Series likely revealed @ Gamescom 20

Postby open » 16 Sep 2018, 18:58

I would wait. The benchmarks from that video show anywhere from a couple percent increase to 263% increase. With most results falling into either the 50% or 20% region. The results from wccf seem like they are from an official nvidia reviewing guide according to the text at the top. They are all sitting around 50% increase for the 2080ti over the 1080ti.

If you watch or read any one of the turing "deep dive" breakdowns posted all over the internet at this point they go into more detail at the architecture changes between pascal and turing and it starts to look optimistic that they will pull over their core*clock ratio. This will be especially true given driver optimization and for games that use heavy int operations on the gpu.

But still the video is just all over the place. The benchmarks that were only a few percent better were likely bottlenecked by the cpu. and the benchmarks that were 260% better well I don't even know what to say about that. Good job nvidia if it is true.

And we should refrain from quoting the nvidia reviewer guide even if it is the most likely to be accurate. There is a reason we want to go with multiple 3rd party benchmarks. Also waiting for the last minute driver updates will be wise.

4-5 days from now we should start to get a final picture. These results are still interesting though.

Good job though darzo it looks like the first guess you posted of about 40-50% increase was very close to what we are seeing here.
open
 
Posts: 117
Joined: 02 Jul 2017, 20:46


Previous

Return to News / Rumors / Conventions

Who is online

Users browsing this forum: No registered users and 1 guest