• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.
Status
Not open for further replies.

Laiza

Member
Oct 25, 2017
2,170
Ah, sarcasm... Sure, maybe if you delay it to 2022 or 2023. 🙄

The 2080Ti is a 12nm card that has been around for two years with performance that's about to be considered around mid-range level in the Fall. AMD claims their RDNA2 architecture is 50% more efficient per watt than the first iteration that's in the 40CU 5700XT which beats a 2070, the XSX's GPU with 52 CUs being close to a stock 2080Ti, which again is going to be mid-range soon, isn't some far out claim.
The 5700 XT is neck-and-neck with a 2070... in games without ray tracing of any kind. We don't know how much of the new RDNA2 GPU's silicon is dedicated to ray tracing cores; if it's anything like what Nvidia did with Turing you're going to get closely equivalent ray tracing performance to a 2070 with maybe 10-15% higher rasterization performance (putting it in the realm of a regular 2080).

Remember, the whole reason Turing was such a small jump compared to Pascal was specifically because of ray tracing. If they had dedicated the entirety of Turing's architecture to rasterization performance they all would have easily been 30-50% more powerful than their equivalent Pascal cards. Ray tracing is heavy and the fact that it's present in the next-gen consoles means that AMD is dedicating a significant chunk of the new GPU's budget specifically towards that. You can't just assume it's an added freebie.

I'll be wildly impressed if AMD can put out something with power on-par with even the mid-range Ampere offerings. Given their track record, that would be a major leap. I'm not crossing my fingers.
 

Kalik

Banned
Nov 1, 2017
4,523
I haven't been following the specs as far as Ampere vs Big Navi but based on the rumors which GPU looks more impressive?...or will both cards have their individual strengths and weaknesses and there won't be 1 GPU that does everything well?...how does Big Navi's ray-tracing implementation look compared to Ampere?
 

ColdSun

Together, we are strangers
Administrator
Oct 25, 2017
3,290
I haven't been following the specs as far as Ampere vs Big Navi but based on the rumors which GPU looks more impressive?...or will both cards have their individual strengths and weaknesses and there won't be 1 GPU that does everything well?...how does Big Navi's ray-tracing implementation look compared to Ampere?
No one knows any of that tbh.
 

Isee

Avenger
Oct 25, 2017
6,235
Preparations for RTX 3000 completed, transition to a PCI gen. 4.0 board accomplished. I know, not necessary but also something I planned to do for NVMe anyway. Now let's hope that RTX 3080 - 3090 prices aren't complete bonkers.
 

MrKlaw

Member
Oct 25, 2017
33,038
20xx seemed to stall a little on classic rendering performance to fit in the Turing stuff, so wasn't as big a performance jump as you'd hope if you aren't using RT. Will 30xx bring back that 'normal' shading jump or are they pushing even more into RT considering how power hungry that is and this is the second gen so am opportunity to correct for any mismatch in balance between RT, Turing cores for Denoising/DLSS etc
 

Birbos

Alt Account
Banned
May 15, 2020
1,354
Preparations for RTX 3000 completed, transition to a PCI gen. 4.0 board accomplished. I know, not necessary but also something I planned to do for NVMe anyway. Now let's hope that RTX 3080 - 3090 prices aren't complete bonkers.
When does Intel get in the PCI 4.0 game? Scared to buy an AMD cpu.
 

Xx 720

Member
Nov 3, 2017
3,920
Dumb question but...of course you pay more for a better/ higher spec graphics card but, does it cost any more for the manufacturer? Does Nvidia have to spend more to make a 2080 than a 2070 or is it just priced higher for the performance?
 

MrKlaw

Member
Oct 25, 2017
33,038
3080 should definitely outperform the 2080ti by a decent chunk or Nvidia is shooting themselves in the foot.

If 3080ti is a little ways off, then it makes sense for the 3080 to slightly outperform the 2080ti. That way they get the extreme buyers to double dip with 3080 and 3080ti
 

Terbinator

Member
Oct 29, 2017
10,218
Dumb question but...of course you pay more for a better/ higher spec graphics card but, does it cost any more for the manufacturer? Does Nvidia have to spend more to make a 2080 than a 2070 or is it just priced higher for the performance?
For the specific example you've used, yes.

The GTX 2080 and Ti are made from a larger chip than those used in the 2070.

I think it depends on the exact contracts Nvidia has with its fab partners, but typically Nvidia will buy a wafer of GPUs from its supliers for an agreed price and then chips that can be salvaged that don't hit certain power/perf requirements will be 'binned' and rebranded as a lower card.
 

Isee

Avenger
Oct 25, 2017
6,235
When does Intel get in the PCI 4.0 game? Scared to buy an AMD cpu.

When their next CPU line up gets released. By the end of 2020 - early 2021.

No need to fear AMD. Pretty awesome CPUs, now that all the release BIOS problems have been figured out, lol.
But the early months were hell of a ride, but to my very surprise: Some z490 bioses have their own problems.
 
Last edited:

kami_sama

Member
Oct 26, 2017
6,998
Dumb question but...of course you pay more for a better/ higher spec graphics card but, does it cost any more for the manufacturer? Does Nvidia have to spend more to make a 2080 than a 2070 or is it just priced higher for the performance?
It depends, a lot of the time different cards have different silicon, so less powerful cards have larger dies.
Sometimes two cards share the same silicon, but some of the card is disabled. Maybe some of the cores have defects so they get disables, and the company can recoup costs.
Sometimes they disable those parts just because they need a card in that segment and making a new silicon is too costly.
Also, the PCBs, the memory, the power delivery and the heatsink/fans might be different because they have different power needs and specs.
 

dgrdsv

Member
Oct 25, 2017
11,846
1080 launched with doom
1080 launched a couple of months after Doom and this was just a coincidence, as with pretty much everything else you can think of.
And Doom never even was in NV's devrel program which is also why it never really got the same level of optimizations in Vulkan renderer which AMD got.
You can just as easily say that it was AMD who launched RX480 with Doom - and it wouldn't be true either.

2080 with shadow of the tomb raider for RTX
2080 launched in Sep 2018, RTX patch for SOTTR came out in Jan 2019.
So again, no relation whatsoever.
Same will be true for Ampere.
 
Oct 27, 2017
4,107
i feel like i've hit the sweet spot with a 2080ti and 1440 ultrawide -- going to do my best to skip this next gen if possible, at the very least i will see what the next Ti has to offer
 

EldarMu

Member
May 7, 2020
50
I'm probably going to get an upgrade from a 2080 ti if there's a card that's 50% or more powerful.

Can't wait for next gen Ryzen so I can upgrade from my 6700k. It's good but not great anymore
 

Isee

Avenger
Oct 25, 2017
6,235
i feel like i've hit the sweet spot with a 2080ti and 1440 ultrawide -- going to do my best to skip this next gen if possible, at the very least i will see what the next Ti has to offer

Sameish. It all depends on the price and a bit on the uplift in Ray Tracing performance for me. An RTX 3080 with plus 20% performance alone isn't tempting enough but add +50-60% additional RTX performance to the mix and I'd do it, for 700€-800€.

A 3080Ti/3090 with 50-60% more performance and an even higher RTX uplift. I'd might even go and spend another 1300€. But that's already the uncomfortable category. More than 1500€ and I'm out.

For now, I need to wait for all RTX 3000 variants to release, wait for the now typical AMD/Nvidia responses/bait and switches, wait for GPUs to become wildly available and then decide.
As you said, it's not like 2080Tis will suddenly become irrelevant from a performance perspective. They are powerhouses and the best that could have been made in 2018.
 

liquidmetal14

Banned
Oct 25, 2017
2,094
Florida
"Render more quicker"?

How about we, I dnno, just load all the relevant data into the absolutely massive piles of RAM and VRAM we have (which, for powerful gaming PCs, can be triple what the "next-gen" consoles are bringing in totality) so we don't have to have this silly conversation about magical SSDs?

I really tire of this whole charade.
Listen, I have no stake in any game or am drinking any platforms coolaid moreso than the other but even I as an enthusiast on PC can admit some great things happening on consoles especially PS5. I'm not interested in a charade moreso than just discussg the facts and also want that new higher baseline of next gen consoles bring all the boats up with the rising tide.

I'm not looking to be antagonistic nor am I really sure how my statement would elicit heat. I will add that the devs I've spoken, who are PC people btw, are pretty excited for the stuff in PS5 that has been discussed a lot already and that's ok.

Our shiny, super fast machines still can run circles around these platforms but you cannot underestimate the closed nature and ease of optimization which consoles allow vs PC. Couple that with some nest advancements, which are actually being bred by console design, trickling down and making PC ports use more of that powerful hardware and I'm even more excited as a gaming PC owner.
 

Alexious

Executive Editor for Games at Wccftech
Verified
Oct 26, 2017
909
KatCorgi-RTX-3080-RTX-3090-Jun19.png


Rumor, of course. Still, 21 Gbps would be insane if true.
 

Alexious

Executive Editor for Games at Wccftech
Verified
Oct 26, 2017
909
Yeah saw this on videocardz website. Wouldn't 3090/80Ti at 21Gbps be on par/better than rumored Titan at 17Gbps? Not complaining, but it seems the only advantage Titan will have is total VRAM amount.

TITAN would also have slightly more CUDA cores. But yes, unless the game required more than 12 GB of VRAM, the 3090/80Ti could be even better.
 

tokkun

Member
Oct 27, 2017
5,400
I don't get why people here are so focused on what the relative performance change is going to be for a given model number. What matters is the change in performance within a price class. That was the lesson we were supposed to learn from the last generation of cards, when all the models got moved to different price classes.
 

dgrdsv

Member
Oct 25, 2017
11,846
Yeah saw this on videocardz website. Wouldn't 3090/80Ti at 21Gbps be on par/better than rumored Titan at 17Gbps? Not complaining, but it seems the only advantage Titan will have is total VRAM amount.
Titan is a prosumer card now which target production and research more than gaming, it kinda makes sense to have it with twice the size but slower VRAM. It's also likely due to power budget meaning that you'd probably be able to OC it to 21Gbps if you're okay with hitting some 400W of consumption.

Also kinda bonkers we might be looking at over a TB/s bandwidth. Did we get over that with AMD HBM cards?
Radeon VII had 1028 GB/s of memory bandwidth.

I don't get why people here are so focused on what the relative performance change is going to be for a given model number. What matters is the change in performance within a price class. That was the lesson we were supposed to learn from the last generation of cards, when all the models got moved to different price classes.
Exactly. 3090 can be +50% or +350% to 2080Ti - it won't matter to any current 20 series owner if it will cost some $3000.
 

Nikokuno

Unshakable Resolve
Member
Jul 22, 2019
761
Yeah, at the end of the day, most consumers will just look at their budgets and pick what's in their ballpark. 600€ is what I'm willing to pay atm to replace my old friend, RX480.
 

Alexious

Executive Editor for Games at Wccftech
Verified
Oct 26, 2017
909
Titan is a prosumer card now which target production and research more than gaming, it kinda makes sense to have it with twice the size but slower VRAM. It's also likely due to power budget meaning that you'd probably be able to OC it to 21Gbps if you're okay with hitting some 400W of consumption.

Exactly. 3090 can be +50% or +350% to 2080Ti - it won't matter to any current 20 series owner if it will cost some $3000.

Realistically though that's never going to happen. Maybe the Titan could approach that, but I don't see even a 3090 costing more than €1500.
 

Sabin

Member
Oct 25, 2017
4,609
A 3080 should be a pretty substantial upgrade over a 2080ti with these specs.

Drop it for around 800-900€ Nvidia and a costume design 3080 will be mine day 1.
 

Smashed_Hulk

Member
Jun 16, 2018
401
Does this mean I'll need to upgrade to a motherboard that supports PCIe 4.0? Since PCIe 3.0 /x16 supports a max of 15.75 GBps?


Edit: Nvm I confused Gbps and GBps lol.
No, the speeds listed are for the RAM on the GPU it has nothing to do with pcie.
That speed is per pin on the GDDR6 ram chip

A single card isn't going to max out a pcie 3.0 x16 link yet, not even the new cards. A 2080ti just barely maxes a 3.0 x8 link. going to x16 only gets 3% more performance out of it.
 
Last edited:

Edgar

User requested ban
Banned
Oct 29, 2017
7,180
so is 3090 gonna be the new ti? or are we getting 3080ti later down the line too? what about 3090ti? lol. I usually like to get the highest end of gpu in the series and dont worry about it for next few years
 

stoke1863

Member
Oct 29, 2017
383
It worries my greatly that people are so comfortable now dropping £1k+ on a GPU, when Nvidia started to gear towards that end no one should of stumped the cash up, now they know they can charge these prices and they are making them seem "normal" it's mental because it just shifts the prices up across the entire range.


I remember buying a GTX 970 for just £260!
 

Deleted member 56306

User-requested account closure
Banned
Apr 26, 2019
2,383
It worries my greatly that people are so comfortable now dropping £1k+ on a GPU, when Nvidia started to gear towards that end no one should of stumped the cash up, now they know they can charge these prices and they are making them seem "normal" it's mental because it just shifts the prices up across the entire range.


I remember buying a GTX 970 for just £260!

It sucks tbh.

I was going to ask how Nvidia could justify the price of some of these cards when next gen consoles will be a lot closer in performance to their middle-high range options, but I feel like they don't really need to.

plus we don't know how much they will cost yet, though I doubt they will be approaching anything above 600.
 

Kalik

Banned
Nov 1, 2017
4,523
if the 350W TDP rumor is true, what kind of PSU is going to be needed to run it alongside something like a 10600K?
 

Nothing

Member
Oct 30, 2017
2,095
Generally, the NVIDIA product stack goes down one number/Ti to non-Ti per generations, give or take a few percentage points and some improvements gen-to-gen a la improved raytracing (supposedly, we'll see what happens with Ampere soon enough) so the 3080 will be around the level of a 2080 Ti
Except for this time, previous leaks have pointed towards the RTX 3070 being -5% slower than a 2080 Ti, and nearly the same with overclocks.

We could be getting (near) 2080 Ti levels of performance for only $500.
 

UF_C

Member
Oct 25, 2017
3,347
At this point, I think my 8700k will be viable for another few years. Paired with 3090c it should be a great upgrade from 1080ti. My only concern is that the new consoles will be running 8 cores (I believe) and the 8700 is 6. But hopefully that won't become a problem for another year or so.
 

scabobbs

Member
Oct 28, 2017
2,103
so is 3090 gonna be the new ti? or are we getting 3080ti later down the line too? what about 3090ti? lol. I usually like to get the highest end of gpu in the series and dont worry about it for next few years
think they're ditching the Ti and just going with the 3090 naming scheme to simplify
How much VRAM does 4k use? Articles I'm seeing point to around 8-10 GB,
You'll probably want at least 8GB but i don't know of any current games that use that much.
 

Nothing

Member
Oct 30, 2017
2,095
Not at all. I said "could".

But the prices should be in line with the previous RTX lineups, and definitely not increasing. When you consider what AMD is putting out and that the powerful new consoles are releasing. I fully expect to be able to pre-order a nice RTX 3070 for $499, possibly even less.
 

Metroidvania

Member
Oct 25, 2017
6,768
Except for this time, previous leaks have pointed towards the RTX 3070 being -5% slower than a 2080 Ti, and nearly the same with overclocks.

We could be getting (near) 2080 Ti levels of performance for only $500.

At first I missed the TI, and was like....'da fuck'.

But yeah, that'd be pretty obscene IF they keep prices reasonable. Would also make me wonder what kind of power the 3080 TI/3090 has.
 

Ra

Rap Genius
Moderator
Oct 27, 2017
12,203
Dark Space
Generally, the NVIDIA product stack goes down one number/Ti to non-Ti per generations, give or take a few percentage points and some improvements gen-to-gen a la improved raytracing (supposedly, we'll see what happens with Ampere soon enough) so the 3080 will be around the level of a 2080 Ti
No, the GTX 970 was on par with the 780 Ti, and the GTX 1070 clean smoked the 980 Ti at all resolutions.

We should be expecting the 3070 to be on par with the 2080 Ti.
 
Status
Not open for further replies.