• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.
  • We have made minor adjustments to how the search bar works on ResetEra. You can read about the changes here.

Playboi Carti

Member
Jan 1, 2018
1,272
Portugal
I meant RDNA2 in general. Let's see if these 80 CUs can destroy 3080Ti with ease, that's the more apt comparison here.
I see. If we take into account RDNA2's 50% perf-per-watt improvement over RDNA1 then theoretically this card could be anywhere from 30% to 50% more powerful than the 2080ti so yeah, if the math is right this can trade blows with high end Ampere.
 

Deleted member 18161

user requested account closure
Banned
Oct 27, 2017
4,805
ILikeFeet

How many tflops?

If they delivery that with more power than a 3080 and for less money I don't care about dlss.

I don't see them having an answer to DLSS 2.0 until the generation after these GPU's personally (AMD's image upscaling is mostly sharpening according to a lot of tech people).

If a lot of AAA games start supporting DLSS 2.0 then a 3070 is going to last the whole of the next console gen and provide much better RT performance than even a 2080TI.
 

Tagyhag

Member
Oct 27, 2017
12,597
I'd be glad if AMD also starting hitting NVIDIA in the GPU front to get them to give us cheaper cards, but unlike their CPU's, it's hard for me to justify an AMD card in the future without ray tracing and DLSS being considered.

It'll be interesting to see if they come up with some features themselves. Because let's say this card was hypothetically stronger than the 3080ti, it wouldn't matter to me if a 3070 outmuscles it with DLSS, all while looking prettier due to RT.
 

Siresly

Prophet of Regret
Member
Oct 27, 2017
6,590
Think of the power of this GPU like taking two 5700XT's and putting them into a tube. You'd end up with a very large tube, probably extending twice the size, and all of a sudden it would collapse and they become.
 

Bosch

Banned
May 15, 2019
3,680
But DLSS is essentially a free 50% performance improvement for the 3080. Maybe not across all games, but I'm willing to bet it will be true of a lot of demanding AAA games.
6 games support dlss. We have no idea if most of games will use the tech.

And I doubt if the whole market adopt that to prolong the life of your gpu Nvidia will not charge to subscribe for the dlss service...
 

Deleted member 11276

Account closed at user request
Banned
Oct 27, 2017
3,223
close to 19.5 TFLOPs, I think (assuming same clocks as the 5700XT). the 2080 TI is 14TFLOPs
2080 Ti is around 17 Tflops.

Nvidia always uses lower clock speeds than what the cards reach in actual gaming to calculate the TFLOPs

Edit: might be closer to 16 TFLOPs. Most cards do around 1905 MHz stock from what I observe.
 
Last edited:
OP
OP
ILikeFeet

ILikeFeet

DF Deet Master
Banned
Oct 25, 2017
61,987
6 games support dlss. We have no idea if most of games will use the tech.

And I doubt if the whole market adopt that to prolong the life of your gpu Nvidia will not charge to subscribe for the dlss service...
as long as Nvidia keeps sponsoring games, those will use DLSS. and they'll definitely will keep opening the wallet for the big games
 

Deleted member 18161

user requested account closure
Banned
Oct 27, 2017
4,805
close to 19.5 TFLOPs, I think (assuming same clocks as the 5700XT). the 2080 TI is 14TFLOPs EDITL 17TFLOPs as per dampflokfreund

Holy shit... Series X's GPU out specced inside the year lol. I had heard Moorslawisdead talk about a 24tflop AMD GPU a couple of months ago, is that bs or a more powerful version of the one you're talking about being 17tflops?
 
OP
OP
ILikeFeet

ILikeFeet

DF Deet Master
Banned
Oct 25, 2017
61,987
Holy shit... Series X's GPU out specced inside the year lol. I had heard Moorslawisdead talk about a 24tflop AMD GPU a couple of months ago, is that bs or a more powerful version of the one you're talking about being 17tflops?
the 2080 Ti is 17TFLOPs. as for a 24TFLOP AMD, maybe that was a server card? Arcturus will have 8192 cores (128CU) max, but that competes with the A100 and isn't meant for gaming
 

BeI

Member
Dec 9, 2017
5,999
Hopefully AMD bring some heat so I can get a cheaper Nvidia card with DLSS. I guess I'd be open to AMD if they have some cool tech of their own, but I'm not holding out too much hope.
 

asmith906

Member
Oct 27, 2017
27,496
So like, how many gamecubes?
Super Saiyan Blue Gamecube


jOJHMEj.jpg
 

AmFreak

Member
Oct 26, 2017
2,513
Holy shit... Series X's GPU out specced inside the year lol. I had heard Moorslawisdead talk about a 24tflop AMD GPU a couple of months ago, is that bs or a more powerful version of the one you're talking about being 17tflops?
If you clocked this one at PS5 clocks you get ~23TF.
Probably a little too much, but who knows.
 

shark97

Banned
Nov 7, 2017
5,327
2x 5700xt plus some efficiency gains for rdna 2 plus some clock increases=2.5x 5700xt performance maybe? yikes, nvidia has their hands full.

of course it's all speculation at this point.
 

Water

The Retro Archivist
Member
Oct 30, 2017
813
Too much memory for games, not enough support form 3D applications that would be able to actually utilize it. AMD GPUs always in a weird spot.
 

Simuly

Alt-Account
Banned
Jul 8, 2019
1,281
No one cares about 1.0 it's defunct. With DLSS, everyone means 2.0.

As to game supporting DLSS 2.0, how many games support AMD's current upscaling implementation?

This is about the future. DLSS means an essentially free 50% improvement in frame rates. That's going to be hard to overcome in raw hardware performance.

Oh please don't start these marketing posts for a technology that Nvidia is using exclusively... they don't need your help (their market gap overtook Intel) and the market needs competition.

DLSS 2.0 is great but it's not without flaws. The main problem is it is proprietary, very few games use it or will use it.
 

Deleted member 25042

User requested account closure
Banned
Oct 29, 2017
2,077
AMD's lack of research time and money

How is that on NV though?

Edit: for clarification , and because you're not the poster who complained about DLSS being proprietary, AMD not having a ML implementation of their own isn't in any way shape or form NV's fault.
Seeing DLSS as some sort as exclusive feature BS when NV developed it and actually added hardware to speed it up makes no sense.
They could just use DirectML for their ML implementation, no proprietary issue anywhere.
 
Last edited:
OP
OP
ILikeFeet

ILikeFeet

DF Deet Master
Banned
Oct 25, 2017
61,987
Too much memory for games, not enough support form 3D applications that would be able to actually utilize it. AMD GPUs always in a weird spot.
from a professional standpoint or a hobbiest standpoint? cause there's a separate line of cards for that

ILikeFeet

Yeah I think this was his calculations. If you can clock a GPU at 2GHz in a console sized box (bigger than usual but still) then I don't see why you can't go beyond that in a large powered and cooled PC tower.
yea it's possible and likely, but we'll have to wait and see. this could cause power draw to go crazy
 

Doc Holliday

Member
Oct 27, 2017
5,820
If they just make a card that that's equivalent to a 2080, runs cool, quiet , and with stable drivers at good price I would be happy
 

BreakAtmo

Member
Nov 12, 2017
12,909
Australia
How is that on NV though?

Edit: for clarification , and because you're not the poster who complained about DLSS being proprietary, AMD not having a ML implementation of their own isn't in any way shape or form NV's fault.
Seeing DLSS as some sort as exclusive feature BS when NV developed it and actually added hardware to speed it up makes no sense.
They could just use DirectML for their ML implementation, no proprietary issue anywhere.

Technically I don't think Nvidia added tensor cores to speed up DLSS - more like the tensor cores were added for professional users, and they came up with a way to make them useful in games.
 

GillianSeed79

Member
Oct 27, 2017
2,372
What's everyone's thoughts on price? I want to finish building a new PC, but I want to wait until the new AMD Big Navi cards drop so I can at least match next-gen performance at 4K. I know AMD is all about best power per dollar value and they are competing with Nividia, but do you think we are looking at the $600-$700 range? Or $500 or less range?
 

Deleted member 25042

User requested account closure
Banned
Oct 29, 2017
2,077
Technically I don't think Nvidia added tensor cores to speed up DLSS - more like the tensor cores were added for professional users, and they came up with a way to make them useful in games.

Yeah I didn't mean that Tensor cores were added specifically for DLSS (even if my post can read like that), just that they added hardware that can help speed it up
Finding fault with NVidia for advancing tech and passing it off as proprietary BS is just weird to me
Again DirectML has been a thing for a while, AMD can use it as they want
Them being way behind in ML is solely on them
 

Herne

Member
Dec 10, 2017
5,331
I'm very happy with my 5700 XT for now but this is all sounding very good. If those rumours a while back that said AMD has fixed their power draw issues and the new chips clock really high then even ignoring CU count and architectural improvements going from RDNA to RDNA 2.0, these cards could be competitive as fuck.

I don't care about ray tracing yet, it's still going to be a few years before even a good percentage of games are using it as a matter of course. By the time I'm looking for an upgrade, probably three years for now, then I'll start to care

Edit -

Also, as a note, they seem to have resolved the vast majority of their driver issues at this point. I got my 5700 XT in late January - the drivers from around then gave me crashes, forcing me to use 19.12.1 which was stable. January, February, March drivers all gave me crashes. 20.5.1 was the first driver of the new year to be perfectly stable. And now on the new pc, 20.7.1 is again, 100% stable, no issues at all.

What's everyone's thoughts on price? I want to finish building a new PC, but I want to wait until the new AMD Big Navi cards drop so I can at least match next-gen performance at 4K. I know AMD is all about best power per dollar value and they are competing with Nividia, but do you think we are looking at the $600-$700 range? Or $500 or less range?

Undercutting nVidia by about €/$100 for the mid-range and up is probably a good bet. Sadly AMD announced some time ago that they will no longer be the "value alternative", so as long as nVidia feels comfortable hiking their prices up, AMD will take advantage to make money, too. Thankfully, unless something like HBM memory forces their hand (unlikely) they will probably always be undercutting nVidia and we benefit (at least a little) as a result.
 
Last edited:

Uhtred

Alt Account
Banned
May 4, 2020
1,340
Oh please don't start these marketing posts for a technology that Nvidia is using exclusively... they don't need your help (their market gap overtook Intel) and the market needs competition.

DLSS 2.0 is great but it's not without flaws. The main problem is it is proprietary, very few games use it or will use it.

Lol, I'm talking as a consumer, not a shill, please.

It's not on Nvidia that they saw ray tracing coming aorudn the corner and made hardware for it, and found a use for that dedicated hardware outside of ray tracing as well.

If AMD doens't have an answer for DLSS, no doubt they are starting on the wrong foot. I'm hoping they include a good amount of ML capability and find an algorithm that at least gets close to DLSS.
 

Pipyakas

Member
Jul 20, 2018
549
I like the idea of using CUs as RT accelerator - sure your card is slower overall due to spliting power between RT and traditional rasterization but you get the full value out of your GPU 100% of the time, not letting die space rest wasted when playing unsupported games
But Radeon software is nowhere near Geforce driver and software stack atm, so anything above 200$ is a very questionable buy for me. And they're not releasing 5500 tier card this year
 

Water

The Retro Archivist
Member
Oct 30, 2017
813
from a professional standpoint or a hobbiest standpoint? cause there's a separate line of cards for that

I'd say both. I don't buy AMD just because a lot of acceleration in rendering right now is CUDA based with packages like redshift.

even if the specs on their pro tier cards are good, they just don't get the same support from the actual software that nvidia does so they're less valuable even if they're cheaper.

I feel like lots of indie artists and pro devs use the same rational when picking up a new graphics card because this stuff makes a huge difference. Since more devs are using nvida.. more games work better with it never mind any thing else that's 3D accelerated
 

ThatNerdGUI

Prophet of Truth
Member
Mar 19, 2020
4,552
Hopefully they can make proper drivers for it and create a legit competition because AMD drivers in the last decade have been garbage.
 

Herne

Member
Dec 10, 2017
5,331
Hopefully they can make proper drivers for it and create a legit competition because AMD drivers in the last decade have been garbage.

The recent horrendous RX 5xxx black screen crashing event aside, that hasn't really been true at all. A few years ago they started developing big end of year updates, asking the community what features they want and (mostly) delivering, half a year back they launched a redesigned driver control panel that really makes nVidia's look antiquated by comparison.

Both companies have had awful drivers - a few years back you had nVidia drivers apparently actually killing cards. AMD has had a bad reputation for their drivers but they have been steadily improving them for some time now and granted, the recent black screen crash issue was awful, absolutely terrible PR, but that mostly seems to have been resolved now, thankfully.

I guess my friends and I are the lucky ones as we've been on mostly AMD cards for about a decade now and we've never really had many issues with them. The most recent - and egregious - would be the issue mentioned above, but of my friends only I had it and 19.12.1 was luckily solid as a rock for me.
 

Kuosi

Member
Oct 30, 2017
2,368
Finland
When was the last time AMD leaks/predictions turned out to be true and not overblown? on the gpu side maybe 290?
 

Ojli

Chicken Chaser
Member
Oct 28, 2017
2,652
Sweden
Guess we'll know soon. Read somewherw that consumer cards were to be released in August. Might be in tandem with that rumored Xboxing day
 

dgrdsv

Member
Oct 25, 2017
11,929
I hope we get more than 12 gb vram for next gen games.
Geez, relax people, next gen consoles will have 16-20 GBs of RAM in total of which some 2-4 GBs will be used by the OS and some 2-4 or so for game's logic which leaves us with 8-16 GBs of VRAM requirements and 12 is exactly in the middle.
Worth remembering that most PCs aren't targeting 4K either which means that the requirements on PC will be even lower.
I'd wager that even 8 GB VRAM cards will be completely fine with the first 2-4 years of next gen releases. 12 is more than enough for 2021-22 really.

When was the last time AMD leaks/predictions turned out to be true and not overblown? on the gpu side maybe 290?
Navi 21 should be a good product, a first one from AMD since 2013's Hawaii which will be capable of competing with NV's high end cards.
 
Oct 30, 2017
250
When was the last time AMD leaks/predictions turned out to be true and not overblown? on the gpu side maybe 290?

The R9 290X was indeed the last proper high end AMD GPU launch. And even that was married by AMD putting the most godawful useless pile of shit cooler on it that thermal throttled at stock so hard it was effectively a 290.
 

MrKlaw

Member
Oct 25, 2017
33,151
XSX was suggested to be about half the speed of 2060 in TOPs for ML (related to DLSS or similar solutions). So a 72-80 CU PC card would be much closer