• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.

TheNerdyOne

Member
Oct 28, 2017
521
will cross post the link to this here, since its relevant to the discussion of whether rdna2 could be a match for ampere.




It seems that ampere RT core performance hasn't improved at all vs turing, RT is faster because the gpu is faster, in the same proportion.... so nvidia "2nd gen" RT cores are just.... 1st gen RT cores, with more memory bandwidth to play with, the percentage drop in performance enabling RT is exactly the same as doing so with turing in every single game tested. this means RDNA2 has a lot less of a challenge on its hands matching nvidia RT performance than anyone would dare admit...
 

dgrdsv

Member
Oct 25, 2017
11,846
It seems that ampere RT core performance hasn't improved at all vs turing
It did. This video makes incorrect statements about RT h/w based off games with hybrid RT renderers performance - which never even were limited by RT h/w even on Turing.

RT is faster because the gpu is faster, in the same proportion...
Which is also false as the same video show several cases where in game RT is actually a lot faster than the average gain of Ampere over Turing cards.

so nvidia "2nd gen" RT cores are just.... 1st gen RT cores
Nope.

the percentage drop in performance enabling RT is exactly the same as doing so with turing
Because the drop is attributed to shading complexity increases which are needed for RT - and it's completely unsurprising that games where RT was limited by shading in the first place are showing the same drops of performance on Ampere.

this means RDNA2 has a lot less of a challenge on its hands matching nvidia RT performance than anyone would dare admit...
RDNA2 has a sizeable challenge beating even Turing's RT tbh.
 

Uhtred

Alt Account
Banned
May 4, 2020
1,340
will cross post the link to this here, since its relevant to the discussion of whether rdna2 could be a match for ampere.




It seems that ampere RT core performance hasn't improved at all vs turing, RT is faster because the gpu is faster, in the same proportion.... so nvidia "2nd gen" RT cores are just.... 1st gen RT cores, with more memory bandwidth to play with, the percentage drop in performance enabling RT is exactly the same as doing so with turing in every single game tested. this means RDNA2 has a lot less of a challenge on its hands matching nvidia RT performance than anyone would dare admit...


That seems to be the case on some of the examples, but they point out that the difference becomes significant in, for instance, the latest patch for Control.

I think the differences in architecture are more forward looking, and you need ot take them into accoutn to get the most out of the newer architecture.

And dgrdsv is probably right too.
 

LCGeek

Member
Oct 28, 2017
5,856
will cross post the link to this here, since its relevant to the discussion of whether rdna2 could be a match for ampere.




It seems that ampere RT core performance hasn't improved at all vs turing, RT is faster because the gpu is faster, in the same proportion.... so nvidia "2nd gen" RT cores are just.... 1st gen RT cores, with more memory bandwidth to play with, the percentage drop in performance enabling RT is exactly the same as doing so with turing in every single game tested. this means RDNA2 has a lot less of a challenge on its hands matching nvidia RT performance than anyone would dare admit...


Since ampere has come out I've learned not to speculate or base performance of its gains vs 2080ti cause you know it's ti vs non. It's great to compare to for extreme or total absolute best cards for the exact case you make its bad. They aren't proportional if you introduce that factor.

dgrdsv already mentioned commentions but I figured I'd say something too.

Not only that why are you basing such a comment of games that don't have the ability to flex ampere. Turing didn't hit some of it stride till drivers, oc, features until at least year and not even within a month of this releasing I'm gonna judge ampere, get real. The guy also uses the word speculation in the video it's not conclusive considering how early things are.
 

Caz

Attempted to circumvent ban with alt account
Banned
Oct 25, 2017
13,055
Canada

In terms of specifications, Navi 21 is Big Navi, which integrates up to 80 groups of CUs, which is 2560 stream processors, matching 256bit wide video memory, and is expected to be named RX 6900 series.
 

TheNerdyOne

Member
Oct 28, 2017
521
videocardz.com

AMD Navy Flounder to feature 40 CUs and 192-bit memory bus - VideoCardz.com

AMD Sienna Cichlid and Navy Flounder specifications AMD RDNA2-based graphics cards (Rumored Radeon RX 6700/6800 – left, Rumored RX 6900 series – right) The next-generation desktop graphics processors based on AMD RDNA2 architecture are expected to debut on October 28th. AMD did not confirm which...


spec leak, 80CU at 2.2ghz for the big one (22.5tflop) and 40CU at 2.5ghz (12.8 tflop) for the midrange, 32CU at unknown clockspeed for the lower end one. Assuming the same perf/flop as RDNA1, and assuming it scales out, you're looking at ~2.3x 5700XT performance, which is uh, yeah, faster than a 3080, also faster than a 3090, at 4k (2080Ti is 50% faster than a 5700XT at 4k, and only 34% faster at 1080p, on average. anything above 2x 5700XT performance will put amd ahead of nvidia here, 2.3x gives them a definitive lead), we'll soon see if that works in practice, but on paper, nvidia's got a real problem here


These leaks aren't a rumor, they were dug out of the latest ROCm release (version 3.8) so yep, we have official specs. ~250w for the 80CU part at 2.2ghz , 170w for the 40CU part at 2.5ghz, those are probably power for the gpu core only, so ~300W for the big boy and ~225w for the midrange is my best guess, which would uh, give them precisely 50% more perf/watt vs rdna1 as claimed as well. looks like they weren't making that up (unlike nvidia).



navi 21 (big one) can apparently use GDDR6 OR hbm2, so the flagship is probably hbm2 (else they wouldnt have bothered putting hbm memory controllers in it)
 
Last edited:

TheNerdyOne

Member
Oct 28, 2017
521
I don't know why we keep assuming all these numbers scale linearly?

they don't, but it can fall 20% short of linear and still match or beat nvidia's $1500 - 2000 flagship and their $700+ high end that's having major crashing issues due to faulty hardware atm, even if rdna2 doesn't see a perf/flop increase vs rdna1. And these numbers are straight out of files amd put on the internet themselves, so they're real. They also match up perfectly with amds previous perf/watt increase claims, so there's that. We'll soon see what improvements RDNA2 has over rdna1, as well as what amd decides to price these things at, but with the now known specs, they're at the very least right in nvidia's face in terms of raster performance, which is something loads of people said was impossible, and still continue to say. Its funny, loads of people on this forum and elsehwere still say big navi MIGHT be competitive with the 3070, which is supposedly 2080Ti perf, roughly... but that would mean that for a 300mhz clockspeed increase and a straight doubling of shaders, that big navi would only be 30 - 40% faster than a bloody 40CU rdna1 part.....its quite absurd when you really think about what people were saying with that. that would be an even bigger perf/flop regression than ampere saw, and we're not actually expecting a regression for amd here ;)
 
Last edited:

Pulp

Member
Nov 4, 2017
3,023
Would be crazy if big navi actually outperforms ampere, at least on non RT numbers. I am happy with a card that can compete with a 3080
 

TheNerdyOne

Member
Oct 28, 2017
521
Would be crazy if big navi actually outperforms ampere, at least on non RT numbers. I am happy with a card that can compete with a 3080


doesn't seem so crazy now that we have specs that were dug straight out of amd files ;) as for RT performance... on paper, BVH intersect test performance of this gpu is uh... 704 billion intersect tests/second, or about 85% faster than the series X. where that will stack up vs ampere.... we don't really know yet for sure, in practice. the midrange 40CU part has RT performance about 5% faster than series X, if anyone was curious, and 5% more raster performance as well.

32CU part at the same 2500mhz the 40CU part is confirmed to have would be uh, almost EXACTLY the same as ps5 (10.24 tflop vs up to 10.28 on ps5)
funny how that works out, they have a ps5 and an XSX equivalent sku in their stack, and then something about 85% faster than them at the top.
 

Kieli

Self-requested ban
Banned
Oct 28, 2017
3,736
AMD must be hella confident if they're announcing a product 1 month after NVIDIA already released theirs.
 

BeI

Member
Dec 9, 2017
5,974
It would be interesting seeing AMD take the lead for clock speed this gen.
 

TheZynster

Member
Oct 26, 2017
13,285
holy shit..........I might be running a full AMD setup for the first time since the.......1090T and the 7000 series.
 
Oct 25, 2017
41,368
Miami, FL
videocardz.com

AMD Navy Flounder to feature 40 CUs and 192-bit memory bus - VideoCardz.com

AMD Sienna Cichlid and Navy Flounder specifications AMD RDNA2-based graphics cards (Rumored Radeon RX 6700/6800 – left, Rumored RX 6900 series – right) The next-generation desktop graphics processors based on AMD RDNA2 architecture are expected to debut on October 28th. AMD did not confirm which...


spec leak, 80CU at 2.2ghz for the big one (22.5tflop) and 40CU at 2.5ghz (12.8 tflop) for the midrange, 32CU at unknown clockspeed for the lower end one. Assuming the same perf/flop as RDNA1, and assuming it scales out, you're looking at ~2.3x 5700XT performance, which is uh, yeah, faster than a 3080, also faster than a 3090, at 4k (2080Ti is 50% faster than a 5700XT at 4k, and only 34% faster at 1080p, on average. anything above 2x 5700XT performance will put amd ahead of nvidia here, 2.3x gives them a definitive lead), we'll soon see if that works in practice, but on paper, nvidia's got a real problem here


These leaks aren't a rumor, they were dug out of the latest ROCm release (version 3.8) so yep, we have official specs. ~250w for the 80CU part at 2.2ghz , 170w for the 40CU part at 2.5ghz, those are probably power for the gpu core only, so ~300W for the big boy and ~225w for the midrange is my best guess, which would uh, give them precisely 50% more perf/watt vs rdna1 as claimed as well. looks like they weren't making that up (unlike nvidia).



navi 21 (big one) can apparently use GDDR6 OR hbm2, so the flagship is probably hbm2 (else they wouldnt have bothered putting hbm memory controllers in it)
Will def strongly consider AMD depending on their additional software solutions.

Issue is I have a gsync monitor that doesn't do freesync.
 

mordecaii83

Avenger
Oct 28, 2017
6,858
they don't, but it can fall 20% short of linear and still match or beat nvidia's $1500 - 2000 flagship and their $700+ high end that's having major crashing issues due to faulty hardware atm, even if rdna2 doesn't see a perf/flop increase vs rdna1. And these numbers are straight out of files amd put on the internet themselves, so they're real. They also match up perfectly with amds previous perf/watt increase claims, so there's that. We'll soon see what improvements RDNA2 has over rdna1, as well as what amd decides to price these things at, but with the now known specs, they're at the very least right in nvidia's face in terms of raster performance, which is something loads of people said was impossible, and still continue to say. Its funny, loads of people on this forum and elsehwere still say big navi MIGHT be competitive with the 3070, which is supposedly 2080Ti perf, roughly... but that would mean that for a 300mhz clockspeed increase and a straight doubling of shaders, that big navi would only be 30 - 40% faster than a bloody 40CU rdna1 part.....its quite absurd when you really think about what people were saying with that. that would be an even bigger perf/flop regression than ampere saw, and we're not actually expecting a regression for amd here ;)
Uh... That's quite the exaggeration.
 

Optmst

Member
Apr 9, 2020
471
videocardz.com

AMD Navy Flounder to feature 40 CUs and 192-bit memory bus - VideoCardz.com

AMD Sienna Cichlid and Navy Flounder specifications AMD RDNA2-based graphics cards (Rumored Radeon RX 6700/6800 – left, Rumored RX 6900 series – right) The next-generation desktop graphics processors based on AMD RDNA2 architecture are expected to debut on October 28th. AMD did not confirm which...


spec leak, 80CU at 2.2ghz for the big one (22.5tflop) and 40CU at 2.5ghz (12.8 tflop) for the midrange, 32CU at unknown clockspeed for the lower end one. Assuming the same perf/flop as RDNA1, and assuming it scales out, you're looking at ~2.3x 5700XT performance, which is uh, yeah, faster than a 3080, also faster than a 3090, at 4k (2080Ti is 50% faster than a 5700XT at 4k, and only 34% faster at 1080p, on average. anything above 2x 5700XT performance will put amd ahead of nvidia here, 2.3x gives them a definitive lead), we'll soon see if that works in practice, but on paper, nvidia's got a real problem here
Why is AMD overclocking their gpus like crazy, they seem like they want to hit certain TF numbers
/s
 

Slick Butter

Member
Oct 25, 2017
3,500
has the time finally come
is amd finally delivering with good enthusiast tier cards

I have a Vega 64, and previously had a Fury and before that a 290X and their top end cards have only gotten more disappointing (for gaming, Vega would have made a dope Hackintosh GPU if FCPX worked with it). I still don't have high hopes but these leaks are sort of getting to me
 

renx

Member
Jan 3, 2020
330
Given that the 5900XT is pretty close to a 2070S, it isn't too crazy to think that the 40CU/2500Mhz may be on pair with a 2080ti, if not better.
Even with a minor IPC improvement, these cards will be beasts.
 
Oct 27, 2017
5,264
Will def strongly consider AMD depending on their additional software solutions.

Issue is I have a gsync monitor that doesn't do freesync.
Yeah, AMD has always known how to do hardware. Hence why I've always used them. But not seems like a good time to switch because AMD's software has never been great and NVIDIA is making some big claims with their stuff.
 

HarryDemeanor

Member
Oct 25, 2017
5,422
Always wishing for a nice competitor card from AMD. One of these days they will hit that mark but we'll wait and see at the end of October.
Will def strongly consider AMD depending on their additional software solutions.

Issue is I have a gsync monitor that doesn't do freesync.
Same here. I actually have two G-Sync monitors. An Alienware AW341DW and an Acer Predator XB271HU. Going to make it hard enough for me to switch over unless AMD has some solution for that.
 
Last edited:

TheNerdyOne

Member
Oct 28, 2017
521
How do you figure? To me it looks like they're going to compete with the 3070 and lower only since that's the one that comes in october.

lol, the 3070, if it actually manages to be equivalent to a 2080Ti, is only 34% faster than a 5700XT at 1080p, you really think that an 80CU part clocked at 2.2ghz vs the 5700XTs 40CU at 1900mhz is going to only be 34% faster? Cmon man, this isn't ampere, amd isn't losing like 50% of their perf/flop overnight here like nvidia did, infact they should be gaining perf/flop, and even if they didn't, the 40CU part (midrange) should still be faster than a 2080Ti/3070 (assuming nvidia's actually telling the truth for the first time on an ampere product, and the 3070 actually matches the 2080Ti on average).

Matter of fact, do some math yourself on it. a 12.8Tflop rdna2 part assuming no perf/flop gains at all vs rdna1, is still ~35% faster than a ~9.5tflop rdna1 part, which makes it 2080Ti/ RTX 3070 level.... and that's the midrange part, not the big one.... so come the fuck on mate.


Oh yeah, 238W for 3080+ performance and 170W for 3070+ performance based on the leaked specs, which again are not a rumor, they're directly from amd provided files. So they're competitive with nvidia while drawing 100W less power it seems, guess samsung 8nm really, really fucked nvidia pretty badly here.
 
Last edited:

Bosch

Banned
May 15, 2019
3,680
How do you figure? To me it looks like they're going to compete with the 3070 and lower only since that's the one that comes in october.
Lol no. Look to the numbers navi21 will be neck a neck with a 3080. Navi 22 will be really close of a 3070 but probably costing 100 less. If I need to guess big navi $599 and mid navi $399
 

TheNerdyOne

Member
Oct 28, 2017
521
Uh... That's quite the exaggeration.



www.igorslab.de

The possible reason for crashes and instabilities of the NVIDIA GeForce RTX 3080 and RTX 3090 | Investigative | igor´sLAB

Not only the editors and testers were surprised by sudden instabilities of the new GeForce RTX 3080 and RTX 3090, but also the first customers who were able to get board partner cards from the first…


is it really? seems to be a pretty widespread problem to me, certainly a big enough issue for major outlets to cover it and do deep dives to find out why... and oh yeah, capacitors that dont meet spec are the cause, aka FAULTY HARDWARE, good job!
 

Tovarisc

Member
Oct 25, 2017
24,404
FIN


www.igorslab.de

The possible reason for crashes and instabilities of the NVIDIA GeForce RTX 3080 and RTX 3090 | Investigative | igor´sLAB

Not only the editors and testers were surprised by sudden instabilities of the new GeForce RTX 3080 and RTX 3090, but also the first customers who were able to get board partner cards from the first…

is it really? seems to be a pretty widespread problem to me, certainly a big enough issue for major outlets to cover it and do deep dives to find out why... and oh yeah, capacitors that dont meet spec are the cause, aka FAULTY HARDWARE, good job!


Or you could just wait for more data and investigation into the issue to see what is real cause here?

There has been reports of crashing cards that supposedly have "safe capacitor designs" so that Igor's article that everyone is grabbing with two hands to hammer NV with isn't full answer to what is going on. It's real possibility still that it has nothing to do with capacitor designs, we need more data.
 

TheNerdyOne

Member
Oct 28, 2017
521
The other way around. One who's not confident will wait till competitor will play its cards before unveiling anything.

seems to me like amd has nothing to worry about, now that we know rdna2 specs, and nvidia were clearly concerned else they wouldn't have released the big ampere die at $700. 3090 is a titan replacement, 3080 is a 2080Ti replacement, which is why its on the 102 die and not the 104 die like the 2080 was. Nvidia literally can't ship anything bigger to counter navi21, meanwhile amd can very easily ship something a couple hundred mm bigger if they actually felt the need.... ~500mm^2 is nowhere near the limit for tsmc N7P. The simple fact that amd isn't even attempting to ship a 600 - 700mm+ die like they COULD do, and like nvidia did do, means they're definitely not lacking for confidence here sir.
 

TheNerdyOne

Member
Oct 28, 2017
521
Or you could just wait for more data and investigation into the issue to see what is real cause here?

There has been reports of crashing cards that supposedly have "safe capacitor designs" so that Igor's article that everyone is grabbing with two hands to hammer NV with isn't full answer to what is going on. It's real possibility still that it has nothing to do with capacitor designs, we need more data.

if the problem is even more widespread than just the ones with junk caps, then that's even worse....
 

Caz

Attempted to circumvent ban with alt account
Banned
Oct 25, 2017
13,055
Canada
Would be crazy if big navi actually outperforms ampere, at least on non RT numbers. I am happy with a card that can compete with a 3080
FWIW AMD has taken the performance crown from NVIDIA a few times...just not on the high-end and not for roughly a decade. That said, it's not so unbelievable that they got their act together given how much of an improvement the 5700XT was compared to the disastrous Radeon VII in gaming. Hoping this is one of those times where we get a price war between the two on the high-end as a result of said raw performance.
 

Roytheone

Member
Oct 25, 2017
5,139
Lol no. Look to the numbers navi21 will be neck a neck with a 3080. Navi 22 will be really close of a 3070 but probably costing 100 less. If I need to guess big navi $599 and mid navi $399

A 399 card that trades blows with a 3070 hopefully will cause Nvidia to price the 3060 ti lower than that.
 

mordecaii83

Avenger
Oct 28, 2017
6,858


www.igorslab.de

The possible reason for crashes and instabilities of the NVIDIA GeForce RTX 3080 and RTX 3090 | Investigative | igor´sLAB

Not only the editors and testers were surprised by sudden instabilities of the new GeForce RTX 3080 and RTX 3090, but also the first customers who were able to get board partner cards from the first…


is it really? seems to be a pretty widespread problem to me, certainly a big enough issue for major outlets to cover it and do deep dives to find out why... and oh yeah, capacitors that dont meet spec are the cause, aka FAULTY HARDWARE, good job!

First of all, I own a 3080 FE and have likely been following news about the 3080 closer than you, and I've had zero crashes.

Second, that is one possible cause for occasional crashes some people have been having usually only in certain games, and a small downclock has fixed it for every person I've seen have the issue.

Third, calling it "widespread" is hyperbole considering we have no idea how many people are affected at this point.

Fourth, multiple people have mentioned it likely only needs a bios flash to correct the issue even in the worst case or just a driver update in the best case.

Edit: Anyway, this is a thread about AMD GPU's, so moving on...
 
Last edited:

TheNerdyOne

Member
Oct 28, 2017
521
FWIW AMD has taken the performance crown from NVIDIA a few times...just not on the high-end and not for roughly a decade. That said, it's not so unbelievable that they got their act together given how much of an improvement the 5700XT was compared to the disastrous Radeon VII in gaming. Hoping this is one of those times where we get a price war between the two on the high-end as a result of said raw performance.

the 5700XT and radeon 7 are almost exactly dead even matched for eachother for gaming, the difference is the radeon 7 has 50% more flops and is a much, much larger, hungrier, more expensive chip.
 

Brodo Baggins

Member
Oct 27, 2017
3,916
My 5700 XT started shitting the bed after owning it for only 2 months. I started getting constant AMD driver crashes nonstop when playing games. Sent it in for RMA and put in my old 1070, and it hasn't crashed in over a week.

Looking around the cards seem to have soooo many lemons out in the wild. I don't have great faith in these cards faring better.
 

Sedated

Member
Apr 13, 2018
2,598
Pretty promising specs.

Amd announcing the event after 3080 public release either means
1) they cant compete so did not choose to by putting their event date after competitor product was out in the market.

Or
2) they are gonna put that 3080 gpu in their slides and show it getting crushed by their own.

Lets see
 

NaDannMaGoGo

Member
Oct 25, 2017
5,963
AMD must be hella confident if they're announcing a product 1 month after NVIDIA already released theirs.

You know, or they just aren't god damned ready earlier?

Why do so many people assume it's just a given that GPU manufacturers' production times line up closely and someone releasing earlier or later than the other is for 4D chess marketing?
 

TheNerdyOne

Member
Oct 28, 2017
521
Pretty promising specs.

Amd announcing the event after 3080 public release either means
1) they cant compete so did not choose to by putting their event date after competitor product was out in the market.

Or
2) they are gonna put that 3080 gpu in their slides and show it getting crushed by their own.

Lets see

specs say they're as fast or faster than the 3080 on the high end, and the middle of the stack is 2080Ti tier as well.... my bet is they wanted a 3080 to put in their slides, and they're also stockpiling as many cards as they can get their hands on so they have a decent launch supply. Still gonna sell out 100% of everything they can ship mind you. They've also got chips to supply sony and microsoft for their launches, as well as wafer supply going for the zen3 rollout.... they're using a metric fuckton of 7nm wafers all in a similar period here, and that by neccesity means they have to stagger their product launches. Zen3 is the more important product for amd, so cpus are launching first, not really a shocker.