• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.
  • We have made minor adjustments to how the search bar works on ResetEra. You can read about the changes here.

horkrux

Member
Oct 27, 2017
4,739
Yes it was very late and a bit of a power hog. Kinda funny though as now people seem fine buying 3x8 pin 450w monsters from Nvidia.

It's held up quite well is all I'm saying. It's really not much slower than a 5700xt/2070.

youtu.be

RX 5700XT vs RX VEGA 64 || NEW DRIVER || PC GAMES TEST |

RX 5700XT vs RX VEGA 64 || NEW DRIVER || RYZEN 5 3600XT || PC GAMES TEST || 1080P | 1440P ||DRIVER- Adrenalin 2020 Edition 20.8.1 SystemOS ...

That is indeed funny. These cards are hyper power hungry and people don't seem to bat an eye
But tbf we also have giga coolers that keep them relatively cool and quiet for the most part, so there's that
 

dgrdsv

Member
Oct 25, 2017
11,885
Welp, that's 8/16GB alright. I'm still weirded out about this less wide memory controller + a big chunk of cache stuff. I guess it could still be 256-bit for a lower tier (N22?) card, but that package definitely looks too big to be in the 200mm² range.
Navi 22 is supposedly 192 bit.
The cache choice will be an interesting one for sure, it could lead to some unexpected changes in scaling and could result also in PC RDNA2 requiring some additional optimizations to optimally run RDNA2 games from PS5/XSX.
I wonder if this approach does makes sense in the long run though considering the limits of scaling and the realities of mem vs logic production on advanced processes.
A rather bold choice on AMD's part, can't wait to see how it will play out.
 

Waaghals

Member
Oct 27, 2017
859
That is indeed funny. These cards are hyper power hungry and people don't seem to bat an eye
But tbf we also have giga coolers that keep them relatively cool and quiet for the most part, so there's that
The truth is that high power consumption can be forgiven if you can bring unparalleled performance to the table.

At the moment, the 3080 does that.

If the next gen Navi cars can provide competitive or better performance with lower power consumption, that will be a strike against the 3080.
(Personally I think they will give Nvidia a run for their money at least in non-RT workloads).

The Vega 64 got trashed because it was slower in most games than the gtx 1080. It was only close to a 1080TI in a one of the Dirt Rally games (forget which one). If it had been the fastest card of its generation it would have gotten a pass on power consumption. Stating otherwise is historical revisionism.
 

Caz

Attempted to circumvent ban with alt account
Banned
Oct 25, 2017
13,055
Canada

Why would anyone already paying $700-$850 for a GPU buy the "off brand" with similar rasterization performance, likely inferior RT/AIS performance, and a reputation of poor driver support to save $50? Nobody with the means to pay that much gives a fuck about $50, that's why most AIB 3080 models $800+. If AMD has any aspirations of genuinely capturing marketshare (not just Nvidia's shortages crumbs), they're going to have to undercut them at least $100, and even that would do very little. $200 would be required for genuine market disruption.
1. NVIDIA has had their share of driver issues to work out post-new architecture launch, including the recent (and in my opinion, disastrous) Ampere launch where they withheld said drivers from developers that were part of the reason why said launch was such a mess. This is not an AMD exclusive issue.
2. It's probably not going to be just $50, even before considering regional pricing and currency conversion; i'm from Canada where the difference, assuming it's closer to $100 USD, could be anywhere from $125 to $160 depending on how aggressive AMD is with their MSRP. That's a fair amount of money on what will be the most expensive component of most people's system.
3. Setting aside AMD having or lacking a DLSS equivalent, DLSS and raytracing are not a thing that I care about; both are poorly supported at the moment and while the latter may see an uptick due to the hardware raytracing on consoles, the former is only available on a dozen or so games with widely varying performance uplifts that are not true renderings at the resolutions they're being displayed at.
4. All leaks have suggested that the 6900XT will boast more memory than the 3080, which makes it a more appealing purchase for 4K and potentially 1440p depending on how things develop over the next few years.
5. The 3080 is not a great overclocker. While the verdict is still out on how much one can squeeze out of a 6800/6900 XT, if it's anything like the 5700XT, there should be a notable amount of headroom to tinker around with.
6. Between the efficiency improvements from RDNA 1 and Ampere's noticeable uptick in TDP, the 3080 competitor may cost one less for the same relative performance when factoring in how much it costs to run said GPU at stock performance. We still don't know this for sure but we'll find out soon enough.

I'm going for whichever of the two gives me the best bang-for-your-buck on the high-end.
 

Tallshortman

Member
Oct 29, 2017
1,634
Why would anyone already paying $700-$850 for a GPU buy the "off brand" with similar rasterization performance, likely inferior RT/AIS performance, and a reputation of poor driver support to save $50? Nobody with the means to pay that much gives a fuck about $50, that's why most AIB 3080 models $800+. If AMD has any aspirations of genuinely capturing marketshare (not just Nvidia's shortages crumbs), they're going to have to undercut them at least $100, and even that would do very little. $200 would be required for genuine market disruption.

Did you even read my post or did you just have something you wanted to get off your chest? I said that's what I expected, not that it would suddenly put AMD over the top. Besides, AMD in the past had 50%+ market share with slightly lower price and similar or higher performance. The 3080 level card is almost negligible in terms of market share, it's all about the lower to mid end. It simply provides better brand awareness.
 

Deleted member 17092

User requested account closure
Banned
Oct 27, 2017
20,360


1. NVIDIA has had their share of driver issues to work out post-new architecture launch, including the recent (and in my opinion, disastrous) Ampere launch where they withheld said drivers from developers that were part of the reason why said launch was such a mess. This is not an AMD exclusive issue.
2. It's probably not going to be just $50, even before considering regional pricing and currency conversion; i'm from Canada where the difference, assuming it's closer to $100 USD, could be anywhere from $125 to $160 depending on how aggressive AMD is with their MSRP. That's a fair amount of money on what will be the most expensive component of most people's system.
3. Setting aside AMD having or lacking a DLSS equivalent, DLSS and raytracing are not a thing that I care about; both are poorly supported at the moment and while the latter may see an uptick due to the hardware raytracing on consoles, the former is only available on a dozen or so games with widely varying performance uplifts that are not true renderings at the resolutions they're being displayed at.
4. All leaks have suggested that the 6900XT will boast more memory than the 3080, which makes it a more appealing purchase for 4K and potentially 1440p depending on how things develop over the next few years.
5. The 3080 is not a great overclocker. While the verdict is still out on how much one can squeeze out of a 6800/6900 XT, if it's anything like the 5700XT, there should be a notable amount of headroom to tinker around with.
6. Between the efficiency improvements from RDNA 1 and Ampere's noticeable uptick in TDP, the 3080 competitor may cost one less for the same relative performance when factoring in how much it costs to run said GPU at stock performance. We still don't know this for sure but we'll find out soon enough.

I'm going for whichever of the two gives me the best bang-for-your-buck on the high-end.


That FS score is very impressive. In between 3080 and 3090.
 

dgrdsv

Member
Oct 25, 2017
11,885
www.igorslab.de

3DMark in Ultra-HD - Benchmarks of the RX 6800XT with and without Raytracing appeared | igor´sLAB

As always, you have to be careful with such benchmarks, even if the material I received yesterday seems quite plausible. Two sources, very different approaches or settings and yet in the end a certain…

01-Percent.png
 

Linus815

Member
Oct 29, 2017
19,797


1. NVIDIA has had their share of driver issues to work out post-new architecture launch, including the recent (and in my opinion, disastrous) Ampere launch where they withheld said drivers from developers that were part of the reason why said launch was such a mess. This is not an AMD exclusive issue.
2. It's probably not going to be just $50, even before considering regional pricing and currency conversion; i'm from Canada where the difference, assuming it's closer to $100 USD, could be anywhere from $125 to $160 depending on how aggressive AMD is with their MSRP. That's a fair amount of money on what will be the most expensive component of most people's system.
3. Setting aside AMD having or lacking a DLSS equivalent, DLSS and raytracing are not a thing that I care about; both are poorly supported at the moment and while the latter may see an uptick due to the hardware raytracing on consoles, the former is only available on a dozen or so games with widely varying performance uplifts that are not true renderings at the resolutions they're being displayed at.
4. All leaks have suggested that the 6900XT will boast more memory than the 3080, which makes it a more appealing purchase for 4K and potentially 1440p depending on how things develop over the next few years.
5. The 3080 is not a great overclocker. While the verdict is still out on how much one can squeeze out of a 6800/6900 XT, if it's anything like the 5700XT, there should be a notable amount of headroom to tinker around with.
6. Between the efficiency improvements from RDNA 1 and Ampere's noticeable uptick in TDP, the 3080 competitor may cost one less for the same relative performance when factoring in how much it costs to run said GPU at stock performance. We still don't know this for sure but we'll find out soon enough.

I'm going for whichever of the two gives me the best bang-for-your-buck on the high-end.


i wouldnt compare nvidia's driver issue that was fixed within a week to the over half a year long stream of issues the 5700 series had after launch.
Like, yeah man, all vendors have problems with drivers. The difference is that nvidia typically fixes them relatively quickly, whereas AMD has a history of taking weeks, months to properly address them.

I have had the misfortune of experiencing the black screening issue and bizarre underperforming in certain, generally older games on my friend's build and he just gave up. Returned the card for a slower 2070, but at least it worked fine.

When even fucking AdoredTV makes a video calling out AMD's driver team, you know something's wrong



the way amd handled the 5700 driver situation is the bigest reason why I'm not getting immediately hyped at seeing big navi perform so well.
 

tusharngf

Member
Oct 29, 2017
2,288
Lordran


1. NVIDIA has had their share of driver issues to work out post-new architecture launch, including the recent (and in my opinion, disastrous) Ampere launch where they withheld said drivers from developers that were part of the reason why said launch was such a mess. This is not an AMD exclusive issue.
2. It's probably not going to be just $50, even before considering regional pricing and currency conversion; i'm from Canada where the difference, assuming it's closer to $100 USD, could be anywhere from $125 to $160 depending on how aggressive AMD is with their MSRP. That's a fair amount of money on what will be the most expensive component of most people's system.
3. Setting aside AMD having or lacking a DLSS equivalent, DLSS and raytracing are not a thing that I care about; both are poorly supported at the moment and while the latter may see an uptick due to the hardware raytracing on consoles, the former is only available on a dozen or so games with widely varying performance uplifts that are not true renderings at the resolutions they're being displayed at.
4. All leaks have suggested that the 6900XT will boast more memory than the 3080, which makes it a more appealing purchase for 4K and potentially 1440p depending on how things develop over the next few years.
5. The 3080 is not a great overclocker. While the verdict is still out on how much one can squeeze out of a 6800/6900 XT, if it's anything like the 5700XT, there should be a notable amount of headroom to tinker around with.
6. Between the efficiency improvements from RDNA 1 and Ampere's noticeable uptick in TDP, the 3080 competitor may cost one less for the same relative performance when factoring in how much it costs to run said GPU at stock performance. We still don't know this for sure but we'll find out soon enough.

I'm going for whichever of the two gives me the best bang-for-your-buck on the high-end.



thats insane .. better than 3080
 

Deleted member 17092

User requested account closure
Banned
Oct 27, 2017
20,360
www.igorslab.de

3DMark in Ultra-HD - Benchmarks of the RX 6800XT with and without Raytracing appeared | igor´sLAB

As always, you have to be careful with such benchmarks, even if the material I received yesterday seems quite plausible. Two sources, very different approaches or settings and yet in the end a certain…

01-Percent.png

That's actually a great result for a first attempt at RT from AMD. Better than a 2080 Ti.

Also it seems igor is confirming that that FS score from yesterday was from a 6800xt.

So Amd 6900xt might actually be as good as 3080/3090 at RT.
 
Last edited:

BeI

Member
Dec 9, 2017
5,983
That's actually a great result for a first attempt at RT from AMD. Better than a 2080 Ti.

Looks pretty close though. Isn't the 6800 expected to have 4608 shaders though, so double that of the PS5? I wonder where that will put consoles in RT performance? Still, general performance looks really good, so I may have to consider one of the budget cards if they are a good price.
 

Serious Sam

Banned
Oct 27, 2017
4,354
www.igorslab.de

3DMark in Ultra-HD - Benchmarks of the RX 6800XT with and without Raytracing appeared | igor´sLAB

As always, you have to be careful with such benchmarks, even if the material I received yesterday seems quite plausible. Two sources, very different approaches or settings and yet in the end a certain…

01-Percent.png
I learned to never believe AMD leaks and rumors. They always tend to be too optimistic heh. We'll see how it turns out soon enough.
 

Dekim

Member
Oct 28, 2017
4,301
This is a chance for AMD to do for their GPU division what they did for their CPU division with Ryzen in 2017. Let's see if AMD can stick the landing or bungle it again.
 

Bashteee

Member
Oct 27, 2017
1,193
After the price hike with the CPUs, I have no doubt the Navi 21 will cost more than the 3080.

I expect it to land slightly below the 3080. It took AMD a couple of iterations before they were able to outplay Intel and they currently need some more marketshare from Nvidia before they can charge more.
 

Serpens007

Well, Tosca isn't for everyone
Moderator
Oct 31, 2017
8,131
Chile
Woah, so we may actually see AMD pulling ahead this time?

Mid-range battle of prices will be so good for those of us aiming there
 

Deleted member 17092

User requested account closure
Banned
Oct 27, 2017
20,360
Woah, so we may actually see AMD pulling ahead this time?

Mid-range battle of prices will be so good for those of us aiming there

I think we have good indications they will be able to at least compete with the 3080/3090 and yeah that rdna2 is a great arch should have good cards in the whole range.

But as always wait for 3rd party gaming benchmarks.
 

Paganmoon

Member
Oct 26, 2017
5,586
After the price hike with the CPUs, I have no doubt the Navi 21 will cost more than the 3080.
They're in good competition with Intel, and have gotten quite a bit of mindshare on the CPU side. They do not have that at all on the GPU side, I don't think they're that selfdestructive to price themselves above Nvidia's offerings.
 

Serpens007

Well, Tosca isn't for everyone
Moderator
Oct 31, 2017
8,131
Chile
I think we have good indications they will be able to at least compete with the 3080/3090 and yeah that rdna2 is a great arch should have good cards in the whole range.

But as always wait for 3rd party gaming benchmarks.

Absolutely. It's good that we are gonna have competition this time. AMD is bringing the big guns in both CPU and GPU
 

dgrdsv

Member
Oct 25, 2017
11,885
That's actually a great result for a first attempt at RT from AMD. Better than a 2080 Ti.
It's a good result in absolute numbers for sure but it's factually worse than 2080Ti, not better - according to these results the performance drop from enabling RT is higher on RDNA2 than it was even on Turing.
Still, do take these with a huge grain of salt. They are from different systems and 3DMark score adds CPU into itself. I'm quite a bit wary of making far reaching conclusions from such results so far.

Also it seems igor is confirming that that FS score from yesterday was from a 6800xt.

So Amd 6900xt might actually be as good as 3080/3090 at RT.
Will happen only if that "6900XT" will be +30% to "6800XT" and this seems like an unrealistically high number.
From pure specs which were leaked this week the top end 6000 card will be just +12.5% to the one below it in pure flops, and this will likely mean <10% of actual performance gain on practice.
 

rckvla

Member
Oct 25, 2017
2,737

Typhest

Member
Oct 26, 2017
43
Those 3DMark results are a bit odd. A 3080 should be closer to 30% faster than a 2080ti in TSE but this pegs it at 13%. The port royal results are a little more believable but looking at Guru3Ds benches, the 3080 is about 43% faster than the 2080ti and in this chart it's 26% or so.

EDIT: These are all 4K, please disregard
 
Last edited:

Deleted member 17092

User requested account closure
Banned
Oct 27, 2017
20,360
It's a good result in absolute numbers for sure but it's factually worse than 2080Ti, not better - according to these results the performance drop from enabling RT is higher on RDNA2 than it was even on Turing.
Still, do take these with a huge grain of salt. They are from different systems and 3DMark score adds CPU into itself. I'm quite a bit wary of making far reaching conclusions from such results so far.


Will happen only if that "6900XT" will be +30% to "6800XT" and this seems like an unrealistically high number.
From pure specs which were leaked this week the top end 6000 card will be just +12.5% to the one below it in pure flops, and this will likely mean <10% of actual performance gain on practice.

I guess I don't see why it matters that the performance drop is bigger when it is still beating a 2080 ti in RT performance.

That's a good result for AMD.

But yeah I'd also say I wouldn't necessarily make conclusions about AMDs actual gaming performance in RT based on a single synthetic benchmark.

It could be that port royal is not very platform agnostic as AMD entering the RT scene is brand new, so we can't really make too many conclusions from this I suppose.
 

dgrdsv

Member
Oct 25, 2017
11,885
I guess I don't see why it matters that the performance drop is bigger when it is still beating a 2080 ti in RT performance.
Cause it will slow down further RT adoption? If you loose 2/3rds of your performance when enabling RT then most people won't use it and devs won't include it into their games.

That's a good result for AMD.
It's a good result in a sense that they have h/w RT now, yes.

But yeah I'd also say I wouldn't necessarily make conclusions about AMDs actually gaming performance in RT based on a single synthetic benchmark.

It could be that port royal is not very platform agnostic as AMD entering the RT scene is brand new, so we can't really make too many conclusions from this I suppose.
Absolutely. We need to see proper benchmarks to make any conclusions. It may well be a lot better than 3DM results imply, it may further improve with drivers, it may need some AMD RT h/w specific optimizations to run at max performance, etc.
 

Raydonn

One Winged Slayer
Member
Oct 25, 2017
919
I fully expect a copy of this thread for Big Navi once the AMD event is over.

www.resetera.com

Nvidia was just brutal with Ampere showing; how can AMD compete with Big Navi?

From the new Ampere line with massive improvements and REALLY good pricing compared to the previous RTX series, an improved raytracing solution, DLSS 2.0 and on top of that they even made RTX I/O in collaboration with MS and their DirectStorage. Can AMD *realistically* bring something new to...

They'd both be fundamentally flawed polls, but at least there will be some crows to be eaten.
 

Deleted member 17092

User requested account closure
Banned
Oct 27, 2017
20,360
Those 3DMark results are a bit odd. A 3080 should be closer to 30% faster than a 2080ti in TSE but this pegs it at 13%. The port royal results are a little more believable but looking at Guru3Ds benches, the 3080 is about 43% faster than the 2080ti and in this chart it's 26% or so.

Here it is 18% faster in TSE and igor has it at 15%. Seems about right.

techgage.com

NVIDIA GeForce RTX 3080 Gaming At 4K, Ultrawide & With RTX On

We were left impressed with our look at NVIDIA's GeForce RTX 3080 in creative applications, so now it's time to turn our attention to gaming! In this article, we're going to take a look at 4K and ultrawide performance in a selection of current games, as well as explore some RTX titles, including...

Port royal all the benches comparisons I can find are port royal 1440p not 4k but the 1440p difference in port royal is around 30% which is close to Igor's 27%.
 

Typhest

Member
Oct 26, 2017
43
Here it is 18% faster in TSE and igor has it at 15%. Seems about right.

techgage.com

NVIDIA GeForce RTX 3080 Gaming At 4K, Ultrawide & With RTX On

We were left impressed with our look at NVIDIA's GeForce RTX 3080 in creative applications, so now it's time to turn our attention to gaming! In this article, we're going to take a look at 4K and ultrawide performance in a selection of current games, as well as explore some RTX titles, including...

Port royal all the benches comparisons I can find are port royal 1440p not 4k but the 1440p difference in port royal is around 30% which is close to Igor's 27%.

Yeah - I just noticed that these are all 4K benches. Disregard my post!
 

ILikeFeet

DF Deet Master
Banned
Oct 25, 2017
61,987
Can't move to AMD due to things like blender supporting Nvidia cards better, but I hope these results force lower prices on the green side
 
Jun 18, 2018
1,100
Cause it will slow down further RT adoption? If you loose 2/3rds of your performance when enabling RT then most people won't use it and devs won't include it into their games.

I think you're missing the bigger picture - RT is standard on next-gen consoles. It will be used there and it will get support on PC. It also helps that Intel and Power VR's upcoming GPUs will also have RT.

We've seen similar jumps in adoption when when other GPU technologies have come online in the console space, such as Pixel & Vertex Shaders. We've also seen how tessellation & virtual texturing haven't become commonly used approaches because it's of limitations in the past two generations of consoles.

[EDIT] Side note, here's hoping the Navi 2 laptop dGPUs can trade performance blows with their Ampere counterparts at better prices and with lower power consumption. And if we get PCIe 4 support on 5xxxH series CPUs, it could usher in revolution for gaming laptops.
 

1-D_FE

Member
Oct 27, 2017
8,261
Is there any hint at all that AMD is updating their GPU video encoder? I was surprised to learn that AMD is way beyond Nvidia's NVENC chip and this is a deal breaker unless they're updating their chip on this.
 

Deleted member 17092

User requested account closure
Banned
Oct 27, 2017
20,360
Is there any hint at all that AMD is updating their GPU video encoder? I was surprised to learn that AMD is way beyond Nvidia's NVENC chip and this is a deal breaker unless they're updating their chip on this.

Does this help you?

videocardz.com

AMD Navi 2X GPUs (RDNA2) to support AV1 decoding - VideoCardz.com

AMD RDNA2 gets AV1 decoding, Sienna Cichlid IDs listed AMD has submitted a Linux kernel patch with AV1 codec registers. This should confirm that the upcoming RDNA2 architecture will support AV1 decoding on the desktop Radeon series (RX and Pro). Both Navi 21 (Sienna Cichlid) and Navi 22 (Navy...
 

SolidSnakeUS

Member
Oct 25, 2017
9,616
I think it would be interesting to see how RT is actually implemented with Big Navi. Is this going to be like Freesync where it can work on almost anything, but does not work as well as the dedicated RTX/Tensor core stuff or will it need to be implemented the same was as RTX (where it would basically need to be a very specific setting and specifically programming into the game)?
 

1-D_FE

Member
Oct 27, 2017
8,261
Does this help you?

videocardz.com

AMD Navi 2X GPUs (RDNA2) to support AV1 decoding - VideoCardz.com

AMD RDNA2 gets AV1 decoding, Sienna Cichlid IDs listed AMD has submitted a Linux kernel patch with AV1 codec registers. This should confirm that the upcoming RDNA2 architecture will support AV1 decoding on the desktop Radeon series (RX and Pro). Both Navi 21 (Sienna Cichlid) and Navi 22 (Navy...

I'm talking about the encoder, though. The NVENC encodes video on the GPU. This has lots of uses, although I'm mainly interested in it for VR purposes. Virtual Desktop is significantly worse on AMD cards, for example, than Nvidia cards. Even my 1070 card, which is better than anything AMD has, doesn't perform as well as the 11xx/2xxx/3xxx NVENC chips. It's something that's playing a role in me wanting to upgrade.
 

TSM

Member
Oct 27, 2017
5,823
I'm talking about the encoder, though. The NVENC encodes video on the GPU. This has lots of uses, although I'm mainly interested in it for VR purposes. Virtual Desktop is significantly worse on AMD cards, for example, than Nvidia cards. Even my 1070 card, which is better than anything AMD has, doesn't perform as well as the 11xx/2xxx/3xxx NVENC chips. It's something that's playing a role in me wanting to upgrade.

We'll probably have to wait until the official announcement for details like that.
 

Lagspike_exe

Banned
Dec 15, 2017
1,974
6800XT can be a seriously great buy if they price it right. With 16GB of RAM and better raster performance than 3080, I could see a lot of people picking it up vs. 10GB 3080 if they're not interested in RTX/DLSS stuff. We'll see the final benchmarks, but AMD seems to have an actual high-end competitor.