• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.
Status
Not open for further replies.

OdjnRyu

Member
Nov 8, 2017
775
Haven't been able to get a 3080 at all, skipping the 3070; looks to me like AMD's 6800XT is my new best friend. Hope I can actually get one.
 

NaDannMaGoGo

Member
Oct 25, 2017
5,963
Oh, I am watching Builzoid's impressions, and realized just now that the 6900xt vs RTX 3090 comparison was with "Rage-Mode" and that Smart Access Memory both enabled. That 8-9% performance boost from 8 CUs + those modes...

Lmao, just seems so fucking not worth it for another $350. RTX 3090 and 6900xt just have a terrible bang to bucks ratio for gaming.

RTX 3070 made a very good impression to me yesterday. Seems like price-performance, power efficiency, and cooling, with minor caveats pointed out by IgorsLAB, are pretty on point. Considering DLSS, RTX, and lower MSRP—if that ends up mattering this time—it sounds more attractive to AMD's new cards, even with lower raw power.
 

MatrixMan.exe

Member
Oct 25, 2017
9,499
6800 (16G, $579) vs 3070 (8G, $499)
6800XT (16G, $649) vs 3080 (10G, $699)
6900XT (16G, $999) vs 3090 (24G, $1499)

I don't know if I wish the 6800 was cheaper or if the 3070 had more memory.

I wish the 3070 had more memory. The 6800 having twice as much is real tempting. Really though, for me I need to see what the partners have cooking.

Only two of the 3070 cards fit in my case (FE and Asus Dual), so I need to know which 6800 or XT cards, if any, fit in there as well.
 

bic

Member
Oct 28, 2017
432
I'm most interested to see what stock is like. If the 6800XT is readily available, I'll go with that.
 

dgrdsv

Member
Oct 25, 2017
11,850
It'll be interesting to see if the industry moves fully away from proprietary RTX implementation due to the consoles. At the very least I'm guessing they'll all have DXR implementation if they include RT.
There's nothing proprietary in the s/w side of RTX implementation. It runs through DXR and VKRT.
 

ILikeFeet

DF Deet Master
Banned
Oct 25, 2017
61,987
6800 (16G, $579) vs 3070 (8G, $499)
6800XT (16G, $649) vs 3080 (10G, $699)
6900XT (16G, $999) vs 3090 (24G, $1499)

I don't know if I wish the 6800 was cheaper or if the 3070 had more memory.
I wonder if directstorage would minimize VRAM overflow issues. will be interesting to see when devs get their hands on it.
 

Menchi

Member
Oct 28, 2017
3,140
UK
I'm currently awaiting my "pre-order" of a 3080, but having moved 6 positions out of 135+ forward in 5-6 weeks, I'm really past caring. If I don't get it by release of 6800XT, I'll try and get one of those instead, and hope AMD aren't useless either
 

Deleted member 17092

User requested account closure
Banned
Oct 27, 2017
20,360

Theswweet

RPG Site
Verified
Oct 25, 2017
6,405
California
So... we don't know the card itself, but it seems like worst case the 6900xt is 20%~ faster than the 2080 ti at RT performance, going off of the benchmark.

That's honestly not that bad. It sounds like RDNA2 should be at least comparable to high-end Turing.
 

Ra

Rap Genius
Moderator
Oct 27, 2017
12,203
Dark Space
It'll be interesting to see if the industry moves fully away from proprietary RTX implementation due to the consoles. At the very least I'm guessing they'll all have DXR implementation if they include RT.
Nvidia hardware is put to use through Microsoft's DXR API's though. You peeps really need to come up to speed on how ray traycing is implemented.

There is nothing proprietary in play.
 

FPS murderer

Member
Oct 27, 2017
363
SLC, UT
I've been using ATI/AMD cards for over a decade, and since then I've been hearing about bad drivers but I have yet to see a fail on that side. Granted I'm not a huge player ( few to several hours a week) but I never experienced a single crash in game due to drivers.
My experience is quite the opposite, always having issues, since switching to nvidia, no more issues. My friends still have amd cards and are always suffering.

I do hope AMD does put more $$$ into their driver team as competition is good for us consumers.
 

Reinhard

Member
Oct 27, 2017
6,592
The only thing Nvidia has going for it really is DLSS via tensor cores, everything else is also being done on RDNA2 like the low latency stuff, accelerated DirectStorage, and of course RT is platform agnostic being implemented via DXR or Vulcan. AMD did confirm on twitter that their version of DLSS is coming, it just wont be ready for launch. Their iteration will be open platform and potentially could be implemented in consoles too. Of course I have to wonder how good the fidelity will be since it will be sofware based instead of using Tensor / matrix cores.
 

bruhaha

Banned
Jun 13, 2018
4,122
Primitive shaders are 100% not Mesh shaders which is the whole point in the creation of Mesh shaders.

Do you have access to the PS5 API? If not, what part of Cerny's description of primitive shader does not match the functionality of mesh shaders?

Also unlike sampler feedback which is patented by MS, mesh shaders and VRS are non-MS specific ideas which Sony would have no reason to remove from stock RDNA2.
 

dgrdsv

Member
Oct 25, 2017
11,850
So just one synthetic here right? Actually shows better performance than the previous port royal score. Not bad really.
It's a very simple test though with hardly any shading meaning that it compares RDNA2 CUs+RAs with NV's RT cores only.
N21 is about 50% slower than GA102 here. Makes you wonder how it will do in games where CUs have to perform shading too and where rays may diverge a lot more often than in this SDK example.
 
Oct 28, 2017
1,715
RT performance between 2080 Ti and 3080 with equal or better rasterization perf of 3080 and 6GB more VRAM. At $50 cheaper it still makes sense as a product.

6GB extra memory which isn't even needed, plus it's only GDDR6. We'll see about the rasterisation performance from neutral outlets. No DLSS and severely diminished RT performance. Not sure that's a bargain at $50 less.
 

ShinUltramanJ

Member
Oct 27, 2017
12,949
I keep hearing no DLSS, but AMD has their answer in the works right now.

Wouldn't surprise me at all if AMD's version doesn't require a per game basis, but can be used with any game.
 

Readler

Member
Oct 6, 2018
1,972
Not a gaming workflow, a business one.

Absolutely not they are equal wastes of money for gaming xD. One is 50% more expensive that the other one.
Well NV is just suited more for business workflows because of CUDA lock in.

And well that was poorly worded then. I don't mean they're equal wastes of money, but equally so. Doesn't matter if you pay 33% less, you're still getting a shit deal haha

I keep hearing no DLSS, but AMD has their answer in the works right now.

Wouldn't surprise me at all if AMD's version doesn't require a per game basis, but can be used with any game.
Isn't AMD working together with MS on this?
 
Oct 25, 2017
4,644
I keep hearing no DLSS, but AMD has their answer in the works right now.

Wouldn't surprise me at all if AMD's version doesn't require a per game basis, but can be used with any game.
im also unsure why people are making a big deal about it on cards that are generally going to be playing native up to 4k. 3070/6800 it might be nice on some games over the next few years, but a 3080 or 6800XT you are probably playing native, unless you wanna do like 4K 144hz DLSS

or I guess 4k 60hz with DLSS and Ray Tracing? but im still of the opinion that RT on the 30 series still isnt worth it due to the performance hit
 

Dictator

Digital Foundry
Verified
Oct 26, 2017
4,930
Berlin, 'SCHLAND
I keep hearing no DLSS, but AMD has their answer in the works right now.

Wouldn't surprise me at all if AMD's version doesn't require a per game basis, but can be used with any game.
I do not think that is possible at all - as there is no way to do that on a technical lvel at all, and as they would be releasing it though gpuopen. Which is a suite of techniques and post process effects devs can use to integrate into their individual game should they so wish.

There wont be a universal game reconstruction technique that works without specific game Integration. It just does not work that way even though it would be amazing.
 

Reinhard

Member
Oct 27, 2017
6,592
im also unsure why people are making a big deal about it on cards that are generally going to be playing native up to 4k. 3070/6800 it might be nice on some games over the next few years, but a 3080 or 6800XT you are probably playing native, unless you wanna do like 4K 144hz DLSS

or I guess 4k 60hz with DLSS and Ray Tracing? but im still of the opinion that RT on the 30 series still isnt worth it due to the performance hit
The thing is that with DLSS enabled and using performance mode, there is negligible performance hit for RT. And if you want the better picture quality of balanced and quality mode, RT no longer tanks performance with DLSS enabled as you can stay within usable Gsync ranges at high resolutions. Sure, you might not want RT in E-sports titles as frame rate is king, but for everything else RT is a great addition. RT really shined in Control.
 

Youngfossil

Member
Oct 27, 2017
3,668
its crazy how people are getting so hard over this card and saying they are sold with no real world benchmarks...

be smart consumers people
 

Nooblet

Member
Oct 25, 2017
13,628
RT performance between 2080 Ti and 3080 with equal or better rasterization perf of 3080 and 6GB more VRAM. At $50 cheaper it still makes sense as a product.
I wouldn't say it's equal or better, it's basically the same.
Of the graphs they showed it beat the 3080 in 2 games but also lost to it in 2 games, while being equal to it in the rest. So it's basically the same as 3080 but without DLSS and whatever RT advantage Ampere may have over it.

So for $50 extra you get better performance in RT titles (which is where you need all the performance you can get) possibly due to the Ampere's potentially more advanced RT cores and that gets increased further due to DLSS. For all the other games it doesn't really matter which one one ups the other as the delta will not be significant enough to really make a difference while playing.
 
Last edited:

Eblo

Member
Oct 25, 2017
1,643
I don't suppose a non-RTX version of Minecraft with raytracing will happen, will it? It's a bit annoying that it's currently hard locked to RTX cards.
 

Reinhard

Member
Oct 27, 2017
6,592
I don't suppose a non-RTX version of Minecraft with raytracing will happen, will it? It's a bit annoying that it's currently hard locked to RTX cards.
The actual game should run on AMD RDNA2 cards as there is nothing proprietary about RTX Ray Tracing, it uses either DXR or Vulkan RT. No idea if the access to the beta will stay restricted to the Nvidia signup process.
 

bruhaha

Banned
Jun 13, 2018
4,122
I wouldn't say it's equal or better, it's basically the same.
Of the graphs they showed it beat the 3080 in 2 games but also lost to it in 2 games, while being equal to it in the rest. So it's basically the same as 3080 but without DLSS and whatever RT advantage Ampere may have over it.

So for $50 extra you get better performance in RT titles (which is where you need all the performance you can get) possibly due to the Ampere's potentially more advanced RT cores and that gets increased further due to DLSS. For all the other games it doesn't really matter which one one ups the other as the delta will not be significant enough to really make a difference while playing.

$50 more for better RT and DLSS, but you lose 6 GB VRAM. I think games will run without major disadvantages on 6800XT for at least 3-4 years since games will be designed for consoles with the same feature set and targeting 1440 or 4K with weaker hardware than 6800XT.
 

Nooblet

Member
Oct 25, 2017
13,628
$50 more for better RT and DLSS, but you lose 6 GB VRAM. I think games will run without major disadvantages on 6800XT for at least 3-4 years since games will be designed for consoles with the same feature set and targeting 1440 or 4K with weaker hardware than 6800XT.
Yea but it's also only GDDR6 vs GDDR6X, so depending on how the games are made and how they use the SSD, the higher bandwidth can make up for it or more. I fully believe that 2-3 years from now on memory bandwidth will be more important than memory capacity as there'll be a paradigm shift in how memory is managed and that's something inevitable due to SSD.
 

Eblo

Member
Oct 25, 2017
1,643
The actual game should run on AMD RDNA2 cards as there is nothing proprietary about RTX Ray Tracing, it uses either DXR or Vulkan RT. No idea if the access to the beta will stay restricted to the Nvidia signup process.
Is it? Isn't it running on pure D3D12 DXR?
I don't know how it works exactly, but I have a GTX 1070 which supports RTX through software (albeit with crap performance). The Minecraft RTX beta wouldn't so much as let me boot it up, saying I must have an RTX card. This would have been August if that changed at all since then. Conversely, I could play Quake II RTX at 15 FPS, but at least I could play it.
 

tusharngf

Member
Oct 29, 2017
2,288
Lordran
I wouldn't say it's equal or better, it's basically the same.
Of the graphs they showed it beat the 3080 in 2 games but also lost to it in 2 games, while being equal to it in the rest. So it's basically the same as 3080 but without DLSS and whatever RT advantage Ampere may have over it.

So for $50 extra you get better performance in RT titles (which is where you need all the performance you can get) possibly due to the Ampere's potentially more advanced RT cores and that gets increased further due to DLSS. For all the other games it doesn't really matter which one one ups the other as the delta will not be significant enough to really make a difference while playing.


DLSS is something that very few games use and thats what its been since its launch. Only few titles get that feature.
 

dgrdsv

Member
Oct 25, 2017
11,850
I don't know how it works exactly, but I have a GTX 1070 which supports RTX through software (albeit with crap performance). The Minecraft RTX beta wouldn't so much as let me boot it up, saying I must have an RTX card.
It's a renderer side device check to disallow GPUs without h/w RT support. I'm sure that they'll include RDNA2 cards into the supported list too. Nothing to do with RTX cards specifically.
 
Nov 8, 2017
13,099
6GB extra memory which isn't even needed, plus it's only GDDR6. We'll see about the rasterisation performance from neutral outlets. No DLSS and severely diminished RT performance. Not sure that's a bargain at $50 less.

It's competitive - moderately better value depending on use case, but it doesn't curb stomp in value the way that people hyped themselves up to believe it would.

The 6900 xt is obviously the motive for why Nvidia is accelerating the release of the "3080ti" (3090 with half VRAM) to compete in that exact price point. If you need 24GB the 3090 still value, but most people don't, so the same perf with 12GB VRAM @ 999 will be their answer.
 

Nooblet

Member
Oct 25, 2017
13,628
DLSS is something that very few games use and thats what its been since its launch. Only few titles get that feature.
It doesn't matter if it's in only few games, those few games are where it's needed the most.

You don't need it in every game.
These cards are powerful enough on their own to provide high framerate in majority of the games, it's those handful of games that push the tech where you actually need the performance where DLSS helps. Those games also happen to be the ones with Ray Tracing and happen to be big name games that people anticipate for when buying a new GPU .

It doesn't really matter if the 3080 gets 5% extra performance in 4 games and 6800XT gets 5% extra in 4 games, as those differences become imperceptible while playing at 60+ FPS. But it will absolutely be noticeable if the 3080 performs like 30-40% better in 2 games due to DLSS and those 2 games happen to be heavyweight like Cyberpunk and COD, and end up being showcase title for Ray Tracing. The narrative then becomes nvidia card can play Cyberpunk at 60FPS with ray tracing, but AMD card can't and that sort of narrative is perceivable even if it's only one game because of how important that release is.
 

dynamitejim

Member
Oct 25, 2017
883
I do not think that is possible at all - as there is no way to do that on a technical lvel at all, and as they would be releasing it though gpuopen. Which is a suite of techniques and post process effects devs can use to integrate into their individual game should they so wish.

There wont be a universal game reconstruction technique that works without specific game Integration. It just does not work that way even though it would be amazing.

How does the AI upscaling on the Nvidia Shield (which is realtime) or even in Topaz's Video Enhance AI work then? I don't understand the point of needing ML cores since all the learning is done offline anyway? Do the ML cores apply the results from the trained model that much faster than the shader cores?

TBH, we don't even know how much better non-AI reconstruction techniques can be when targeting the increased power of the next-gen consoles either.
 

Reinhard

Member
Oct 27, 2017
6,592
Probably a stupid question but are stores taking preorders for this? Or do you just have to try to buy it on launch day?
Most likely not. With the prior Nvidia launch no stores in the US took pre-orders and I think the few stores that tried to do so in Europe things didn't go well, as little stock was sent and the queue lines were long as hell.
 

tusharngf

Member
Oct 29, 2017
2,288
Lordran


RGT says DLSS like Super sampling tech might come soon to AMD cards via driver update. He is the only to leak infinity cache first.
 

dgrdsv

Member
Oct 25, 2017
11,850
Do the ML cores apply the results from the trained model that much faster than the shader cores?
Yes. AI inferencing is still a neural network, the difference is that it does its job instead of being trained. Inferencing can be performed in a lower precision which means that you could tap into INTs here which are 2X to FP16 on the same tensor h/w which is partially why it's faster than training.

TBH, we don't even know how much better non-AI reconstruction techniques can be when targeting the increased power of the next-gen consoles either.
This is true but next gen consoles don't have anything which wasn't available in PC GPU space since 2016 or so and there was no breakthroughs in math based reconstruction on PC during that period.
 

jett

Community Resettler
Member
Oct 25, 2017
44,653
Not sure I'd too excited for whatever DLSS-like solution AMD is cooking. Doesn't Nvidia's DLSS use the machine learning-oriented tensor cores for that? What does AMD have on their GPUs that is comparable?
 

shinken

Member
Oct 27, 2017
1,917
Have been working all day and haven't really had the time to read about the event. So what is the consensus about the new AMD cards compared to Nvidia's offering?
 

Deleted member 17092

User requested account closure
Banned
Oct 27, 2017
20,360
Have been working all day and haven't really had the time to read about the event. So what is the consensus about the new AMD cards compared to Nvidia's offering?

Regular raster performance 6800 seems to beat the 3070 fairly comfortably (but it costs $579, but you also get 16gb vram).

6800 xt is $650 roughly matches 3080

6900xt roughly matches 3090 $1k

Unknown RT performance and they have an upcoming super resolution feature to try to compete with dlss

Wait for reviews as always but overall seems like they are finally competive again in the enthusiast/high end segment
 

Eternia

Member
Oct 25, 2017
490
Have been working all day and haven't really had the time to read about the event. So what is the consensus about the new AMD cards compared to Nvidia's offering?
Pretty competitive performance in terms of rasterization albeit by AMD's own benchmarks. Priced near Nvidia outside of the big 3090 undercut with the 6900XT. If you're hoping for good performance, you're in luck. But if you're hoping for something significantly cheaper, that did not happen.
 

Caz

Attempted to circumvent ban with alt account
Banned
Oct 25, 2017
13,055
Canada
So what are the chances that Sapphire, XFX or another one of the high-end 6800XTs has 3 8-pins for power draw? Not sure how much the 6800XT's average draw will be when used but it can be juiced with, at most, 375W with the two 8-pins and power supplied from the motherboard itself.
 

Gyroscope

Member
Oct 25, 2017
786
Lack of RT showings is definitely something to look forward to with independent benchmarks. Which is sadly up to 3 weeks away. Decent showing though. Glad there is some real sense of competition back along a potential full stack and in lock step. Hopefully Intel can further shake it up.

But RT is definitely the topic at hand. I do wonder. RTX has been out since Turing, and developers have been able to optimize and profile against it, we all should know how it was at launch. RDNA2 is AMDs first attempt at hardware functionality for it. Whether games are done with DXR 1.0, or 1.1 is a thing, and still requires stages of integration into pipelines and optimizing against hardware. nVidia is ahead here no doubt because of their amazing efforts with Turing. Also, despite APIs, there is still a driver that needs to talk to.

With articles like this: https://videocardz.com/newz/amd-ray...dia-rt-core-in-this-dxr-ray-tracing-benchmark
It needs to be noted that it is squarely in its own specialized environment.

And this fuels the fire that RDNA2 RT performance is between Turing and Ampere. With that you get potentially this.

3070 reviews came out, 1:1 performance basically with 2080ti.




Yet despite the claims of improvements from Turing to Ampere from nVidia it's not bearing fruit in real world games yet. Obviously not an ironclad argument but the tech itself is very young. The improvements seen on the 3080 and 3090 could be coming from the higher power budget and FP32+FP32 capabilities. The improvements from offline rendering and DCC apps fall into the obscure of not solely relying on the RTX and Tensor Cores.

Though I'll say again, AMD not showing anything strictly RT is suspect. And I'm not expecting much because of it.



Anyway, more importantly, I'm glad Raytracing is going full speed ahead. It's only going to get better from here now that it's going to be potentially everywhere for games.

Edit: Also curious to see how AIBs can further push the architecture.

Edit: Bottom of the page drudgery. The binning from 80 > 72 > 60 is also weird. I wonder how it works out. Really looking forward to the whitepaper as well.
 
Last edited:
Status
Not open for further replies.