6800 (16G, $579) vs 3070 (8G, $499)
6800XT (16G, $649) vs 3080 (10G, $699)
6900XT (16G, $999) vs 3090 (24G, $1499)
I don't know if I wish the 6800 was cheaper or if the 3070 had more memory.
There's nothing proprietary in the s/w side of RTX implementation. It runs through DXR and VKRT.It'll be interesting to see if the industry moves fully away from proprietary RTX implementation due to the consoles. At the very least I'm guessing they'll all have DXR implementation if they include RT.
I wonder if directstorage would minimize VRAM overflow issues. will be interesting to see when devs get their hands on it.6800 (16G, $579) vs 3070 (8G, $499)
6800XT (16G, $649) vs 3080 (10G, $699)
6900XT (16G, $999) vs 3090 (24G, $1499)
I don't know if I wish the 6800 was cheaper or if the 3070 had more memory.
AMD Radeon RX 6000 "RDNA 2 Big Navi" GPU Ray Tracing Performance Detailed - NVIDIA's RTX 3080 With RT Cores 33% Faster Than AMD's Ray Accelerator Cores
AMD has provided the first ray tracing performance numbers of its next-gen RDNA 2 GPU based Radeon RX 6000 series graphics cards.wccftech.com
Nvidia hardware is put to use through Microsoft's DXR API's though. You peeps really need to come up to speed on how ray traycing is implemented.It'll be interesting to see if the industry moves fully away from proprietary RTX implementation due to the consoles. At the very least I'm guessing they'll all have DXR implementation if they include RT.
My experience is quite the opposite, always having issues, since switching to nvidia, no more issues. My friends still have amd cards and are always suffering.I've been using ATI/AMD cards for over a decade, and since then I've been hearing about bad drivers but I have yet to see a fail on that side. Granted I'm not a huge player ( few to several hours a week) but I never experienced a single crash in game due to drivers.
Without tensor cores I don't see how. Maybe a workaround about not having them, but without them I don't see how could they have the same performance benefits.Sounds like the "DLSS" like tech AMD is working on. Hopefully this partnership will allow them to catch up in that area.
AMD Radeon RX 6000 "RDNA 2 Big Navi" GPU Ray Tracing Performance Detailed - NVIDIA's RTX 3080 With RT Cores 33% Faster Than AMD's Ray Accelerator Cores
AMD has provided the first ray tracing performance numbers of its next-gen RDNA 2 GPU based Radeon RX 6000 series graphics cards.wccftech.com
Primitive shaders are 100% not Mesh shaders which is the whole point in the creation of Mesh shaders.
Didn't expect it to be better than 2080ti.AMD Radeon RX 6000 "RDNA 2 Big Navi" GPU Ray Tracing Performance Detailed - NVIDIA's RTX 3080 With RT Cores 33% Faster Than AMD's Ray Accelerator Cores
AMD has provided the first ray tracing performance numbers of its next-gen RDNA 2 GPU based Radeon RX 6000 series graphics cards.wccftech.com
It's a very simple test though with hardly any shading meaning that it compares RDNA2 CUs+RAs with NV's RT cores only.So just one synthetic here right? Actually shows better performance than the previous port royal score. Not bad really.
Ouch, now we know why they didn't throw up any performance graphs in the presentation.
RT performance between 2080 Ti and 3080 with equal or better rasterization perf of 3080 and 6GB more VRAM. At $50 cheaper it still makes sense as a product.
Well NV is just suited more for business workflows because of CUDA lock in.Not a gaming workflow, a business one.
Absolutely not they are equal wastes of money for gaming xD. One is 50% more expensive that the other one.
Isn't AMD working together with MS on this?I keep hearing no DLSS, but AMD has their answer in the works right now.
Wouldn't surprise me at all if AMD's version doesn't require a per game basis, but can be used with any game.
im also unsure why people are making a big deal about it on cards that are generally going to be playing native up to 4k. 3070/6800 it might be nice on some games over the next few years, but a 3080 or 6800XT you are probably playing native, unless you wanna do like 4K 144hz DLSSI keep hearing no DLSS, but AMD has their answer in the works right now.
Wouldn't surprise me at all if AMD's version doesn't require a per game basis, but can be used with any game.
I do not think that is possible at all - as there is no way to do that on a technical lvel at all, and as they would be releasing it though gpuopen. Which is a suite of techniques and post process effects devs can use to integrate into their individual game should they so wish.I keep hearing no DLSS, but AMD has their answer in the works right now.
Wouldn't surprise me at all if AMD's version doesn't require a per game basis, but can be used with any game.
Which means that they have nothing against DLSS at the moment. Also remains to be seen what this answer is. Their previous "answer" to DLSS was a post processing sharpening filter.I keep hearing no DLSS, but AMD has their answer in the works right now.
No.
The thing is that with DLSS enabled and using performance mode, there is negligible performance hit for RT. And if you want the better picture quality of balanced and quality mode, RT no longer tanks performance with DLSS enabled as you can stay within usable Gsync ranges at high resolutions. Sure, you might not want RT in E-sports titles as frame rate is king, but for everything else RT is a great addition. RT really shined in Control.im also unsure why people are making a big deal about it on cards that are generally going to be playing native up to 4k. 3070/6800 it might be nice on some games over the next few years, but a 3080 or 6800XT you are probably playing native, unless you wanna do like 4K 144hz DLSS
or I guess 4k 60hz with DLSS and Ray Tracing? but im still of the opinion that RT on the 30 series still isnt worth it due to the performance hit
I wouldn't say it's equal or better, it's basically the same.RT performance between 2080 Ti and 3080 with equal or better rasterization perf of 3080 and 6GB more VRAM. At $50 cheaper it still makes sense as a product.
The actual game should run on AMD RDNA2 cards as there is nothing proprietary about RTX Ray Tracing, it uses either DXR or Vulkan RT. No idea if the access to the beta will stay restricted to the Nvidia signup process.I don't suppose a non-RTX version of Minecraft with raytracing will happen, will it? It's a bit annoying that it's currently hard locked to RTX cards.
I wouldn't say it's equal or better, it's basically the same.
Of the graphs they showed it beat the 3080 in 2 games but also lost to it in 2 games, while being equal to it in the rest. So it's basically the same as 3080 but without DLSS and whatever RT advantage Ampere may have over it.
So for $50 extra you get better performance in RT titles (which is where you need all the performance you can get) possibly due to the Ampere's potentially more advanced RT cores and that gets increased further due to DLSS. For all the other games it doesn't really matter which one one ups the other as the delta will not be significant enough to really make a difference while playing.
Yea but it's also only GDDR6 vs GDDR6X, so depending on how the games are made and how they use the SSD, the higher bandwidth can make up for it or more. I fully believe that 2-3 years from now on memory bandwidth will be more important than memory capacity as there'll be a paradigm shift in how memory is managed and that's something inevitable due to SSD.$50 more for better RT and DLSS, but you lose 6 GB VRAM. I think games will run without major disadvantages on 6800XT for at least 3-4 years since games will be designed for consoles with the same feature set and targeting 1440 or 4K with weaker hardware than 6800XT.
Is it? Isn't it running on pure D3D12 DXR?I don't suppose a non-RTX version of Minecraft with raytracing will happen, will it? It's a bit annoying that it's currently hard locked to RTX cards.
The actual game should run on AMD RDNA2 cards as there is nothing proprietary about RTX Ray Tracing, it uses either DXR or Vulkan RT. No idea if the access to the beta will stay restricted to the Nvidia signup process.
I don't know how it works exactly, but I have a GTX 1070 which supports RTX through software (albeit with crap performance). The Minecraft RTX beta wouldn't so much as let me boot it up, saying I must have an RTX card. This would have been August if that changed at all since then. Conversely, I could play Quake II RTX at 15 FPS, but at least I could play it.
Once it releases it probably won't be RTX-locked. ATM I think that exists just because Nvidia did a lot of the background work with Microsoft.I don't suppose a non-RTX version of Minecraft with raytracing will happen, will it? It's a bit annoying that it's currently hard locked to RTX cards.
I wouldn't say it's equal or better, it's basically the same.
Of the graphs they showed it beat the 3080 in 2 games but also lost to it in 2 games, while being equal to it in the rest. So it's basically the same as 3080 but without DLSS and whatever RT advantage Ampere may have over it.
So for $50 extra you get better performance in RT titles (which is where you need all the performance you can get) possibly due to the Ampere's potentially more advanced RT cores and that gets increased further due to DLSS. For all the other games it doesn't really matter which one one ups the other as the delta will not be significant enough to really make a difference while playing.
It's a renderer side device check to disallow GPUs without h/w RT support. I'm sure that they'll include RDNA2 cards into the supported list too. Nothing to do with RTX cards specifically.I don't know how it works exactly, but I have a GTX 1070 which supports RTX through software (albeit with crap performance). The Minecraft RTX beta wouldn't so much as let me boot it up, saying I must have an RTX card.
6GB extra memory which isn't even needed, plus it's only GDDR6. We'll see about the rasterisation performance from neutral outlets. No DLSS and severely diminished RT performance. Not sure that's a bargain at $50 less.
It doesn't matter if it's in only few games, those few games are where it's needed the most.DLSS is something that very few games use and thats what its been since its launch. Only few titles get that feature.
You don't need it in every game.
These cards are powerful enough on their own to provide high framerate in majority of the games, it's those handful of games that push the tech where you actually need the performance where DLSS helps. Those games also happen to be the ones with Ray Tracing and happen to be big name games that people anticipate for when buying a new GPU .
It doesn't really matter if the 3080 gets 5% extra performance in 4 games and 6800XT gets 5% extra in 4 games, as those differences become imperceptible while playing at 60+ FPS. But it will absolutely be noticeable if the 3080 performs like 30-40% better in 2 games due to DLSS and those 2 games happen to be heavyweight like Cyberpunk and COD, and end up being showcase title for Ray Tracing. The narrative then becomes nvidia card can play Cyberpunk at 60FPS with ray tracing, but AMD card can't and that sort of narrative is perceivable even if it's only one game because of how important that release is.
I do not think that is possible at all - as there is no way to do that on a technical lvel at all, and as they would be releasing it though gpuopen. Which is a suite of techniques and post process effects devs can use to integrate into their individual game should they so wish.
There wont be a universal game reconstruction technique that works without specific game Integration. It just does not work that way even though it would be amazing.
Most likely not. With the prior Nvidia launch no stores in the US took pre-orders and I think the few stores that tried to do so in Europe things didn't go well, as little stock was sent and the queue lines were long as hell.Probably a stupid question but are stores taking preorders for this? Or do you just have to try to buy it on launch day?
Yes. AI inferencing is still a neural network, the difference is that it does its job instead of being trained. Inferencing can be performed in a lower precision which means that you could tap into INTs here which are 2X to FP16 on the same tensor h/w which is partially why it's faster than training.Do the ML cores apply the results from the trained model that much faster than the shader cores?
This is true but next gen consoles don't have anything which wasn't available in PC GPU space since 2016 or so and there was no breakthroughs in math based reconstruction on PC during that period.TBH, we don't even know how much better non-AI reconstruction techniques can be when targeting the increased power of the next-gen consoles either.
Have been working all day and haven't really had the time to read about the event. So what is the consensus about the new AMD cards compared to Nvidia's offering?
Pretty competitive performance in terms of rasterization albeit by AMD's own benchmarks. Priced near Nvidia outside of the big 3090 undercut with the 6900XT. If you're hoping for good performance, you're in luck. But if you're hoping for something significantly cheaper, that did not happen.Have been working all day and haven't really had the time to read about the event. So what is the consensus about the new AMD cards compared to Nvidia's offering?