• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.
  • We have made minor adjustments to how the search bar works on ResetEra. You can read about the changes here.

nekomix

Member
Oct 30, 2017
474
So the 6800s are a pretty nice alternative to 3070/80 if we don't care about RT. I hope I will see both available in 2 weeks when I will enter the RX/RTX wars. And good to know the 6800 is dimensioned well enough for ITX cases.
 
Oct 27, 2017
5,329
Isn't the 3070 aimed more at 1080/1440 though?
In my personal experience with the RTX 3070 FE I've found that it's a really good card at 2560x1440 but a jump to just 3440x1440 and it becomes a bit questionable and concerning regarding future-proofing. It all depends on your settings in-game and your monitor/refresh rate but I think anything above 2560x1440 and you'll want a RTX 3080 especially for upcoming games (and some current ones as well).
 

kostacurtas

Member
Oct 27, 2017
9,067

capturey7k48.png
 
Oct 25, 2017
1,170
Wakayama
So this question I have doesn't feel like it's worthy of it's own thread but since we're comparing cards I figured I'd ask here. One comparison I'm seeing no sites/videos check is the cards performance when it comes to high-end VR. I have an HTC Vive and while I am able to run games on it, I'm unable to supersample with my 980ti without running into framerate drops which is an absolute no-no in VR (this especially sucks for Elite Dangerous because the text is basically unreadable without supersampling). Between the RTX 3080 and RX 6800XT my gut tells me the 16GB VRAM would be better for VR to ensure both higher framerates and being able to framebuffer rendering each frame twice while I supersample the images at/close to 4K.

Does anyone here know if I am thinking about this correctly or if it really wouldn't matter? Are there any sites and videos that have (or might in future) compare performance with regards to VR?
 

Buggy Loop

Member
Oct 27, 2017
1,232


coreteks says dxr will get better with upcoming games.


I'm not sure i follow what he means.

"... so my guess is that it's more representative of what we can expect from the DXR implementations that developers are likely to be adopting from here on out, with the exceptions of titles where Nvidia specifically pays developers to include their own implementations like what we see with watchdog.."

I'm sorry, the guy is clueless (as most youtubers), Nvidia does not have "their own implementations", Quake II RTX has Vulkan RT, the rest are DXR. Why do you think AMD cards can even be benchmarked otherwise? Console to PC games will use DXR. The same DXR calls that have been used since Turing, will be for Ampere and will be for RDNA2, it's hardware agnostic. It's presumptuous to even try to pass that as information during a review, holy shit. Did Nvidia kill everyone's dogs? Or they are just that ignorant.

And the "it's more representative" is a game with only RT shadows, the bare minimum. Are we hoping now that all console ports to PC will do bare minimum so that AMD cards "drop less" in performances? When has that ever happened? They go all in on PC and then drop down for consoles, for most devs anyway.
 

Acrano

Member
Nov 2, 2017
1,141
Germany
Now the scalpers get directly supplied by the manufacturers.
The only two German retailers that got them, wanted 820 Euros for the 6800 XT and 740 for the 6800.
Fuck this...
Yeah, the prices are ridiculous. I also tried to get my hand on a 6800 or 6800XT at mindfactory and alternate but not at this cost. AMD site didn´t sell any on their site and we already have people on ebay and ebay kleinanzeigen selling their models that are not even in their hands.

It´s a worse launch than the RTX cards here in germany.

Kurz gesagt: Absolut überteuert, bei den Shops werd ich nichts mehr bestellen. Dann wart ich halt bis Januar / Februar rum, bis dahin muss meine GTX 1070Ti noch durchhalten, so etwas unterstütze ich nicht.
 

Buggy Loop

Member
Oct 27, 2017
1,232
People hypothesized that since consoles basically had the same hardware, games are going to take advantage of AMD stuff in the future.

Yea... but no. There's no "AMD stuff" for ray tracing.

A DXR call is hardware agnostic, after that, it's drivers and how it is implemented in hardware. Nvidia ain't lagging on driver side and as we see, it's not even a fair fight on the hardware RT implementations.

Only thing that can happen is that we see shoddy PC ports that bring console RTX 2060 RT level, cut-down features to PC and that the RT "drop" is lower for AMD than a fully ray traced game ala minecraft because it supports bare minimum effects like ray traced shadows (which are almost worthless) or lower resolution reflections/no bounces, or roughness cutoff, etc.

But then by that logic, we would not have Cyberpunk 2077 supporting so many RT features because it would be held off by PS4/Xbox one which were the main focus for console support in the game's development history. Devs are going to push things beyond consoles on PC, at least most of them, like it has always been.
 

GhostofWar

Member
Apr 5, 2019
512
I've seen different sources confirming the watchdogs legion and horizon RT bugs (if that's what they are). Has anyone else confirmed what linus was saying here?

youtu.be

AMD did NOT disappoint me - RX 6800 Series Review

Intel Core i7-9700K: https://rebrand.ly/go5f8Intel Core i9-9900K: https://rebrand.ly/8rg11*Free 32GB Flash Drive & 32GB Micro SD Card: https://rebrand.ly/ppa...

edit:I probably should have said what it was, productivity software using RT acceleration is putting out weird results.
 

Eiji

Member
Oct 28, 2017
145
Now we all know why raytracing performance wasn't shown at the 6000 series reveal.
 

Vamphuntr

Member
Oct 25, 2017
3,301
Soundly outperformed in 4K raster loads, annihilated in RT and DLSS, why would anyone buy these to save $50?

In Canada the 6800xt is much cheaper than a 3080. All of the 3080 are above 1000$ CAD on Newegg and the reference 6800xt was 860$ CAD. According to the rumors posted earlier in the thread AMD is sending all of their supplies to AIB instead for their custom cards so there might be a lot of stock next week. So if you cannot get a 3080 you might be tempted by one of these. The 6800xt also has more VRAM than the 3080 (16GB vs 10GB) and so does the 6800 vs the 3070 (twice as much). And the 6800 pretty much mops the floor with the 3070. If you don't care about RT they are still a decent alternative if you can find them over the 3000 series. AMD is also working on a DLSS alternative just like they did to Gsync.

This is a pretty decent try by AMD imo. They came really close this time. They must be doing things right because NVIDIA and Intel want to do a Smart Access Memory for their GPU too. NVIDIA is also coming up with emergency cards with more VRAM like the 3060Ti 12 GB and 3080Ti 20 GB so this is for the best too. Competition is good.

I still cannot get a 3080 ( I want the Strix) but if suppliers are good for the 6800xt I will definitely try my luck.
 

Buggy Loop

Member
Oct 27, 2017
1,232
In Canada the 6800xt is much cheaper than a 3080. All of the 3080 are above 1000$ CAD on Newegg and the reference 6800xt was 860$ CAD. According to the rumors posted earlier in the thread AMD is sending all of their supplies to AIB instead for their custom cards so there might be a lot of stock next week. So if you cannot get a 3080 you might be tempted by one of these. The 6800xt also has more VRAM than the 3080 (16GB vs 10GB) and so does the 6800 vs the 3070 (twice as much). And the 6800 pretty much mops the floor with the 3070. If you don't care about RT they are still a decent alternative if you can find them over the 3000 series. AMD is also working on a DLSS alternative just like they did to Gsync.

This is a pretty decent try by AMD imo. They came really close this time. They must be doing things right because NVIDIA and Intel want to do a Smart Access Memory for their GPU too. NVIDIA is also coming up with emergency cards with more VRAM like the 3060Ti 12 GB and 3080Ti 20 GB so this is for the best too. Competition is good.

I still cannot get a 3080 ( I want the Strix) but if suppliers are good for the 6800xt I will definitely try my luck.

But then in Canada, the 3080 FE is 899$, 40$ difference. When 6800XT AIBs release in 1 week, the prices will raise up too.

If you did not get a 6800XT reference from this morning, then we're back on square 0, either 3080 FE or AMD 6800XT, we're talking about unicorns that don't exist in Canada.
 

Vamphuntr

Member
Oct 25, 2017
3,301
But then in Canada, the 3080 FE is 899$, 40$ difference. When 6800XT AIBs release in 1 week, the prices will raise up too.

If you did not get a 6800XT reference from this morning, then we're back on square 0, either 3080 FE or AMD 6800XT, we're talking about unicorns that don't exist in Canada.

Where is the FE available in Canada? No one is carrying it AFAIK. It was only on NVIDIA and they've stopped selling them because of bots or something. At least the listings for the 6800XT are still on Newegg so it implies there will b more.
 

0VERBYTE

Banned
Nov 1, 2017
5,555
I guess this is all for the best since I haven't finished updating my rig yet anyway.

It is normal for cpu prices to go up this time of year, yeah?
 

djplaeskool

Member
Oct 26, 2017
19,796
I guess this is all for the best since I haven't finished updating my rig yet anyway.

It is normal for cpu prices to go up this time of year, yeah?

You may see a slight tick upwards as retailers offset sales by shifting stock back towards MSRP. This year may be an outlier, as new gpus lead to more builds, which leads to less overall stock of components. You also have folks, frustrated with low stock of new gpus, snapping up older ones. Anecdotally, my local Micro Center can't keep rtx 2000 cards on the shelf.

Mercifully, the PSU drought has ended, though the prices haven't fully normalized quite yet.
 

JahIthBer

Member
Jan 27, 2018
10,395
It didn't happen last gen.
It did, AMD benefited when games went heavy on compute, like RDR2. Nvidia has taken away any advantages AMD has now.
Still, AMD should be happy with their rasterization improvement over Nvidia, but ray tracing is something they will need to look into, ironically console focus is probably going to benefit Nvidia more because of devs using RT more.
 

Readler

Member
Oct 6, 2018
1,972
I'm not sure i follow what he means.

"... so my guess is that it's more representative of what we can expect from the DXR implementations that developers are likely to be adopting from here on out, with the exceptions of titles where Nvidia specifically pays developers to include their own implementations like what we see with watchdog.."

I'm sorry, the guy is clueless (as most youtubers), Nvidia does not have "their own implementations", Quake II RTX has Vulkan RT, the rest are DXR. Why do you think AMD cards can even be benchmarked otherwise? Console to PC games will use DXR. The same DXR calls that have been used since Turing, will be for Ampere and will be for RDNA2, it's hardware agnostic. It's presumptuous to even try to pass that as information during a review, holy shit. Did Nvidia kill everyone's dogs? Or they are just that ignorant.

And the "it's more representative" is a game with only RT shadows, the bare minimum. Are we hoping now that all console ports to PC will do bare minimum so that AMD cards "drop less" in performances? When has that ever happened? They go all in on PC and then drop down for consoles, for most devs anyway.
Yes and no.

I genuinely don't listen to YouTubers (lol @ "pays devs for their own implementation"...this is why no one takes you seriously), but CB speculates the same.
Ampere and RDNA2 use two very different approaches to RT.
Ampere does all its raytracing (ray traversal and BVH) on its dedicated RT cores, which is effective, as it directly accelerates BVH on a hardware level, but also takes up a lot of space.
10-630.6e958ab8.jpg
54-630.b1ca1132.png

RDNA2 doesn't do BVH on dedicated hardware, but does it on the normal CUs, which have been extended with a ray accelerator for ray traversal; the amount of CUs thus directly scales with RT performance (80 CUs vs 68 RT cores). The idea is that the Infinity Cache can kinda outweigh the cons of not doing BVH on dedicated cores, as it is large enough to keep already calculated BVH structures in memory. As of now, this approach is clearly not working as great, since despite the higher number of cores doing ray traversal, performance is worse.
Right now, Infinity Cache is handled on a driver level, though AMD hopes to provide devs with more control over it - something that very well might happen with the consoles. As of now, hit rate with the Cache is at 58% at 4k on average, leaving much room for improvement if you opt to optimise for it.

Now, I'm not saying anyone should base their decisions off future potential, that's dumb, but it doesn't sound too farfetched that at least RT performance could age a bit better. While it could likely be a fluke, the 6800XT performed better than the 3080 in 5/7 of the new current gen games, compared to 3/17 older games, with a reduced gap in RT performance (with RDNA performing better than Ampere in Watch Dogs...which is arguably just a piece of shit though on PC haha).
 

Classy Tomato

Member
Jun 2, 2019
2,523
It seems AMD will hold a virtual event tomorrow (20th November) in my country, which is quite rare. I don't think Nvidia held any event when Ampere was released here. Announcing and releasing the cards close to the MSRP would be dope though.
 

Irminsul

Member
Oct 25, 2017
3,041
Now, I'm not saying anyone should base their decisions off future potential, that's dumb, but it doesn't sound too farfetched that at least RT performance could age a bit better. While it could likely be a fluke, the 6800XT performed better than the 3080 in 5/7 of the new current gen games, compared to 3/17 older games, with a reduced gap in RT performance (with RDNA performing better than Ampere in Watch Dogs...which is arguably just a piece of shit though on PC haha).
This can just as easily explained by the fact that those newer games are actually doing less w.r.t. Raytracing than the older ones. The difference in fps being Minecraft > Control > Watch Dogs perfectly aligns with the amount of Raytracing those games have implemented.
 

Readler

Member
Oct 6, 2018
1,972
This can just as easily explained by the fact that those newer games are actually doing less w.r.t. Raytracing than the older ones. The difference in fps being Minecraft > Control > Watch Dogs perfectly aligns with the amount of Raytracing those games have implemented.
Very very fair point.

The new CoD also performs significantly worse on RDNA2 with RT on, though I don't know anything abouts it RT implementation. Then again Dirt 5 performs a lot better on RDNA.

Pay for what? There is no "RTX implementation" like there was with PhysX enabled games, it's a generic interface. Nvidia literally worked with Microsoft on this, as part of DXR.
They pay for advertising, branding, and cross-promotion, but it's not any different than when Sony or Microsoft gets the marketing rights to an upcoming multi-plat game. Unless a game use OptiX (which is not meant for games btw), no game with RT will run exclusively on NV.
 

Kalik

Banned
Nov 1, 2017
4,523
very impressive performance overall from Big Navi...ray tracing is not that great but that was to be expected...but for 1st gen RT they did better then Nvidia's 1st gen RTX...Smart Access Memory seems decent but since Nvidia also announced support for this it's basically a wash...rasterization performance is where Big Navi really shines...pretty much neck and neck with the 3080 at 1440 with Nvidia widening the gap at 4K

the Hardware Canucks review mentioned loud noise levels at load but I haven't seen any other reviews talk about it...if I didn't have a G-Sync monitor and didn't care about ray tracing I would choose the 6800XT without a second thought
 

Thretau

Member
Oct 31, 2017
43
Some interesting tradeoffs with these cards. RX6800 14% faster vs 3070 DLSS & RT perf. SAM will increase the difference even more in some games. I would totally go with 14% perf increase instead of DLSS/RTX in a few random games which I might not even play. Some might weight Nvidia features more.
 

icecold1983

Banned
Nov 3, 2017
4,243
Very very fair point.

The new CoD also performs significantly worse on RDNA2 with RT on, though I don't know anything abouts it RT implementation. Then again Dirt 5 performs a lot better on RDNA.


Pay for what? There is no "RTX implementation" like there was with PhysX enabled games, it's a generic interface. Nvidia literally worked with Microsoft on this, as part of DXR.
They pay for advertising, branding, and cross-promotion, but it's not any different than when Sony or Microsoft gets the marketing rights to an upcoming multi-plat game. Unless a game use OptiX (which is not meant for games btw), no game with RT will run exclusively on NV.
To use code written by Nvidia engineers.
 

Simuly

Alt-Account
Banned
Jul 8, 2019
1,281
The 6800XT's gaming power draw is so much lower than the 3080s (100W), that you could overclock the 6800XT to gain +9% performance and still draw less power than the 3080 at stock. Which only gains around 2-3% from overclocking, which I wouldn't recommend as that would take power draw closer to 400W!
 

tokkun

Member
Oct 27, 2017
5,419
Pay for what? There is no "RTX implementation" like there was with PhysX enabled games, it's a generic interface. Nvidia literally worked with Microsoft on this, as part of DXR.
They pay for advertising, branding, and cross-promotion, but it's not any different than when Sony or Microsoft gets the marketing rights to an upcoming multi-plat game. Unless a game use OptiX (which is not meant for games btw), no game with RT will run exclusively on NV.

It's not about exclusivity, it's about optimization.

One way the GPU vendors subsidize games is by providing engineers to help optimize them. Naturally, the optimizations will tend to favor their products - both because the engineers have more expertise and because they are incentivized to make the game run better on their product than the competitor's. You don't need a proprietary API to do this - just knowledge about the strengths and weaknesses of the underlying architecture. We see the same thing play out with DirectX and Vulkan, even though they are common APIs.
 

GhostofWar

Member
Apr 5, 2019
512

TC McQueen

Member
Oct 27, 2017
2,592
From what I've heard, AMD's raytracing performance sucks in some titles because those games use a particularly computationally heavy light bounce for the RT calculations, while the AMD optimized RT games use a different light bounce that doesn't hit performance as hard.