• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.
Status
Not open for further replies.

sweetmini

Member
Jun 12, 2019
3,921
You forgot Watch Dogs Legions, and every future big title.
Next far cry has it
Next AC has it
Cyberpunk has it
Hitman 3 has it
etc etc
DLSS 2.0+ will now be a systematic feature, the benefits being too valuable to ignore.

This feature and AMD's response need to be wrapped in a DX12 feature, so that everybody can benefit from the new advanced supersampling techniques without having to code for 2 different solutions. One toggle and this is it.
They are and will be fantastic.
 

Simuly

Alt-Account
Banned
Jul 8, 2019
1,281
No joke I literally cancelled my 3080 preorder. Unless I hear some post-launch talk about crappy drivers and crashes etc I'm going to give AMD a try this time. It's about time, haven't had a AMD/ATI card since ATI All in Wonder!

It's definitely worth waiting for reviews before buying anything. Ampere doesn't have any meaningful overclocking headroom, whereas these are meant to overclock really well...can't wait to see the full picture.
 

Hasney

One Winged Slayer
The Fallen
Oct 25, 2017
18,601
It's definitely worth waiting for reviews before buying anything. Ampere doesn't have any meaningful overclocking headroom, whereas these are meant to overclock really well...can't wait to see the full picture.

Fuck RAGE MODE. I want to see MANUAL STEROID USER BEING TOLD TO WEAR A MASK IN A STORE MODE overclocks.

KAREN MODE
 

sweetmini

Member
Jun 12, 2019
3,921
Fuck RAGE MODE. I want to see MANUAL STEROID USER BEING TOLD TO WEAR A MASK IN A STORE MODE overclocks.

KAREN MODE

Nope, the next mode is FURY, i'm calling it , and the one after will be TNT mode.
furypro41kyq.jpg
 

Simuly

Alt-Account
Banned
Jul 8, 2019
1,281
RAGE mode, ThreadRipper, Fury...someone at AMD marketing been playing DOOM or something.

For those that haven't seen, compilation of the bench results using that website AMD released, SAM on but RAGE mode off:

123338901_1024580218008265_5818115754332673115_o.jpg
 

minimalism

Member
Jan 9, 2018
1,129
I'll wait until we start seeing RT comparisons and whenever AMD decides to roll out whatever DLSS equivalent they have. Raw rasterization performance means nothing for now.
 

Shroki

Member
Oct 27, 2017
5,911
I just want a 4K60 for like two years. I'm more willing to make some settings sacrifices than I am to sit on my hands until 2021 for an Nvidia card tbh.

So I'm pretty sure I'm just grabbing a 6800xt.
 

brain_stew

Member
Oct 30, 2017
4,727
For those that haven't seen, compilation of the bench results using that website AMD released, SAM on but RAGE mode off:

123338901_1024580218008265_5818115754332673115_o.jpg

I'm really surprised that AMD didn't make a bigger fuss about their 1440p performance. For the majority of the market those are the results that matter and the 6800xt is beating a 3090 at that resolution. That's hugely significant, and better than I had initially realised.

It also puts the 6800 in a better light getting within 5% of 3080 performance and a clear tier above the 2080Ti. The 6800 doesn't show quite the same lead over the 2080Ti at 4K but then the 3070 falls off a little against the 2080Ti at 4K anyway so once that is switched out with the 3070, the gap will widen at 4K.

Need to understand how much impact SAM is having on those results as I'm on a B450 motherboard with a 3700x so it may be 5+ years before I can benefit from it.
 

Readler

Member
Oct 6, 2018
1,972
You forgot Watch Dogs Legions, and every future big title.
Next far cry has it
Next AC has it

Cyberpunk has it
Hitman 3 has it
etc etc
DLSS 2.0+ will now be a systematic feature, the benefits being too valuable to ignore.

This feature and AMD's response need to be wrapped in a DX12 feature, so that everybody can benefit from the new advanced supersampling techniques without having to code for 2 different solutions. One toggle and this is it.
They are and will be fantastic.
None of those games have confirmed DLSS support for now fwiw. At this point it also seems unlikely for Valhalla to have it, as they probably would have advertised it by now, as Legion and Cyberpunk have.

Arguably some of *the* biggest games will most likely not have it: Rockstar usually doesn't give a shit, and I doubt that Bethesda/Microsoft games will support it either and instead opt for a DX solution (see Doom)

I'll wait until we start seeing RT comparisons and whenever AMD decides to roll out whatever DLSS equivalent they have. Raw rasterization performance means nothing for now.
Lol
 
Oct 25, 2017
7,141
RAGE mode, ThreadRipper, Fury...someone at AMD marketing been playing DOOM or something.

For those that haven't seen, compilation of the bench results using that website AMD released, SAM on but RAGE mode off:

123338901_1024580218008265_5818115754332673115_o.jpg
This chart makes a pretty good case for both the 6800 cards. Does this mean they're both stomping the 3070?
 

Theswweet

RPG Site
Verified
Oct 25, 2017
6,404
California


Potential RT performance leak for RX 6800? CPU might be bottlenecking the 1440p results. Just below RTX 2080 ti performance - 6800 XT would likely be above 2080 ti RT performance.

Not bad if real!
 

Readler

Member
Oct 6, 2018
1,972
This chart makes a pretty good case for both the 6800 cards. Does this mean they're both stomping the 3070?
These charts make an even stronger case:
AMD-Radeon-RX-6900XT-6800XT-6800-vs-GeForce-RTX-3090-3080-2080Ti-2K-1200x675.jpg

Source: https://videocardz.com/newz/amd-dis...900xt-rx-6800xt-and-rx-6800-gaming-benchmarks

The 6800 is pretty much consistently better than the 3070/2080 Ti, even edging out the 3080/3090 at 1440p in certain games. Since the 3070 doesn't use GDDR6X like the 3080, the only reason to get them is DLSS and RT (which arguably are not trivial reasons, but on a *hardware* level and in terms of rasterisation performance the 6800 series are better in every regard).
 

BeI

Member
Dec 9, 2017
5,974
RAGE mode, ThreadRipper, Fury...someone at AMD marketing been playing DOOM or something.

For those that haven't seen, compilation of the bench results using that website AMD released, SAM on but RAGE mode off:

123338901_1024580218008265_5818115754332673115_o.jpg

I would have thought that the Infinity cache increasing the effective bandwidth of the 6800 (XT) so much would make them better at 4k.
 

Readler

Member
Oct 6, 2018
1,972


Potential RT performance leak for RX 6800? CPU might be bottlenecking the 1440p results. Just below RTX 2080 ti performance - 6800 XT would likely be above 2080 ti RT performance.

Not bad if real!

4TC10bS.png

That tweet seems fishy lol
At 4k without DLSS with RT for SOTTR, CB reports 33 FPS for the 3070, 45 FPS for the 3080 - that benchmark of 46 FPS for the 6800 would be better than both which seems very unlikely imo
www.computerbase.de

Nvidia GeForce RTX 3070 im Test: Taktraten, Benchmarks in Full HD, WQHD, UHD

Nvidia GeForce RTX 3070 FE im Test: Taktraten, Benchmarks in Full HD, WQHD, UHD / Testsystem und Testmethodik
 

Theswweet

RPG Site
Verified
Oct 25, 2017
6,404
California
4TC10bS.png

That tweet seems fishy lol
At 4k without DLSS with RT for SOTTR, CB reports 33 FPS for the 3070, 45 FPS for the 3080 - that benchmark of 46 FPS for the 6800 would be better than both which seems very unlikely imo
www.computerbase.de

Nvidia GeForce RTX 3070 im Test: Taktraten, Benchmarks in Full HD, WQHD, UHD

Nvidia GeForce RTX 3070 FE im Test: Taktraten, Benchmarks in Full HD, WQHD, UHD / Testsystem und Testmethodik

TPU's numbers tell a much different story:



4k numbers:

 

sweetmini

Member
Jun 12, 2019
3,921
None of those games have confirmed DLSS support for now fwiw. At this point it also seems unlikely for Valhalla to have it, as they probably would have advertised it by now, as Legion and Cyberpunk have.

Arguably some of *the* biggest games will most likely not have it: Rockstar usually doesn't give a shit, and I doubt that Bethesda/Microsoft games will support it either and instead opt for a DX solution (see Doom)


Lol

Damn i thought the Ubigames all had the same engine now... indeed i am wrong.
and for hitman 3...
This is the source of my confusion:
www.ign.com

50 Games with RTX and DLSS - IGN

Games with enhanced ray tracing and increased frame rates are on the rise, see the full list of RTX enabled games.

They listed... Hitman 2 as coming soon game with dlss

Anyway, it still stands that for me quality supersampling is the key for the future, and i wish for all to benefit from it.
 

LordRuyn

Member
Oct 29, 2017
3,909
AMD is really making a good case for me to switch from my 1080ti, the only thing that I am waiting on is the quality of the drivers.
 

Fredrik

Member
Oct 27, 2017
9,003
These charts make an even stronger case:
AMD-Radeon-RX-6900XT-6800XT-6800-vs-GeForce-RTX-3090-3080-2080Ti-2K-1200x675.jpg

Source: https://videocardz.com/newz/amd-dis...900xt-rx-6800xt-and-rx-6800-gaming-benchmarks

The 6800 is pretty much consistently better than the 3070/2080 Ti, even edging out the 3080/3090 at 1440p in certain games. Since the 3070 doesn't use GDDR6X like the 3080, the only reason to get them is DLSS and RT (which arguably are not trivial reasons, but on a *hardware* level and in terms of rasterisation performance the 6800 series are better in every regard).
So as it seems right now ray-tracing and DLSS are the wins for Nvidia this time? The rest are AMD wins?
I wonder if Nvidia will react to this, since 6900XT beats 3090 on raw performance while being $400 cheaper they can't do a quick fix with the rumored december 20GB "Super" cards.
 

Readler

Member
Oct 6, 2018
1,972
TPU's numbers tell a much different story:



4k numbers:


Dang you're right.
LS7Wqy0.png

Source: https://www.gamestar.de/artikel/gef...ite3.html#raytracing-performance-der-rtx-3070
Strange.

Although all of CB's scores are lower, so I guess it's something to do with their methodology?

Damn i thought the Ubigames all had the same engine now... indeed i am wrong.
and for hitman 3...
This is the source of my confusion:
www.ign.com

50 Games with RTX and DLSS - IGN

Games with enhanced ray tracing and increased frame rates are on the rise, see the full list of RTX enabled games.

They listed... Hitman 2 as coming soon game with dlss

Anyway, it still stands that for me quality supersampling is the key for the future, and i wish for all to benefit from it.
Agree. MS should get on it. They're investing lots into AI too so I wouldn't be surprised. In the end, it's hopefully the consumers who benefit :)

And it is a reasonable conclusion to assume that Hitman 3's gonna get it - all I'm saying is that we shouldn't rely on "coulda shoulda"s. I guess all of us have been burnt by possible improvement by driver magic, so just judge performance by what we currently know and have.

Ok, what is lol about the statement? Raw rasterization is inferior if I can just enable DLSS and get superior performance. It is not a misnomer to say its meaningless unless AMD comes out with a equivalently good DLSS option....
Because only a handful of games support it? I don't care about DLSS if the games I'm currently most interested don't have it. I wanna play RDR2 in ultrawide with my new rig and be ready for Starfield for instance. The only DLSS game I'm interested in atm is Control and I'm not gonna lose out on safe raster performance for potentially faster DLSS performance.

So it most certainly is a misnomer, unless you're only playing those DLSS supported games. It's like saying a PC is useless because you can't play Bloodborne on it.

So as it seems right now ray-tracing and DLSS are the wins for Nvidia this time? The rest are AMD wins?
I wonder if Nvidia will react to this, since 6900XT beats 3090 on raw performance while being $400 cheaper they can't do a quick fix with the rumored december 20GB "Super" cards.
As always: wait for third-party benchmarks. There will be cases where the 3090 is faster than the 6900XT.
From what we seem to know NOW they seem pretty good. If it wasn't for DLSS and RT I'd see literally no reason to get the 3080 over a 6800XT for instance. Like, not a single one.
 

Simuly

Alt-Account
Banned
Jul 8, 2019
1,281
I'm really surprised that AMD didn't make a bigger fuss about their 1440p performance. For the majority of the market those are the results that matter and the 6800xt is beating a 3090 at that resolution. That's hugely significant, and better than I had initially realised.

It also puts the 6800 in a better light getting within 5% of 3080 performance and a clear tier above the 2080Ti. The 6800 doesn't show quite the same lead over the 2080Ti at 4K but then the 3070 falls off a little against the 2080Ti at 4K anyway so once that is switched out with the 3070, the gap will widen at 4K.

Need to understand how much impact SAM is having on those results as I'm on a B450 motherboard with a 3700x so it may be 5+ years before I can benefit from it.

Yep precisely, they said SAM improves performance 'up to 11%' but then the detail: 'Smart Access Memory will have a degree of automatic support to it. As things stand, many AAA games can use it today automatically and get some degree of a performance gain. But to get the best results, developers will still want to program games with the technology in mind.'

So I'd assume none of those games have been optimised for it. Maybe those benches are boosted by 5% then? Hard to gauge.

This chart makes a pretty good case for both the 6800 cards. Does this mean they're both stomping the 3070?

Yes basically replace the 2080 Ti with the 3070 in that chart. Both 6800s are significantly faster using AMD's numbers.
 

JahIthBer

Member
Jan 27, 2018
10,376
Seeing how demanding these games with RT are, DLSS is sorely needed, but the synergy between Ryzen 3/RDNA2 plus RDNA2 having much more memory, this is a hard choice.
Might end up getting a 6800 XT, hope the price in my country isn't bonkers, $950 for the 3070 is a god damn joke, 8GB of VRAM in 2020 for that price.
 

brain_stew

Member
Oct 30, 2017
4,727
These charts make an even stronger case:
AMD-Radeon-RX-6900XT-6800XT-6800-vs-GeForce-RTX-3090-3080-2080Ti-2K-1200x675.jpg

Source: https://videocardz.com/newz/amd-dis...900xt-rx-6800xt-and-rx-6800-gaming-benchmarks

The 6800 is pretty much consistently better than the 3070/2080 Ti, even edging out the 3080/3090 at 1440p in certain games. Since the 3070 doesn't use GDDR6X like the 3080, the only reason to get them is DLSS and RT (which arguably are not trivial reasons, but on a *hardware* level and in terms of rasterisation performance the 6800 series are better in every regard).

The 6800 looks really good there and given it has much lower clocks it has the biggest potential for overclocks as long as AMD don't put some artificial limits on it. We know the 3070 has practically zero overclocking headroom, so it would be interesting to see what the gap between an overclocked 3070 and 6800 will be. It could be quite significant at 1440p.

After initial disappointment with the 6800 due to it meaning AMD don't have a card on the $500 price segment this year, I am starting to warm upto it. I set on a 3070 but the fact that I couldn't secure one on launch is making me reconsider the options from AMD.

Really need to know UK pricing.
 

brain_stew

Member
Oct 30, 2017
4,727
From what we seem to know NOW they seem pretty good. If it wasn't for DLSS and RT I'd see literally no reason to get the 3080 over a 6800XT for instance. Like, not a single one.

Drivers are a hugely significant one that can't be overlooked. I hate the fact that buying an AMD GPU means I get an effective and significant CPU downgrade in all DX11 titles and there's still plenty of those that require significant single core performance to maintain a truly locked 60fps. The idea that upgrading one competent downgrades another has never sat well with me. While DX12 and Vulkan are becoming increasingly common, DX11 still matters and continues to matter.

We can't just pretend that Navi drivers weren't a complete mess for a long period after launch either. I don't expect the same to happen with RDNA2 but it's a legitimate concern to have.
 

Readler

Member
Oct 6, 2018
1,972
The 6800 looks really good there and given it has much lower clocks it has the biggest potential for overclocks as long as AMD don't put some artificial limits on it. We know the 3070 has practically zero overclocking headroom, so it would be interesting to see what the gap between an overclocked 3070 and 6800 will be. It could be quite significant at 1440p.

After initial disappointment with the 6800 due to it meaning AMD don't have a card on the $500 price segment this year, I am starting to warm upto it. I set on a 3070 but the fact that I couldn't secure one on launch is making me reconsider the options from AMD.

Really need to know UK pricing.
BN also consuming less energy might also mean more OC headroom.

Drivers are a hugely significant one that can't be overlooked. I hate the fact that buying an AMD GPU means I get an effective and significant CPU downgrade in all DX11 titles and there's still plenty of those that require significant single core performance to maintain a truly locked 60fps. The idea that upgrading one competent downgrades another has never sat well with me. While DX12 and Vulkan are becoming increasingly common, DX11 still matters and continues to matter.

We can't just pretend that Navi drivers weren't a complete mess for a long period after launch either. I don't expect the same to happen with RDNA2 but it's a legitimate concern to have.
I feel like this is always overblown. Turing notoriously had issues as well, and there was also that one driver that bricked older cards.
FWIW, while Nvidia does have the rounder overall software experience, I never had any problems with AMD drivers myself. Hey, you could even argue that poorer optimisation leads to better aging :P
I remember the 7970 leaving the 680 in the dust after a year or two.
 

Bosch

Banned
May 15, 2019
3,680
6800 xt with theorycal 23 tflops should offer 2x perf than Xbox series x right?

When ps4 come out 7970 ghz edition offered double performance.
 

RealSamFisher

Member
Oct 6, 2018
26
After initial disappointment with the 6800 due to it meaning AMD don't have a card on the $500 price segment this year, I am starting to warm upto it. I set on a 3070 but the fact that I couldn't secure one on launch is making me reconsider the options from AMD.
I seriously don't understand this notion here about the 6800. It's pretty great value compared to the 3070 in my opinion:
  • According to the benchmarks it's around 10 - 15 % better than a 2080Ti/3070 in raster performance.
  • It has 16GB of VRAM.
  • RT performance seems to be between 2080 and 2080Ti, looking pretty good actually.
  • There is overclocking headroom for the 6800. The 3070 is pretty much at it's limit (even AIBs don't add much).
  • AMD announced Super Resolution.
Sure we need to wait for the real benchmarks to come. But if we look at all the Zen & RDNA events... the performance numbers were pretty reliable. So I don't think it will change that much. That said we still need to wait for benchmarks from Gamer Nexus, Hardware Unboxed, Computerbase etc.

I also don't understand why people expected AMD to be able to do some miracles and beat Nvidia in every segment. Being around Turing level when it comes to RT performance is a pretty good start for implementing Hardware RT the first time. We will have to see how AMD will implement Super Resolution. However AMD is at a pretty good spot in my opinion, better than I have expected. This is good for us consumers. And I expect the AMD Radeon team to move on like the AMD Ryzen team does.

Looking at the architecture of RDNA2, I really wouldn't be surprised to see MCMs in RDNA3. The Radeon team has done a lot of preparation in RDNA2 to make this work. Potentially we might be able to see 160 CUs (4 x 40 CUs chiplets) in 5nm.
 
Last edited:

Readler

Member
Oct 6, 2018
1,972
I seriously don't understand this notion here about the 6800. It's pretty great value compared to the 3070 in my opinion:
  • According to the benchmarks it's around 10 - 15 % better than a 2080Ti/3070 in raster performance.
  • It has 16GB of VRAM.
  • RT performance seems to be between 2080 and 2080Ti, looking pretty good actually.
  • There is overclocking headroom for the 6800. The 3070 is pretty much at it's limit (even AIBs don't add much).
  • AMD announced Super Resolution.
Sure we need to wait for the real benchmarks to come. But if we a look at all the Zen & RDNA events... the performance numbers were pretty reliable. So I don't think it will change the much. That said we still need to wait for benchmarks from Gamer Nexus, Hardware Unboxed, Computerbase etc.

I also don't understand why people expected AMD to be able to do some miracles and beat Nvidia in every segment. Being around Turing level when it comes to RT performance is a pretty good start for implementing Hardware RT the first time. We will have to see how AMD will implement Super Resolution. However AMD is at pretty good spot in my opinion, better than I have expected. This is good for us consumers. And I expect the AMD Radeon team to move on like the AMD Ryzen team does.

Looking at the architecture of RDNA2, I really wouldn't be surprised to see MCMs in RDNA3. The Radeon team has done a lot of preparation in RDNA2 to make this work. Potentially we might be able to see 160 CUs (4 x 40 CUs chiplets) in 5nm.
I guess calling this their Zen moment makes sense as they have caught up now and are a viable alternative; RDNA2, with hopefully a useful SuperResolution feature and better RT implementation, is where things could get spicy.

I would be very surprised to see MCMs in RDNA3 (that is, if the 7000 series is indeed RDNA3), particularly if they stay on the same node. That would be huge, as latency, especially for gaming, seems to be somewhat of a problem.
 
Nov 2, 2017
2,275
6800 xt with theorycal 23 tflops should offer 2x perf than Xbox series x right?

When ps4 come out 7970 ghz edition offered double performance.
No, because the XSX is 12 tflops so you'd need 24 tflops but there's never 1:1 scaling with tflops so effective performance is going to be less than 2x. If we put the XSX around 2080 and the 6800xt around 3080 then the 6800xt is about 60-70% faster. There's no card out right now that's twice as powerful as the XSX. We'll have to wait until Hopper & RDNA3.

Also the 7970 ghz was 2.3 times faster in raw tflops. I think it had less Async engines. No clue how much faster it was in actual games though.
 
OP
OP
Raydonn

Raydonn

One Winged Slayer
Member
Oct 25, 2017
919
videocardz.com

Alleged AMD Radeon RX 6800 Time Spy and Tomb Raider (with DXR) performance leaks out - VideoCardz.com

The editor from Uniko’s Hardware shared alleged AMD RadeoN RX 6800 graphics card performance. AMD Radeon RX 6800 faster than GeForce RTX 3070 in Time Spy The results are not confirmed to be Radeon RX 6800, but we can assume that Uniko’s Hardware editor is not sharing fabricated information, as...

More rumours, not threadmarking because I only want to mark credible news, but something to read and speculate on is always welcome.
 

RealSamFisher

Member
Oct 6, 2018
26
I would be very surprised to see MCMs in RDNA3 (that is, if the 7000 series is indeed RDNA3), particularly if they stay on the same node. That would be huge, as latency, especially for gaming, seems to be somewhat of a problem.
Look at this block diagram from TechPowerUp:

923-block-diagram.jpg


Now this isn't official, however if this block diagram is true, I could see how they could create a MCM for RDNA3. Infinity Fabric & Cache Controller sitting between the IO Die and the GPU chiplets. IO Die having the massive infinity cache and all the other stuff, while a GPU chiplet is a single shader engine with some L2 cache. I know it's not that easy, just throwing out some ideas.
 

Readler

Member
Oct 6, 2018
1,972
Look at this block diagram from TechPowerUp:

923-block-diagram.jpg


Now this isn't official, however if this block diagram is true, I could see how they could create a MCM for RDNA3. Infinity Fabric & Cache Controller sitting between the IO Die and the GPU chiplets. IO Die having the massive infinity cache and all the other stuff, while a GPU chiplet is a single shader engine with some L2 cache. I know it's not that easy, just throwing out some ideas.
You know what really grinds my gears? When you criticise something and people go "oh yeah, you go do it better then!"
That being said, if it's that easy, why haven't you done it yourself RealSamFisher, HUH?

I kid of course, but yeah I could see that. It's clear that AMD is laying out the groundwork with Infinity Fabric and things like SAM (which I could imagine being translated for an MCM design), I'm just not seeing it for next year. RDNA2 was an evolution of RDNA, as was Turing to Ampere - an MCM design would be pretty revolutionary and a pretty heavy shift in architecture design I guess. Maaaybe in a Titan-esque card to just show off.
It's clear they are working on it though, just like Nvidia announced with Hopper.
 

RealSamFisher

Member
Oct 6, 2018
26
@Readler: I don't think we will see it next year either. More like Q1/Q2 2022. Maybe it's to early and we will see an other monolithic die with RDNA3 and the 7000 series. I wouldn't rule it out.
 

Readler

Member
Oct 6, 2018
1,972
@Readler: I don't think we will see it next year either. More like Q1/Q2 2022. Maybe it's to early and we will see an other monolithic die with RDNA3 and the 7000 series. I wouldn't rule it out.
I suppose you're right. At this point, how the hell knows haha I personally think they'd make the 7000 series just a strong contender to reach feature parity with Nvidia. Although Hopper is rumoured to have an MCM design, I'd be very surprised to see it anywhere outside of the absolute flagship until the generation after that. I do hope to be proven wrong though lol
Either way, exciting times!
 

Nachtmaer

Member
Oct 27, 2017
347
Splitting off the I/O from the cores makes a lot of sense for CPUs. That's basically going back to the days of having the MC on the north bridge, just on the CPU package now. I'm not sure if you could do something similar like that for GPUs where you're dealing with close to 1TB/s vs 100GB/s on Zen 2, roughly speaking. Afaik, the biggest two challenges are splitting things up without messing up the pipeline by adding a bunch of latency and stitching everything together on package with something that has enough bandwidth.
 

tokkun

Member
Oct 27, 2017
5,400
From what we seem to know NOW they seem pretty good. If it wasn't for DLSS and RT I'd see literally no reason to get the 3080 over a 6800XT for instance. Like, not a single one.

Hardware lock-in is still a thing. Nvidia only added vanilla adaptive sync support to Gsync hardware a year ago. Going with an AMD card either means giving up VRR support or replacing my two Gsync monitors.
 

Readler

Member
Oct 6, 2018
1,972
Hardware lock-in is still a thing. Nvidia only added vanilla adaptive sync support to Gsync hardware a year ago. Going with an AMD card either means giving up VRR support or replacing my two Gsync monitors.
I mean, yeah, but that's kinda on you if you did that (I really sound like a dick here, and this really isn't my intention, sorry).
I just didn't support that shit precisely because of this, but this is not the first time where Nvidia is being a massive cunt. GeForce Partner Program, anyone?
 

j^aws

Member
Oct 31, 2017
1,569
UK
Yep precisely, they said SAM improves performance 'up to 11%' but then the detail: 'Smart Access Memory will have a degree of automatic support to it. As things stand, many AAA games can use it today automatically and get some degree of a performance gain. But to get the best results, developers will still want to program games with the technology in mind.'

So I'd assume none of those games have been optimised for it. Maybe those benches are boosted by 5% then? Hard to gauge.
There are great opportunities for optimisation with SAM, since the CPU has full access to GDDR6 (VRAM), like with unified memory with XSX and PS5 APUs, ports could really benefit where CPUs are used in the rendering pipeline.
 

tokkun

Member
Oct 27, 2017
5,400
I mean, yeah, but that's kinda on you if you did that (I really sound like a dick here, and this really isn't my intention, sorry).
I just didn't support that shit precisely because of this, but this is not the first time where Nvidia is being a massive cunt. GeForce Partner Program, anyone?

It made sense at the time.

Nvidia was not supporting DP adaptive sync output from their cards either, so the choice was either getting locked in to Nvidia or getting locked in to AMD. This was around the Maxwell release, when Nvidia GPUs were really superior. Gsync had more significant technical advantages back then too; AMD hadn't introduced FreeSync 2 and didn't have LFC. And because Gsync was seen as the premium technology and FreeSync was relegated to budget models, if you wanted a top-end monitor you had to go Gsync.

I'd love to get out from under Nvidia's thumb on this, but this seems like a pretty bad time to do a monitor upgrade, since it seems like MiniLED is going to go mainstream in the next year or two and HDMI 2.1 / DP 2.0 support is coming as well.
 

RealSamFisher

Member
Oct 6, 2018
26
videocardz.com

Alleged AMD Radeon RX 6800 Time Spy and Tomb Raider (with DXR) performance leaks out - VideoCardz.com

The editor from Uniko’s Hardware shared alleged AMD RadeoN RX 6800 graphics card performance. AMD Radeon RX 6800 faster than GeForce RTX 3070 in Time Spy The results are not confirmed to be Radeon RX 6800, but we can assume that Uniko’s Hardware editor is not sharing fabricated information, as...

More rumours, not threadmarking because I only want to mark credible news, but something to read and speculate on is always welcome.
If AMD is better at Ray Traycing why wouldn't they show it at the event? Are they waiting til they are ready with their Super Resolution technology?
 
OP
OP
Raydonn

Raydonn

One Winged Slayer
Member
Oct 25, 2017
919
The 6800 isn't competing against the 3070, hence the price difference. It's for the future 3070ti.
The 6700XT that is yet to be released will be comparable to the 3070 and will be around that price point.
If AMD is better at Ray Traycing why wouldn't they show it at the event? Are they waiting til they are ready with their Super Resolution technology?
If I had to guess... "Drivers not ready" or "Let's wait until release date to drop even more news so Nvidia can't counter us".

Just like how people were complaining that AMD weren't showing anything in September, so they must have not have anything that could compete with Ampere.
AMD was just not ready to release it yet and wanted Nvidia to blow its load so they couldn't react to AMD's response.
 

PHOENIXZERO

Member
Oct 29, 2017
12,069
So as it seems right now ray-tracing and DLSS are the wins for Nvidia this time? The rest are AMD wins?
I wonder if Nvidia will react to this, since 6900XT beats 3090 on raw performance while being $400 cheaper they can't do a quick fix with the rumored december 20GB "Super" cards.
It's probably why the 16GB and 20GB non-Ti versions were supposedly cancelled.
 
Status
Not open for further replies.