• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.
Status
Not open for further replies.

mario_O

Member
Nov 15, 2017
2,755
Well, they didn't show any RT benchmarks unlike Nvidia in their presentation, so I don't think it's too much of a stretch to assume that they lag behind in that aspect.
If the rumors are right, they are one gen behind. On par with Turing. And their "DLSS" solution faster but not as good. I personally don't care that much about raytracing. If it's as good as on consoles that's good enough.
 

Iron Eddie

Banned
Nov 25, 2019
9,812
You have no valid issue. I've explained this. Drawing logical conclusions is fair game on a discussion board.


The 3090 is now completely out of pocket. Nvidia can't even drop the price this soon as they are actively fulfilling orders.

BUT, the RT performance comparison between the 6900 XT and RTX 3090 could tell an entirely different story. What if the 3090 just bulldozes the 6900 by some ridiculous percentage?
True but then you have to wager is $500 extra worth the limited amount of games that use it and will AMD get their own solution up and running soon?

I found this presentation rather weak, they hardly touched on the games so now we have to wait for real world performance differences.
 
Oct 25, 2017
1,957
Germany
The thing is that these cards are likely to have better RT performance than the consoles, for which developers will optimize RT. The games will look great and optimal, you might just not get that "ray tracing just blew my mind"-moment.

No hardware supported DLSS has me actually worried. The UE5 demo ran at 1440p/30fps on a PS5, meaning that some sort of upscaling will be necessary for high end early UE5 games running at 60fps+. Now, we know that Microsoft is working on their own software AI super-resolution upscaler, but will that be on PC? Maybe... probably even I'd say. What about Sony though?
Really don't like that AMD doesn't seem to have an in-house solution that is implemented in RDNA2 AI upscaling.
 

Deleted member 35478

User-requested account closure
Banned
Dec 6, 2017
1,788
I was dead set on a 3080, now I want a 6900xt. If stock isn't an issue and 6900xt price really is $999 hard to pass that up. Heck, most AIB 3080's are the OC models that are $800+ anyways. I've never owned an AMD card before, NVIDIA / EVGA owner since 5 series cards lol. Now my PC might be full team red, 3700x in there now.
 
Nov 2, 2017
2,275
In regard to the 3070 vs 6800 I think having less RT performance is going to hit performance harder & faster than having less VRAM, atleast at 1440p. In a 1-2 years I expect pretty much every single AAA game to be using RT, given that consoles support it.

This also applies to the 6800xt where the rumours have it at 2080Ti performance for raytracing so if that is true then these cards won't age well at all.
 

molnizzle

Banned
Oct 25, 2017
17,695

I'd be worried if they weren't... but they really needed that tech to be ready for reveal today. DLSS is becoming increasingly common with the biggest triple A releases. I'm mainly buying a new GPU right now for Cyberpunk and Nvidia DLSS will likely give me +25% frame rate (or more) with no discernible downside.

If AMD can match or exceed that in the future (and I'm optimistic after seeing what they did with FreeSync) then we can talk, but for now... these cards just aren't compelling to me. DLSS is absolutely critical to those trying to meet or exceed 120fps in the latest games.
 

bevishead

Member
Jan 9, 2018
885
I actively avoid it. I want as close to 144fps as possible with no dips below 60 ever. Raytracing too intensive for that.

I have a 2080 and when fortnite added DLSS and raytracing I turned both of them on DLSS helps a lot. Raytracing frames are ok so long as nothing much is going on. Soon as a big battle along with building starts happening, the framerate tanks so matter which RTX quality is set.
 

TSM

Member
Oct 27, 2017
5,821
I never said RT wasn't a valid topic. I'm not sure why you think I said it wasn't something worth discussing. I said the fact he proclaimed it was going to be poor with no evidence was my issue.

It was poor enough that AMD decided not to talk about it during their hardware unveiling even though it was the elephant in the room.
 

Simuly

Alt-Account
Banned
Jul 8, 2019
1,281
For those that missed it:

radeon-rx-6900-xt-4k-100863941-large.jpg


I predict with Rage Mode and SAM off, the 6900XT will be ~5% slower than the 3090. But, the rumours are the 6900XT overclocks really well. The folk that leaked most of the performance values and info for the 6000-series have been on the money so far, and the 60-72 CU cards at least being able to clock up to 2400-2500Mhz rumour is real imo.
 

dgrdsv

Member
Oct 25, 2017
11,846
AFAIK 16GB RAM is completely useless at 1440 P So I don't know why for the 6800 AMD didn't go with 8 or 10gb and make it cheaper.
You can be sure as hell that it's coming.
$580 price on 6800-16 hints that there will inevitably be 6800-8 at some $530 down the line.
They haven't announced it today because it would perform exactly the same as 6800-16 and this would put a big question mark over the need for 16GB RAM buffers for all of these cards. No point in destroying your competitive advantage with your own hand.

and the 60-72 CU cards at least being able to clock up to 2400-2500Mhz is real imo
AIB models with >350W power limits and 3-slot coolers sure.
 

nitewulf

Member
Nov 29, 2017
7,195
I thought RT was shown in Godfall, as well as during the developer highlight where one of the devs demo-ed RT shadows? I liked the presentation. It wasn't flashy like NVs but these are performant cards.
 

Siresly

Prophet of Regret
Member
Oct 27, 2017
6,570
They didn't show raytracing graphs, so probably that performance is less impressive. Didn't hear anything about a DLSS alternative, but I may have missed it.
Otherwise the XT appears to be at least on the same level as the 3080, while drawing a little bit less power and being somewhat cheaper. We'll see if independent testing confirms.

BFV does particularly well on RDNA2 for some reason. Looks like the XT is about +15% over the 3080.

I'll probably keep my 3080 order, because DLSS seems pretty good, and it's been over five weeks since I made the order and I am not keen on potentially hitting reset on that waiting period, especially when it should end this week. But RDNA2 does seem competitive, so that's good. Especially with Nvidia having such stock issues. Hopefully AMD will do better on that front.

Interesting.

Can't really base a purchase on "we're working on it", but yeah by bringing consoles onboard it seems to only be a matter of time before AMD catches up.
Certainly at least in support. Like, most console games will eventually be using it.
 
Last edited:

ZeroMaverick

Member
Mar 5, 2018
4,433
I have an Asrock AB350 motherboard. Do I need to upgrade that in order to get one of these cards? PC stuff is still so confusing sometimes. Thanks in advance for anyone offering help.
 

Caz

Attempted to circumvent ban with alt account
Banned
Oct 25, 2017
13,055
Canada
I'd be worried if they weren't... but they really needed that tech to be ready for reveal today. DLSS is becoming increasingly common with the biggest triple A releases. I'm mainly buying a new GPU right now for Cyberpunk and Nvidia DLSS will likely give me +25% frame rate (or more) with no discernible downside.
The word "common" is doing some legwork.
 

BobLoblaw

This Guy Helps
Member
Oct 27, 2017
8,288
My 3090 is scheduled for delivery today. Luckily, I have until January to return it because Amazon. The next 6 weeks should be bonkers.
 

SolarPowered

Member
Oct 28, 2017
2,211
If they manage to make that feature universal and relatively easy to implement, it might just blow the NV DLSS away.
I'm expecting a lot of enthusiasm from Sony and MS on this subject since they'll definitely want that DLSS on their own consoles.
AFAIK 16GB RAM is completely useless at 1440 P So I don't know why for the 6800 AMD didn't go with 8 or 10gb and make it cheaper.
That'll be for a 48 or 50CU 12GB variant next year at $450 or $500.
It certainly does that. As a consumer with money literally waiting to be spent on the best sub £500 graphics on offer I'd far rather they released a product that was competitive instead. There's much bigger volume in the $500 and below price bracket but they've left it completely uncontested.
You're right, but I also put 50% of the blame on the consumer for this decision as well. The 5700XT came out at a good price, eventually solved it's driver issues and now trades blows with the 2070 Super, but few people bought it. If the average income PC gamers aren't biting AMD will inevitably shift to more premium tiers for the more discerning power users who'll pay through the nose for those frames per second. It's great for those guys and gals, but no so much for the lower tier PC gamers living that covid struggle life lol. We gotta show AMD we actually want these products next time.
 
Last edited:

Nooblet

Member
Oct 25, 2017
13,625
So they arent using compute for RT but have dedicated RT hardware called RT accelerators?
That's news to me then, wonder how it performs vs RT cores, but seeing as they didn't talk about it at all I'm betting it doesn't perform as good and maybe on same level as turing or a bit worse.
 

shanew21

Member
Nov 7, 2017
516
No RTX benchmarks kinda says it all really.

These cards look great at pure raster, but no DLSS and no RT performance benchmarks means that Nvidia will be king there. For something like Cyberpunk that's going to matter.

The VRAM is a big deal though. 16GB vs 10 is substantial.
 

molnizzle

Banned
Oct 25, 2017
17,695
The word "common" is doing some legwork.
Perennial PC gaming giants Fortnite and Minecraft both support DLSS now. The 2 biggest AAA releases this holiday (Cyberpunk 2077 and CoD Cold War) will both support it. Watch Dogs Legion will support it.

Yes, DLSS support is becoming increasingly common. Demand for it will only increase as more people experience it too.
 

SolarPowered

Member
Oct 28, 2017
2,211
You can be sure as hell that it's coming.
$580 price on 6800-16 hints that there will inevitably be 6800-8 at some $530 down the line.
They haven't announced it today because it would perform exactly the same as 6800-16 and this would put a big question mark over the need for 16GB RAM buffers for all of these cards. No point in destroying your competitive advantage with your own hand.
No way anyone is releasing another 8GB card for more than $500 ever again. I expect AMD's 12GB 5700XT to be $500 max. I'm glad we're finally moving past the 8GB era. Feels a little like the end of the 4 core.
 

bruhaha

Banned
Jun 13, 2018
4,122
This right here is the reason - the only reason they are going for this much memory is to keep the bus fatttt

You shouldn't need the same bandwidth if there are fewer CUs and target a lower resolution. Costing $80 more than the competition and just $70 less than your other card basically makes 6800 pointless.
 
Status
Not open for further replies.