my guess its raytracing was a addon ms, or sony, or both, asked for well in to first gen navi disgin, so they couldnt at it too these cards.That met my expectations. Just that it's a shame they didn't talk about their ray-tracing plans.
my guess its raytracing was a addon ms, or sony, or both, asked for well in to first gen navi disgin, so they couldnt at it too these cards.That met my expectations. Just that it's a shame they didn't talk about their ray-tracing plans.
Oh, I know... I am trying to console AegonSnake.It doesn't matter what Navi version they're using. They can scale the CU count to their desire.
People willfully ignored the math and helpful guides people constructed with tables. They set themselves up for disappointment.I don't understand the meltdowns. It's been known for weeks the 5700 was 2070-level. Furthermore, this was not a console reveal.
Oh, I know... I am trying to console AegonSnake.
Personally, I am still expecting 56CU GPUs in next-gen consoles. With 48CU active.
Anaconda Polar, comes with a fridge based cooling that you insert your console into.
It can - you're overestimating the amount of total data accessed during rendering of a frame.Can it hold data needed for the next frame instead of it being in memory? No, it can't.
I was mainly asserting that if a cost trade-off was made for the kind of high-speed SSDs as rumored in nexgen machines, cutting some Ram would be more than worth it for it. Obviously none of us have the BOM to speculate on at this point.I agree with you there but there's no HDD option in either PS5 or Xb4 and 12GBs of RAM is definitely too low for the next gen machine.
Well if it makes you feel any better
an RTX 2070 OC is roughly just like 3 - 4 fps lower then a GTX 1080TI tested on 9 games.
And as we saw earlier an RX 5700 XT is slightly better then a RTX 2070, therefore on par with a GTX 1080TI
And thats on a 2019 RDNA hybrid card, imagine what the pure RDNA cards of 2020 will do that will definetly be in the next gen consoles.
Next Gen consoles with the GPU power of a GTX 1080TI doesn't seem like a pipedream tbh.
Not a big deal IMO. That just tells you that MS has a newer or more customized version of Navi, which can only be a good thing.
I don't understand the meltdowns. It's been known for weeks the 5700 was 2070-level. Furthermore, this was not a console reveal.
given the 5700xt is already 180 tdp no your not. yeah you can reduce clockspeeds , but than adding the extra cus doesnt get you much other than less usable chips.People willfully ignored the math and helpful guides people constructed with tables. They set themselves up for disappointment.
I also expect at least 48 CU in one console and at least one console with 10+ TF.
I kinda doubt that consoles will be using an even bigger GPU than 5700, especially considering that they will get RDNA2 with RT h/w and other new features probably which will add even more transistors to each flop.Maybe this is Navi 12... and the next-gen consoles will be using Navi 10?
The sucky part is now we have to wade through a year and half of insanity and pastebin leaks
I'd wager an account ban that one console will hit 10TF.given the 5700xt is already 180 tdp no your not. yeah you can reduce clockspeeds , but than adding the extra cus doesnt get you much other than less usable chips.
my guess is rdna+ rt tbh, I think rdna 2 is 2021.I kinda doubt that consoles will be using an even bigger GPU than 5700, especially considering that they will get RDNA2 with RT h/w and other new features probably which will add even more transistors to each flop.
Have you played RE7 in VR? Doesn't get more immersive than that, AAA or not :P
In in any case I'm really glad immersive experiences will drive next gen, as I'm a fan of both VR and SP narrative focused games
If we are counting based on Nvidia's TFlops measurement, you are definitely getting an account ban.
The console warriors are already pretty unbearable. This could go on till both systems specs are opening known. And according to Matt both are playing chicken.
That's not really a problem this next gen, we saw stuff like FFXV, Watch Dogs, Star Wars 1313 & Deep Down that needed more powerful hardware & when PS4/XO didn't achieve that we saw a big downgrade, luckily now i think devs know much more about what they are getting & they are keeping more demo's close to their chest.Some of you folks and your fatalistic attitudes. I really hope it dies down within a week because this won't be good for the health of anyone reading and contributing.
Personally, this same I had a similar breakdown when folks predicted back in late 2012 that next gen would need to have at least 2.4TF to bring ground breaking visuals and then it fell through when the highest spec console was rated at 1.84TF. "Gamer over man, it's game over" was prevalent attitude among many including myself and it was honestly a toxic circle jerk that went on for days.
Now, over 7 years older, I feel like it was so silly in retrospect and that the lesson to learned included the fact that target and established hard specs are not be all and end all when it comes to software and then mid gen refresh introduced another paradigm shift that no one saw coming.
So, I hope the older folks here can take a longer view of things.
that feels like a excuse for rt not being ready yet.Felt like the guy's messaging around we don't want to add extra features that will reduce performance was a direct statement to explain the absence of raytracing.
I appreciate this approach. It's all too easy to get caught up in the hype and disappointment assembly line.Some of you folks and your fatalistic attitudes. I really hope it dies down within a week because this won't be good for the health of anyone reading and contributing.
Personally, this same I had a similar breakdown when folks predicted back in late 2012 that next gen would need to have at least 2.4TF to bring ground breaking visuals and then it fell through when the highest spec console was rated at 1.84TF. "Gamer over man, it's game over" was prevalent attitude among many including myself and it was honestly a toxic circle jerk that went on for days.
Now, over 7 years older, I feel like it was so silly in retrospect and that the lesson to learned included the fact that target and established hard specs are not be all and end all when it comes to software and then mid gen refresh introduced another paradigm shift that no one saw coming.
So, I hope the older folks here can take a longer view of things.
Codenames don't matter. For all we know now RDNA2 may be exactly RDNA+RT and nothing else. It will still be more complex than Navi's RDNA.
It was a cheap stab considering they're barely edging out NV's RTX cards on a next gen production process without any RT which showcases that it doesn't in fact slow down anything.
wasn't the flagship card 40 CU?I also expect at least 48 CU in one console and at least one console with 10+ TF.
We still don't know, it could just be ray tracing via compute.If both Scarlett and PS5 have hardware RT why are people so sure they will be based off these cards, which don't?
If anything that supports the Navi 10 stuff for me. They are bigger better cards, this is just the entree.
Clocks are not final yet is the reason they won't say anything. if Sony is clocked at 1.8ghz, Microsoft might try to push to 1.9ghz, we will see.ITT folks are learning that an architecture can't be measured by a single number alone. This is the reason MS didn't reveal their TF number, because it isn't a good descriptor of how the new console will perform. AMD has made a more efficient arch and we will see good things down the line.
The game clock is the average you get when playing most games.
Not the maximum boost clock.
Ok guys so what's now ? 5700XT seems to be 220W board power so 220W for the GPU and Vram right ?
Then the APU can't handle that if you add 30w for the Zen 2 and extra feature like RT or ram ? How can it be possible without downclock the GPU and maybe you can add the extras CU ?
5700 XT is not flagship. There's clearly room for a 58xx and 59xx series of cards there.
If we are counting based on Nvidia's TFlops measurement, you are definitely getting an account ban.
So next gen consoles won't have VRS (Variable Rate Shading) support?
That seems like an important feature to have can easily gain performance for visually no graphical downgrade
cant see sony ignoring, even a noughty dog dev said ps5 had hardware ray tracing.Imagine if PS5 is 12.9 teraflops Navi without HW RT.
But Anaconda is ~10 teraflops Navi with HW RT.
The battles would be biblical.
Scarlett reveal mentioned VRS.
MS said VRR, variable refresh rate, IIRC.Imagine if PS5 is 12.9 teraflops Navi without HW RT.
But Anaconda is ~10 teraflops Navi with HW RT.
The battles would be biblical.
Scarlett reveal mentioned VRS.
Good. I reckon PS5 should too it's too good to miss out on
Always good to take a break from this thread, I had to do so myself during the height of the Zen 3/Arcturus nonsense.
So next gen consoles won't have VRS (Variable Rate Shading) support?
That seems like an important feature to have can easily gain performance for visually no graphical downgrade
With a single exception, there also aren't any new graphics features. Navi does not include any hardware ray tracing support, nor does it support variable rate pixel shading. AMD is aware of the demands for these, and hardware support for ray tracing is in their roadmap for RDNA 2 (the architecture formally known as "Next Gen"). But none of that is present here.
The one exception to all of this is the primitive shader. Vega's most infamous feature is back, and better still it's enabled this time. The primitive shader is compiler controlled, and thanks to some hardware changes to make it more useful, it now makes sense for AMD to turn it on for gaming. Vega's primitive shader, though fully hardware functional, was difficult to get a real-world performance boost from, and as a result AMD never exposed it on Vega.