No it's not
16% vs close to 40%
And this time around the 16% weaker console has the better memory architecture, unlike PS4 which had better memory and TFlops
According to Cerny, it will run at that speed the vast majority of the time, only dropping a "couple of percents" on rare occasions.
10.28TF is the absolute base/default for the GPU.
Don't forget that the GPU CLOCK affect the rasterization speed and some other parameters. A 10.3TFLOPS at 2.23 GHZ can be equal efficiant to a 12TFLOPS 1.8GHZ. Plus Sony seems to have very good memory managing, this could affect the final perf. I think the macro optimisation in PS5 can be equal efficient like the raw power of the XSX.
Guys and gals I understand that we're all a little riled up about things not going as expected for whatever your personal preferred systems are but please stop spreading misinformation to fit your narrative.
10.28 is the max for the gpu when it is at full boost, we don't know how long it can stay there or the percentage of performance lost if it drops since we don't know the base tf or frequency. Might be 90% of the time might be 60% might be variable depending on the air you feed your system.
Next 10.28 RDNA 2 tf does not equal 12.15 RDNA 2 tf in any world please stop saying this, the Xbox gpu is better, is it going to be taken advantage of who knows, but it is what it is, let's see what shakes out just like the ps5 ssd is superior to the Xbox series x, will it be taken advantage of we'll see
lastly, we don't know what memory configuration is better and it will likely come down to the size of a game, type of game, and dev preferences. If a dev only needs 10gb of ram the Xbox offers the faster solution if a dev needs more maybe they prefer the ps5 solution of it all being the same speed rather than 2 speeds. Wait till we know more as games release and hear from devs.
Accept the facts, argue as much as you'd like about what's better but please stop making things up to make each side look better or worse it doesn't help anyone