No. The point is it makes "fast max" and "heavy max" the same power draw, by dynamically reducing clockspeed in the former, where it doesn't matter. So no matter what power/cooling system you use, all of it provisions gameplay. Unlike fixed clocks, where a significant wattage must be reserved for basically no benefit.It is helpful, but this system doesn't eliminate fast max, you still can't go above 2.23ghz clock speeds, it just shifts it to a different clock speed that's only obtainable by reducing power elsewhere, no?
The benefit is more than 2%, which is just the raw TF gap. But you're also speeding up cache, the scheduler, texel and pixel fillrates, etc. The overall gain is still not huge, but why avoid grasping it? In edge cases, even small percentages matter: no tearing instead of minimal tearing; middle-distance enemies at full instead of half framerate; lots of particles instead of few particles.So this is what I don't get. If these drops are so unlikely, why do you need room to drop the clocks at all? Is that last 2% of performance really that important?
My rough estimates appear to show that all elements are smaller in RDNA2, not juat CUs (PHY, I/O, cache, etc).If you take the difference between Arden and Navi 10, and then scale that based on CU count, it would suggest denser CUs and a 40CU chip could be as low as 293 mm^2.
But this isn't certain, and I expect Sony may be a little bigger than 300mm^2. This is a good ballpark, though.
Yes it does (though the number of deactivated CUs is 18, not 16).I dont buy the 36 CUs was neccesary for BC at all, is not like the own PS4 pro deativate 16 CUs for doing it.
Actually, even Boost Mode doesn't turn on the other CUs. It just causes the active 18 to run at full Pro clock, rather than PS4 clock.This is why the framerate bumps are relatively modest.
Just because a platform holder hasn't mentioned something doesn't mean they don't have it. But in this case you're right they might not. Of course they can't balance CPU and GPU load by changing clocks. But I was thinking they could do so based on draw: if Zen2 is running cool, send more voltage to Navi to ensure more stable performance. But I forgot about Microsoft's Hovis Method of power matching. Their chips should already being fed the optimal voltage.I don't think so. Microsoft said absolutely everything about the specs of the XSX, but they never mentioned SmartShift. IF they said everything else, why keep that to themselves? It's very likely the XSX doesn't have it.
Easier perhaps, but it seems not affordable. Microsoft went all-out on cooling the XSX, with a novel airflow design, a huge fan and exhaust surface, and a vapor chamber with massive heatsink. And their clockspeed is notably slower anyway. Their SOC is bigger than Sony's, yes, but by a smaller percentage than the clock gap, even though Sony is necessarily also farther up the inefficiency curve.Seems like the easier way to fix that would be to use a better cooling solution.
You're forgetting Xbox 360, the biggest overheating disaster consoles have ever seen.