The way I understood it, and people are welcome to correct me, it's this: Most variable frequencies in GPUs and CPUs vary based on thermals. If there's enough thermal headroom, they clock higher. If the chips gets hot, it clocks down, or "thermal throttles."Okay so genuine question. I'm not understanding the comments being made regarding CPU/GPU peaks and clock speeds. They're called variable, as in can fluctuate/are not fixed, but then as I understand it Cerny has confirmed both can run at max simultaneously? So why are they labelled variable? And don't just say "cause they run at different clocks depending on how intensive the game is" cause, like, don't all consoles/games do that? And then I've seen comments saying "if it's a more intensive game and there isn't enough power, one or the other will lower it's clocks accordingly". How does that pair up with both being able to run at peak performance? Is both running at peak only for very short bursts? Educate me.
The variable frequency in the PS5 is not based on thermals, but rather power draw. Different workloads consume different amounts of power. And it's not as straightforward as "this bit looks insane graphically therefore it consumes more power." But that's above my paygrade.
Anyway, so, for the most part, because the workloads don't require a lot of power, both CPU and GPU will run at max clocks. However, when that workload that pushes one part harder, electrically speaking, the system will underclock one part to push the needed current into the other.
And it all also helps with their cooling. They know they always have to cool whatever set power limit they have, so they're also avoiding weird spikes that result in terrible fan noise.