• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.

M.Bluth

Member
Oct 25, 2017
4,240
Okay so genuine question. I'm not understanding the comments being made regarding CPU/GPU peaks and clock speeds. They're called variable, as in can fluctuate/are not fixed, but then as I understand it Cerny has confirmed both can run at max simultaneously? So why are they labelled variable? And don't just say "cause they run at different clocks depending on how intensive the game is" cause, like, don't all consoles/games do that? And then I've seen comments saying "if it's a more intensive game and there isn't enough power, one or the other will lower it's clocks accordingly". How does that pair up with both being able to run at peak performance? Is both running at peak only for very short bursts? Educate me.
The way I understood it, and people are welcome to correct me, it's this: Most variable frequencies in GPUs and CPUs vary based on thermals. If there's enough thermal headroom, they clock higher. If the chips gets hot, it clocks down, or "thermal throttles."
The variable frequency in the PS5 is not based on thermals, but rather power draw. Different workloads consume different amounts of power. And it's not as straightforward as "this bit looks insane graphically therefore it consumes more power." But that's above my paygrade.
Anyway, so, for the most part, because the workloads don't require a lot of power, both CPU and GPU will run at max clocks. However, when that workload that pushes one part harder, electrically speaking, the system will underclock one part to push the needed current into the other.

And it all also helps with their cooling. They know they always have to cool whatever set power limit they have, so they're also avoiding weird spikes that result in terrible fan noise.
 

褲蓋Calo

Alt-Account
Banned
Jan 1, 2020
781
Shenzhen, China
Okay so genuine question. I'm not understanding the comments being made regarding CPU/GPU peaks and clock speeds. They're called variable, as in can fluctuate/are not fixed, but then as I understand it Cerny has confirmed both can run at max simultaneously? So why are they labelled variable? And don't just say "cause they run at different clocks depending on how intensive the game is" cause, like, don't all consoles/games do that? And then I've seen comments saying "if it's a more intensive game and there isn't enough power, one or the other will lower it's clocks accordingly". How does that pair up with both being able to run at peak performance? Is both running at peak only for very short bursts? Educate me.
Maybe both can execute NOP at peak frequency.
 

Phellps

Member
Oct 25, 2017
10,798
One thing I'm still not clear on... does the power limit on the PS5 allow for both the CPU and GPU to run at their highest/capped frequency? I'm guess no or otherwise there would be no need to divert power.

If not, then what I'd be curious to know is what is the highest possible CPU frequency when GPU is running at max frequency and what is highest possible GPU frequency when the CPU is running at max frequency?
Well, my guess is as good as anyone's, but I'm thinking there is a power budget based on the workload and that the system dynamically sends power to where it's most needed? Cerny said there is enough power output to sustain both CPU and GPU at their max clocks, so I'm thinking this is the "max power budget" situation.

I don't have a lot of knowledge on the subject, but it's kind of what I could gather from this.
 

Brees2Thomas

Member
Dec 27, 2019
1,525
Shame Cerny doesn't comment on the DF findings or give no specific answers.

No answer for the lower clock bounds, no follow up on his claims that low amount of CUs with a higher clock beating more CUs but slower not being seen in practice. No follow up on how Rdna 1 (and all gpus basically) does not scale very well with frequency when you don't increase the bandwidth (and for a 5700 going over 1.9ghz is already non linear).
Correct me if I misheard him, but in this video, Paul says he thinks the only thing that would cause either the CPU or GPU to drop from their respective clock speeds is a heavy workload that would never occur during gaming.
 

the-pi-guy

Member
Oct 29, 2017
6,269
Okay so genuine question. I'm not understanding the comments being made regarding CPU/GPU peaks and clock speeds. They're called variable, as in can fluctuate/are not fixed, but then as I understand it Cerny has confirmed both can run at max simultaneously? So why are they labelled variable? And don't just say "cause they run at different clocks depending on how intensive the game is" cause, like, don't all consoles/games do that? And then I've seen comments saying "if it's a more intensive game and there isn't enough power, one or the other will lower it's clocks accordingly". How does that pair up with both being able to run at peak performance? Is both running at peak only for very short bursts? Educate me.
The Xbox Series X is clocked at a particular frequency. It always runs at the frequency it is set at.
 

mikehaggar

Developer at Pixel Arc Studios
Verified
Oct 26, 2017
1,379
Harrisburg, Pa
Alright, so now I'm under the impression that the power shifting and down-clocking has mainly been implemented as a means of controlling the temperature(s) inside the box.
 

Axel Stone

Member
Jan 10, 2020
2,771
I've got to say, I am still struggling a bit to understand the variable frequencies thing. I sometimes think I've understood it, but then it gets hazy again.

I think I've just reached a point with all of this where I'm willing to take it on faith that it isn't going to be causing issues and that it isn't some attempt to make the PS5 seem more powerful than it is. I think that's the best I can hope for.
 

Lukas Taves

Banned
Oct 28, 2017
5,713
Brazil
Correct me if I misheard him, but in this video, Paul says he thinks the only thing that would cause either the CPU or GPU to drop from their respective clock speeds is a heavy workload that would never occur during gaming.

I think that any game that aims for stable framerates are going to be leaving power on the table for when the game is stressing still be able to maintain it.

Dynamic resolution, vrs and more non graphical workloads running on the gpu might make overall utilization higher though. So it's hard to say how this will actually work on practical terms.
 

mikehaggar

Developer at Pixel Arc Studios
Verified
Oct 26, 2017
1,379
Harrisburg, Pa
That seem to be the case .
That is why looking at there cooling system and console design going to be very interesting .

Yeah, so then I feel like spending so much time talking about it really added to a lot of misunderstanding and all of the "well how powerful is it really?" and all of the speculation about how much weaker it is than the Series X.
 

low-G

Member
Oct 25, 2017
8,144
Alright, so now I'm under the impression that the power shifting and down-clocking has mainly been implemented as a means of controlling the temperature(s) inside the box.

It's ultimately literally not for anything else and that's the reason all devices have dynamic frequencies.

From the start the difference with PS5 is they are setting a constant heat output level which will not vary. Only dissipation (fan speeds, environment, dust) will vary. That's different from other devices, which may get extremely hot or overheat if truly maxed out.
 

Hey Please

Avenger
Oct 31, 2017
22,824
Not America
Alright, so now I'm under the impression that the power shifting and down-clocking has mainly been implemented as a means of controlling the temperature(s) inside the box.

This is one of those things that confuses me. I thought that the point of providing constant power was to be able to predict the heat output of the machine on a consistent basis and build a cooling solution around it and the downclocking occurs when processing complex chunks of information, which w/o aforementioned reduction in frequency, would mean exceeding the power budget (so the power ceiling is artificially capped) producing greater heat than what the cooling solution was designed to contain.
 
OP
OP
Wollan

Wollan

Mostly Positive
Member
Oct 25, 2017
8,808
Norway but living in France
This is not 20 GB/s of bandwidth from the GPU or the Onion bus linking CPU and GPU. This is 20 GB/s from the unified memory.
I still don't understand why it's out of the question that the Tempest CU could be one of the four (or however many) disabled CU's utilizing the equivalent of the Onion bus (to bypass caches and to access RAM at 20GBps+). I want to understand though. :)
KfHEEmjQ0n59n7pxUm9yBhyhGt2fy93zFRfaw2D_9aZRv0lWKHB1qbUcFX-xoVURfCdiCVoDsKgKXkKIBZ2k07-si6Cr99xUZ_Rds62cVh7S6Q

Cerny PS5 DF article:
"The Tempest engine itself is, as Cerny explained in his presentation, a revamped AMD compute unit, which runs at the GPU's frequency...
...GPUs process hundreds or even thousands of wavefronts; the Tempest engine supports two," explains Mark Cerny. "One wavefront is for the 3D audio and other system functionality, and one is for the game. Bandwidth-wise, the Tempest engine can use over 20GB/s, but we have to be a little careful because we don't want the audio to take a notch out of the graphics processing. If the audio processing uses too much bandwidth, that can have a deleterious effect if the graphics processing happens to want to saturate the system bandwidth at the same time."
 
Last edited:

Brees2Thomas

Member
Dec 27, 2019
1,525

Axel Stone

Member
Jan 10, 2020
2,771
But Cerny said frequency wasn't based on thermals but power draw, did he not?

Yes, but, heat is surely the thing that can damage the console, no?

There's no particular reason that I can think of that you would need to keep your power draw under a certain level as long as your system is built to handle it. Too much heat, on the other hand, can start melting components.

I'm sure I'm missing something here, I'm just not sure what.
 

androvsky

Member
Oct 27, 2017
3,500
I still don't understand why it's out of the question that the Tempest CU could be one of the four (or however many) idle CU's utilizing the equivalent of the Onion bus (to bypass caches and to access RAM at 20GBps+). I want to understand though. :)
KfHEEmjQ0n59n7pxUm9yBhyhGt2fy93zFRfaw2D_9aZRv0lWKHB1qbUcFX-xoVURfCdiCVoDsKgKXkKIBZ2k07-si6Cr99xUZ_Rds62cVh7S6Q

Cerny PS5 DF article:
"GPUs process hundreds or even thousands of wavefronts; the Tempest engine supports two," explains Mark Cerny. "One wavefront is for the 3D audio and other system functionality, and one is for the game. Bandwidth-wise, the Tempest engine can use over 20GB/s, but we have to be a little careful because we don't want the audio to take a notch out of the graphics processing. If the audio processing uses too much bandwidth, that can have a deleterious effect if the graphics processing happens to want to saturate the system bandwidth at the same time."
What idle CU? You mean the disabled ones? Those are disabled at random if there's a defect in one, it'd mean every other CU would have to be designed to act like the audio engine.
 

androvsky

Member
Oct 27, 2017
3,500
Yes, but, heat is surely the thing that can damage the console, no?

There's no particular reason that I can think of that you would need to keep your power draw under a certain level as long as your system is built to handle it. Too much heat, on the other hand, can start melting components.

I'm sure I'm missing something here, I'm just not sure what.
Pretend the PS5 is fanless and they're building towards a portable for the Slim revision.
 
OP
OP
Wollan

Wollan

Mostly Positive
Member
Oct 25, 2017
8,808
Norway but living in France
What idle CU? You mean the disabled ones? Those are disabled at random if there's a defect in one, it'd mean every other CU would have to be designed to act like the audio engine.
That's my point really. Maybe there isn't that much extra custom silicon (as in none) required for the Tempest CU since it's essentially just another CU (chosen amongst the disabled ones) that solely uses the Onion bus equivalent (being locked to this data path). It would be cost beneficial if it's technically possible.
 

lukeskymac

Banned
Oct 30, 2017
992
The PS5 will be 60% smaller (compared to the the xsx) and only 10-14% less powerful.

I don't think it will be much smaller, if any. Going small and fast on silicon means they'll need plenty of space for cooling. This is rather a cost tradeoff, I think. It's cheaper to deal with the heat than to use a larger chip. This savings is either completely being used to pay for the SSD, or also to get a $50 difference, IMO.
 

Axel Stone

Member
Jan 10, 2020
2,771
Power draw is related to thermals.
Except with the caveat that the temperature of the room doesn't affect the power draw.

It's like it uses the power draw to estimate the ideal PS5.

But the temperature of the room does affect the heat inside the console, and if there's too much heat inside the console, that could potentially cause issues, couldn't it?
 

Brees2Thomas

Member
Dec 27, 2019
1,525
Yes, but, heat is surely the thing that can damage the console, no?

There's no particular reason that I can think of that you would need to keep your power draw under a certain level as long as your system is built to handle it. Too much heat, on the other hand, can start melting components.

I'm sure I'm missing something here, I'm just not sure what.
That's why Cerny said the power is capped. Frequencies would down-clock to prevent that "melting of components" from happening.
 

androvsky

Member
Oct 27, 2017
3,500
But the temperature of the room does affect the heat inside the console, and if there's too much heat inside the console, that could potentially cause issues, couldn't it?
It'll just run hotter, like any console in a hot room. Cerny said it won't change clocks based on measured temperature.

If it gets too hot, it'll shut down like any modern console that gets too hot.
 

anexanhume

Member
Oct 25, 2017
12,912
Maryland
I prefer the sound produced by the Mega Drive's Yamaha chip over the SPC-700 in Super NES. Both can sound great in the right hands but Sega has the edge.

Also, the 68000? That's one heck of a CPU.

I don't think people realize just how pervasive the 68000 was in consoles and PCs of yore. It was everywhere.
Well the article clearly disproved the idea that the CPU and GPU would have to tradeoff performance to run at the higher clocks which was something that seemed to be the case from the presentation.

I also thought the test between the two GPU's normalizing the TFLOPS with a 36cu and 40cu GPU was pretty interesting. I would not have thought you would see any material performance change when the FLOPS were the same but that was an interesting result.

Also nice to clear-up what appeared to be conflicting information where both sides turned out to be right (hearing from developers the need to "lock" clocks being a feature of the Devkit vs. final HW)

It looks like they have done something super interesting here which I admit I'm still wrapping my head around. Nice to get a little more detail and clarity on some of these points from DF.
It is an interesting point and kind of counter to what Cerny suggests. However, we don't know if that holds true for RDNA 2, or if it was a characteristic of the architecture that would have been known at the time of conception. Perhaps they made some changes (e.g. cache scrubbers) to tip the balance.
 

Brees2Thomas

Member
Dec 27, 2019
1,525
It'll just run hotter, like any console in a hot room. Cerny said it won't change clocks based on measured temperature.

If it gets too hot, it'll shut down like any modern console that gets too hot.
But as far as I understand it, it would only overheat IF the room temperature was too hot, due to power draw being capped.
 

Axel Stone

Member
Jan 10, 2020
2,771
It'll just run hotter, like any console in a hot room. Cerny said it won't change clocks based on measured temperature.

If it gets too hot, it'll shut down like any modern console that gets too hot.

Fair enough, that makes sense.

I mean, I'm still confused more generally, but I think I simply don't know enough about computer science to have this explained to me in a way that won't have me feeling confused.
 

androvsky

Member
Oct 27, 2017
3,500
So how do you control that if you're a developer?
Control what, the clocks? They only change based on workload, so it'll be like any GPU that's good at some things but bad at others. It's just deep down one of the things it's bad at is "getting hot".

But as far as I understand it, it would only overheat IF the room temperature was too hot, due to power draw being capped.
Yes, the context was a room that's too hot.
 

rokkerkory

Banned
Jun 14, 2018
14,128
Wait so if xsx gpu also outputted 10.28TF like ps5 that xsx gpu would achieve a tiny bit more? Based in the DF test?
 

Draper

The Fallen
Oct 28, 2017
4,280
Harrisburg, PA
I'm on board with whatever Sony does because of their first party line-up, but god does this current approach to every have me a bit worried?

Whatever I guess. Just give me God of War
 

Goron2000

Banned
Oct 27, 2017
542
User Banned (1 Week) - Conspiracy Theories
It sounds like they have boost clocks and are trying to rewrite the narrative to make people talk about the boosted frequency as the baseline rather than the standard running frequency.

"No, no, no. This isn't a 3.5GHz cpu that boosts to 4.6GHZ. This is a brand-new innovation, it's a 4.6GHz cpu that can lower itself to 3.5GHz"
 
Last edited:

Penny Royal

The Fallen
Oct 25, 2017
4,158
QLD, Australia
One thing I'm still not clear on... does the power limit on the PS5 allow for both the CPU and GPU to run at their highest/capped frequency? I'm guess no or otherwise there would be no need to divert power.

If not, then what I'd be curious to know is what is the highest possible CPU frequency when GPU is running at max frequency and what is highest possible GPU frequency when the CPU is running at max frequency?

Literally the first post in the thread has the answer to this.
 

Bluelote

Member
Oct 27, 2017
2,024
do we know if the CPU of these consoles have the same amount of l3 cache as the PC Zen 2 CPUs?

if they removed the l3 it would save a lot of space and some power, and it would be a lot slower but still "zen 2 cores"
 

Hawk269

Member
Oct 26, 2017
6,043
Sorry, I didn't mean you yourself worried. I was just responding to people saying the components could melt or the machine could overheat and shutdown.
I kind of doubt that they would allow a dev to push it too hard to make it shut down. If it does shut down it would be because of environmental temps on where the unit is placed. I do a ton of over-clocking of my PC for both the CPU and the GPU and I have both on a custom water loop. Temps alone don't hold back the clocks, it is the voltage limitation built into the hardware to prevent it going even faster.

I am sure Sony has done a ton of testing with whatever cooling solution and then locked the clock at what Cerny stated as that is the highest threshold they could hit and it still be stable. Stability is crucial because if they did not do this, we would be playing graphically intensive games that would have graphic/visual oddities. This happens a lot in the PC field with a not properly tested overclock on the GPU...the game would run but at one point you start to get visual/graphic corruptions because it is not a stable OC. I am sure Sony pushed it that far and also ensured it was stable.

Regardless if a game hits max CPU and GPU at the same time, I believe Cerny and his team planned this stuff out. But the big thing that WE WILL NEVER KNOW is even is the GPU is down clocking or the CPU is down clocking we would never know. There are no tools available to anyone outside of a game dev. that shows a console game and what speed the CPU/GPU is hitting or what temp it is running at. So for all those worried about it....how would you know? If it is hitting 10.3tf or 9.2 or anything in-between?
 

Illusion

Banned
Oct 27, 2017
8,407
Alright, so now I'm under the impression that the power shifting and down-clocking has mainly been implemented as a means of controlling the temperature(s) inside the box.
Cerny's clarification that Sony wants the highest quality fans, they don't care about any variables to set the fan at. So Sony doesn't want to compromise the console parts with cheap solutions, but they are willing to compromise a console's power with power-shifting and downclocking?

If I'm understanding correctly, please correct me if I'm wrong.
 

McScroggz

The Fallen
Jan 11, 2018
5,971
So as somebody who isn't a tech guy does this ridiculously dumbed down explanation make some since?

You have two tasks for the GPU (or CPU) and both use the max allowable frequency to perform the task. One task is simple, so it can be done quickly and efficiently and there is down time, which means there are times when the PS5 is not hitting the overall power consumption cap. The other task is complex so the PS5 is constantly working to do that task, which hits the power consumption cap and at times if the CPU also were to hit its max would exceed the threshold so one lowers its power consumption (which reduces frequency).

I know there are very technical examples and explanations, but I'm trying to wrap my head around the concept because that's about as far as I'll get. If what I'm describing (reductively) is accurate then how the console is presented makes sense and why they say it has variable frequencies despite being able to run the GPU and CPU at max frequencies simultaneously.
 

Nostremitus

Member
Nov 15, 2017
7,772
Alabama
So, the Tempest Engine has the equivalent capability of if all 8 jaguar cores in the PS4 were working on audio and nothing else?
 

mikehaggar

Developer at Pixel Arc Studios
Verified
Oct 26, 2017
1,379
Harrisburg, Pa
Cerny's clarification that Sony wants the highest quality fans, they don't care about any variables to set the fan at. So Sony doesn't want to compromise the console parts with cheap solutions, but they are willing to compromise a console's power with power-shifting and downclocking?

If I'm understanding correctly, please correct me if I'm wrong.

It's not compromising the console's power. It's a different method of managing power consumption and thermals. And it has the additional benefit of being able to send some extra power to either the CPU or the GPU if the other is not using all of it's allocated power budget.