• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.
  • We have made minor adjustments to how the search bar works on ResetEra. You can read about the changes here.
Status
Not open for further replies.

the-pi-guy

Member
Oct 29, 2017
6,301
Honestly I think its more likely between 2.1 and 2.23 based on the scale of 10% power reduction. im not sure of the ration but even a 10% power reduction doesnt mean 10% loss In clock speed. i dont think it will hit 2 ever.



less than that cerny said 10 % power drop = couple percent reduction. not sure of the exact math however.
I'm kind of expecting something like:
90% 2.23
9.9% 2.1
0.1% 2

I expect that it'll happen but it'll be so rare that it's not even worth pointing out.
 

AegonSnake

Banned
Oct 25, 2017
9,566
Not to mention some people are acting like we're going to hit that worst case right out of the gate. And even in the worst case, we'd be looking at, what, a 10% drop?
he said 2% for worst case.

thats 44 mhz. so it would drop from 2,230 mhz to 2,186 mhz.

the only issue i have with this is his statement saying 2 ghz was looking impossible. and that 3.5 ghz was looking impossible. well, xbox has 3.6 ghz and fixed clocks so what did they do that made it so possible with fixed clocks?

for gpu, is he saying that some instructions were hotter than others? so they wont be running those instructions at 2 or 2+ ghz? how would that impact performance?

we need to see the overall TDP budget of this thing. if this thing is a 150w gpu then there is zero cause for concern.

EDIT: I just realized that DF said third party devs were already running the gpu at 2.23 ghz albeit while downclocking the cpu. but cpu power savings are so small, i am pretty sure the ps5 cooling system can handle the gpu running at 2.23 ghz regardless of what instructions are being fed. cerny did tell richard that the final retail unit will handle everything gracefully without needing clock profiles or input from devs to downclock cpu to feed the gpu enough power. it should just work.
 

the-pi-guy

Member
Oct 29, 2017
6,301
he said 2% for worst case.

thats 44 mhz. so it would drop from 2,230 mhz to 2,186 mhz.

the only issue i have with this is his statement saying 2 ghz was looking impossible. and that 3.5 ghz was looking impossible. well, xbox has 3.6 ghz and fixed clocks so what did they do that made it so possible with fixed clocks?

for gpu, is he saying that some instructions were hotter than others? so they wont be running those instructions at 2 or 2+ ghz? how would that impact performance?

we need to see the overall TDP budget of this thing. if this thing is a 150w gpu then there is zero cause for concern.
I don't recall him saying that was the worst case. Just him giving an example that we don't expect major drops, because frequency drops cause basically near cubic drops in power.
 

Transistor

Outer Wilds Ventures Test Pilot
Administrator
Oct 25, 2017
37,343
Washington, D.C.
the only issue i have with this is his statement saying 2 ghz was looking impossible. and that 3.5 ghz was looking impossible. well, xbox has 3.6 ghz and fixed clocks so what did they do that made it so possible with fixed clocks?
Cerny is kinda weird with his statements sometimes, so maybe he meant when they were originally designing the console 2ghz seemed impossible, but it turned out that wasn't true.
 

platocplx

2020 Member Elect
Member
Oct 30, 2017
36,084
he said 2% for worst case.

thats 44 mhz. so it would drop from 2,230 mhz to 2,186 mhz.

the only issue i have with this is his statement saying 2 ghz was looking impossible. and that 3.5 ghz was looking impossible. well, xbox has 3.6 ghz and fixed clocks so what did they do that made it so possible with fixed clocks?

for gpu, is he saying that some instructions were hotter than others? so they wont be running those instructions at 2 or 2+ ghz? how would that impact performance?

we need to see the overall TDP budget of this thing. if this thing is a 150w gpu then there is zero cause for concern.
The combination of the two. vs just having to choose one or the other.
 

AegonSnake

Banned
Oct 25, 2017
9,566
Cerny is kinda weird with his statements sometimes, so maybe he meant when they were originally designing the console 2ghz seemed impossible, but it turned out that wasn't true.
yeah, its the only thing thats a bit confusing in his 50 minute talk.

and i just added this as an edit in my original post in case you missed it.

EDIT: I just realized that DF said third party devs were already running the gpu at 2.23 ghz albeit while downclocking the cpu. but cpu power savings are so small, i am pretty sure the ps5 cooling system can handle the gpu running at 2.23 ghz regardless of what instructions are being fed. cerny did tell richard that the final retail unit will handle everything gracefully without needing clock profiles or input from devs to downclock cpu to feed the gpu enough power. it should just work.
 

platocplx

2020 Member Elect
Member
Oct 30, 2017
36,084
I don't recall him saying that was the worst case. Just him giving an example that we don't expect major drops, because frequency drops cause basically near cubic drops in power.
yeah he didnt really put a number on what the percentage points would be.
youtu.be

The Road to PS5

PS5 lead system architect Mark Cerny provides a deep dive into PS5’s system architecture and how it will shape the future of games.

i think what you placed as the clocks percentages of the time makes sense at a minimum expectation.
 

GhostTrick

Member
Oct 25, 2017
11,463
again, the fraction of a single core is for the xbox series X which has dedicated decompression hardware. any system without this dedicated hardware will definitely utilize more than a tenth of a cpu core.


And the point is to bring it to Windows. So no, the point isn't on PC to "bruteforce it" and bringing a narrative that "thanks to secret sauce, those 8 cores Zen 2 CPUs on consoles will perform the same as 16 cores Zen 2 CPUs".
As for your second sentence, nothing indicate that. In fact, in the Microsoft article, both of those aspects (the hardware decompression block and DirectStorage API) were under two different sections.
 

Transistor

Outer Wilds Ventures Test Pilot
Administrator
Oct 25, 2017
37,343
Washington, D.C.
yeah, its the only thing thats a bit confusing in his 50 minute talk.

and i just added this as an edit in my original post in case you missed it.

EDIT: I just realized that DF said third party devs were already running the gpu at 2.23 ghz albeit while downclocking the cpu. but cpu power savings are so small, i am pretty sure the ps5 cooling system can handle the gpu running at 2.23 ghz regardless of what instructions are being fed. cerny did tell richard that the final retail unit will handle everything gracefully without needing clock profiles or input from devs to downclock cpu to feed the gpu enough power. it should just work.
Yeah, the dev units probably have the profiles built in so that devs can manually trigger "what if" and work around it.
 

platocplx

2020 Member Elect
Member
Oct 30, 2017
36,084
And the point is to bring it to Windows. So no, the point isn't on PC to "bruteforce it" and bringing a narrative that "thanks to secret sauce, those 8 cores Zen 2 CPUs on consoles will perform the same as 16 cores Zen 2 CPUs".
As for your second sentence, nothing indicate that. In fact, in the Microsoft article, both of those aspects (the hardware decompression block and DirectStorage API) were under two different sections.
what do you think needs to make use of an API? processor time.
 

AegonSnake

Banned
Oct 25, 2017
9,566
And the point is to bring it to Windows. So no, the point isn't on PC to "bruteforce it" and bringing a narrative that "thanks to secret sauce, those 8 cores Zen 2 CPUs on consoles will perform the same as 16 cores Zen 2 CPUs".
As for your second sentence, nothing indicate that. In fact, in the Microsoft article, both of those aspects (the hardware decompression block and DirectStorage API) were under two different sections.
here is the full quote. im not creating this narrative.

The final component in the triumvirate is an extension to DirectX - DirectStorage - a necessary upgrade bearing in mind that existing file I/O protocols are knocking on for 30 years old, and in their current form would require two Zen CPU cores simply to cover the overhead, which DirectStorage reduces to just one tenth of single core.

"Plus it has other benefits," enthuses Andrew Goossen. "It's less latent and it saves a ton of CPU. With the best competitive solution, we found doing decompression software to match the SSD rate would have consumed three Zen 2 CPU cores. When you add in the IO CPU overhead, that's another two cores. So the resulting workload would have completely consumed five Zen 2 CPU cores when now it only takes a tenth of a CPU core. So in other words, to equal the performance of a Series X at its full IO rate, you would need to build a PC with 13 Zen 2 cores. That's seven cores dedicated for the game: one for Windows and shell and five for the IO and decompression overhead."
its saving a 'ton of cpu' because of this additional hardware. not just because they updated a software API. its basically what Epic did with UE5 when they say they rewrote the entire i/o logic to take advantage of the ps5 i/o. yes its software but its relying on dedicated ps5 i/o hardware to save on the cpu processing time.

otherwise, you could technically have direct storage or ps5's unreal engine 5 i/o logic run on the xbox one and ps4 while taking only a fraction of a jaguar core. hell, why stop there. port direct storage to the switch and have that become a 2.4 gbps ssd.
 

platocplx

2020 Member Elect
Member
Oct 30, 2017
36,084
here is the full quote. im not creating this narrative.


its saving a 'ton of cpu' because of this additional hardware. not just because they updated a software API. its basically what Epic did with UE5 when they say they rewrote the entire i/o logic to take advantage of the ps5 i/o. yes its software but its relying on dedicated ps5 i/o hardware to save on the cpu processing time.

otherwise, you could technically have direct storage or ps5's unreal engine 5 i/o logic run on the xbox one and ps4 while taking only a fraction of a jaguar core. hell, why stop there. port direct storage to the switch and have that become a 2.4 gbps ssd.
and they totally missed " The final component in the triumvirate is an extension to DirectX - DirectStorage "

it still will take some CPU time.
 

GhostTrick

Member
Oct 25, 2017
11,463
what do you think needs to make use of an API? processor time.


I know. The point is to reduce the CPU time.

here is the full quote. im not creating this narrative.


its saving a 'ton of cpu' because of this additional hardware. not just because they updated a software API. its basically what Epic did with UE5 when they say they rewrote the entire i/o logic to take advantage of the ps5 i/o. yes its software but its relying on dedicated ps5 i/o hardware to save on the cpu processing time.

otherwise, you could technically have direct storage or ps5's unreal engine 5 i/o logic run on the xbox one and ps4 while taking only a fraction of a jaguar core. hell, why stop there. port direct storage to the switch and have that become a 2.4 gbps ssd.


Yes you are. Hence your previous post. Then again, I guess we'll see in a few months.
 
1 gift from Transistor

GiftBot

Official Giveaway Bot
Verified
Mar 7, 2018
12,020

Giveaway

Restrictions:
Hello, I am bot! I come bearing 1 gift from Transistor Transistor!

This is a two-day raffle that will expire in 48 hours. The winner will be drawn at random! Any prizes leftover after the deadline will become available on a first-come first-serve basis.

Transistor said:
To all you disgustingly beautiful avatars out there, thanks for putting up with me over the past few months. I love y'all so damn much. Let's keep the next few months fun and let's not bicker about teraflops or I/Os, mmkay? Next-gen gonna be great for everyone.

P.S. if you haven't changed your avatar yet (I will be checking soon), I'd suggest you do it. It would be a shame if I had to have an admin do it and we accidentally left it on there forever :)

These are our awesome prizes:

  • PlayStation 4 (US) PlayStation 4 (US): $50 PSN bucks - Won by Calabi Calabi (85 entries)
 

chris 1515

Member
Oct 27, 2017
7,075
Barcelona Spain
he said 2% for worst case.

thats 44 mhz. so it would drop from 2,230 mhz to 2,186 mhz.

the only issue i have with this is his statement saying 2 ghz was looking impossible. and that 3.5 ghz was looking impossible. well, xbox has 3.6 ghz and fixed clocks so what did they do that made it so possible with fixed clocks?

for gpu, is he saying that some instructions were hotter than others? so they wont be running those instructions at 2 or 2+ ghz? how would that impact performance?

we need to see the overall TDP budget of this thing. if this thing is a 150w gpu then there is zero cause for concern.

EDIT: I just realized that DF said third party devs were already running the gpu at 2.23 ghz albeit while downclocking the cpu. but cpu power savings are so small, i am pretty sure the ps5 cooling system can handle the gpu running at 2.23 ghz regardless of what instructions are being fed. cerny did tell richard that the final retail unit will handle everything gracefully without needing clock profiles or input from devs to downclock cpu to feed the gpu enough power. it should just work.

Again they have a power envelope to reach and I think he means 2Ghz wasn't possible at the power envelope/yield they want to reach. He said the GPU clock is capped maybe if the power envelope was higher they can reach a higher clock. We need to wait for RDNA GPU to be sure of it.
 

EagleClaw

Member
Dec 31, 2018
10,881
Thanks for the great giveaway, Transistor.
Already got "lucky" with the avatar, but who knows, maybe i will have more luck.
 

ShapeGSX

Member
Nov 13, 2017
5,263
Looks like MS will be repurposing their black crush POP! engine for the third time. They just love that pop.

???

Dismissing Microsoft's BC HDR technique without even looking into what they are doing is...well, it's your prerogative. But it's pretty closed-minded.

And they're working with 11bit or 16bit source data, not 8 bit, like a TV would have access to.

www.resetera.com

All Xbox Games May Have HDR on Series X, Select XB1 Games to Get Resolution Enhancements Through BC (No Work Needed From Devs + Testing Unlocked FPS)

I've been speaking to them about it. It's super sophisticated and extends the work done by the Coalition on Gears 5. With Gears they trained an AI to look at SDR and HDR output frames of various games and understand what the difference in colour space transform was: So HDR became the ground...

"the advantage with doing it at a source level, is that it is possible to have access to the original 11bit or 16bit data including, rather than have to apply it as a post-process to an 8bit image - which doesn't contain enough data to avoid banding."
 

AntiMacro

Member
Oct 27, 2017
3,152
Alberta
Not to mention some people are acting like we're going to hit that worst case right out of the gate. And even in the worst case, we'd be looking at, what, a 10% drop?
You're more likely to hit the 'worst case' at two points in the console's lifespan - right at the start and right at the end. At the start because documentation is spotty, there are kludged-together workarounds to get things to work the way you think they're supposed to - or need to for the rest of the code, and it's the 'whole new world' of figuring out how everything works best. At the end because you've figured out all that stuff - for the most part, there's still a lot of 'that just works, leave it alone' commented blocks - and are just pushing the limits or past the limits of what the hardware can handle.
 

disco_potato

Member
Nov 16, 2017
3,145
You're more likely to hit the 'worst case' at two points in the console's lifespan - right at the start and right at the end. At the start because documentation is spotty, there are kludged-together workarounds to get things to work the way you think they're supposed to - or need to for the rest of the code, and it's the 'whole new world' of figuring out how everything works best. At the end because you've figured out all that stuff - for the most part, there's still a lot of 'that just works, leave it alone' commented blocks - and are just pushing the limits or past the limits of what the hardware can handle.
Or at any time if you're bethesda. I assume engines like theirs or even rdr/gta engine will have issues staying under the power envelope.
 

Calabi

Member
Oct 26, 2017
3,503
he said 2% for worst case.

thats 44 mhz. so it would drop from 2,230 mhz to 2,186 mhz.

the only issue i have with this is his statement saying 2 ghz was looking impossible. and that 3.5 ghz was looking impossible. well, xbox has 3.6 ghz and fixed clocks so what did they do that made it so possible with fixed clocks?

for gpu, is he saying that some instructions were hotter than others? so they wont be running those instructions at 2 or 2+ ghz? how would that impact performance?

we need to see the overall TDP budget of this thing. if this thing is a 150w gpu then there is zero cause for concern.

EDIT: I just realized that DF said third party devs were already running the gpu at 2.23 ghz albeit while downclocking the cpu. but cpu power savings are so small, i am pretty sure the ps5 cooling system can handle the gpu running at 2.23 ghz regardless of what instructions are being fed. cerny did tell richard that the final retail unit will handle everything gracefully without needing clock profiles or input from devs to downclock cpu to feed the gpu enough power. it should just work.

Its the overall power budget in my opinion that makes the difference between Xbox's fixed frequency's. They have to get the whole system within a specific power envelope for the consumer PSU and heat budget etc. All that extra stuff the Tempest engine the specialised IO interfaces must take up a lot of that extra power. CPU's arent generally using all their threads and clocks all the time, it would in general be really difficult to use all the a CPU's threads all of the time. It could be like in a single a millisecond the CPU has finished its jobs so it down clocks and gives over its power to the GPU and they go backwards and forwards several times each one downclocking and upclocking all within the time it takes to create a single frame. But in general the the CPU wont be doing much, threads will be idle and long milliseconds spent doing nothing, and so the GPU will likely be spending that power budget to keep its max clockspeed the majority of the time.
 

Graefellsom

Avenger
Oct 28, 2017
1,659
Transistor is more amazing than Cerny's voice.

edit: how much do i have to participate in this thread to enter the raffle?

double edit: reading this page more answered my question :D
 

androvsky

Member
Oct 27, 2017
3,531
Cerny is kinda weird with his statements sometimes, so maybe he meant when they were originally designing the console 2ghz seemed impossible, but it turned out that wasn't true.
I took it to mean that the XSX simply has a larger power/thermal budget, since he mentioned the CPU and the thermals there were probably more of a known quantity early in development. From the context I did get the distinct impression he was talking about the variable clocks making the difference between hitting those speeds and not.
 

Praedyth

Member
Feb 25, 2020
6,648
Brazil
Cerny is kinda weird with his statements sometimes, so maybe he meant when they were originally designing the console 2ghz seemed impossible, but it turned out that wasn't true.
2GHz was impossible because it was a fixed clock, so the console would have to run at 2GHz at high workloads and would probably shutdown.

Thanks for the giveway Transistor, da real MVP

I'm kind of expecting something like:
90% 2.23
9.9% 2.1
0.1% 2

I expect that it'll happen but it'll be so rare that it's not even worth pointing out.
I don't think 2GHz is the base clock, I believe there's no base clock at all. If the code consumes so much power that it needs to run at 1.6GHz (or whatever number) then it will run at 1.6GHz. The catch is that devs will know when that worst case scenario will happen and have the choice to make a better code if they really need the compute power.
 
Last edited:
Status
Not open for further replies.