• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.
  • We have made minor adjustments to how the search bar works on ResetEra. You can read about the changes here.

Dakhil

Member
Mar 26, 2019
4,459
Orange County, CA
This week at ISSCC (International Solid State Circuits Conference), Microsoft presented a talk titled 'Xbox Series X SoC: A Next Generation Gaming Console', with hardware engineer Paul Paternoster presenting. The 30 minute presentation covered a lot about Microsoft's latest console processor, most of which was a repeat about what we saw at Hot Chips in August last year, however there was a new element in this presentation talking about how the console design team balanced acoustics, power, thermal performance, and processor yield, discussing where the hotspots in the design originate and where the performance/power targets of the final silicon were optimized.
 
Oct 27, 2017
4,927
Really interesting article, still reading it. To my uneducated ass, it's surprising that the highest hot spot was actually on the CPU.

ISSCC2021-3_1-page-031_575px.jpg
 

DukeBlueBall

Banned
Oct 27, 2017
9,059
Seattle, WA
The Xsx cloud gpus are running at least 1.97ghz since they have a minimum of 48CUs. Would be interesting not see the power consumption there.

Also interesting is that Xsx will use 210 watts max based on the 15% increase from 1X. That makes sense with the 315W PSU.
 

bsigg

Member
Oct 25, 2017
22,556
Console XSX SoCs were either 52 or 56 CUs to allow for a higher yield, just clocked differently to meet the same performance. They ultimately decided on 52 to help with yields, disabling 2 WGPs for the chips that came back with 28/28 WGPs as good.

Cloud XSX SoCs can be as low as 48 CUs, like DukeBlueBall mentioned, just clocked much higher since that wouldn't be a thermal/power issue in a data center.

ISSCC2021-3_1-page-033.jpg



ISSCC2021-3_1-page-034.jpg
 
Last edited:

ekim

Member
Oct 26, 2017
3,405
Is there any way to find out the GPU clock of the console? Would like to know (just for the sake of it) if my Series X has 52 or 56 CUs
 

canderous

Prophet of Truth
Member
Jun 12, 2020
8,692
Huh. Pretty cool how they're maximizing the yields by overclocking some data center chips that wouldn't make the cut for home console thermal/noise requirements.
 

ArchedThunder

Uncle Beerus
Member
Oct 25, 2017
19,068
The Xsx cloud gpus are running at least 1.97ghz since they have a minimum of 48CUs. Would be interesting not see the power consumption there.

Also interesting is that Xsx will use 210 watts max based on the 15% increase from 1X. That makes sense with the 315W PSU.
Console XSX SoCs are either 52 or 56 CUs to allow for a higher yield, just clocked differently to meet the same performance.

Cloud XSX SoCs can be as low as 48 CUs, like DukeBlueBall mentioned, just clocked much higher since that wouldn't be a thermal/power issue in a data center.

ISSCC2021-3_1-page-033.jpg



ISSCC2021-3_1-page-034.jpg
Could this cause small variations in game performance?
 

bsigg

Member
Oct 25, 2017
22,556
Could this cause small variations in game performance?

No, I had to make an edit to add in that they ultimately ended up deciding to roll with 26 WGPs, 52 CUs, regardless of if chips were good on all 28 WGPs to help increase yields.

The only place we might see a difference in performance would be the cloud since that will be allowed to run with 24 WGPs, 48 CUs, as a minimum.
 

The Lord of Cereal

#REFANTAZIO SWEEP
Member
Jan 9, 2020
9,652
I'm gonna be honest, I thought that the whole thing about the Series X being able to run 4 Xbox One S simulations in a server environment was like the coolest thing about the tech, but the fact that they have tolerances for less/more CUs in the server environment by adjusting clocks is also just rad as hell. I swear, every time I hear about the design considerations of the Series X (both the consumer console and the Anaconda APU for the server clusters) I'm just more and more impressed.

I also have to wonder if the flexibility in how the 12TF is derived (in the server environment, and likely also for development on the GDK) is something that helps them with backwards compatibility. I have very limited knowledge on everything, but its all so damn interesting
 

dgrdsv

Member
Oct 25, 2017
11,885
Not sure why people are surprised at CPUs being hotter than GPU.
Temperature is a result of wattage which is directly linked to clocks.
It wasn't a "surprise" which MS found out, it was their decision to clock the CPUs like this which resulted in such hot spots.
 

Scottoest

Member
Feb 4, 2020
11,356
I'm gonna be honest, I thought that the whole thing about the Series X being able to run 4 Xbox One S simulations in a server environment was like the coolest thing about the tech, but the fact that they have tolerances for less/more CUs in the server environment by adjusting clocks is also just rad as hell. I swear, every time I hear about the design considerations of the Series X (both the consumer console and the Anaconda APU for the server clusters) I'm just more and more impressed.

I also have to wonder if the flexibility in how the 12TF is derived (in the server environment, and likely also for development on the GDK) is something that helps them with backwards compatibility. I have very limited knowledge on everything, but its all so damn interesting

It's almost funny to go back and look at the OG Xbox One by contrast. It was basically just a bunch of PC components in a box, with a TON of negative space inside.

t1GJ1RXuMmxj4YMZ.medium


And they somehow couldn't manage to fit the power brick inside that!
 
Oct 27, 2017
4,927
Not sure why people are surprised at CPUs being hotter than GPU.
Temperature is a result of wattage which is directly linked to clocks.
It wasn't a "surprise" which MS found out, it was their decision to clock the CPUs like this which resulted in such hot spots.
Yeah it makes sense when you consider clock speed and relative die size. But at first glance, it feels counterintuitive because of how a ryzen 3700 has a TDP of 65W while an RX 6800 is around 250W and also when you think about how much bigger the Series X is than the S.
 

Karateka

Member
Oct 28, 2017
6,940
Ooc do we know the r n d costs for xb1 vs xbsx? I bet the SX had a much higher budget looking at that xb1 render.
 

Lagspike_exe

Banned
Dec 15, 2017
1,974
It's almost funny to go back and look at the OG Xbox One by contrast. It was basically just a bunch of PC components in a box, with a TON of negative space inside.

t1GJ1RXuMmxj4YMZ.medium


And they somehow couldn't manage to fit the power brick inside that!

OG Xbox was a rush job. Basically, they packed a laptop inside a blackbox and that was it. They've come a long way since then.
 

dgrdsv

Member
Oct 25, 2017
11,885
Yeah it makes sense when you consider clock speed and relative die size. But at first glance, it feels counterintuitive because of how a ryzen 3700 has a TDP of 65W while an RX 6800 is around 250W and also when you think about how much bigger the Series X is than the S.
3700 is also some 100 mm^2 while 6800 is 520 mm^2. It's hardly surprising that CPU is the hot spot when you consider the die area against wattage. The smaller the die is the hotter it gets with the same TDP.