• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.
  • We have made minor adjustments to how the search bar works on ResetEra. You can read about the changes here.

SunBroDave

Member
Oct 25, 2017
13,164
tenor.png
 

Convasse

Member
Oct 26, 2017
3,820
Atlanta, GA, USA
The discussion about the SSD added a few details to my understanding. Now I understand why developers would be excited about such an innovation. This pleases me greatly :3
 
Sep 19, 2019
2,282
Hamburg- Germany
Where I'm confused with this is why SmartShift would need to be used at all to move power from one to the other ("the unused portion of the budget goes to the GPU"), if there's enough power for both to run at their peaks simultaneously anyway.

That exact question came to my mind as well.

Even if the CPU might not need all its power what benefits would the gpu have from gaining more power when it is already at its peak frequency??😳
 

bcatwilly

Member
Oct 27, 2017
2,483
They designed a good system that is pushing clocks on their 36 CU (this way due to BC no doubt) much higher than thought with this variable frequency approach, but realistically in real world games that are pushing the limits of what developers want to be doing with next generation they are going to be going below the max GPU clock at times. Cerny himself has used the general "at or near" verbiage regarding this, and of course any platform holder will always put the best foot forward when presenting their hardware.
 

pswii60

Member
Oct 27, 2017
26,677
The Milky Way
You're missing the 'workload' piece of the puzzle.

Power consumption varies with workload as well as clock.

Cerny expects that in 'most' workloads, 'most of the time', clocks could be sustained by both chips at or near their peak from the system's power budget.

But sometimes that won't be the case. There are workloads of a power intensive type that, when those of that type are running on both cpu and gpu, there isn't enough power supply to run both at their max. That's where the power management unit kicks in to manage clocks, and where smartshift can be used to shift excess power to the gpu.
So, I suppose the question is, as we progress further in to the next generation and games are pushing the GPU and CPU much harder than currently, with intensive workloads of advanced graphical effects and ray-tracing, and more CPU utilisation for physics and AI etc, will "most workloads" still be sustaining peak clock? Or as developers move away from cross-gen development to exclusively developing for next-gen, are they going to have to make choices more often between prioritising CPU or GPU workloads?
 

Patitoloco

Member
Oct 27, 2017
23,712
I really think the boost clock idea is really brilliant, using the idea of a closed enviroment like a console to reinvent the idea in favour of his design. I don't know how the PS5 will be, but Cerny (and his team) is a really smart man.
 

gofreak

Member
Oct 26, 2017
7,736
Developers simply need to specify the ID, the start location and end location and a few milliseconds later, the data is delivered. Two command lists are sent to the hardware - one with the list of IDs, the other centring on memory allocation and deallocation - i.e. making sure that the memory is freed up for the new data.

With latency of just a few milliseconds, data can be requested and delivered within the processing time of a single frame, or at worst for the next frame. This is in stark contrast to a hard drive, where the same process can typically take up to 250ms.

So this is a little bit new. An order of milliseconds to get data through the I/O stack. That's pretty exciting because it does open up the possibility of on-demand, intra-frame requesting of some kinds of data.

I wonder how much you could get into memory within, say, 10ms...
 

StrykerIsland

Member
Oct 25, 2017
2,160
Also, PS5 will not have VRS... or i misinterpreted the article?

Quote:

And at the nuts and bolts level, there are still some lingering question marks. Both Sony and AMD have confirmed that PlayStation 5 uses a custom RDNA 2-based graphics core, but the recent DirectX 12 Ultimate reveal saw AMD confirm features that Sony has not, including variable rate shading.
Just means Sony haven't said anything about that particular piece yet. Which is fair, there's still plenty we don't know about the machine.
 

Dierce

Member
Oct 27, 2017
3,993
I do think that variable frequency is a mistake but if it helps the PS cinco be $400 then that is great. I have the feeling that the max potential will be very limited in use, similar to the Switch enhanced 1.7GHz boost mode which increases the CPU frequency during loading screens for some Nintendo developed games and cant be used for anything else.
 

CatAssTrophy

Member
Dec 4, 2017
7,632
Texas
That just says it's not confirmed. Not that it's confirmed it does not have it.

Yeah I think there is some confusion there that I hope gets cleared up soon.

A lot of folks seem to pushing a "VRS is a MS thing" narrative but it's not quite that simple, IIRC. It may end up being a situation where the same tech is in PS4 but they call it something else if all we're worrying about is the "VRS" moniker.
 

Loudninja

Member
Oct 27, 2017
42,216
About how BC works
PlayStation 4 Pro was built to deliver higher performance than its base counterpart in order to open the door to 4K display support, but compatibility was key. A 'butterfly' GPU configuration was deployed which essentially doubled up on the graphics core, but clock speeds aside, the CPU had to remain the same - the Zen core was not an option. For PS5, extra logic is added to the RDNA 2 GPU to ensure compatibility with PS4 and PS4 Pro, but how about the CPU side of the equation?

"All of the game logic created for Jaguar CPUs works properly on Zen 2 CPUs, but the timing of execution of instructions can be substantially different," Mark Cerny tells us. "We worked to AMD to customise our particular Zen 2 cores; they have modes in which they can more closely approximate Jaguar timing. We're keeping that in our back pocket, so to speak, as we proceed with the backwards compatibility work."
.
 

pswii60

Member
Oct 27, 2017
26,677
The Milky Way
Yeah I think there is some confusion there that I hope gets cleared up soon.

A lot of folks seem to pushing a "VRS is a MS thing" narrative but it's not quite that simple, IIRC. It may end up being a situation where the same tech is in PS4 but they call it something else if all we're worrying about is the "VRS" moniker.
VRS isn't a MS thing.

VRS Tier 2 has a MS patent but only relating to DXR. I've no doubt Sony will have the same for PS5 as it'll be a standard part of RDNA 2.0.
 

Fezan

Member
Oct 26, 2017
3,274
"So, when I made the statement that the GPU will spend most of its time at or near its top frequency, that is with 'race to idle' taken out of the equation - we were looking at PlayStation 5 games in situations where the whole frame was being used productively. The same is true for the CPU, based on examination of situations where it has high utilisation throughout the frame, we have concluded that the CPU will spend most of its time at its peak frequency."

This may give better clues that games will have cpu running at near peak freq than gpu most of the time when heavy graphical scenes are displayed
No, he already stated that GPU will be running mostly at the max clock here he is giving examples that CPU will also be running at MAx frequency if there is no idle state
 

GymWolf86

Banned
Nov 10, 2018
4,663
before i watch the video, is this still based on the old cerny presentation or they have talked with him\sony recently to get more accurate details?
 

CatAssTrophy

Member
Dec 4, 2017
7,632
Texas
That exact question came to my mind as well.

Even if the CPU might not need all its power what benefits would the gpu have from gaining more power when it is already at its peak frequency??😳

My understanding is that if you have for example a GPU running at 2ghz and it isn't actually doing anything or processing any games, it will only consume X watts, but as soon as you make it process a game or a benchmark or a workload of some sort it will start consuming more watts, especially if it needs to maintain and not drop that clockspeed it's running at.

Since the PS5 power supply can only offer so many watts, it's possible that if CPU and GPU are both at max frequencies and running heavy enough workloads in a game that they will require more power than the power supply can give them, so one of them (or both) can either slightly decrease their clocks or one can say "hey i don't need as many watts as you, so here, have some of mine".

Does that sound close to being accurate, folks that understand this stuff? lol
 

Phellps

Member
Oct 25, 2017
10,811
That was a great read. Sounds like the PS5 is prioritizing efficiency as a way to get the best out of its raw power, which could pay off. Looking forward to finding more about it, hopefully Sony doesn't wait too long to show up again.
 

Loudninja

Member
Oct 27, 2017
42,216

gofreak

Member
Oct 26, 2017
7,736
So, I suppose the question is, as we progress further in to the next generation and games are pushing the GPU and CPU much harder than currently, with intensive workloads of advanced graphical effects and ray-tracing, and more CPU utilisation for physics and AI etc, will "most workloads" still be sustaining peak clock? Or as developers move away from cross-gen development to exclusively developing for next-gen, are they going to have to make choices more often between prioritising CPU or GPU workloads?

That's the big question - although I think one thing that may be borne in mind here by Sony/Cerny, is that I'm not sure how typically games equally stress the CPU and GPU. Some are maxing out their frametimes on both, and perhaps with 'power intensive' workloads, but I've had the impression that often a game may be bound by one more than the other. In this case, where a game is bound by one to some degree more than the other, the machine could be throttling one in favour of the other and you'd never notice it in the framerate.

But of course, I'm sure you'll have games at some point which more or less never hit peak clocks on either chip because of workloads' power intensity. Cerny would presumably cast doubt on how common that would be, but of course he could be wrong.
 

CatAssTrophy

Member
Dec 4, 2017
7,632
Texas
VRS isn't a MS thing.

VRS Tier 2 has a MS patent but only relating to DXR. I've no doubt Sony will have the same for PS5 as it'll be a standard part of RDNA 2.0.

Cool, thank you! Things can get confusing when different companies start using their own labels for stuff that might be openly available for all to use.

ie: variable rate refresh/gsync/freesync etc.
 

Bearly_There

Member
Mar 16, 2020
30
There's nothing at all new here, DF are currently starved for content. I'm not criticising them, they have to make a living. They'll have more interesting things to say once there's more news to actually report on. I'm so bored already of people speculating inexpertly over the minutia of how devs will optimise for PS5. The basic CPU and GPU performance picture is clear: both PS5 and XSX are clearly much better than even the "pro" models of last gen, and XSX has a little more CPU and GPU oomph than PS5, which is unlikely to translate into any significant perceptible difference for gamers. The basic questions, in my opinion, seem to be:

1. Is either box going to have trouble with cooling or fan noise?
2. Is one going to be meaningfully more capable of ray tracing than the other (probably the XSX, if so), and will this difference translate into meaningfully different gaming experiences?
3. Will the PS5's investments in data bandwidth translate into meaningfully better gaming experiences compared to XSX or high-specced PCs?
4. Will audio in next-gen gaming actually not suck? Because I do feel like it is rather underwhelming on PS4 (maybe not true for xbox, I couldn't say).
5. Is Half-Life Alyx coming to PSVR? Because if so I'll actually buy that kit!!! As a satisfied PS4 owner, I expect to buy a PS5, but Half Life Alyx would seal the deal, with a presumed PSVR2 thrown in as well.
 

Pryme

Member
Aug 23, 2018
8,164
Cerny's use of the word 'potentially' is concerning here. But we'll see in due time.

Nice article, does a lot to detail the advantages the Tempest engine brings to 3D sound, compared to Dolby's efforts, for example. Unfortunately it seems it's geared to headphones only at launch, so you don't get the full experience if you're gaming with spectators. Impressive, nonetheless, and i hope it works well with my Hyper X Cloud.
 

Decarb

Member
Oct 27, 2017
8,643
No, it's not.

Alex said what is elaborated on in the article - that some developers DF spoke to said they were using locked profiles in order to keep the GPU at 2.23Ghz all the time. Cerny makes a slightly different point here - that he expects processing to run most of the time 'at or near' peak clocks when the chip is busy.

If devs want a 100% locked GPU clock even when not busy, they have a debug profile that lets them do that. And the only way to guarantee it would be to use one of those profiles. But it's a debug profile rather than a release profile it seems.

So Alex was right, and Cerny was right, but they're talking in slightly different contexts and with different degrees of precision I think re. how tightly the clock stays pinned to peak.
It is such a simple thing and yet caused so much warring in the other thread, and headache to DF guys, being called out and labelled as biased and PR for one company.
 

gundamkyoukai

Member
Oct 25, 2017
21,148
So this is a little bit new. An order of milliseconds to get data through the I/O stack. That's pretty exciting because it does open up the possibility of on-demand, intra-frame requesting of some kinds of data.

I wonder how much you could get into memory within, say, 10ms...

Yeah that was a interesting point about keep thing like sound and for spidey the bag eg ram.
Make you wonder just how much different type of data they can pull on the fly instead of having it in the ram per say .
 

True_fan

Banned
Mar 19, 2020
391
Too much "secret sauce" and people theorizing capabilities. Unless Sony is going to magically redesign the system, they may as well showcase what they have. It's amazing how much they have bumbled communication and sony have been masters at such.
 

space_nut

Member
Oct 28, 2017
3,306
NJ
"Developers don't need to optimise in any way; if necessary, the frequency will adjust to whatever actions the CPU and GPU are performing, - Mark Cerny..." DF "Right now, it's still difficult to get a grip on boost and the extent to which clocks may vary. "

By next GDC we'll probably get a good idea how variable frequencies are on nextgen games
 
Last edited:

WhtR88t

Member
May 14, 2018
4,588
I'm wondering if the PS5 will be a much smaller box than the Series X?

They keep talking about the cooling solution being really unique and pair that with the variable clock rates, maybe the console itself ends up being much smaller and running cooler than the Series X or previous PlayStations?

Maybe if it's smaller it's cheaper to produce too? Or they're making up for the costs of their SSD?
 

ILikeFeet

DF Deet Master
Banned
Oct 25, 2017
61,987
given that devs can set max GPU clocks with dialed down CPU clocks, I'm guessing that's what we'll see a majority of the time. we might not see a "traditional" game be cpu bound despite those slightly lower clocks. though the games that will push the cpu will be interesting, being strategy games and those with a lot of simulations
 

Deleted member 2441

User requested account closure
Banned
Oct 25, 2017
655
On Richard's point re: a games-focused reveal, I really wonder when that's going to happen considering the situation with the virus. Will be interesting to see what format that's in. It certainly won't be a big press event...
 

nelsonroyale

Member
Oct 28, 2017
12,128
Too much "secret sauce" and people theorizing capabilities. Unless Sony is going to magically redesign the system, they may as well showcase what they have. It's amazing how much they have bumbled communication and sony have been masters at such.

Not really. It is fairly clear what Cerny is claiming, just need to see how it actually works with software.
 

gundamkyoukai

Member
Oct 25, 2017
21,148
"Developers don't need to optimise in any way; if necessary, the frequency will adjust to whatever actions the CPU and GPU are performing,...Right now, it's still difficult to get a grip on boost and the extent to which clocks may vary. "

By next GDC we'll probably get a good idea how variable frequencies are on nextgen games

Nah it would be the GDC after that one since next year is more cross gen games than anything else .
Which really won't push the system .
 

PLASTICA-MAN

Member
Oct 26, 2017
23,624
Dark1x Can you confirm this?

So PS5 variable GPU and CPU speeds means running a very low game or app will make both run at much lower clocks so less risk of overheating when running small games or areas or apps while XSX will run evverything at fixed clocks all the time regardless of the power needed?
Isn't this kinda risky for XSX to run at such heat all the time, even if the cooling system is good enough? This may wear off components much faster.
 
Dec 31, 2017
1,430
VRS isn't a MS thing.

VRS Tier 2 has a MS patent but only relating to DXR. I've no doubt Sony will have the same for PS5 as it'll be a standard part of RDNA 2.0.
Except both Sony and MS are using "custom" RDNA2, so no guarantees here. VRS is a big feature when it comes to optimizing and performance for future games, MS, Nvidia and AMD aren't pushing it that hard in the PC space for no reason. You'd think Sony would have mentioned it in their tech reveal considering how important it is.
 

WhtR88t

Member
May 14, 2018
4,588
Too much "secret sauce" and people theorizing capabilities. Unless Sony is going to magically redesign the system, they may as well showcase what they have. It's amazing how much they have bumbled communication and sony have been masters at such.
I think the bumbling has a lot to do with marketing The Last of Us 2 and Ghosts of Tsushima for later this year. They need those games to look fantastic, but showing PS5 exclusives that blow them out of the water graphically now would possibly take some hype away from them.
 

ILikeFeet

DF Deet Master
Banned
Oct 25, 2017
61,987
On Richard's point re: a games-focused reveal, I really wonder when that's going to happen considering the situation with the virus. Will be interesting to see what format that's in. It certainly won't be a big press event...
I wouldn't want that regardless of the virus. big-stage shows are becoming passe, I feel. a video and a journalist venue for hands-on is good enough for me
 

True_fan

Banned
Mar 19, 2020
391
I think the bumbling has a lot to do with marketing The Last of Us 2 and Ghosts of Tsushima for later this year. They need those games to look fantastic, but showing PS5 exclusives that blow them out of the water graphically now would possibly take some hype away from them.
That's perfectly ok though. Fans would buy those games and be super hyped about how ps5 games will look.
 

Bearly_There

Member
Mar 16, 2020
30
Cool, thank you! Things can get confusing when different companies start using their own labels for stuff that might be openly available for all to use.

ie: variable rate refresh/gsync/freesync etc.

Variable rate shading, or its functional equivalent, is an important optimisation method for virtual reality uses. It might even be in place already for PSVR. Either way, assuming Sony is still committed to VR, I'd be very surprised if the PS5 doesn't facilitate some version of the same basic principle, regardless of whether or not PS5 and XSX share the exact same implementation in silicon.

VRS is massively overhyped in non-VR purposes anyway. It's a handy optimisation that can deliver decent incremental performances improvements in many situations. Compared to checkerboarding, temporal interpolation, machine learning interpolation etc., it's not that significant. I thought it was weird as hell that when MS revealed the XSX specs that they chose to make VRS the second bullet point after "12 TF". I mean, MS gave VRS hype priority over the CPU, SSD, or raytracing. That's just random as hell IMHO.
 

jroc74

Member
Oct 27, 2017
28,999

Basically, lol
So this is a little bit new. An order of milliseconds to get data through the I/O stack. That's pretty exciting because it does open up the possibility of on-demand, intra-frame requesting of some kinds of data.

I wonder how much you could get into memory within, say, 10ms...
Yeah this was interesting and new info.

The part about the file system. Is it gonna be different for the entire console or just game development? Trying to wrap my head around what that actually means.
 

Dark1x

Digital Foundry
Verified
Oct 26, 2017
3,530
Dark1x Can you confirm this?

So PS5 variable GPU and CPU speeds means running a very low game or app will make both run at much lower clocks so less risk of overheating when running small games or areas or apps while XSX will run evverything at fixed clocks all the time regardless of the power needed?
Isn't this kinda risky for XSX to run at such heat all the time, even if the cooling system is good enough? This may wear off components much faster.
I don't see how this would wear out components any faster. This really isn't an issue with modern processors. It's not something could or should be used as a point of argument in any discussion, I'd say.

Of current machines, only Switch is a concern for me as it won't boot without some charge in the battery which means long-term usage will always require a functional battery.
 

chipperrip

Member
Jan 29, 2019
434
From where did DF get such new infos?

You could have at least read the article's preamble.

"A couple of days prior to the talk going live, Digital Foundry spoke in depth with Cerny on the topics covered. Some of that discussion informed our initial coverage, but we have more information. A lot more."

"A few times throughout the conversation, Cerny suggested further research, one of the reasons we didn't (indeed, couldn't) go live straight away after the event"
 
OP
OP
Wollan

Wollan

Mostly Positive
Member
Oct 25, 2017
8,816
Norway but living in France
"GPUs process hundreds or even thousands of wavefronts; the Tempest engine supports two," explains Mark Cerny. "One wavefront is for the 3D audio and other system functionality, and one is for the game. Bandwidth-wise, the Tempest engine can use over 20GB/s, but we have to be a little careful because we don't want the audio to take a notch out of the graphics processing. If the audio processing uses too much bandwidth, that can have a deleterious effect if the graphics processing happens to want to saturate the system bandwidth at the same time."

I was theorizing earlier today that the Tempest engine was actually one of the 4 idle CU's on the APU and it was utilizing the second 20GBps Onion bus (same as on PS4) for bypassing the L1 and L2 cache and not having to share bandwidth with the other CU's. Maybe this is the case?
The Onion bus (as it is called on PS4) is utilized for asynchronous compute needs hence the "take a notch out of the graphics processing" mention by Cerny I'm sure.
edit: Or maybe that's just the overall power budget he's referring to.

From the other thread:
Ok I was just curious if there was a secondary bus somewhere on the APU so that bandwidth wouldn't be needed to be shared with other CU's (hence the L1 cache is not needed) so that they could potentially use one of the idle CU's as the Tempest CU. The PS4 APU has a secondary 20GBps bus so that L2 and L1 cache can be bypassed (which likely the PS5 APU does as well due to BC concerns).
 
Last edited:

xem

Member
Oct 31, 2017
2,043
It is such a simple thing and yet caused so much warring in the other thread, and headache to DF guys, being called out and labelled as biased and PR for one company.
and in the video, richard immediately follows thats by saying "The fixed profiles that devs are talking about are only used in ps5 dev kits and in retail games their code will tap into the variable clocks for extra performance"
its important to have the entire context.