• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.
  • We have made minor adjustments to how the search bar works on ResetEra. You can read about the changes here.
Status
Not open for further replies.

Wereroku

Member
Oct 27, 2017
6,256
Wait, IIRC there were reports that PS5 was the easiest PS console to develop for thus far (Mr. Cerny's Road to PS5 mentioned PS5 having the shortest time to triangle- of course that alone is not indicative of overall ease of development or otherwise).

Furthermore, isn't the APU's design supposed to deterministic? So, performance analysis tools will function the exact same way (to ascertain a game's performance) on PS5 as it does with PS4.
I mean if you want to give him the benefit of the doubt. Devs will know what workloads will cause down clocking and to get the most performance possible you can manage the use of those. However I don't think that is really required.
 

Drandony

Alt Account
Banned
Jun 22, 2020
37
Wait, IIRC there were reports that PS5 was the easiest PS console to develop for thus far (Mr. Cerny's Road to PS5 mentioned PS5 having the shortest time to triangle- of course that alone is not indicative of overall ease of development or otherwise).

Furthermore, isn't the APU's design supposed to deterministic? So, performance analysis tools will function the exact same way (to ascertain a game's performance) on PS5 as it does with PS4.

develop maybe, but optimize? No. It's way harder to optimize for PS5 than PS4, or Xbox series X.
This is due to the variable clocks. Cerny even said that devs need to optimize their engine based on those variable clocks. They would need to do this for a single platform. Which is highly unlikely for a multiplatform game.
Mark Cerny sees a time where developers will begin to optimise their game engines in a different way - to achieve optimal performance for the given power level. "Power plays a role when optimising. If you optimise and keep the power the same you see all of the benefit of the optimisation. If you optimise and increase the power then you're giving a bit of the performance back. What's most interesting here is optimisation for power consumption, if you can modify your code so that it has the same absolute performance but reduced power then that is a win. "

In short, the idea is that developers may learn to optimise in a different way, by achieving identical results from the GPU but doing it faster via increased clocks delivered by optimising for power consumption. "The CPU and GPU each have a power budget, of course the GPU power budget is the larger of the two," adds Cerny. "

With Xbox series X they won't need to do this, because Xbox series is sustained under any circumstances, this does not apply to PS5. Devs would need to do something different to optimize for PS5 and no, this is NOT simple or easy.
 

Magio

Member
Apr 14, 2020
647
Before we knew the specs of either consoles, Tom Warren was on Twitter saying that the Xbox would be optimized better because "Microsoft has decades of experience on DirectX". Don't get why anyone would care about anything he has to say past that point, dude really thought Microsoft's experience with APIs for game development would differentiate them from the company that made the PlayStation consoles. Clearly those guys don't know how to make an API !
 

jroc74

Member
Oct 27, 2017
29,003
develop maybe, but optimize? No. It's way harder to optimize for PS5 than PS4, or Xbox series X.
This is due to the variable clocks. Cerny even said that devs need to optimize their engine based on those variable clocks. They would need to do this for a single platform. Which is highly unlikely for a multiplatform game.


With Xbox series X they won't need to do this, because Xbox series is sustained under any circumstances, this does not apply to PS5. Devs would need to do something different to optimize for PS5 and no, this is NOT simple or easy.
Oh lord, here we go again....
 

Unkindled

Member
Nov 27, 2018
3,247
develop maybe, but optimize? No. It's way harder to optimize for PS5 than PS4, or Xbox series X.
This is due to the variable clocks. Cerny even said that devs need to optimize their engine based on those variable clocks. They would need to do this for a single platform. Which is highly unlikely for a multiplatform game.


With Xbox series X they won't need to do this, because Xbox series is sustained under any circumstances, this does not apply to PS5. Devs would need to do something different to optimize for PS5 and no, this is NOT simple or easy.
You forgot about follow up interview where he clarified his statement.
Developers don't need to optimise in any way; if necessary, the frequency will adjust to whatever actions the CPU and GPU are performing. I think you're asking what happens if there is a piece of code intentionally written so that every transistor (or the maximum number of transistors possible) in the CPU and GPU flip on every cycle. That's a pretty abstract question, games aren't anywhere near that amount of power consumption. In fact, if such a piece of code were to run on existing consoles, the power consumption would be well out of the intended operating range and it's even possible that the console would go into thermal shutdown. PS5 would handle such an unrealistic piece of code more gracefully.
 

jroc74

Member
Oct 27, 2017
29,003
You forgot about follow up interview where he clarified his statement.
See, I was thinking about this.....but even reading the quote in the other post.....Cerny explains what he meant right after that bolded part. I'm still re reading it to see what other ways it could be interpreted. Because he mentions power consumption in that quote too.

This does clarify it tho.
 

Shambala

Banned
Oct 26, 2017
1,537
User Banned (Permanent): Modwhining and doubling down on a ban for hostility; long history of modwhining and hostility
So Tom Warren is not to be questioned?
Yup I got banned for calling him out on his "journalism" with baseless research. Spouting nothing but fud.. I got banned like 15 mins after. Don't question "verified" members I guess...
 

behOemoth

Member
Oct 27, 2017
5,629
develop maybe, but optimize? No. It's way harder to optimize for PS5 than PS4, or Xbox series X.
This is due to the variable clocks. Cerny even said that devs need to optimize their engine based on those variable clocks. They would need to do this for a single platform. Which is highly unlikely for a multiplatform game.
That's not true at all. The PS5 devkits have different profiles to sustain the clock rate of the processors to ease the optimization of the code (if needed at all). You can use the same working flow using CPU or GPU profilers for optimization like for the PS4 or XBone. The variable clock rate doesn't have to be considered by the developers at all. Since the whole SoC (i.e. CPU and GPU) knows when the code is running (flipping bits and consuming more power) it will clock the core to its set maximum. If no activity is detected it will decrease the clock rate to save power.
Thus, the varying of the clockspeed is deterministic, because it is based on the activity of the code. Intel for instance also uses boost clocks and throttlings, but it solely based on whether the cooling system can handle the current power output.
 

•79•

Banned
Sep 22, 2018
608
South West London, UK
Yup I got banned for calling him out on his "journalism" with baseless research. Spouting nothing but fud.. I got banned like 15 mins after. Don't question "verified" members I guess...
Judging by the bans, it would seem so. Mods don't allow persistent targeting of any specific member. I'll be honest and say that I don't know (or necessarily agree with) where the line is on that here given the relevance of his commentary to the subject of the thread.
Some clarification would be great.
 

AegonSnake

Banned
Oct 25, 2017
9,566
So Tom Warren is not to be questioned?
Judging by the bans, it would seem so. Mods don't allow persistent targeting of any specific member. I'll be honest and say that I don't know (or necessarily agree with) where the line is on that here given the relevance of his commentary to the subject of the thread.

well it seems you can discuss his tweets but u can't call him a warrior. I checked all four bans and they were all accusing him of being a console warrior.

The rest of the posts discussing the contents of his posts didn't get banned. Seems fair enough.

Tbh, I've been called a warrior before and told im stupid, dumb and that i should piss off from this very thread and no one got banned for antagonizing me.
 

jroc74

Member
Oct 27, 2017
29,003
Yup I got banned for calling him out on his "journalism" with baseless research. Spouting nothing but fud.. I got banned like 15 mins after. Don't question "verified" members I guess...
Thats hilarious....

Anyway, the other part that was left out in that quote was this:

"If the CPU doesn't use its power budget - for example, if it is capped at 3.5GHz - then the unused portion of the budget goes to the GPU. That's what AMD calls SmartShift. There's enough power that both CPU and GPU can potentially run at their limits of 3.5GHz and 2.23GHz, it isn't the case that the developer has to choose to run one of them slower."

That came directly after.

That's not true at all. The PS5 devkits have different profiles to sustain the clock rate of the processors to ease the optimization of the code (if needed at all). You can use the same working flow using CPU or GPU profilers for optimization like for the PS4 or XBone. The variable clock rate doesn't have to be considered by the developers at all. Since the whole SoC (i.e. CPU and GPU) knows when the code is running (flipping bits and consuming more power) it will clock the core to its set maximum. If no activity is detected it will decrease the clock rate to save power.
Thus, the varying of the clockspeed is deterministic, because it is based on the activity of the code. Intel for instance also uses boost clocks and throttlings, but it solely based on whether the cooling system can handle the current power output.
This too. This is all in the same DF article.

"Regarding locked profiles, we support those on our dev kits, it can be helpful not to have variable clocks when optimising. Released PS5 games always get boosted frequencies so that they can take advantage of the additional power," explains Cerny.

Know what came right before all this in the article?

Feedback from developers saw two areas where developers had issues - the concept that not all PS5s will run in the same way, something that the Model SoC concept addresses. The second area was the nature of the boost. Would frequencies hit a peak for a set amount of time before throttling back? This is how smartphone boost tends to operate.

www.eurogamer.net

PlayStation 5 uncovered: the Mark Cerny tech deep dive

On March 18th, Sony finally broke cover with in-depth information on the technical make-up of PlayStation 5. Expanding …

Feedback from developers...questions about how it all works...

I know we like discussing things in articles, but sometimes it would be good to include stuff before and after the point you want to discuss. It helps the discussion IMO.
 

gundamkyoukai

Member
Oct 25, 2017
21,154
well it seems you can discuss his tweets but u can't call him a warrior. I checked all four bans and they were all accusing him of being a console warrior.

The rest of the posts discussing the contents of his posts didn't get banned. Seems fair enough.

Tbh, I've been called a warrior before and told im stupid, dumb and that i should piss off from this very thread and no one got banned for antagonizing me.

Aegon i see you play TLOU 2 but you notice how impressive the clothes parts were in that game.
It crazy to think what ND will do on next gen .
 

gundamkyoukai

Member
Oct 25, 2017
21,154
Lol actually i didn't. I was so blown away by the foliage, the rain and lighting that i ignored everything else.

Fog + fire + volumetric lighting = best visuals ever.

It funny cause it so impressive that people did not notice how impressive it is since it look normal .
When characters take off clothes and stuff like that they don't normal show it because it hard to do with clipping , funny animation etc etc .
ND made it look so normal .
 

AegonSnake

Banned
Oct 25, 2017
9,566
It funny cause it so impressive that people did not notice how impressive it is since it look normal .
When characters take off clothes and stuff like that they don't normal show it because it hard to do with clipping , funny animation etc etc .
ND made it look so normal .
yeah i saw you guys talking about the cutscene where she takes her shirt off. i didnt even think about how it had no clipping.
Try posting some FUD on Twitter. May get you a different result.

*shrug*
lol
brb writing a poem.
 

MrKlaw

Member
Oct 25, 2017
33,076
Just because it's deterministic doesn't mean it is clear at the outset. Yes, when a developer sees the results, those will be the same on any console. But they may find some workloads resulting in lower clocks and therefore lower performance, and may not know that until they try.

however - if Cerny's description is accurate this will be rare, and most workloads should run at full clocks.

so Warren's comments aren't necessarily inaccurate, but I do feel the 'variable clocks' are often posted as though they're more like regular PC clocks, which they aren't. It would be nice to see More balanced reporting on that, vs what often does appear to be more fear mongering
 

Deleted member 10747

User requested account closure
Banned
Oct 27, 2017
1,259
The question was already answered and didn't require your input.
One, was being sarcastic and agreeing with you.
Two, i wasn't that far with reading this thread.
Three, sorry that i responded and quoted you. Good to know where my place is. Thank you my master.

This place is becoming more horrible and vile by the minute.
 

Lady Gaia

Member
Oct 27, 2017
2,480
Seattle
Just because it's deterministic doesn't mean it is clear at the outset. Yes, when a developer sees the results, those will be the same on any console. But they may find some workloads resulting in lower clocks and therefore lower performance, and may not know that until they try.

This myth that code normally executes in trivially predictable amounts of time keeps getting perpetuated by these discussions. It might be true on a DSP, or something as ancient as a 6502, but on modern high-performance processors and operating systems you always have to test code to determine how cache misses, branch mispredictions, context switching, virtual memory interactions, and many other factors impact your code's performance. There's a good reason why measurement is the gold standard for figuring out where and why your code performs the way it does. This isn't new to even moderately skilled developers. Stop this madness and listen to what developers are focused on, not casual observers.
 

jroc74

Member
Oct 27, 2017
29,003
Just because it's deterministic doesn't mean it is clear at the outset. Yes, when a developer sees the results, those will be the same on any console. But they may find some workloads resulting in lower clocks and therefore lower performance, and may not know that until they try.

however - if Cerny's description is accurate this will be rare, and most workloads should run at full clocks.

so Warren's comments aren't necessarily inaccurate, but I do feel the 'variable clocks' are often posted as though they're more like regular PC clocks, which they aren't. It would be nice to see More balanced reporting on that, vs what often does appear to be more fear mongering
Knowing where 9tf and 9.2tf comes from....c'mon. That person is directly refuting what Cerny said.

Why those specific numbers and not, say, 10.0?
 

M.Bluth

Member
Oct 25, 2017
4,260
If a 2% reduction in frequency can bring down the power load by 10%, described by Cerny as "the worst case game," how often, then, would any realistic game code require an even bigger reduction in power?

If it's so rare, then describing that scenario as the "real performance" is nothing but baseless nonsense.
Now, if there's a credibly sourced write up saying otherwise then I'd love to read it.
 

behOemoth

Member
Oct 27, 2017
5,629
Just because it's deterministic doesn't mean it is clear at the outset. Yes, when a developer sees the results, those will be the same on any console. But they may find some workloads resulting in lower clocks and therefore lower performance, and may not know that until they try.

however - if Cerny's description is accurate this will be rare, and most workloads should run at full clocks.

so Warren's comments aren't necessarily inaccurate, but I do feel the 'variable clocks' are often posted as though they're more like regular PC clocks, which they aren't. It would be nice to see More balanced reporting on that, vs what often does appear to be more fear mongering
Such circumstances should be even rarer than you think, especially for game code. Modern computers already use "accelerator" processors to outsource such tasks to DSPs for sound or modulated data signals of your WLAN, Ethernet or Bluetooth. The other juggernaut using a very high activity for a long sustained time would be encryption and the PS5 is using a custom unit for that to free up resources for the CPU. The PS4 on the other hand didn't sacrificed all its CPU power for encryption. So that shouldn't be a real problem to begin with.
Plus the PS5 has a 8 CPU cores and 36 GPU CUs. I can't think of any code to use everything to it's fullest for a any time and I'm not a real coder though.
 

terawatt

Member
Oct 27, 2017
336
The whole variable clock concerns are complete nonsense. Lowering the power limit by 10% even on a less efficient pascal gpu results in a 76mhz drop. Even with RTSS up I would struggle to see a performance drop on my system. The whole thing is totally overblown.
 

BreakAtmo

Member
Nov 12, 2017
12,839
Australia
If a 2% reduction in frequency can bring down the power load by 10%, described by Cerny as "the worst case game," how often, then, would any realistic game code require an even bigger reduction in power?

If it's so rare, then describing that scenario as the "real performance" is nothing but baseless nonsense.
Now, if there's a credibly sourced write up saying otherwise then I'd love to read it.

I could possibly see the system going below 2GHz when it gets a really stressful menu/map screen like what Cerny was talking about, which of course would go unnoticed. But during actual meaningful gameplay? I don't think we'll see drops of more than maybe 5%, and even those would be brief outliers.
 

Jedi2016

Member
Oct 27, 2017
15,729
Not to mention I can't really imagine a situation where both the CPU and GPU are running breakneck at 100% for an extended period.
 

Fafalada

Member
Oct 27, 2017
3,068
Yes, when a developer sees the results
That's literally how it's always worked.
You can't call something an optimization until after you've profiled/measured it. And it really goes for all code we write, you don't really know how something runs until it's been tested.

Different hw will of course have different performance characteristics - but optimization methodologies don't change here.
 
Oct 27, 2017
7,139
Somewhere South
And it's good to remember that the time frames Cerny was talking about mean that CPU and GPU might be shifting frequencies multiple times within a single frame - juice up as needed, chill when not critical.
 

disco_potato

Member
Nov 16, 2017
3,145
I could possibly see the system going below 2GHz when it gets a really stressful menu/map screen like what Cerny was talking about, which of course would go unnoticed. But during actual meaningful gameplay? I don't think we'll see drops of more than maybe 5%, and even those would be brief outliers.
That would be a >10% clock drop and likely a >30% power draw drop. I'd be asking questions if you're a dev and you've overshot your goals by 30%.
 

BreakAtmo

Member
Nov 12, 2017
12,839
Australia
That would be a >10% clock drop and likely a >30% power draw drop. I'd be asking questions if you're a dev and you've overshot your goals by 30%.

Like I said, I was specifically talking about the stuff Cerny mentioned where menu screens and map screens and the like, due to their simplicity, tend to pointlessly run the GPU at maximum efficiency and cause the power draw and heat dissipation to go crazy. Cerny did also say that running the PS5 GPU at a fixed 2GHz was looking infeasible, and I'm guessing this is the reason why since, indeed, regular gameplay really should not need such large drops.
 

Sia

Attempted to circumvent ban with alt account
Banned
Jun 9, 2020
825
Canada
I'm just surprised as how there's still a drought of concrete information. I thought the June presentation would have kicked off a trickle of news, maybe a tear down or showing off the ui maybe new features. There's been nothing unless they just plan to cram everything in all at once in August
 

Lady Gaia

Member
Oct 27, 2017
2,480
Seattle
Hey, my game requires a lot of 256 bit instructions!

(But really I'm not sure this is the right way to think about it.).

It certainly doesn't tell the whole story. What does a typical instruction mix look like? What's the mix of integer vs. floating point vector work? What's your cache hit rate like? Is this typical of the instruction mix on all threads, or is the task single-threaded? Are the relevant threads scheduled across all cores, or is there affinity for a subset (and indeed can you express affinity with the API you have to work with?)
 

AegonSnake

Banned
Oct 25, 2017
9,566
I'm just surprised as how there's still a drought of concrete information. I thought the June presentation would have kicked off a trickle of news, maybe a tear down or showing off the ui maybe new features. There's been nothing unless they just plan to cram everything in all at once in August
lmao. i just came to post this. i thought the floodgates wouldve opened by now.

i really need to see next gen gameplay. no offense to ratchet and gt7, but i need more than what they showed off.
 

gundamkyoukai

Member
Oct 25, 2017
21,154
I think TLOU2 is going to fuck up next gen for me since most of the game cross gen.
Was watching some of the new AC game and all i thinking what the fuck is this animation .
 

RoboPlato

Member
Oct 25, 2017
6,811
I'm just surprised as how there's still a drought of concrete information. I thought the June presentation would have kicked off a trickle of news, maybe a tear down or showing off the ui maybe new features. There's been nothing unless they just plan to cram everything in all at once in August
I think we could end up seeing that teardown sometime soon. The console is in production
 
Oct 26, 2017
6,151
United Kingdom
I'm just surprised as how there's still a drought of concrete information. I thought the June presentation would have kicked off a trickle of news, maybe a tear down or showing off the ui maybe new features. There's been nothing unless they just plan to cram everything in all at once in August

Drought of information? On the console?

Outside of the PS5 price and confirmed list of launch games, we pretty much know everything important about the PS5.

What kinda info. are you specifically looking for?
 
Status
Not open for further replies.