yeah and even then worst case may only mean a couple of scenes in the game etc.Even better. It's practically a non-issue at that point. Would probably mean some minor tweaks when that worst case is eventually hit
yeah and even then worst case may only mean a couple of scenes in the game etc.Even better. It's practically a non-issue at that point. Would probably mean some minor tweaks when that worst case is eventually hit
Yup, around that.Not to mention some people are acting like we're going to hit that worst case right out of the gate. And even in the worst case, we'd be looking at, what, a 10% drop?
I'm kind of expecting something like:Honestly I think its more likely between 2.1 and 2.23 based on the scale of 10% power reduction. im not sure of the ration but even a 10% power reduction doesnt mean 10% loss In clock speed. i dont think it will hit 2 ever.
less than that cerny said 10 % power drop = couple percent reduction. not sure of the exact math however.
he said 2% for worst case.Not to mention some people are acting like we're going to hit that worst case right out of the gate. And even in the worst case, we'd be looking at, what, a 10% drop?
I don't recall him saying that was the worst case. Just him giving an example that we don't expect major drops, because frequency drops cause basically near cubic drops in power.he said 2% for worst case.
thats 44 mhz. so it would drop from 2,230 mhz to 2,186 mhz.
the only issue i have with this is his statement saying 2 ghz was looking impossible. and that 3.5 ghz was looking impossible. well, xbox has 3.6 ghz and fixed clocks so what did they do that made it so possible with fixed clocks?
for gpu, is he saying that some instructions were hotter than others? so they wont be running those instructions at 2 or 2+ ghz? how would that impact performance?
we need to see the overall TDP budget of this thing. if this thing is a 150w gpu then there is zero cause for concern.
Cerny is kinda weird with his statements sometimes, so maybe he meant when they were originally designing the console 2ghz seemed impossible, but it turned out that wasn't true.the only issue i have with this is his statement saying 2 ghz was looking impossible. and that 3.5 ghz was looking impossible. well, xbox has 3.6 ghz and fixed clocks so what did they do that made it so possible with fixed clocks?
The combination of the two. vs just having to choose one or the other.he said 2% for worst case.
thats 44 mhz. so it would drop from 2,230 mhz to 2,186 mhz.
the only issue i have with this is his statement saying 2 ghz was looking impossible. and that 3.5 ghz was looking impossible. well, xbox has 3.6 ghz and fixed clocks so what did they do that made it so possible with fixed clocks?
for gpu, is he saying that some instructions were hotter than others? so they wont be running those instructions at 2 or 2+ ghz? how would that impact performance?
we need to see the overall TDP budget of this thing. if this thing is a 150w gpu then there is zero cause for concern.
yeah, its the only thing thats a bit confusing in his 50 minute talk.Cerny is kinda weird with his statements sometimes, so maybe he meant when they were originally designing the console 2ghz seemed impossible, but it turned out that wasn't true.
yeah he didnt really put a number on what the percentage points would be.I don't recall him saying that was the worst case. Just him giving an example that we don't expect major drops, because frequency drops cause basically near cubic drops in power.
Almost missed this.
again, the fraction of a single core is for the xbox series X which has dedicated decompression hardware. any system without this dedicated hardware will definitely utilize more than a tenth of a cpu core.
Yeah, the dev units probably have the profiles built in so that devs can manually trigger "what if" and work around it.yeah, its the only thing thats a bit confusing in his 50 minute talk.
and i just added this as an edit in my original post in case you missed it.
EDIT: I just realized that DF said third party devs were already running the gpu at 2.23 ghz albeit while downclocking the cpu. but cpu power savings are so small, i am pretty sure the ps5 cooling system can handle the gpu running at 2.23 ghz regardless of what instructions are being fed. cerny did tell richard that the final retail unit will handle everything gracefully without needing clock profiles or input from devs to downclock cpu to feed the gpu enough power. it should just work.
yeah that makes sense. thats actually pretty cool.Yeah, the dev units probably have the profiles built in so that devs can manually trigger "what if" and work around it.
what do you think needs to make use of an API? processor time.And the point is to bring it to Windows. So no, the point isn't on PC to "bruteforce it" and bringing a narrative that "thanks to secret sauce, those 8 cores Zen 2 CPUs on consoles will perform the same as 16 cores Zen 2 CPUs".
As for your second sentence, nothing indicate that. In fact, in the Microsoft article, both of those aspects (the hardware decompression block and DirectStorage API) were under two different sections.
here is the full quote. im not creating this narrative.And the point is to bring it to Windows. So no, the point isn't on PC to "bruteforce it" and bringing a narrative that "thanks to secret sauce, those 8 cores Zen 2 CPUs on consoles will perform the same as 16 cores Zen 2 CPUs".
As for your second sentence, nothing indicate that. In fact, in the Microsoft article, both of those aspects (the hardware decompression block and DirectStorage API) were under two different sections.
its saving a 'ton of cpu' because of this additional hardware. not just because they updated a software API. its basically what Epic did with UE5 when they say they rewrote the entire i/o logic to take advantage of the ps5 i/o. yes its software but its relying on dedicated ps5 i/o hardware to save on the cpu processing time.The final component in the triumvirate is an extension to DirectX - DirectStorage - a necessary upgrade bearing in mind that existing file I/O protocols are knocking on for 30 years old, and in their current form would require two Zen CPU cores simply to cover the overhead, which DirectStorage reduces to just one tenth of single core.
"Plus it has other benefits," enthuses Andrew Goossen. "It's less latent and it saves a ton of CPU. With the best competitive solution, we found doing decompression software to match the SSD rate would have consumed three Zen 2 CPU cores. When you add in the IO CPU overhead, that's another two cores. So the resulting workload would have completely consumed five Zen 2 CPU cores when now it only takes a tenth of a CPU core. So in other words, to equal the performance of a Series X at its full IO rate, you would need to build a PC with 13 Zen 2 cores. That's seven cores dedicated for the game: one for Windows and shell and five for the IO and decompression overhead."
and they totally missed " The final component in the triumvirate is an extension to DirectX - DirectStorage "here is the full quote. im not creating this narrative.
its saving a 'ton of cpu' because of this additional hardware. not just because they updated a software API. its basically what Epic did with UE5 when they say they rewrote the entire i/o logic to take advantage of the ps5 i/o. yes its software but its relying on dedicated ps5 i/o hardware to save on the cpu processing time.
otherwise, you could technically have direct storage or ps5's unreal engine 5 i/o logic run on the xbox one and ps4 while taking only a fraction of a jaguar core. hell, why stop there. port direct storage to the switch and have that become a 2.4 gbps ssd.
what do you think needs to make use of an API? processor time.
here is the full quote. im not creating this narrative.
its saving a 'ton of cpu' because of this additional hardware. not just because they updated a software API. its basically what Epic did with UE5 when they say they rewrote the entire i/o logic to take advantage of the ps5 i/o. yes its software but its relying on dedicated ps5 i/o hardware to save on the cpu processing time.
otherwise, you could technically have direct storage or ps5's unreal engine 5 i/o logic run on the xbox one and ps4 while taking only a fraction of a jaguar core. hell, why stop there. port direct storage to the switch and have that become a 2.4 gbps ssd.
This is a two-day raffle that will expire in 48 hours. The winner will be drawn at random! Any prizes leftover after the deadline will become available on a first-come first-serve basis.
Transistor said:To all you disgustingly beautiful avatars out there, thanks for putting up with me over the past few months. I love y'all so damn much. Let's keep the next few months fun and let's not bicker about teraflops or I/Os, mmkay? Next-gen gonna be great for everyone.
P.S. if you haven't changed your avatar yet (I will be checking soon), I'd suggest you do it. It would be a shame if I had to have an admin do it and we accidentally left it on there forever :)
he said 2% for worst case.
thats 44 mhz. so it would drop from 2,230 mhz to 2,186 mhz.
the only issue i have with this is his statement saying 2 ghz was looking impossible. and that 3.5 ghz was looking impossible. well, xbox has 3.6 ghz and fixed clocks so what did they do that made it so possible with fixed clocks?
for gpu, is he saying that some instructions were hotter than others? so they wont be running those instructions at 2 or 2+ ghz? how would that impact performance?
we need to see the overall TDP budget of this thing. if this thing is a 150w gpu then there is zero cause for concern.
EDIT: I just realized that DF said third party devs were already running the gpu at 2.23 ghz albeit while downclocking the cpu. but cpu power savings are so small, i am pretty sure the ps5 cooling system can handle the gpu running at 2.23 ghz regardless of what instructions are being fed. cerny did tell richard that the final retail unit will handle everything gracefully without needing clock profiles or input from devs to downclock cpu to feed the gpu enough power. it should just work.
Oh, nice, very generous.
Looks like MS will be repurposing their black crush POP! engine for the third time. They just love that pop.
Just make a few more posts in here and you'll be good. We do it to discourage people from taking advantage of community giveaways, but since you have an avatar of mine, it's all good :Phaha I never knew you could put a "no lurkers" restriction on a giveaway.
I feel attacked.
Mine isn't that bad either. Still wish I had gotten the fuck you mars avatar, haha.Thanks for the great giveaway, Transistor.
Already got "lucky" with the avatar, but who knows, maybe i will have more luck.
You're more likely to hit the 'worst case' at two points in the console's lifespan - right at the start and right at the end. At the start because documentation is spotty, there are kludged-together workarounds to get things to work the way you think they're supposed to - or need to for the rest of the code, and it's the 'whole new world' of figuring out how everything works best. At the end because you've figured out all that stuff - for the most part, there's still a lot of 'that just works, leave it alone' commented blocks - and are just pushing the limits or past the limits of what the hardware can handle.Not to mention some people are acting like we're going to hit that worst case right out of the gate. And even in the worst case, we'd be looking at, what, a 10% drop?
Or at any time if you're bethesda. I assume engines like theirs or even rdr/gta engine will have issues staying under the power envelope.You're more likely to hit the 'worst case' at two points in the console's lifespan - right at the start and right at the end. At the start because documentation is spotty, there are kludged-together workarounds to get things to work the way you think they're supposed to - or need to for the rest of the code, and it's the 'whole new world' of figuring out how everything works best. At the end because you've figured out all that stuff - for the most part, there's still a lot of 'that just works, leave it alone' commented blocks - and are just pushing the limits or past the limits of what the hardware can handle.
he said 2% for worst case.
thats 44 mhz. so it would drop from 2,230 mhz to 2,186 mhz.
the only issue i have with this is his statement saying 2 ghz was looking impossible. and that 3.5 ghz was looking impossible. well, xbox has 3.6 ghz and fixed clocks so what did they do that made it so possible with fixed clocks?
for gpu, is he saying that some instructions were hotter than others? so they wont be running those instructions at 2 or 2+ ghz? how would that impact performance?
we need to see the overall TDP budget of this thing. if this thing is a 150w gpu then there is zero cause for concern.
EDIT: I just realized that DF said third party devs were already running the gpu at 2.23 ghz albeit while downclocking the cpu. but cpu power savings are so small, i am pretty sure the ps5 cooling system can handle the gpu running at 2.23 ghz regardless of what instructions are being fed. cerny did tell richard that the final retail unit will handle everything gracefully without needing clock profiles or input from devs to downclock cpu to feed the gpu enough power. it should just work.
Whoa, that's awesome! Thanks!
Is it only for those that got an avatar from you? Lol. Ill have to cancel my entry.Just make a few more posts in here and you'll be good. We do it to discourage people from taking advantage of community giveaways, but since you have an avatar of mine, it's all good :P
I took it to mean that the XSX simply has a larger power/thermal budget, since he mentioned the CPU and the thermals there were probably more of a known quantity early in development. From the context I did get the distinct impression he was talking about the variable clocks making the difference between hitting those speeds and not.Cerny is kinda weird with his statements sometimes, so maybe he meant when they were originally designing the console 2ghz seemed impossible, but it turned out that wasn't true.
2GHz was impossible because it was a fixed clock, so the console would have to run at 2GHz at high workloads and would probably shutdown.Cerny is kinda weird with his statements sometimes, so maybe he meant when they were originally designing the console 2ghz seemed impossible, but it turned out that wasn't true.
I don't think 2GHz is the base clock, I believe there's no base clock at all. If the code consumes so much power that it needs to run at 1.6GHz (or whatever number) then it will run at 1.6GHz. The catch is that devs will know when that worst case scenario will happen and have the choice to make a better code if they really need the compute power.I'm kind of expecting something like:
90% 2.23
9.9% 2.1
0.1% 2
I expect that it'll happen but it'll be so rare that it's not even worth pointing out.