• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.
  • We have made minor adjustments to how the search bar works on ResetEra. You can read about the changes here.
Status
Not open for further replies.

Phoenix15

Banned
Oct 23, 2019
598
Again, PS4's RAM change was a last minute change. Unless PS5 gets delayed, it's too late to change.

RAM is an easier thing to change, that's true, but it still requires a lot of setup.

It wasn't really a last minute change in fact.

They always wanted to have the 8GB GDDR5 but it takes times to negociate having it at the good price so they planned a "B plan" in case they negociation failed.
And while they speek with devs with the "plan B" just in case, they works in the 2 scenarios in the same time.
At last time, the negociation finally been sucessfull so they switch all to "Plan A" and put that RAM they always wanted to.
 

the-pi-guy

Member
Oct 29, 2017
6,276
It wasn't really a last minute change in fact.

They always wanted to have the 8GB GDDR5 but it takes times to negociate having it at the good price so they planned a "B plan" in case they negociation failed.
And while they speek with devs with the "plan B" just in case, they works in the 2 scenarios in the same time.
At last time, the negociation finally been sucessfull so they switch all to "Plan A" and put that RAM they always wanted to.
It was something they wanted to do, but it was a last minute push.

www.fraghero.com

Mark Cerny "PS4 8GB GDDR5 RAM Was Decided in Very Final Meeting"

During an interview with Game Informer, PS4 lead architect Mark Cerny revealed the decision of the PlayStation 4 receiving 8GB of GDDR5 RAM was decided at the very final meeting. At first, it was going to only be limited to 4GB but Cerny knew that GDDR5 is far superior and made for the...

"On the decision to have 8GB of RAM…I'm supposed to look at the hardware and talk to the developers. I am not a business guy. There is an organization Andy runs, which is the business organization.

So you don't want to speak up too soon. If you speak up too soon it's like, "That's a billion dollars we are talking about. Mark is good with the bits and the bytes but he doesn't understand this whole business thing." I very intentionally waited. They asked, "What do you think Mark?" I said, "Well, let's get feedback from the developers." I did not speak up until the very final meeting on the topic, and then I said, "I think we have to do this."
 

褲蓋Calo

Alt-Account
Banned
Jan 1, 2020
781
Shenzhen, China
How big is the cache? You've brought it up a few times, but never mentioned what size this cache actually is, which is seemingly a very important factor.

Microsoft themselves state that it's "typically easy for games to more than fill up their standard memory quota with CPU, audio data, stack data, and executable data, script data" with respect to the 3.5GB of slower bandwidth ram, and given that this cache you speak of is probably comparatively tiny in size, I find the notion that this ram won't be accessed frequently hard to believe.
I know it's hard to believe, but you'll just have to believe it : )

The gap between memory speed and CPU's speed is getting larger and larger, and to programmers RAM is nothing but an gigantic cache for the disk. If you turn off all the caches on your multi-thousand-dollar system all together, it'll feel like a 486.
 

褲蓋Calo

Alt-Account
Banned
Jan 1, 2020
781
Shenzhen, China
Never said anything about it being significant... or even that its a bad thing. Its just a by-product of their RAM layout choice. And I would imagine that they did a lot of testing and concluded that the bandwidth hit was acceptable.
All 10 GDDR6 interfaces operate simultaneously. At any given moment, say the CPU of XSX requests data reside on 2 chips, then the GPU will get to use 320 - 64 = 256 bits of bandwidth, for similar scenario on PS5, the GPU gets 256 - 64 = 192 bits of bandwidth.
 
Oct 27, 2017
7,139
Somewhere South

Lady Gaia put my thoughts here better than I could ever do it.

It's also easy to overstate the impact of cache. It has been a part of every high performance CPU design for decades, and yet we still provision computers today with substantial amounts of DDR4 bandwidth because the CPU still makes good use of it. Estimating exactly how much bandwidth CPU tasks will require is going to be difficult, as current generation titles running on the newer hardware is going to represent a much lighter CPU load than games designed from the ground up for the newer hardware.
 

Gully Bully

Member
Aug 19, 2019
145
Is that Tempest audio engine thing a discrete chip or something, or is it taking up silicon space in the main APU by sharing resources of the GPU or whatever?

In either case, I find it concerning that Sony would waste R&D resources and/or wafer acreage on audio matters, when that isn't even what the mainstream audience is going to care about as much as extra graphics power.

Could this audio engine investment be Sony's equivalent to Xbox One's Kinect camera and eSRAM follies/blunder? Are they not jumping through all these other hoops now, like variable frequency, just to contain costs and power budget? Sounds like something that could've been avoidable.

It would seem that Cerny's only ace in the hole really is this super fast SSD that makes it a Super PlayStation, and it's the only thing keeping me in the PlayStation camp; but I feel like this is going to be dangerously close to PS3 all over again where it will be an exclusives machine again and the exclusives are never even all that good because they don't have Zelda nor Mario.

So they have an ace in the hole, but their saving grace will be to not fuck up the marketing (by continuing to focus on games, games, games). Let's hope Sony can pull it off by just maintaining their current course. Just let Cerny continue to do all the talking, and lock down that Activision/Call-of-Duty marketing.
 

Wonderbrah

Banned
Nov 7, 2017
278
It wasn't really a last minute change in fact.

They always wanted to have the 8GB GDDR5 but it takes times to negociate having it at the good price so they planned a "B plan" in case they negociation failed.
And while they speek with devs with the "plan B" just in case, they works in the 2 scenarios in the same time.
At last time, the negociation finally been sucessfull so they switch all to "Plan A" and put that RAM they always wanted to.

I thought it was the opposite. From the onset 4GB was planned. Plan B was 8GB if price allowed. At the last minute, prices dropped and they pushed for 8GB
 

褲蓋Calo

Alt-Account
Banned
Jan 1, 2020
781
Shenzhen, China
It's also easy to overstate the impact of cache. It has been a part of every high performance CPU design for decades, and yet we still provision computers today with substantial amounts of DDR4 bandwidth because the CPU still makes good use of it. Estimating exactly how much bandwidth CPU tasks will require is going to be difficult, as current generation titles running on the newer hardware is going to represent a much lighter CPU load than games designed from the ground up for the newer hardware.
I don't think he's overstating the impact of cache. Comparing to GPU's constant stream of data, CPU's memory access still look few and far between. Even in bandwidth's terms, a high end quad channel DDR4 setup can achieve somewhere 70GB/s of bandwidth, say with a CPU running at 3.6 GHz, that gives you a quarter of a typical cache line accessed per cycle. Doesn't sound like a lot but it should already be an overkill for consoles, as I don't think anyone will be rendering 4K raw videos on their gaming consoles.
 
Last edited:

Phoenix15

Banned
Oct 23, 2019
598
It was something they wanted to do, but it was a last minute push.

www.fraghero.com

Mark Cerny "PS4 8GB GDDR5 RAM Was Decided in Very Final Meeting"

During an interview with Game Informer, PS4 lead architect Mark Cerny revealed the decision of the PlayStation 4 receiving 8GB of GDDR5 RAM was decided at the very final meeting. At first, it was going to only be limited to 4GB but Cerny knew that GDDR5 is far superior and made for the...

I know they finallywwas able to do that at the last minute but you do understand that it's absolutely different to hope for having 8GB since the beginning but plan also 4GB just in case and work on both possibilities during all the process.

Than change totally the final specs without any prior preparation and after the official announcement of this specs.

I thought it was the opposite. From the onset 4GB was planned. Plan B was 8GB if price allowed. At the last minute, prices dropped and they pushed for 8GB

Just a point of vue, but the "Plan A" is supposed to be the better.

The key here is that they worked on both possibilities during all the process from the beginning.
 

androvsky

Member
Oct 27, 2017
3,507
All 10 GDDR6 interfaces operate simultaneously. At any given moment, say the CPU of XSX requests data reside on 2 chips, then the GPU will get to use 320 - 64 = 256 bits of bandwidth, for similar scenario on PS5, the GPU gets 256 - 64 = 192 bits of bandwidth.
That sounds like an unusual memory setup. When you say GDDR6 interfaces, do you mean every GDDR6 memory chip is addressed separately, like they each have their own memory controller? That's a bit surprising, and if you've got a link that details that I'd like to see it.


Is that Tempest audio engine thing a discrete chip or something, or is it taking up silicon space in the main APU by sharing resources of the GPU or whatever?

In either case, I find it concerning that Sony would waste R&D resources and/or wafer acreage on audio matters, when that isn't even what the mainstream audience is going to care about as much as extra graphics power.

Could this audio engine investment be Sony's equivalent to Xbox One's Kinect camera and eSRAM follies/blunder? Are they not jumping through all these other hoops now, like variable frequency, just to contain costs and power budget? Sounds like something that could've been avoidable.

It would seem that Cerny's only ace in the hole really is this super fast SSD that makes it a Super PlayStation, and it's the only thing keeping me in the PlayStation camp; but I feel like this is going to be dangerously close to PS3 all over again where it will be an exclusives machine again and the exclusives are never even all that good because they don't have Zelda nor Mario.

So they have an ace in the hole, but their saving grace will be to not fuck up the marketing (by continuing to focus on games, games, games). Let's hope Sony can pull it off by just maintaining their current course. Just let Cerny continue to do all the talking, and lock down that Activision/Call-of-Duty marketing.
The Tempest Audio Engine is basically one extra compute unit, I doubt the customization added a lot of area. So it's less than 1/36 of the GPU area. MS has put R&D and silicon wafer area into audio on the XSX too from what I understand; and if not, it'll detract from their GPU/CPU advantage.
 

disco_potato

Member
Nov 16, 2017
3,145
Is that Tempest audio engine thing a discrete chip or something, or is it taking up silicon space in the main APU by sharing resources of the GPU or whatever?

In either case, I find it concerning that Sony would waste R&D resources and/or wafer acreage on audio matters, when that isn't even what the mainstream audience is going to care about as much as extra graphics power.

Could this audio engine investment be Sony's equivalent to Xbox One's Kinect camera and eSRAM follies/blunder? Are they not jumping through all these other hoops now, like variable frequency, just to contain costs and power budget? Sounds like something that could've been avoidable.

It would seem that Cerny's only ace in the hole really is this super fast SSD that makes it a Super PlayStation, and it's the only thing keeping me in the PlayStation camp; but I feel like this is going to be dangerously close to PS3 all over again where it will be an exclusives machine again and the exclusives are never even all that good because they don't have Zelda nor Mario.

So they have an ace in the hole, but their saving grace will be to not fuck up the marketing (by continuing to focus on games, games, games). Let's hope Sony can pull it off by just maintaining their current course. Just let Cerny continue to do all the talking, and lock down that Activision/Call-of-Duty marketing.


Read towards the bottom of the article and come to your own conclusions.
www.eurogamer.net

Inside PlayStation 5: the specs and the tech that deliver Sony's next-gen vision

Sony has broken its silence. PlayStation 5 specifications are now out in the open with system architect Mark Cerny deli…
 

褲蓋Calo

Alt-Account
Banned
Jan 1, 2020
781
Shenzhen, China
That sounds like an unusual memory setup. When you say GDDR6 interfaces, do you mean every GDDR6 memory chip is addressed separately, like they each have their own memory controller? That's a bit surprising, and if you've got a link that details that I'd like to see it.
I don't know about the controller situation, but at least the number of GDDR6 PHY(physical layer interface) on these SoCs and GPUs matches the number of GDDR6 memory chip they have.

edit: That's seems to be expected. I imagine the memory controller will hold a queue of request that can be dispatched to every physical interface in a parallel fashion, but I'll need to check about that.
 
Last edited:
Mar 22, 2020
87
Ideally, I think the two consoles would be better with 16 Gbps GDDR6 module for 512 GB/s of bandwidth for Sony and 640 GB/s for Microsoft but they want to reach a certain MSRP. And only SK Hynix offer 16 Gbps 2 GB module, only in sample phase for Samsung.
I don't recall SKHynix announcing 14Gbps 16GB ICs, and definitely not anything over 14Gbps, for 16Gbps and more you would only find Samsung ICs for both 8Gb and 16Gb, in fact, Digital Foundry showed the PCB memory layout for those wondering. However, both Micron and Samsung can equip the PS5 since it appears it's 8 ICs of 16Gb, 14Gbps.
The top 4 chips are 16Gb, on each side of 3, the top and bottom chips are 8Gb, the middle one 16Gb. So it does redirect some of the higher bandwidth ICs towards the GPU and the lower pool is mostly over the CPU controllers.

The reason why 48GB/s of bus traffic for the CPU on Series X costs you 80GB/s of the theoretical peak GPU bandwidth is because it's tying up the whole 320-bit bus to transmit only 192 bits of data per cycle from the slower portion of RAM we've been told will be typically used by the CPU. 48GB / 192 bits * 320 bits = 80GB of effective bandwidth used to make that 48GB/s available. This is because only six of the ten RAM chips can contribute to that additional 6GB over and above the 10GB of RAM that can be accessed more quickly when all ten are used in parallel.
I agree with you that the GPU accessing all 10 ICs at 560GB/s will never happen (or at least 99.99999% of the time when the CPU uses any memory). However, I still don't know about the overall cost to CPU memory access, are we sure they use 32bit buses for all 6 2GB ICs ? There is a 2 channel (16 bits wide) mode on GDDR6, wouldn't it make more sense to have the GPU still access 2GB ICs with limited width ?
Something like:
  • 1.75GT/s * ((32bits * 4) + (16bits * 6)) ~ 392 GB/s, even while the CPU is using memory,
  • while 1.75GT/s * (16bits * 6)) ~ 168 GB/s for the CPU bandwidth.
I gather MS mentioned bandwidth figures for both pools with full width (336 GB/s) using per die peak bandwidth. But in that case they also did assume no CPU impact on the faster 560 GB/s pool. Just tell me what you think.

Is that Tempest audio engine thing a discrete chip or something, or is it taking up silicon space in the main APU by sharing resources of the GPU or whatever?
The Tempest engine is an audio ASIC, from what I gather it must be part of the APU, much like all the dedicated I/O ASICs are embedded into the SoC of both consoles. I don't think it's part of the GPU, even it apparently is a repurposed RDNA2 CU.
In either case, I find it concerning that Sony would waste R&D resources and/or wafer acreage on audio matters, when that isn't even what the mainstream audience is going to care about as much as extra graphics power.
Could this audio engine investment be Sony's equivalent to Xbox One's Kinect camera and eSRAM follies/blunder? Are they not jumping through all these other hoops now, like variable frequency, just to contain costs and power budget? Sounds like something that could've been avoidable.
I think the goal of all that dedicated hardware is to save as much CPU and GPU power as possible. ASICs should be a worthwhile investment if developers suddenly want to make use of all the new functionalities. While the CPU is free from handling I/O and compression/decompression and the audio doesn't need any GPU resources, all that power goes into holding >2.1GHz clocks. It's actually a guarantee that a lot of thought went into being as power efficient as possible, given that probably ~300mm² and a few ICs will burn ~250W of power.

That sounds like an unusual memory setup. When you say GDDR6 interfaces, do you mean every GDDR6 memory chip is addressed separately, like they each have their own memory controller? That's a bit surprising, and if you've got a link that details that I'd like to see it.
GDDR6 ICs use 32 bits data buses to the memory controllers, so the SoC on both consoles has 10 or 8 32 bits GDDR6 controllers to handle them separately. They can also perform in 2 channel operations for 16bits wide transfers with little performance loss.
Scorpio did use something similar:
en.wikichip.org

Scorpio Engine - Microsoft - WikiChip

Scorpio Engine is a 64-bit octa-core x86 SoC designed by AMD and Microsoft for their Xbox One X. The chip features eight Enhanced Jaguar cores operating at 2.3 GHz and a custom Arctic Islands-based GPU operating at 1.172 GHz. Fabricated on TSMC's 16FF+, this chip supports 12 (24 for Dev) GiB of...
I made a post on this in another thread because DF recently gave us enough time to check out who made the ICs and what the layout actually was. It's unclear how the underlying separation exactly works though.
 

Gully Bully

Member
Aug 19, 2019
145
The Tempest Audio Engine is basically one extra compute unit, I doubt the customization added a lot of area. So it's less than 1/36 of the GPU area. MS has put R&D and silicon wafer area into audio on the XSX too from what I understand; and if not, it'll detract from their GPU/CPU advantage.

Ah, ok. In that case, it is not a devastating investment at all. I am squarely back in the camp of PlayStation, then.

The Tempest engine is an audio ASIC, from what I gather it must be part of the APU, much like all the dedicated I/O ASICs are embedded into the SoC of both consoles. I don't think it's part of the GPU, even it apparently is a repurposed RDNA2 CU.

I think the goal of all that dedicated hardware is to save as much CPU and GPU power as possible. ASICs should be a worthwhile investment if developers suddenly want to make use of all the new functionalities. While the CPU is free from handling I/O and compression/decompression and the audio doesn't need any GPU resources, all that power goes into holding >2.1GHz clocks. It's actually a guarantee that a lot of thought went into being as power efficient as possible, given that probably ~300mm² and a few ICs will burn ~250W of power.

I like the sound of this even better.

Cerny remains GOAT.

Read towards the bottom of the article and come to your own conclusions.
www.eurogamer.net

Inside PlayStation 5: the specs and the tech that deliver Sony's next-gen vision

Sony has broken its silence. PlayStation 5 specifications are now out in the open with system architect Mark Cerny deli…

Thank you for this read. I made a big fuss about nothing. I can honestly say to myself that their SSD tech is worth the sacrifice of a couple extra teraflops.
 

Gemüsepizza

Member
Oct 26, 2017
2,541
Just saw this tweet by Brad Sams:



Pretty disappointing tbh. First he made up some conspiracy stuff about PS5 actually being 9.2 TFLOPS, and now he is implying that Cerny lied and that the raw SSD speed is actually lower. I always thought that he is one of the more reputable journalists, but he went full tinfoil hat console warrior with this crap. Shame.
 

褲蓋Calo

Alt-Account
Banned
Jan 1, 2020
781
Shenzhen, China
GDDR6 ICs use 32 bits data buses to the memory controllers, so the SoC on both consoles has 10 or 8 32 bits GDDR6 controllers to handle them separately. They can also perform in 2 channel operations for 16bits wide transfers with little performance loss.
A small correction - Each memory chip is paired with a PHY (physical interface), not necessarily a memory controller. This is true for Scorpio Engine:
scorpio_engine_block_diagram.png

But I imagine the MCT should be able to dispatch requests in parallel.



Wait it seems you're right:
Each pair of Render Boxes is wired to one Memory Controller Cluster (MCC). Each MMC consists of two dedicated memory controllers and two more that are shared between two pairs of MCCs. In total there are four clusters with each having two dedicated controllers and two more pair (4 controllers) of channels shared between a pair of MCCs for a total of 12 channels.
 

Kyoufu

Member
Oct 26, 2017
16,582
Just saw this tweet by Brad Sams:



Pretty disappointing tbh. First he made up some conspiracy stuff about PS5 actually being 9.2 TFLOPS, and now he is implying that Cerny lied and that the raw SSD speed is actually lower. I always thought that he is one of the more reputable journalists, but he went full tinfoil hat console warrior with this crap. Shame.


Brad Sams has been going full fanboy for a while now. Not that it's surprising.
 

lukeskymac

Banned
Oct 30, 2017
992
Is that Tempest audio engine thing a discrete chip or something, or is it taking up silicon space in the main APU by sharing resources of the GPU or whatever?

In either case, I find it concerning that Sony would waste R&D resources and/or wafer acreage on audio matters, when that isn't even what the mainstream audience is going to care about as much as extra graphics power.

Could this audio engine investment be Sony's equivalent to Xbox One's Kinect camera and eSRAM follies/blunder? Are they not jumping through all these other hoops now, like variable frequency, just to contain costs and power budget? Sounds like something that could've been avoidable.

The notion that dedicating less than 1/40th of your APU to an audio ASIC that'll get its own API so that every game can have quality positional audio is "a waste" should be offensive to anyone with a set of working ears, and comparing it to the band-aid of strategically compromised design that was DDR3+eSRAM is downright trolling.
 

Gully Bully

Member
Aug 19, 2019
145
Just saw this tweet by Brad Sams:



Pretty disappointing tbh. First he made up some conspiracy stuff about PS5 actually being 9.2 TFLOPS, and now he is implying that Cerny lied and that the raw SSD speed is actually lower. I always thought that he is one of the more reputable journalists, but he went full tinfoil hat console warrior with this crap. Shame.


Imagine having to stan a console that has "XBsX" as its acronym and you type it out shillingly and unironically. Tsk. Couldn't be me.

The notion that dedicating less than 1/40th of your APU to an audio ASIC that'll get its own API so that every game can have quality positional audio is "a waste" should be offensive to anyone with a set of working ears, and comparing it to the band-aid of strategically compromised design that was DDR3+eSRAM is downright trolling.

My bad, breh. I wasn't really keeping up on talking points until now. I feel much more reassured and secure in my Day 1 purchase of PS5 now.
 
Mar 22, 2020
87
Pretty disappointing tbh. First he made up some conspiracy stuff about PS5 actually being 9.2 TFLOPS, and now he is implying that Cerny lied and that the raw SSD speed is actually lower. I always thought that he is one of the more reputable journalists, but he went full tinfoil hat console warrior with this crap. Shame.
idk who this person is and what his previous tweets were. But is it wrong to assume performance figures probably should be based on sustainable clocks or realistic scenarios ? Just like MS put forward 560 GB/s of memory bandwidth but can realistically only guarantee at least 224 GB/s to the GPU 100% of time and at least ~392 GB/s in CPU intensive workloads ?
 

lukeskymac

Banned
Oct 30, 2017
992
It's not like only a few select audiophiles will get to enjoy it either. Anyone with a pair of headphones is getting support on day 1. Any decent binaural demo on YouTube should give you an idea of what it's like - although without the selectable HRTF that is frankly one of the most novel things about it.
 

Gully Bully

Member
Aug 19, 2019
145
It's not like only a few select audiophiles will get to enjoy it either. Anyone with a pair of headphones is getting support on day 1. Any decent binaural demo on YouTube should give you an idea of what it's like - although without the selectable HRTF that is frankly one of the most novel things about it.

Yeah, I read about that in the link that disco_potato shared, and I was thoroughly impressed and pleased simultaneously. I will finally join the ranks of an audiophile, with my headphones. Cerny has made it possible for me, while giving me the fastest SSD money can buy at console price. Win-win like no other forreal.
 
Last edited:

Cyborg

Banned
Oct 30, 2017
1,955
Just saw this tweet by Brad Sams:



Pretty disappointing tbh. First he made up some conspiracy stuff about PS5 actually being 9.2 TFLOPS, and now he is implying that Cerny lied and that the raw SSD speed is actually lower. I always thought that he is one of the more reputable journalists, but he went full tinfoil hat console warrior with this crap. Shame.

Why would he make such a claim? Is he saying that the given information is wrong?
 

Expy

Member
Oct 26, 2017
9,865
Just saw this tweet by Brad Sams:



Pretty disappointing tbh. First he made up some conspiracy stuff about PS5 actually being 9.2 TFLOPS, and now he is implying that Cerny lied and that the raw SSD speed is actually lower. I always thought that he is one of the more reputable journalists, but he went full tinfoil hat console warrior with this crap. Shame.

So he's basically calling Cerny a liar.
 

jroc74

Member
Oct 27, 2017
28,996
I wonder why Sony has not shared how many GB of RAM will be used for games. Maybe it's not yet decided?
I'm curious about this too. I think its info that will come with a proper reveal. That GDC stream was just that, for GDC.

A proper consumer reveal should be coming sometime before launch. With features, UI, etc.

Interesting discussion about the split memory, That was one of the first things that caught my eye with the specs reveal.

Like it's been said, these consoles will have compromises. It's normal.

About that Sams tweet and that previous article. War never changes.
 

Sprat

Member
Oct 27, 2017
4,684
England
Is that Tempest audio engine thing a discrete chip or something, or is it taking up silicon space in the main APU by sharing resources of the GPU or whatever?

In either case, I find it concerning that Sony would waste R&D resources and/or wafer acreage on audio matters, when that isn't even what the mainstream audience is going to care about as much as extra graphics power.

Could this audio engine investment be Sony's equivalent to Xbox One's Kinect camera and eSRAM follies/blunder? Are they not jumping through all these other hoops now, like variable frequency, just to contain costs and power budget? Sounds like something that could've been avoidable.

It would seem that Cerny's only ace in the hole really is this super fast SSD that makes it a Super PlayStation, and it's the only thing keeping me in the PlayStation camp; but I feel like this is going to be dangerously close to PS3 all over again where it will be an exclusives machine again and the exclusives are never even all that good because they don't have Zelda nor Mario.

So they have an ace in the hole, but their saving grace will be to not fuck up the marketing (by continuing to focus on games, games, games). Let's hope Sony can pull it off by just maintaining their current course. Just let Cerny continue to do all the talking, and lock down that Activision/Call-of-Duty marketing.
The audio stuff is the only thing interesting to me about the ps5 so far. It uses a modified cu that works similar to a cell spu
 

Dimajjio

Member
Oct 13, 2019
782
I thought the reason for Xbox memory was due to signalling issues? They couldn't get 16GB working at full bandwidth or something.
 

Lady Gaia

Member
Oct 27, 2017
2,479
Seattle
I still don't know about the overall cost to CPU memory access, are we sure they use 32bit buses for all 6 2GB ICs?

I think that's clear for both consoles, as it lines up with the specifications that have been published. On PS5, it's 8 * 32 = 256 bit wide data bus, and on Series X it's 10 * 32 = 320 bit wide for 10GB, and 6 * 32 = 192 bit wide for the remaining 6GB. There's no ambiguity there that I've seen.

PS4 was a ram config and MB change, not a simple chip swap out.

I'm fairly certain that wasn't the case. The dev kits used the same motherboard layout with 8GB, it was just a question of volume economics that made the decision for retail units difficult to commit to. Cerny's description of how it played out argued that it wasn't some breakthrough in production or availability, but arguing that it was necessary to accept the higher BOM to have a compelling product.
 

platocplx

2020 Member Elect
Member
Oct 30, 2017
36,072
idk who this person is and what his previous tweets were. But is it wrong to assume performance figures probably should be based on sustainable clocks or realistic scenarios ? Just like MS put forward 560 GB/s of memory bandwidth but can realistically only guarantee at least 224 GB/s to the GPU 100% of time and at least ~392 GB/s in CPU intensive workloads ?
it is wrong to assume. the PS5 based on cernys demo will perform closer to the 10.28 than it would the 9.2 and he even said he liked GPUs at a higher frequency for a while and they even had to cap the GPU because it was going at too high a frequency if i remember correctly. any one that claims the 9.2 number doesnt really understand at all whats going on.
 

Proven

Banned
Oct 29, 2017
5,841
Era wonders why there are very few actual sources that come onto this site but then will quickly call anyone that isn't agreeing with them or defending their favorite companies honor biased, astroturfing, etc etc.
 

Deleted member 45460

User requested account closure
Banned
Jun 27, 2018
1,492
I don't think that Sony, Microsoft, or Nintendo pay any journalists under the table to run PR for them, and that's a really serious accusation to make (and it's been done repeatedly in this thread).
 

Expy

Member
Oct 26, 2017
9,865
Era wonders why there are very few actual sources that come onto this site but then will quickly call anyone that isn't agreeing with them or defending their favorite companies honor biased, astroturfing, etc etc.
So you're saying Brad Sams knows more than Cerny about the PS5?
 

jroc74

Member
Oct 27, 2017
28,996
Is that Tempest audio engine thing a discrete chip or something, or is it taking up silicon space in the main APU by sharing resources of the GPU or whatever?

In either case, I find it concerning that Sony would waste R&D resources and/or wafer acreage on audio matters, when that isn't even what the mainstream audience is going to care about as much as extra graphics power.

Could this audio engine investment be Sony's equivalent to Xbox One's Kinect camera and eSRAM follies/blunder? Are they not jumping through all these other hoops now, like variable frequency, just to contain costs and power budget? Sounds like something that could've been avoidable.

It would seem that Cerny's only ace in the hole really is this super fast SSD that makes it a Super PlayStation, and it's the only thing keeping me in the PlayStation camp; but I feel like this is going to be dangerously close to PS3 all over again where it will be an exclusives machine again and the exclusives are never even all that good because they don't have Zelda nor Mario.

So they have an ace in the hole, but their saving grace will be to not fuck up the marketing (by continuing to focus on games, games, games). Let's hope Sony can pull it off by just maintaining their current course. Just let Cerny continue to do all the talking, and lock down that Activision/Call-of-Duty marketing.
I'm more convinced Sony is trying to target $399. Or so least cheaper than $499.

Comparing it to Kinect would mean no dev would use it. Like someone tried comparing the SSD to Kinect.

All because teraflops.

I do think they made some decisions because of price and BC. We won't be able to compare anything to Kinect for the XBO until at least 2 years after launch.
 
Status
Not open for further replies.