• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.
Status
Not open for further replies.

SeanMN

Member
Oct 28, 2017
2,185
Early December I fully expect Scarlett specs alongside PS5.
Keep in mind that other than the initial pre-E3 2016 leak of a ~6TF Xbox console, beyond what was in the E3 2016 announcement, no additional details on Scorpio/1X leaked. The full specs were disclosed by DF April 2017. Though that wasn't a full new generation, so perhaps there didn't need to be as much information provided, or devkits weren't delivered till spring 2017. Any additional info you can share on this Albert Penello?

Tom Warren said Xbox was being cleaver about preventing leaks with Scarlett. Last gen, other than the mostly accurate, but at the time dismissed pastebin (1.2 TF CPU), Durango didn't leak till January 2013. If we apply a similar timing to the Scarlett SOC as was on Scorpio, they didn't even get that chip back and booted for the first time till Dec 2016 (according to Albert), with devkits based on those in the spring of 2017. It's likely that Xbox won't need to provide any additional info (other than what is currently out there) on Scarlett to devs till those near final kits ship out.

I think we'll be leak free till the new year.
 

Albert Penello

Verified
Nov 2, 2017
320
Redmond, WA
It's a different scenario with a full generation change than with Scorpio. There was really no downside to sharing because they knew the numbers (specs) were good and the price was going to be controversial but it was never the intent for Scorpio to be anything other than a premium console.

I'm not sure either side (MS or Sony) has a clear idea what the other are doing, so I think both will be more cautious on what they share this time. That's my guess.
 

JediTimeBoy

Member
Oct 27, 2017
6,810
Sorry if this sounds stupid, but when people say next-gen will have SSDs, will it be like the 2.5" or more likely to be NVMe ? I'm guessing 2.5" due to cost.
 

gofreak

Member
Oct 26, 2017
7,734
They already said it would be NVME

Who did?

For Sony, they say it's a custom SSD. It could be anything - maybe nvme, maybe a totally custom bus and interface.

For Scarlet, I'm not sure they confirmed what tech their ssd is using - they said in the E3 video 'we've created a new generation of ssd', and Digital Foundry says they've heard it's also a custom unit. (Of course a custom unit could use nvme, but... did MS confirm that?)
 
Last edited:

Deleted member 1589

User requested account closure
Banned
Oct 25, 2017
8,576
Who did?

For Sony, they say it's a custom SSD. It could be anything, might even be a totally custom bus and interface.

For Scarlet, I'm not sure they confirmed what tech their ssd is using - they said in the E3 video 'we've build a next generation of ssd', and Digital Foundry says they've heard it's also a custom unit. (Of course a custom unit could use nvme, but... did MS confirm that?)
You're right. Officially we dont know. Only rumors from Kobrille and... that other insider. Hsmigg?
 

AegonSnake

Banned
Oct 25, 2017
9,566
I've said this before, but perhaps it's worth repeating: for the specifications that people are speculating about, I can see where $499 makes sense.

However, I believe some of these to be optimistic. And there is actually nothing that has been disclosed officially by either Sony or Microsoft that precludes $399. We do not know the TFLOPS. We don't know the RAM (amount) and we don't know the size of the SSD. Each of these would contribute greatly to the overall BOM of the console.

So just simply based on the official statements from both Sony and Microsoft, $399 is not out of the question.

As for the Xbox One X - I happen to know a little about that. Part of what drove $499 for the X was simply the knowledge that it would be a smaller part of the overall mix. Meaning, what drove the cost was that it wasn't going to be the volume leader. Knowing that, it allowed MS to spend more on increasing the amount of memory, better cooling solution, etc. So the extra $100 was used to make the device even more premium. It would have been possible to do a version of the X at $399, but it would have meant compromises in other ways. So you could have gotten something close to the X for $399 if you were willing to give up a few things. This is what Sony decided to do and how they landed the Pro at $399.

I'm not saying that I know anything about what Sony is doing for sure. If they launch at $499 I would be surprised but I'm not suggesting it's impossible or crazy or bad or the rumors are wrong. It's just my guess that they will make some tradeoffs in order to get to a $399 sales price because it's been very successful for them and I happen to know it's not out of the question.

So to answer your question directly: Yes I do. It's certainly possible. I guess we will have to wait and see.
While I think $399 is unlikely, i think it might be possible if we assume a few things:

- 1TB SSD costs the same as 500GB HDD did in 2013.
- 16 GB GDDR6 costs the same as 8GB GDDR5 did in 2013.
- UHD costs the same as a Bluray drive in 2013.

This leaves us with a $381 BOM and a weakish $100 budget for the SOC. I think this would be your 8tflops console.

409628-ihs-xbox-one-teardown.jpg


Where I disagree with you in regards to performance matching 2013 levels is the fact that Sony and MS are now competing for the same price and performance. They will both be willing to take a loss in order to have the more powerful console. if that means going with a $150 APU then fine, sell for a $30 loss you can make up with a few first party game sales and services. If they go with a $150 APU then they will have to invest in a premium cooling solution like vapor chamber cooling like you guys did with the x1x. I dont think it's going to cost much more than $20 on top of the standard cooling the PS4 used in 2013.

So that gives us a $450 BOM for a console that has an APU that cost 50% more. With 7nm costs being much higher than the 28nm fab process, its possible that $150 gives us a 350mm2 GPU, same size as a PS4. That is still a 50-60 CU GPU. >10 tflops.

Again, thats assuming the ssd, vram and optical drive prices remain the same which is very unlikely. 1 TB HDDs were selling for $50-60 back then and Sony went with a 0.5 TB HDD for $37 a pop. 1TB SSD is going for over $110 right now.
 
Jun 23, 2019
6,446
It's a different scenario with a full generation change than with Scorpio. There was really no downside to sharing because they knew the numbers (specs) were good and the price was going to be controversial but it was never the intent for Scorpio to be anything other than a premium console.

I'm not sure either side (MS or Sony) has a clear idea what the other are doing, so I think both will be more cautious on what they share this time. That's my guess.

A pretty smart way to look at it. MS and Sony aren't dumb and you yourself can attest to it Albert by way of MS. These companies are not ignorant to social media and the news. They are reading these forums, seeing rumors, etc. the same way we are. Sony knew exactly what they were doing announcing PS5 in Wired. It's all orchestrated to raise the hype levels. It would have to take some catastrophic level of leaks to get any company to respond at this point of time.
 

Deleted member 61179

User requested account closure
Banned
Nov 7, 2019
121
So I stole this post from beyond3d and it may be interesting for some:

"I am reposting this again here so it can have a better exposure.

According to an obscure discussion thread on Anandtech's forum, a member that is known to be a developer had some posts about the hardware RT in consoles, he said that consoles will support 32 bit snorm format and have a more programmable traversal stage, he claims current DXR hardware lacks those, and so will be at a disadvantage moving forward, requiring new DXR hardware. His points are contested in the thread though, and he never explained the usefulness of the snorm format, so I don't know about the accuracy of his argument.

There are three problems with the actual implementation.
- It's not support the 32-bit snorm format, because the fixed function hardware is not designed around it. This is a huge limitation, and it can sacrifice the performance and the memory too much, and the supported 32-bit float format don't gives you really better results.
- The used acceleration structures are not public, so it could result extremely wild performance variations depending on the scenes. This needs to be solved.
- The ray traversal stage is extremely limited. It should be programmable.
He also claims that Xbox and PS5 will have different pipelines for RT.
The PS5 has a very different RT solution compared to what is implemented in DXR. Even the Xbox has an upgraded pipeline
https://forums.anandtech.com/threads/ray-tracing-is-in-all-next-gen-consoles.2571546/#post-39954331

He also says that consoles will do custom BVHs."

I can't say too much about this, but the next step will be the custom BVHs.
 

amstradcpc

Member
Oct 27, 2017
1,768
So I stole this post from beyond3d and it may be interesting for some:

"I am reposting this again here so it can have a better exposure.

According to an obscure discussion thread on Anandtech's forum, a member that is known to be a developer had some posts about the hardware RT in consoles, he said that consoles will support 32 bit snorm format and have a more programmable traversal stage, he claims current DXR hardware lacks those, and so will be at a disadvantage moving forward, requiring new DXR hardware. His points are contested in the thread though, and he never explained the usefulness of the snorm format, so I don't know about the accuracy of his argument.


He also claims that Xbox and PS5 will have different pipelines for RT.

https://forums.anandtech.com/threads/ray-tracing-is-in-all-next-gen-consoles.2571546/#post-39954331

He also says that consoles will do custom BVHs."
This sounds like Sony went with Power Vr Wizard.
 
Oct 26, 2017
6,151
United Kingdom

PLASTICA-MAN

Member
Oct 26, 2017
23,573
There won't be a 400 buck PS5, just forget it. The best you can hope for is a 450 buck 1 TB model and another 500 buck 2TB model if not then a 450 buck 500 GB model and 500 buck 1TB model.
But we will mostly get one 500 buck 1TB model.
 

Chamon

Member
Feb 26, 2019
1,221
For me, the biggest problem of Scarlett being 400€, is what will happen then to Xbox One X? It will have to drop its price by a huge amount if they are to coexist in the market.
 

VX1

Member
Oct 28, 2017
7,000
Europe
So I stole this post from beyond3d and it may be interesting for some:

"I am reposting this again here so it can have a better exposure.

According to an obscure discussion thread on Anandtech's forum, a member that is known to be a developer had some posts about the hardware RT in consoles, he said that consoles will support 32 bit snorm format and have a more programmable traversal stage, he claims current DXR hardware lacks those, and so will be at a disadvantage moving forward, requiring new DXR hardware. His points are contested in the thread though, and he never explained the usefulness of the snorm format, so I don't know about the accuracy of his argument.


He also claims that Xbox and PS5 will have different pipelines for RT.

https://forums.anandtech.com/threads/ray-tracing-is-in-all-next-gen-consoles.2571546/#post-39954331

He also says that consoles will do custom BVHs."
anexanhume what does all this mean regarding RT...?
 

AegonSnake

Banned
Oct 25, 2017
9,566
So I stole this post from beyond3d and it may be interesting for some:

"I am reposting this again here so it can have a better exposure.

According to an obscure discussion thread on Anandtech's forum, a member that is known to be a developer had some posts about the hardware RT in consoles, he said that consoles will support 32 bit snorm format and have a more programmable traversal stage, he claims current DXR hardware lacks those, and so will be at a disadvantage moving forward, requiring new DXR hardware. His points are contested in the thread though, and he never explained the usefulness of the snorm format, so I don't know about the accuracy of his argument.


He also claims that Xbox and PS5 will have different pipelines for RT.

https://forums.anandtech.com/threads/ray-tracing-is-in-all-next-gen-consoles.2571546/#post-39954331

He also says that consoles will do custom BVHs."
eh. this is not the first time we have heard about Sony's RT implementation being different. I am sure the first party devs will take full advantage of it but i dont want another cell fiasco with poor performing early gen multiplats.
 

SublimeAnarky

Member
Oct 27, 2017
811
Copenhagen, Denmark
Albert Penello Now that we're reaching the end of the current gen - are you able to share any more insight on what the design process was during the X1 rollout?

I'm specifically curious about 'when' in the process was it that you got a picture of what the PS4 was going to look like - and if the RAM change, Sony allegedly made, came to your notice.

I'm also curious on whether you think that we should be placing any stock today in the state of current devkits as we hear - given both consoles are near holiday 2020 releases.
 

disco_potato

Member
Nov 16, 2017
3,145
For me, the biggest problem of Scarlett being 400€, is what will happen then to Xbox One X? It will have to drop its price by a huge amount if they are to coexist in the market.

Even if it's $499, one X needs a lower price. Also, keep in mind, one X has been "readily" available for $350 from early on in its life.
 

PLASTICA-MAN

Member
Oct 26, 2017
23,573
So I stole this post from beyond3d and it may be interesting for some:

"I am reposting this again here so it can have a better exposure.

According to an obscure discussion thread on Anandtech's forum, a member that is known to be a developer had some posts about the hardware RT in consoles, he said that consoles will support 32 bit snorm format and have a more programmable traversal stage, he claims current DXR hardware lacks those, and so will be at a disadvantage moving forward, requiring new DXR hardware. His points are contested in the thread though, and he never explained the usefulness of the snorm format, so I don't know about the accuracy of his argument.


He also claims that Xbox and PS5 will have different pipelines for RT.

https://forums.anandtech.com/threads/ray-tracing-is-in-all-next-gen-consoles.2571546/#post-39954331

He also says that consoles will do custom BVHs."

And you forget this:

So you are saying that PS5 will have a better RT performance than an RTX 2080Ti?
I wouldn't compare these, because PS5 do this with a different graphics pipeline. If DXR would allow that pipeline, probably the PC hardwares can get the same speed

From that wording, regarldess of the problems th PS5 RT will have for not supporting DXR, it seems the RT capabilities are more advanced than the one in the RTX 2080TI card. Does that mean thet RT tech in the new ATI GPUs is better? It remaiins to be seen and possibly not out of this world because the current RTX tech seems to be just the first groping of this feature and obviously the next RTX 3000 series and ATI RDNA 2 will be better in that domain. Not talking about the overall graphical performance of the RTX 2080TI though but just the RT capabilities, afetr all A RTX 2060 GPU performs better in RT than a GTX 1080TI when it comes to RT because it packs the right hardware but when it coems to overall graphical performance then it's anotehr story. This what I mean and what would could be understood from that wording.
 

Deleted member 1589

User requested account closure
Banned
Oct 25, 2017
8,576
So I stole this post from beyond3d and it may be interesting for some:

"I am reposting this again here so it can have a better exposure.

According to an obscure discussion thread on Anandtech's forum, a member that is known to be a developer had some posts about the hardware RT in consoles, he said that consoles will support 32 bit snorm format and have a more programmable traversal stage, he claims current DXR hardware lacks those, and so will be at a disadvantage moving forward, requiring new DXR hardware. His points are contested in the thread though, and he never explained the usefulness of the snorm format, so I don't know about the accuracy of his argument.


He also claims that Xbox and PS5 will have different pipelines for RT.

https://forums.anandtech.com/threads/ray-tracing-is-in-all-next-gen-consoles.2571546/#post-39954331

He also says that consoles will do custom BVHs."
What do you think chris 1515

Seems like something you've theorised.
 

chris 1515

Member
Oct 27, 2017
7,074
Barcelona Spain
anexanhume what does all this mean regarding RT...?


Snorm definition:
Signed normalized integer, meaning that for an n-bit 2's complement number, the maximum value means 1.0f (e.g. the 5-bit value 01111 maps to 1.0f), and the minimum value means -1.0f (e.g. the 5-bit value 10000 maps to -1.0f). In addition, the second-minimum number maps to -1.0f (e.g. the 5-bit value 10001 maps to -1.0f). There are thus two integer representations for -1.0f. There is a single representation for 0.0f, and a single representation for 1.0f. This results in a set of integer representations for evenly spaced floating point values in the range (-1.0f...0.0f), and also a complementary set of representations for numbers in the range (0.0f...1.0f),

32 bits float is a format 32 bits snorm too. if I understand what the guy told it seems DXR/RTX is not currently supporting this format and I suppose the snorm format is better for memory bandwidth and/or cache-friendly than 32 bits float. This is my own conclusion maybe it is false. Maybe the guy is not legit at all.

And developer not happy about the BVH being a black box is not new at all but this is more an RTX problem than a DXR one. But RTX is the only available RT hardware architecture on PC or console.

For traversal this is true, devs want more programmability/flexibility and Intel proposes a solution too:



EDIT: If it is different I think Sony choose Imgtec technology or develop its own raytracing technology.
 
Last edited:

PLASTICA-MAN

Member
Oct 26, 2017
23,573
32 bits float is a format 32 bits snorm too. if I understand what the guy told it seems currently DXR is not supporting this format and I suppose the snorm format is better for memory bandwidth than 32 bits float. This is my own conclusion maybe it is false. Maybe the guy is not legit at all.

And developer not happy about the BVH being a black box is not new at all but this is more an RTX problem than a DXR one. But RTX is the only available RT hardware architecture on PC or console.

For traversal this is true, devs want more programmability/flexibility and Intel proposes a solution too:



EDIT: If it is different I think Sony choose Imgtec technology or develop its own raytracing technology.


Thanks for explaining this, from what I udnerstood, this why the RT implmenetation in PS5 and Scarlett seem to even surpass the RTX 2080Ti, until maybe NVIidia and MIcrosoft update their DXR-RTX implementation and relation but I don't think this will happen software wise, it could be even related to harwdare at this point (like FP16 and RPM) so this can't be circumvented to gain performance, hence mostly RDNA2, next-gen and next RTX 3000 sereis will be more optimised and have better performance when running RT, thanks to the a better hardware implementation.
 
Last edited:

VX1

Member
Oct 28, 2017
7,000
Europe
32 bits float is a format 32 bits snorm too. if I understand what the guy told it seems currently DXR is not supporting this format and I suppose the snorm format is better for memory bandwidth than 32 bits float. This is my own conclusion maybe it is false. Maybe the guy is not legit at all.

And developer not happy about the BVH being a black box is not new at all but this is more an RTX problem than a DXR one. But RTX is the only available RT hardware architecture on PC or console.

For traversal this is true, devs want more programmability/flexibility and Intel proposes a solution too:



EDIT: If it is different I think Sony choose Imgtec technology or develop its own raytracing technology.

I see.Thanks!
 

Albert Penello

Verified
Nov 2, 2017
320
Redmond, WA
Albert Penello Now that we're reaching the end of the current gen - are you able to share any more insight on what the design process was during the X1 rollout?

I'm specifically curious about 'when' in the process was it that you got a picture of what the PS4 was going to look like - and if the RAM change, Sony allegedly made, came to your notice.

I'm also curious on whether you think that we should be placing any stock today in the state of current devkits as we hear - given both consoles are near holiday 2020 releases.

I'm sure there have been lots of interviews about the X1 rollout that cover anything I could say.

Regarding RAM, It's most likely I'm going to take a ton of guff for this, but the RAM story is interesting so here goes (it's very similar to what happened on X360 but reverse). The Gen 8 spec was going to be 4GB. That would have been 4x the amount of memory from Gen7 so that was the target for both consoles based on what we knew. With all the Kinect and Media stuff the team knew that would eat too much memory for games and proposed 8gb - 2-3 for media, and 5-6 for games. Once the xbox team had knowledge there was a gap in GPU with PS4, the feeling was that the incremental 1-2 gb for games, while slower, would make up some of the difference. So having say 5-6gb for games in DDR3 could offset having only 3.5gb of GDDR5 (having half the memory would have meant half the memory bandwidth, too).

Once word got out X1 had 8gb (and of course Sony would not have known the type of RAM we were using or for what), we heard that matched to 8. I think that part is pretty undisputed at this point. So that was a bummer for X1 but was a super smart call on Sony's part. Of course, the exact opposite happened on X360.

It would be interesting to think what a 4gb PS4 vs. an 8GB X1 battle would have looked like in games, given their advantage in GPU.
 
Jun 23, 2019
6,446
Interesting. No wonder Sony devs were caught off guard at PS Meeting 2013. It was the latest of calls to increase it to 8GB DDR5.
 

gofreak

Member
Oct 26, 2017
7,734
I'm sure there have been lots of interviews about the X1 rollout that cover anything I could say.

Regarding RAM, It's most likely I'm going to take a ton of guff for this, but the RAM story is interesting so here goes (it's very similar to what happened on X360 but reverse). The Gen 8 spec was going to be 4GB. That would have been 4x the amount of memory from Gen7 so that was the target for both consoles based on what we knew. With all the Kinect and Media stuff the team knew that would eat too much memory for games and proposed 8gb - 2-3 for media, and 5-6 for games. Once the xbox team had knowledge there was a gap in GPU with PS4, the feeling was that the incremental 1-2 gb for games, while slower, would make up some of the difference. So having say 5-6gb for games in DDR3 could offset having only 3.5gb of GDDR5 (having half the memory would have meant half the memory bandwidth, too).

Once word got out X1 had 8gb (and of course Sony would not have known the type of RAM we were using or for what), we heard that matched to 8. I think that part is pretty undisputed at this point. So that was a bummer for X1 but was a super smart call on Sony's part. Of course, the exact opposite happened on X360.

It would be interesting to think what a 4gb PS4 vs. an 8GB X1 battle would have looked like in games, given their advantage in GPU.

Here is what I'm deathly curious about... was that initial 4GB spec going to be GDDR5, or also DDR3?

It was a theory that the decision to go to 8GB for 'other things' motivated the choice of DDR3 rather than GDDR5, which then motivated the choice of eSRAM, which then ate into the GPU silicon budget, which then...

Or was it always going to be DDR3, even when 4GB was the target? That would be kind of puzzling, other than for cost reasons I guess.
 

chris 1515

Member
Oct 27, 2017
7,074
Barcelona Spain

They speak about another technology refitting BVH but they compare to custom-built BVH.

The last of these items is the most interesting one. To investigate it further, we have also taken several poses for each of the models, and have generated one BVH for each of these poses, as well as the best BVH over time. Each of these BVHs has then been deformed to each of the sampled poses, and the best and worst frame rate have been recorded for each pose. As can be seen from Figure 9, except for BART deforming works well for any initial pose encountered in our experiments: the best BVH over time can avoid BVH deterioration to a certain degree, but even the worst BVH generated by any of the time steps usually is less than 20-30% slower than a custom built BVH. As the scenes deform significantly over time, this small impact of BVH deformation on runtime at first was quite surprising.

I find this paper and I think a custom built BVH is BVH build every frame. Only one technology do it currently Imgtec RT technology.


Bounding Volume Hierarchy (BVH)
is a popular ray tracing acceleration technique that uses a tree-based "acceleration structure" that contains multiple hierarchically-arranged bounding boxes (bounding volumes) that encompass or surround different amounts of scene geometry or primitives. Testing each ray against every primitive intersection in the scene is inefficient and computationally expensive, and BVH is one of many techniques and optimizations that can be used to accelerate it. The BVH can be organized in different types of tree structures and each ray only needs to be tested against the BVH using a depth-first tree traversal process instead of against every primitive in the scene. Prior to rendering a scene for the first time, a BVH structure must be created (called BVH building) from source geometry. The next frame will require either a new BVH build operation or a BVH refitting based on scene changes.
 

Albert Penello

Verified
Nov 2, 2017
320
Redmond, WA
Here is what I'm deathly curious about... was that initial 4GB spec going to be GDDR5, or also DDR3?

It was a theory that the decision to go to 8GB for 'other things' motivated the choice of DDR3 rather than GDDR5, which then motivated the choice of eSRAM, which then ate into the GPU silicon budget, which then...

Or was it always going to be DDR3, even when 4GB was the target? That would be kind of puzzling, other than for cost reasons I guess.

Memory was really expensive at the time - can't remember the circumstances but X360 memory was dirt cheap by the end, so there was a big shock to the system in the cost for DD3. GDDR5 was CRAZY expensive (like 2x the cost). So for MS at least it was always DDR3. The team believed that 8gb of DDR3 would be the same cost as 4gb of GDDR5 so it was a risk but felt it would provide better games and all the media stuff would not come at a cost to the developers.

Of course, it didn't pan out that way. A Hynix fire right before launch drove the cost of DDR3 through the roof (eroding the cost advantage), and all the major graphics cards moved quickly to GDDR5 so prices dropped faster than expected. Worked out great for Sony, not so great for Xbox.

I can only speculate what Sony's original plan was for RAM, but my guess is it was always GDDR5.
 

gundamkyoukai

Member
Oct 25, 2017
21,097
Thanks for explaining this, from what I udnerstood, this why the RT implmenetation in PS5 and Scarlett seem to even surpass the RTX 2080Ti, until maybe NVIidia and MIcrosoft update their DXR-RTX implementation and relation but I don't think this will happen software wise, it could be even related to harwdare at this point (like FP16 and RPM) so this can't be circumvented to gain performance, hence mostly RDNA2, next-gen and next RTX 3000 sereis will be more optimised and have better performance when running RT, thanks to the a better harwdare implementation.


It going to be interesting times ahead for both hardware and software when it comes to RT .
Looking forward to seeing what devs come up with to get as much as possible from the current RT hardware .
 
Feb 10, 2018
17,534
Memory was really expensive at the time - can't remember the circumstances but X360 memory was dirt cheap by the end, so there was a big shock to the system in the cost for DD3. GDDR5 was CRAZY expensive (like 2x the cost). So for MS at least it was always DDR3. The team believed that 8gb of DDR3 would be the same cost as 4gb of GDDR5 so it was a risk but felt it would provide better games and all the media stuff would not come at a cost to the developers.

Of course, it didn't pan out that way. A Hynix fire right before launch drove the cost of DDR3 through the roof (eroding the cost advantage), and all the major graphics cards moved quickly to GDDR5 so prices dropped faster than expected. Worked out great for Sony, not so great for Xbox.

I can only speculate what Sony's original plan was for RAM, but my guess is it was always GDDR5.

In the end it was luck which greatly effected how things panned out, I wonder what history would be like if ddr3 retained the price advantage and the ps4 had 4gb gddr5.
 

PLASTICA-MAN

Member
Oct 26, 2017
23,573
It going to be interesting times ahead for both hardware and software when it comes to RT .
Looking forward to seeing what devs come up with to get as much as possible from the current RT hardware .

Indeed. In fact, I think every company will come up with their own implemenation of RT in their games and how they manage to do it. After even PBR had different workflows.

Here is Wolrd Of Tanks on RT and deosn't require RT HW too, not bad at all:



The game si already sublime on mid-gen consoles with high resoluton and 4K textures. It will get mostly updated for next-gen too with RT.
It seems F2P are more prone to embrace technological advances in the domain since they are everlasting products compared to closed and shipped SP games.
 
Oct 26, 2017
6,151
United Kingdom
Memory was really expensive at the time - can't remember the circumstances but X360 memory was dirt cheap by the end, so there was a big shock to the system in the cost for DD3. GDDR5 was CRAZY expensive (like 2x the cost). So for MS at least it was always DDR3. The team believed that 8gb of DDR3 would be the same cost as 4gb of GDDR5 so it was a risk but felt it would provide better games and all the media stuff would not come at a cost to the developers.

Of course, it didn't pan out that way. A Hynix fire right before launch drove the cost of DDR3 through the roof (eroding the cost advantage), and all the major graphics cards moved quickly to GDDR5 so prices dropped faster than expected. Worked out great for Sony, not so great for Xbox.

I can only speculate what Sony's original plan was for RAM, but my guess is it was always GDDR5.

I'm guessing you're right, because the APU has the GDDR5 memory controller on-die; they would have designed it this way from the outset.

Based on what you said about the prices for GDDR5 being 2x, it makes sense that Sony was initially targeting half the capacity.
 

Albert Penello

Verified
Nov 2, 2017
320
Redmond, WA
In the end it was luck which greatly effected how things panned out, I wonder what history would be like if ddr3 retained the price advantage and the ps4 had 4gb gddr5.

Sony made a fundamentally better decision to start with GDDR5 so I don't want to discount that. All things equal, that was a smarter call.

Some luck played into it (good and bad) for both parties which has happened in every console cycle.
 

Albert Penello

Verified
Nov 2, 2017
320
Redmond, WA
I'm guessing you're right, because the APU has the GDDR5 memory controller on-die; they would have designed it this way from the outset.

Based on what you said about the prices for GDDR5 being 2x, it makes sense that Sony was initially targeting half the capacity.

yeah that makes sense.

I don't want to suggest Sony was targeting half capacity - both consoles were targeting 4gb in the early days.

I think what's more interesting is how there is certain information that gets around (like Xbox had 8 and Sony had 4), but the context around it was missing; I'm not sure they knew we were GDDR3 and that Xbox were allocating much of the extra memory to non-gaming applications.
 

dgrdsv

Member
Oct 25, 2017
11,846
Thanks for explaining this, from what I udnerstood, this why the RT implmenetation in PS5 and Scarlett seem to even surpass the RTX 2080Ti, until maybe NVIidia and MIcrosoft update their DXR-RTX implementation and relation but I don't think this will happen software wise, it could be even related to harwdare at this point (like FP16 and RPM) so this can't be circumvented to gain performance, hence mostly RDNA2, next-gen and next RTX 3000 sereis will be more optimised and have better performance when running RT, thanks to the a better harwdare implementation.
We know nothing about RT implementation in PS5 and Scarlett to make any kind of claims on how it compares to 2080Ti. But I seriously doubt that either console will have faster RT h/w than even 2080, let alone 2080Ti.
BVH traversal programmability would in fact trade performance for flexibility (allowing to bypass things which you may just as well bypass with more performance) and thus far it isn't clear that this flexibility would even net you a win in the end over "raw power".
 
Oct 26, 2017
6,151
United Kingdom
Sony made a fundamentally better decision to start with GDDR5 so I don't want to discount that. All things equal, that was a smarter call.

Some luck played into it (good and bad) for both parties which has happened in every console cycle.

It was only smarter with the benefit of hindsight.

At the time I can totally see why MS prioritised 8GB memory capacity to begin with. However, I've always wondered why on-die eSRAM was the option chosen to close the bandwidth gap, when as per Mark Cerny's comments at the PS4 reveal, a possible eDRAM solution with >1TB/s bandwidth was available?

Was an eDRAM daughter die, like the 360 had, ever considered? Or were the manufacturing limitations on eDRAM suppliers the nail in the coffin?
 

AegonSnake

Banned
Oct 25, 2017
9,566
Sony made a fundamentally better decision to start with GDDR5 so I don't want to discount that. All things equal, that was a smarter call.

Some luck played into it (good and bad) for both parties which has happened in every console cycle.
do you think Sony might be going with HBM instead of GDDR6 this time around?

After all its more expensive than GDDR6 but they seem to be willing to take risks in hopes of reduced costs further down the line.
 

Albert Penello

Verified
Nov 2, 2017
320
Redmond, WA
do you think Sony might be going with HBM instead of GDDR6 this time around?

After all its more expensive than GDDR6 but they seem to be willing to take risks in hopes of reduced costs further down the line.

Haven't they already confirmed GDDR6? I thought they had. HBM came with a host of other constraints beyond just cost, from what I recall. I'm not going to claim a lot of expertise here.
 

AegonSnake

Banned
Oct 25, 2017
9,566
Haven't they already confirmed GDDR6? I thought they had. HBM came with a host of other constraints beyond just cost, from what I recall. I'm not going to claim a lot of expertise here.
Thats the one thing they havent confirmed which is very odd. MS was the one that confirmed GDDR6. Sony has confirmed custom SSD, Hardware RT, Zen 2 8 core 16 thread CPU and navi GPU, but wont even talk about RAM which gives me pause.
 

Albert Penello

Verified
Nov 2, 2017
320
Redmond, WA
It was only smarter with the benefit of hindsight.

At the time I can totally see why MS prioritised 8GB memory capacity to begin with. However, I've always wondered why on-die eSRAM was the option chosen to close the bandwidth gap, when as per Mark Cerny's comments at the PS4 reveal, a possible eDRAM solution with >1TB/s bandwidth was available?

Was an eDRAM daughter die, like the 360 had, ever considered? Or were the manufacturing limitations on eDRAM suppliers the nail in the coffin?

Out of my element on this - I can tell you that I was told it was a natural evolution of the eDRAM which developers liked on X360. People seem to mischaracterize it as a response or a band-aid to using DDR3 but that wasn't the case - it was always in the plan as an evolution of how X360 worked. My understanding is that, when used, it really helped.

As to the architecture differences between eSRAM and eDRAM, talk to the Technical Fellows :)
 
Status
Not open for further replies.