• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.

Overall maximum teraflops for next-gen launch consoles?

  • 8 teraflops

    Votes: 43 1.9%
  • 9 teraflops

    Votes: 56 2.4%
  • 12 teraflops

    Votes: 978 42.5%
  • 14 teraflops

    Votes: 525 22.8%
  • Team ALL THE WAY UP +14 teraflops

    Votes: 491 21.3%
  • 10 teraflops (because for some reason I put 9 instead of 10)

    Votes: 208 9.0%

  • Total voters
    2,301
Status
Not open for further replies.

Lagspike_exe

Banned
Dec 15, 2017
1,974
Bit of a dumb question, do we know the PS5's codename?

Ariel is rumored Sony codename, while AMD appears to be using Gonzalo for APU or whatever they're doing for Sony.

I think that Ms would have zero problem starting with gddr6 and move their mid gen iteration to HBM3 if it needed

This is much easier said than done. Those are veeeery different memory setups which work very differently. It's highly unlikely it would allow for straight BC for the same-gen console.
 

thuway

Member
Oct 27, 2017
5,168
Imagine how this system will cost reduce over time compared to a GDDR6 system.

Double stack HBM2 and quad channel DDR4 becomes single stack HBM3 with dual channel DDR5.
As was stated before, it's either legit or a meticulously crafted fake.

Digitimes' track record is not 100%, but if that rumor regarding ASE providing packaging for PS5 chips is true, it completely kills the rumor. Even if they didn't use InFO_MS, they could use TSMC's regular packaging services. So to claim that's explicitly not the case implies Digitimes has some knowledge and is not merely guessing.
Pardon my ignorance but what rumor are You guys talking about
 

BreakAtmo

Member
Nov 12, 2017
12,824
Australia
1) We are getting to the limits of my knowledge how modern RAM works - 25 years ago I learned something about it in my computer science university endeavor LOL - but how does that affect memory bandwidth or performance? Less cycles to move data from memory to cache or execution? Less memory bandwidth overhead for reading/writing operations? Bandwidth consistency?

From what I understand, the ram is split into banks. When the CPU or GPU uses the ram for something it'll use a bank. When they're using all the ram it's more likely that the CPU will take over a bank for a small task when the you really needs that bank for a larger task. HBM has more banks, so in that same scenario, the CPU and GPU could have one bank each. Someone please correct me if I'm wrong though.
 

Deleted member 12635

User requested account closure
Banned
Oct 27, 2017
6,198
Germany
From what I understand, the ram is split into banks. When the CPU or GPU uses the ram for something it'll use a bank. When they're using all the ram it's more likely that the CPU will take over a bank for a small task when the you really needs that bank for a larger task. HBM has more banks, so in that same scenario, the CPU and GPU could have one bank each. Someone please correct me if I'm wrong though.
This is what Wikipedia says about "memory banks".
A memory bank is a logical unit of storage in electronics, which is hardware-dependent. In a computer, the memory bank may be determined by the memory controller along with physical organization of the hardware memory slots. In a typical synchronous dynamic random-access memory (SDRAM) or double data rate synchronous dynamic random-access memory (DDR SDRAM), a bank consists of multiple rows and columns of storage units, and is usually spread out across several chips. In a single read or write operation, only one bank is accessed, therefore the number of bits in a column or a row, per bank and per chip, equals the memory bus width in bits (single channel). The size of a bank is further determined by the number of bits in a column and a row, per chip, multiplied by the number of chips in a bank.

Some computers have several identical memory banks of RAM, and use bank switching to switch between them. Harvard architecture computers have (at least) two very different banks of memory, one for program storage and other for data storage.

If we further assume that the HBM2 memory is exclusive to the GPU the structure you suggest doesn't make sense as the CPU already only has access to the DDR4 memory and not to the HBM2 memory. If we further add HBCC you add another layer of complexity as the memory you are now using with the GPU (exclusively) consists of different sized memory banks (HBM2 and DDR4 sizes).

Thats why I am so skeptical about that specific rumor (its not about HBM2 or not, its about the memory setup and split). Just the fundamentals how all components are supposed to work together don't fit very well or make things more complex than it needs to be.
 
Last edited:

SublimeAnarky

Member
Oct 27, 2017
811
Copenhagen, Denmark
4ydy0Ad.png

This got a response from hmqgg in the Xbox Studios thread when it came up. Take from it what you will.
 

Gamer17

Banned
Oct 30, 2017
9,399
This got a response from hmqgg in the Xbox Studios thread when it came up. Take from it what you will.
Eh there is no one that knows what exactly both are doing at the moment so to laugh it off or to confirm it doesn't make sense .(maybe a very few devs who got both dev kits but those won't post on Reddit for sure and those dev kits are not finalized ).

Where is the link to this thread ??
 
Oct 27, 2017
442
They won't help if they're just repackaged PC hardware.

From the wired article:

The devkit, an early "low-speed" version, is concealed in a big silver tower, with no visible componentry.

Not definitive, but tower certainly would point to "PC hardware." That said they may still be using an HBM solution which would still be telling.

I am very intrigued by the HBM rumors, and it seems like a very real possibility. But I still think a unified pool would be more likely based on Cerny's design history. The speed of HBM2 also doesn't seem to be all that dramatic over GDDR6 considering the added complexity. Seems in the neighborhood of 570GB/s vs 650GB/s? Maybe there is a variable I am missing here. https://www.anandtech.com/show/12338/samsung-starts-mass-production-of-gddr6-memory
 

kungfuian

Banned
Jan 24, 2018
278
Is it outside the realm of possibility that Sony or Microsoft release their dev kits with specs scaled back to conceal final specs/features from their competitors? Is their a history of such a thing? Would they purposely withhold features to prevent leaks or as a way to create false leaks?
 
Last edited:
Nov 12, 2017
2,877
This is what Wikipedia says about "memory banks".


If we further assume that the HBM2 memory is exclusive to the GPU the structure you suggest doesn't make sense as the CPU already only has access to the DDR4 memory and not to the HBM2 memory. If we further add HBCC you add another layer of complexity as the memory you are now using with the GPU (exclusively) consists of different sized memory banks (HBM2 and DDR4 sizes).

Thats why I am so skeptical about that specific rumor (its not about HBM2 or not, its about the memory setup and split). Just the fundamentals how all components are supposed to work together don't fit very well or make things more complex than it needs to be.
I don't get it too...hbcc has been tested multiple times and it start to work giving some results when the GPU is memory starved ...(basically using an fast SSD as cache) ..sure it could work if they gonna just with 8gb for the gpu because I don't think those are enough for the next gen....but why don't go directly with 16 GB of gddr6? I think is overall a better solution. That setup add complexity ...no better performance in games that aren't using all the memory pool and initial cost .
 

Gamer17

Banned
Oct 30, 2017
9,399
Is it outside the realm of possibility that Sony or Microsoft release their dev kits with specs scaled back to conceal final specs/features from their competitors? Is their a history of such a thing? Would they purposely withhold features to prevent leaks or as a way to create false leaks?
Didn't Sony exactly do just that with PS4? Even first parties didn't know ps4 is 8gb gddr5 as the dev kits had 4 gb
 
Feb 1, 2018
5,240
Europe
Didn't Sony exactly do just that with PS4? Even first parties didn't know ps4 is 8gb gddr5 as the dev kits had 4 gb
Well 4GB was the original plan, they just changed it last minute because 8Gb was becoming "affordable" (and that kinda won them the generation). So yea, some changes can be made last minute. Clockrates and memory, but nothing really low level. Usually SDKs are pretty accurate, they need to be because you partners really rely on them to create their (expensive) software.
 

NippleViking

Member
May 2, 2018
4,481
Are either of those worth giving the time to? Unless the former is confirmed dev and the latter, whether they have been labelled an insider or not, actually has contributed anything worthwhile...
Always best to take anything next-gen with a handful of salt - and that's even if it comes from the horse's/developer's mouth.

I believe hmgg has been verified, though their knowledge is solely of the Xbox side, and mainly in hardware. They reaffirmed ray-tracing, SSD/NVME, Lockhart as the cheaper SKU, hinted at Project Acoustics, XCloud (and Halo) coming to other platforms beyond just Windows, quite a while before these rumours were prolific.

They've also dropped some hints about Gears 5, and Switch ports (Ori 2, Gears Tactics, Super Lucky's Tail).
 

gofreak

Member
Oct 26, 2017
7,734
I don't get it too...hbcc has been tested multiple times and it start to work giving some results when the GPU is memory starved ...(basically using an fast SSD as cache) ..sure it could work if they gonna just with 8gb for the gpu because I don't think those are enough for the next gen....but why don't go directly with 16 GB of gddr6? I think is overall a better solution. That setup add complexity ...no better performance in games that aren't using all the memory pool and initial cost .

The main reason to go with it would be if you don't think 16GB total (12GB excluding the OS) is enough capacity compared to 24GB (20GB ex OS). There may be other advantages in bandwidth behavior - depending on how things are linked together at least - and latency would be better.

The idea of hbm2+ddr4 would be more expensive than 16GB of gddr6, so it would only make sense with some tangible technical benefits, which I think are there, even if it's not necessarily an unequivocal advantage in absolutely all use cases. Now if you were to compare to an alternative of 24GB of GDDR6, it becomes muddier - I think in that case the hbm option depends more on the theory of better cost scaling over time in order to look more attractive.

Imo 16GB Gddr6 seems like a good place to set expectations. The hbm2 setup is fun to speculate on but there's not really all that much behind the rumour yet.
 

SublimeAnarky

Member
Oct 27, 2017
811
Copenhagen, Denmark
Eh there is no one that knows what exactly both are doing at the moment so to laugh it off or to confirm it doesn't make sense .(maybe a very few devs who got both dev kits but those won't post on Reddit for sure and those dev kits are not finalized ).

Where is the link to this thread ??

If you click hmqgg's username in that quote, you should get taken to the specific post in the thread i referenced. If you're having difficulty with that, here it is
 

Deleted member 1589

User requested account closure
Banned
Oct 25, 2017
8,576
hmqgg hinting Gears Tactics and other MS first party games coming to Switch is great news. They probably won't make a dent with their console at Japan again, but some of their small titles could very well be hits there.

It's something I've been wishing ever since insiders hinted at some sort of partnership.
 
Feb 10, 2018
17,534
Wouldn't 8gb of HBM2 have 484gbps
But 24gb gddr6 have around 864gbps

So while HBM2 would have some advantages gddr6 would have others so they would equal each other out?
 
Nov 12, 2017
2,877
The main reason to go with it would be if you don't think 16GB total (12GB excluding the OS) is enough capacity compared to 24GB (20GB ex OS). There may be other advantages in bandwidth behavior - depending on how things are linked together at least - and latency would be better.

The idea of hbm2+ddr4 would be more expensive than 16GB of gddr6, so it would only make sense with some tangible technical benefits, which I think are there, even if it's not necessarily an unequivocal advantage in absolutely all use cases. Now if you were to compare to an alternative of 24GB of GDDR6, it becomes muddier - I think in that case the hbm option depends more on the theory of better cost scaling over time in order to look more attractive.

Imo 16GB Gddr6 seems like a good place to set expectations. The hbm2 setup is fun to speculate on but there's not really all that much behind the rumour yet.
Well exactly I'll set my expectations for both system at 16gb gddr6 and maybe some ddr4 for the os
How would be 16gddr6 + 8gb ddr4 add too much to the tdp?
 
Feb 1, 2018
5,240
Europe
An "exotic" memory architecture might be a real problem for BC. Will be interesting to see how Sony solves this. DF videos are going to be very interesting the first year.
 
Last edited:
Oct 26, 2017
6,151
United Kingdom
This got a response from hmqgg in the Xbox Studios thread when it came up. Take from it what you will.

Tbf, hmqgg is a MS insider who admittedly knows very little about PS5. In which case, I would be hesitant to take any post from him/her that is not purely about MS/Xbox with a hefty heap of salt.

An "exotic" memory architecture might be a real problem for BC. Will be interesting to see how Sony solves this. DF videos are going to be very interesting the first year.

Why would it be down to Sony to care about how games run on PC?

That's a problem for developers to work out. I doubt it would be too much of an issue, however.

The custom SSD solution, however, would be far more impactful in determining how games made also to run on PC will operate.
 
Feb 1, 2018
5,240
Europe
Tbf, hmqgg is a MS insider who admittedly knows very little about PS5. In which case, I would be hesitant to take any post from him/her that is not purely about MS/Xbox with a hefty heap of salt.



Why would it be down to Sony to care about how games run on PC?

That's a problem for developers to work out. I doubt it would be too much of an issue, however.

The custom SSD solution, however, would be far more impactful in determining how games made also to run on PC will operate.
BC not PC ;)
 

dgrdsv

Member
Oct 25, 2017
11,843
An "exotic" memory architecture might be a real problem for BC. Will be interesting to see how Sony solves this. DF videos are going to be very interesting the first year.
There's nothing "exotic" about using a fast SSD as a caching step between RAM and HDD/BD storage. If it will be done through using GPU/APU virtual memory aka HBCC then it's basically transparent to software although in case of a gaming console I'd imagine that having some degree of control over this would be preferable.

The bigger question is what gain will this actually be for Sony as thus far such SSD caches are only useful in HPC applications which needs to work with a very big data sets which can't be fitted into (V)RAM even on top end 32/64 GB HPC cards. I'm not sure that this will provide anything of value for a typical gaming workload - while it's very clear that the inclusion of such SSD into a console will affect its pricing in a big way. Do I need to remind anyone how bad the ESRAM idea worked with XBO? Weird memory solutions tend to drive costs up while not providing many actual benefits in gaming space.
 
Feb 1, 2018
5,240
Europe
There's nothing "exotic" about using a fast SSD as a caching step between RAM and HDD/BD storage. If it will be done through using GPU/APU virtual memory aka HBCC then it's basically transparent to software although in case of a gaming console I'd imagine that having some degree of control over this would be preferable.
If software was only this simple :). My experience with emulation is that every layer add the potential for wrong timings and/or small errors. Anyway... like you said, if the "exotic" part is limited, there won't be any big issues.
 
Status
Not open for further replies.