Not that we know of now atleast.So he says RDNA2 runs a lot hotter than AMD claimed, hence one of the reasons the XSX/PS5 are significantly larger than most other consoles. Does that claim hold any water?
Not that we know of now atleast.So he says RDNA2 runs a lot hotter than AMD claimed, hence one of the reasons the XSX/PS5 are significantly larger than most other consoles. Does that claim hold any water?
So he says RDNA2 runs a lot hotter than AMD claimed, hence one of the reasons the XSX/PS5 are significantly larger than most other consoles. Does that claim hold any water?
:DIs this some fucking pixelated comic sans you've just burnt into my retinas vestan ? HOW DARE YOU!
Love the header in the OT btw - weirdly looking at it reminded me of that Playstation game "Drawn to death"
Yea I went digging at USB IF website and the only thing i can find reference to USB charger only port was this with battery icon. Logo Usage Guideline PDFI think the "confusion" stems from the lack of USB trident, which itself isn't required to show data transfer ability. Someone claimed the lightning inside a battery icon is a charge only port and since this has a lightning as well, it's the same port.
That too. They said PS5 will be compatible with some PS4 accessories and those use regular USB type A ports. So that port is definitely not power only. Could be a case of design decision where those 2 ports are USB SS and power delivery hence the ports are sandwiched between those two symbols?It also means its more juicy (provides more current than standard).
Some cool stuff about the adaptive triggers talked about in this.
Hair triggers etc seem to be possible, which is very very exciting.
Why would the PS5 need 3D memory stacking?he suggests both RDNA2 and the PS5 SOC will have some sort of 3D memory stacking
Who knows, they need to release them first.So he says RDNA2 runs a lot hotter than AMD claimed, hence one of the reasons the XSX/PS5 are significantly larger than most other consoles. Does that claim hold any water?
Isn't the PS5 going to run a lot hotter regardless because of the high clock frequency?
Something to chew on: PS5 has a higher pixel fillrate than an RTX 2080Ti (extrapolated from RDNA1).
PS5:
64 ROPs x 2.23 GHz - 142.72 Giga pixels per second
RTX 2080Ti:
88 ROPs x 1.545 GHz - 135.96 Giga pixels per second
RTX Titan:
96 ROPs x 1.770 GHz - 169.92 Giga pixels per second
PS5s bandwidth at 448 GB/s seems too low, so does RDNA2 have any clever compression tech to mitigate this for PS5s framebuffer? High clocks mean high internal cache bandwidth to feed these ROPs, and cache scrubbers/ coherency engines minimise stalls and bandwidth usage, but bandwidth to framebuffer looks like a bottleneck...
2080ti doesn't run at 1.5 ghz. It's likely around 2.0 ghz in game. Nvidia under reports their clocks.Something to chew on: PS5 has a higher pixel fillrate than an RTX 2080Ti (extrapolated from RDNA1).
PS5:
64 ROPs x 2.23 GHz - 142.72 Giga pixels per second
RTX 2080Ti:
88 ROPs x 1.545 GHz - 135.96 Giga pixels per second
RTX Titan:
96 ROPs x 1.770 GHz - 169.92 Giga pixels per second
PS5s bandwidth at 448 GB/s seems too low, so does RDNA2 have any clever compression tech to mitigate this for PS5s framebuffer? High clocks mean high internal cache bandwidth to feed these ROPs, and cache scrubbers/ coherency engines minimise stalls and bandwidth usage, but bandwidth to framebuffer looks like a bottleneck...
I've acknowledged those, still BW looks bottlenecked...
1.545 GHz is non-Founders Edition 2080Ti clocks at boost speeds reported in Techpowerup... I'm aware of many tweaked SKUs from various manufacturers.2080ti doesn't run at 1.5 ghz. It's likely around 2.0 ghz in game. Nvidia under reports their clocks.
I too wondered the same thing about the sandwiching icons but really it's not consistent with standard USB labeling.Yea I went digging at USB IF website and the only thing i can find reference to USB charger only port was this with battery icon. Logo Usage Guideline PDF
That too. They said PS5 will be compatible with some PS4 accessories and those use regular USB type A ports. So that port is definitely not power only. Could be a case of design decision where those 2 ports are USB SS and power delivery hence the ports are sandwiched between those two symbols?
No shit... findin it hard to believe I have met someone on here that knows of st Kitts such less comes from hereWhat the heck. Small world. My family is from there. I was there in December. Planning on getting my citizenship there based on birthright.
Unless I am mistaken doesn't pixel fill rate just mean how many pixels a GPU can output/second. Kinda like saying what is the max resolution that GPU would support at a given framerate. And if you have higher speed caches which inherently reduce your need to pull directly from RAM, I believe that also mean you can actually make do with a less than ideal bandwidth. As Cerny even pointed out, RAM is 33% awaySomething to chew on: PS5 has a higher pixel fillrate than an RTX 2080Ti (extrapolated from RDNA1).
PS5:
64 ROPs x 2.23 GHz - 142.72 Giga pixels per second
RTX 2080Ti:
88 ROPs x 1.545 GHz - 135.96 Giga pixels per second
RTX Titan:
96 ROPs x 1.770 GHz - 169.92 Giga pixels per second
PS5s bandwidth at 448 GB/s seems too low, so does RDNA2 have any clever compression tech to mitigate this for PS5s framebuffer? High clocks mean high internal cache bandwidth to feed these ROPs, and cache scrubbers/ coherency engines minimise stalls and bandwidth usage, but bandwidth to framebuffer looks like a bottleneck...
Whee did AMD say RDNA2 runs cooler. They said its more efficient,which in tis case could just mean that it can hit certain clocks using less power than RDNA1, but power is power, if you push power into a chip its gonna generate heat.So he says RDNA2 runs a lot hotter than AMD claimed, hence one of the reasons the XSX/PS5 are significantly larger than most other consoles. Does that claim hold any water?
Where are you getting that its not been confirmed for the PS5? It has.Has SMT not been confirmed for the CPU? That doesn't make sense. If the Series X has it, it's got to be a standard hardware feature. PS5 not having it makes no sense. Unless they purposely disabled it for some reason? Does SMT cause overheating of the CPU? I wonder what's going on here.
EDIT: Or perhaps SMT doesn't really improve performance by much when your CPU already has 8 cores? So Sony decided to disable the feature, maybe.
SMT has been confirmed a long time ago. Moreover it's in the official specs posted on the PS blog right after Cerny's GDC presentation.Has SMT not been confirmed for the CPU? That doesn't make sense. If the Series X has it, it's got to be a standard hardware feature. PS5 not having it makes no sense. Unless they purposely disabled it for some reason? Does SMT cause overheating of the CPU? I wonder what's going on here.
EDIT: Or perhaps SMT doesn't really improve performance by much when your CPU already has 8 cores? So Sony decided to disable the feature, maybe.
Pixel fillrate is how many pixels you can write to a framebuffer, which is then displayed. ROPs responsible for this do more than write pixels, like blending pixels and anti-aliasing. This all consumes bandwidth. I've acknowledged higher internal cache BW to feed these ROPs, but the fillrate is disproportionate to BW. So, wondering if RDNA2 has any clever compression tech for its ROPs...Unless I am mistaken doesn't pixel fill rate just mean how many pixels a GPU can output/second. Kinda like saying what is the max resolution that GPU would support at a given framerate. And if you have higher speed caches which inherently reduce your need to pull directly from RAM, I believe that also mean you can actually make do with a less than ideal bandwidth. As Cerny even pointed out, RAM is 33% away
Same feeling lmao! Don't get me started. I was gonna be back for the music festival before all the COVID. stay safe!No shit... findin it hard to believe I have met someone on here that knows of st Kitts such less comes from here
Honestly I think they can based on them not having to deal with anything windows based. It should be about the same footprint. Or if they do look to use the SSD to help with the footprint. If I'm not mistaken their OS has been Linux based.Of that 16GB GDDR6, I am most curious to see whether PS5 can match XSX's 2.5GB OS footprint. I wonder if Sony would employ the same strategy they did with PS4 Pro- integrate a cache of LPDDR4/DDR4 RAM for offloading non-gaming function whilst still being relatively brisk for accessing between games and said apps.
Honestly I think they can based on them not having to deal with anything windows based. It should be about the same footprint. Or if they do look to use the SSD to help with the footprint. If I'm not mistaken their OS has been Linux based.
Honestly I think they can based on them not having to deal with anything windows based. It should be about the same footprint. Or if they do look to use the SSD to help with the footprint. If I'm not mistaken their OS has been Linux based.
I thought he was suggesting 3d stacked cache, not the memory(gddr6) itself? Maybe I wasn't paying enough attention.Has Coreteks ever been correct in a prediction? His Ampere video predicting that Nvidia has moved all of the rt functionality to a bespoke chip on the opposite side of the card seems strange to me. I would have assumed that if that method was preferable, they wouldn't have bundled the RT cores with the SMs for Turing. Even AMD is following the Turing implementation of RT.
His theory for the PS5 using stacked memory seems strange to me as well. If that was a fundamental design of RDNA2, why wouldn't the Series X use it? If memory latency drives the performance efficiency gains of RDN2, it just doesn't make sense why there would be any RDN2 designs that wouldn't use stacked memory.
Some cool stuff about the adaptive triggers talked about in this.
Hair triggers etc seem to be possible, which is very very exciting.
Because RAM is further away with higher clocks, cache misses hurt more than a GPU with slower clocks. Combined with the lower BW which makes RAM even further away, that's probably one of the reasoning for the cache scrubbers.And if you have higher speed caches which inherently reduce your need to pull directly from RAM, I believe that also mean you can actually make do with a less than ideal bandwidth. As Cerny even pointed out, RAM is 33% away
And yet X1 and PS4 had the exact same amount reserved for the OS - 3GB, and X1 had Frankenstein of an OS. I wouldn't presume anything regarding OS footprint, we will just have to wait and see. My assumption is that if the footprint was super small and a big advantage, we would have heard about it by now, we already got a PS5 deep-dive in March and Sony already knew MS's footprint is 2.5GB.Honestly I think they can based on them not having to deal with anything windows based. It should be about the same footprint. Or if they do look to use the SSD to help with the footprint. If I'm not mistaken their OS has been Linux based.
The funny thing is, the PS5 SSD is almost as fast as the slowest DDR3 RAM out there (just around 6GB/s). I have no doubt in my mind that sony intends to use their SSD drastically reduce their OS RAM footprint. I would be shocked if the OS RAM reserve is anything more than 2GB, and I only see it being that much because that would be sony trying to be safe.Thanks.
The notion of exploiting SSD to offload portions OS has crossed my mind and I can see it for menus that buried within other menus at best. I think it may well have contributed to XSX's OS footprint being 2.5GB instead of the One's 3GB. Either that or XSX is also employing a lower speed DRAM cache for non-gaming related programs.
Pertaining to Windows or FreeBSD, lest it be forgotten X360 had a footprint of measly 32MB when PS3's was 120MB at launch which eventually was reduced to 50MB. And for current gen, both platforms dedicated 3GB of RAM for OS functions. Credit where it is due, MS's engineers did a fantastic job and pushed Sony's to iterate for the better. My fingers are crossed for PS5 being able to match XSX's OS footprint.
Thank you! Wasn't sure if it was. Or not. Unix based would then say they may have a lot more granularity in what they can do then. I see both PS3 and 4 use it.
Yeah so at min they would be equal.aAnd yet X1 and PS4 had the exact same amount reserved for the OS - 3GB, and X1 had Frankenstein of an OS. I wouldn't presume anything regarding OS footprint, we will just have to wait and see. My assumption is that if the footprint was super small and a big advantage, we would have heard about it by now, we already got a PS5 deep-dive in March and Sony already knew MS's footprint is 2.5GB.
Yeah I see Xbox one is windows 10 based 360 was based on windows 2000.Pertaining to Windows or FreeBSD, lest it be forgotten X360 had a footprint of measly 32MB when PS3's was 120MB at launch which eventually was reduced to 50MB. And for current gen, both platforms dedicated 3GB of RAM for OS functions. Credit where it is due, MS's engineers did a fantastic job and pushed Sony's to iterate for the better. My fingers are crossed for PS5 being able to match XSX's OS footprint.
Yes that was what I was insinuating when I said faster caches would man you could make do with a less than ideal bandwidth.I just forgot to mention the cache scrubbers.Because RAM is further away with higher clocks, cache misses hurt more than a GPU with slower clocks. Combined with the lower BW which makes RAM even further away, that's probably one of the reasoning for the cache scrubbers.
And yet X1 and PS4 had the exact same amount reserved for the OS - 3GB, and X1 had Frankenstein of an OS. I wouldn't presume anything regarding OS footprint, we will just have to wait and see. My assumption is that if the footprint was super small and a big advantage, we would have heard about it by now, we already got a PS5 deep-dive in March and Sony already knew MS's footprint is 2.5GB.
No shit... findin it hard to believe I have met someone on here that knows of st Kitts such less comes from here
Unless I am mistaken doesn't pixel fill rate just mean how many pixels a GPU can output/second. Kinda like saying what is the max resolution that GPU would support at a given framerate. And if you have higher speed caches which inherently reduce your need to pull directly from RAM, I believe that also mean you can actually make do with a less than ideal bandwidth. As Cerny even pointed out, RAM is 33% away
Whee did AMD say RDNA2 runs cooler. They said its more efficient,which in tis case could just mean that it can hit certain clocks using less power than RDNA1, but power is power, if you push power into a chip its gonna generate heat.
Where are you getting that its not been confirmed for the PS5? It has.
Inside PlayStation 5: the specs and the tech that deliver Sony's next-gen vision
Sony has broken its silence. PlayStation 5 specifications are now out in the open with system architect Mark Cerny deli…www.eurogamer.net
Oh. Good to know. Thanks for the replies.SMT has been confirmed a long time ago. Moreover it's in the official specs posted on the PS blog right after Cerny's GDC presentation.
Yeah we may not know until the UI revealAnd as for the OS footprint, honestly if all we have to go on is what has been announced or not announced then that not really anything considering he tight-lipped sony as ben about everything of late.
The funny thing is, the PS5 SSD is almost as fast as the slowest DDR3 RAM out there (just around 6GB/s). I have no doubt in my mind that sony intends to use their SSD drastically reduce their OS RAM footprint. I would be shocked if the OS RAM reserve is anything more than 2GB, and I only see it being that much because that would be sony trying to be safe.
And at least with the PS5 whose SSD is almost as fast as DDR3, they shouldn't need extra DDR3 RAM to offload non-essential programs. They can just move all that stuff to the SSD when not in use.
These reveals are lead by marketing people, usually, when one of them has something to brag about, they do. When Sony talks a lot about the SSD and Tempest while MS talks a lot about BC and TF, there is a reason for that. Each one of them knows their strength and weaknesses, it's not a coincidence that we know a lot about MS's BC and very little about Sony's while we know a lot about Sony's I/O block while we don't know much about MS's. Emphasize your advantages and hide your disadvantages, that's marketing 101.Yes that was what I was insinuating when I said faster caches would man you could make do with a less than ideal bandwidth.I just forgot to mention the cache scrubbers.
And as for the OS footprint, honestly if all we have to go on is what has been announced or not announced then that not really anything considering he tight-lipped sony as ben about everything of late.
All NANDmodule has that extra headroom, but still every SSD reserves ~10% additional space for failing blocks, yields, and overprovisioning. If 100% of the 64GiB per module will be available to the OS, that will be a first for me. I don't remember any SSD's that has 100% of the module (ignoring the extra space) available to the OS, do you have examples?It should be noted that NAND chips include some out-of-bond (addressable by controller only) additional memory to deal with failing cells. Some Micron 64 GiB are actually 67.2 GiB, for example.
This means that the whole 825 GB capacity will be available to the PS5 operating system (which may however reserve some bits for its own use.)
I'm pretty sure he said NAND flash as a gpu cache to reduce latency, which is obviously bonkers. The tidbits he was basing it on might point to an m.2 slot for an SSD on the video card, but that would still have nothing to do with memory latency.I thought he was suggesting 3d stacked cache, not the memory(gddr6) itself? Maybe I wasn't paying enough attention.
Either way, I think he's off his rocker in the video.
Naaaa... I'm not saying that the SSD would be used as RAM. That's generally just a BS statement for anyone to make because of latency as you correctly pointed out. But it can be used in such a way that it accentuates your RAM usage.I hope this turns out to be true but I wonder if latency (SSD operating in microseconds as opposed to nanosecond of DRAM) is the bigger factor at play when it comes to switching between gaming, ancillary and non-gaming functions. Plus given how Sony's patent in calling for data might from SSD end being in MB rather than KB (purpose built for gaming after all, and potentially using a DRAM-less controller):
https://www.resetera.com/threads/pl...ve-ot-secret-agent-cerny.175780/post-37047595
It may end up being more efficient to have RAM be in charge of majority functions of the OS.
Some cool stuff about the adaptive triggers talked about in this.
Hair triggers etc seem to be possible, which is very very exciting.
You can't do much with a 5.5GB/s drive that you can't do with a 0.5GB/s drive in an OS scenario, the difference in reserved space will be the fact that it's two distinctive OSs developed by two different companies with different goals in mind. Also, dumping some of the OS for a game and then dumping some of the game for the OS while you can hot-swape back and forth between them sounds like one hell of a complicated system. What do you even dump out of the game? And who chooses what data gets dumped?Naaaa... I'm not saying that the SSD would be used as RAM. That's generally just a BS statement for anyone to make because of latency as you correctly pointed out. But it can be used in such a way that it accentuates your RAM usage.
With a 2GB OS reserve, the PS5 can keep in RAM essential portions of the OS, so basically system programs and UI and a kinda snapshot of the OS state are all in that 2GB. Then when switching from a game to the OS proper, just like it does with games, it can move in the rest of the OS into RAM. This is kinda a big deal.
With the PS4, at any given point in time, as long as a game was running, the OS could never use anything more than its 3GB reserve. Any app that required more than that to run would force you to close the game app. On the PS5, the OS reserve is just that, an OS reserve, and when switching between a game (where the OS just sits in the reserve) to the full OS, a larger portion of the RAM could be utilized for the OS, say as much as 6GB+ if needed.
MS is probably doing the same thing. So their 2.4GB/s SSD allows them to have an OS reserve that is only 2.5GB. But in the second it takes you to switch from a game to the home screen (or during the transition) the system would reload the rest of the OS. Now, what does having a 5.5GB/s SSD mean in this case?
Honestly, I just want a proper full-blown chrome browser or something on the PS5.
Nice to see a continuation of the topic. Not long until the first hints of PS6...
Honestly learned a lot in the short time.I'm going to miss these threads. They've been my favourite since the very first one.