• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.
Mar 22, 2020
87
Unless you're severely CPU bottlenecked a 2080Ti is more like 35-40% faster than a 2070 SUPER.
https://www.computerbase.de/thema/grafikkarte/rangliste/#diagramm-performancerating-3840-2160
It depends on the resolution you're playing at. A 25% gap between a 2080 and a 2080 Ti at 1080p/1440p becomes a larger 35% gap at 4K. I also think it's wrong to consider anything below a 2080 (or at least above a 2070 super) to compare PS5 performance. RDNA2 adds quite a lot and a 400MHz bump is a very significant one. I wouldn't be surprised 52CUs to be closer to a 2080 Ti because the clock frequency barely drops from the RX 5700 XT. But I agree CU scaling on the higher end of the scale is completely unknown. If like GCN (former AMD GPU architecture) it scales poorly, then I'm unsure why there would be massive consequences on enjoying either console.

Missed other questions sorry. Think port will be just a res difference and FPS stability. RT on 3rd party games should also perform better on XBSX. DXR 1.1, VRS more CU's etc
I'm not sure RTRT performance can be confirmed by now. We still have no idea if VRS is unique to XSX and we have no idea what Sony will use as a RT API on the PS5, so I couldn't tell if dx12 ultimate gives them an edge. Some games show great performance games on Vulkan, some don't. VulkanRT wasn't even benchmarked against any other APIs as of this today. I'm surprised none of you are mentioning VR, it does make use of more pixels and has a specific requirement on latency (ideally >90fps).
 

Mubrik_

Member
Dec 7, 2017
2,723
Developers already profile code to find the most efficient way to get work done by successive refinement. They already have to deal with the reality that instructions don't execute in trivially predictable amounts of time, as that's just a reality of modern processor architecture and has been for a very long time. Nothing changes in their workflow. Whether it makes you nervous or not is inconsequential, because the people doing this work aren't going to be negatively impacted. It doesn't make their job any more complicated than it is already, because it's axiomatic: you gauge performance by testing your code running in real-world conditions, not based on a simplified theoretical model of how it ought to behave.

Again: I've been doing this kind of work for decades, have worked on substantially similar systems, and have led teams that built profiling tools used to build software you may already use every day. You're creating imaginary problems, and at some point the only possible conclusion is that it's your objective.

He's already convinced himself.

I'd just ignore him tbh
 
Mar 22, 2020
87
[...] because the people doing this work aren't going to be negatively impacted. It doesn't make their job any more complicated than it is already, because it's axiomatic: you gauge performance by testing your code running in real-world conditions, not based on a simplified theoretical model of how it ought to behave.
Two very good points I had to point out: it's clear in both cases, one community of developers will not suffer more from the choices made here.
 

M3rcy

Member
Oct 27, 2017
702
I don't know man... just doesn't make sense to me. From its inception, my understanding of a boost clock has ben pretty simple. Using examples..
Chip has a minimum clock = 500Mhz
Chip has a normal clock = 1500Mhz
Chip has a boost clock = 1800Mhz

There's your problem. That's not how it works. When running a game GPUs always try to hit max boost clock and only drop when power/thermals force them to, and then only by enough to pull those back within limits.
 

gozu

Member
Oct 27, 2017
10,315
America
Developers already profile code to find the most efficient way to get work done by successive refinement. They already have to deal with the reality that instructions don't execute in trivially predictable amounts of time, as that's just a reality of modern processor architecture and has been for a very long time. Nothing changes in devs workflow. Whether it makes you nervous or not is inconsequential, because the people doing this work aren't going to be negatively impacted. It doesn't make their job any more complicated than it is already, because it's axiomatic: you gauge performance by testing your code running in real-world conditions, not based on a simplified theoretical model of how it ought to behave.

Again: I've been doing this kind of work for decades, have worked on substantially similar systems, and have led teams that built profiling tools used to build software you may already use every day. You're creating imaginary problems, and at some point the only possible conclusion is that it's your objective.

AegonSnake has been told this already and ignored it. My hypothesis is that he is reluctant to admit to the hotness of his take. Maybe if I make some of your words giant he will be forced to read them?
 

AegonSnake

Banned
Oct 25, 2017
9,566
AegonSnake has been told this already and ignored it. My hypothesis is that he is reluctant to admit to the hotness of his take. Maybe if I make some of your words giant he will be forced to read them?
lol whats up with this realtime commentary in this thread? Clearly i am having a 1 on 1 discussion with lady gaia and gofreak before that, but the rest of you are like those those ex-players sitting in the announcer booth taking cheap potshots at two people having a discussion. It's not like i have ignored anything he's said. i have replied to all of his points and presented my concerns. you dont have to buy them, but this kind of passive aggressive commentary is no different than what Colbert and his ilk used to do in the other thread.
 

AegonSnake

Banned
Oct 25, 2017
9,566
He's already convinced himself.

I'd just ignore him tbh
on the contrary, i am not convinced of anything. go back and read my posts, and you will see me concede that we will find out what this variable frequency stuff will amount to come launch.

in the meantime, i had no idea there was a memorandum sent out where we were not allowed to discuss the potential drawbacks of this new (to consoles at least) tech. A tech that was described to us in such great detail that we literally only have one sentence from Cerny showing how it works before moving on to AMD smart shift. I have watched the gpu portion of that talk many times now, and all I can find is one line from him. the rest is him talking about why they didnt go with fixed clocks and why they went with higher clocks. There is a lot that we simply dont know. A few demos here and there, some performance charts, and some benchmarks would've helped. AMD did it at computex. they had them in their press release. This is literally all speculation.

It's bizarre because even Cerny talks about the worst case scenario, but sure let's all turn on AegonSnake for daring to talk about worst case scenarios. Apparently, my "objectives" have changed after nearly two years in these threads. I was a double agent working for the other side all along. Colbert is my partner and secret lover, and I took part in the github hack with Anthony Hopkins at my side. I ditched him when he got caught, but was able to escape just in time to report my findings to our boss Albert Penello.
 

Pheonix

Banned
Dec 14, 2018
5,990
St Kitts
That profile is exactly what I am talking about. The profile will be 2.0 ghz or whatever the most consistent clocks are that dont cause the gpu to downgrade clocks.

I know devs already playtest, but how many times we have seen games this and the gen before and well every gen before that where devs dont do that kind of optimization, and simply let frames drop. We just saw it in games like Control where base consoles were dipping down to 10 fps. and this was on fixed clocks. they knew the limits of the system, they must have seen the issues during playtests for performance and they did not bother optimizing.

What Cerny is asking devs to do now is to tackle it on not just a scene by scene basis, but also frame by frame, and if they dont do that, the consequences are dire. they fixed clocks they relied on before, and now they will have no choice but to account for that. unless like you said, they create a profile with lower clocks and stick to that.
Look at it this way.

PS5 has a general chip profiler that keeps realtime track of the load either the CPU or GPU has at all times. It needs ts to keep track of how much power they are drawing and what they need. So if the profiler detects that the GPU is pulling 120Wand starts a workload that could take its power pull to 160W, it would take power from the CPU that doesn't have as much of a load to do and shunt that over to the GPU. So te entire chip never exceeds its allowed power budget.

Now where devs come in and why they say this is deterministic is that since this is something that can occur even with a perfectly tuned engine, devs can isolate the instances where this would happen, and set instructions for the profiler on exactly by how much clocks should be dropped here or there to keep things within budget. In these cases, dropping the clock doesn't mean there would be a resulting performance hit. This is evident in the example Cerny gave using that Horizon map screen, things like that tend to happen when the task is small and not complex but sets the APU into overdrive.
 

Adder7806

Member
Dec 16, 2018
4,122
Apparently, my "objectives" have changed after nearly two years in these threads. I was a double agent working for the other side all along. Colbert is my partner and secret lover, and I took part in the github hack with Anthony Hopkins at my side. I ditched him when he got caught, but was able to escape just in time to report my findings to our boss Albert Penello.

This is more plausible than most of the concerns you've come up with. (smile emoji) (I don't actually know how to insert emojis on ERA).


Look, we're all glad you post. You have an enthusiasm that is enviable. It's just that most of your worries are equivalent to the time I look up a mild sore throat on WebMD and came out convinced I had seven different rare diseases.

The PS5 is going to be fine. It's going to run incredible games. Sony's engineers are very smart and they've built what will be an amazing system. I'll even go on record saying it's better than the XBX (not knocking the XBX it looks amazing too). First party games will reach higher highs. 3rd party games will be equal or better. (Unless all you care about is maxing out the number of indiscernible pixels on the tv screen). The PS5 will be stronger because it's forward thinking. This next generation of games is going to be fantastic. (For all consoles.)
 
Mar 22, 2020
87
Controller itself needs to run hot and is designed to do so;80*C+. You can put a thermal pad on it to keep it where it needs to be. Nand chips themselves don't need cooling. Those fancy heatsinks you see on most NVMEs aren't necessary.
That's not exactly true, the controller will maintain peak transfer rates and normal behavior if under 105°C and probably throttles early before that. On the other hand, it's very important that the NAND on the SSD stay at a rather high temperature (~80°C) but not near throttle territory (105°C) because operating under a certain temperature will kill the cells endurance over time. So if you see a heatsink, it's useless on NAND but not on the controller. Unless the NAND overheats even more.
 

Dizastah

Member
Oct 25, 2017
6,124
This is more plausible than most of the concerns you've come up with. (smile emoji) (I don't actually know how to insert emojis on ERA).

Look, we're all glad you post. You have an enthusiasm that is enviable. It's just that most of your worries are equivalent to the time I look up a mild sore throat on WebMD and came out convinced I had seven different rare diseases.

The PS5 is going to be fine. It's going to run incredible games. Sony's engineers are very smart and they've built what will be an amazing system. I'll even go on record saying it's better than the XBX (not knocking the XBX it looks amazing too). First party games will reach higher highs. 3rd party games will be equal or better. (Unless all you care about is maxing out the number of indiscernible pixels on the tv screen). The PS5 will be stronger because it's forward thinking. This next generation of games is going to be fantastic. (For all consoles.)

Chill Aegon, everythings gonna be alright. :)
 

Mubrik_

Member
Dec 7, 2017
2,723
on the contrary, i am not convinced of anything. go back and read my posts, and you will see me concede that we will find out what this variable frequency stuff will amount to come launch.

in the meantime, i had no idea there was a memorandum sent out where we were not allowed to discuss the potential drawbacks of this new (to consoles at least) tech. A tech that was described to us in such great detail that we literally only have one sentence from Cerny showing how it works before moving on to AMD smart shift. I have watched the gpu portion of that talk many times now, and all I can find is one line from him. the rest is him talking about why they didnt go with fixed clocks and why they went with higher clocks. There is a lot that we simply dont know. A few demos here and there, some performance charts, and some benchmarks would've helped. AMD did it at computex. they had them in their press release. This is literally all speculation.

It's bizarre because even Cerny talks about the worst case scenario, but sure let's all turn on AegonSnake for daring to talk about worst case scenarios. Apparently, my "objectives" have changed after nearly two years in these threads. I was a double agent working for the other side all along. Colbert is my partner and secret lover, and I took part in the github hack with Anthony Hopkins at my side. I ditched him when he got caught, but was able to escape just in time to report my findings to our boss Albert Penello.


I'm not trying to paint you in a bad light or anything, apologies if it seemed so.
It just seemed like you weren't taking the point of other people who might be more knowledgeable on the discussion into consideration.
You've had multiple people reply to your point and trying to clarify your argument about developers having a harder time due to the design of the console.

You stuck to a point that the devs have to choose some ' profile ' to work with.

I understand you're skeptical but it's not like it ' worse case ' scenario all the time even
 

foamdino

Banned
Oct 28, 2017
491
AegonSnake it's good to discuss this variable clock solution as it is new in the console space - and it's interesting to me how Microsoft went out of their way to promote fixed clocks to DF just before the Sony presentation - you think they knew about the PS5 architecture <hmm>

The reality is that:
  1. Cerny designed the entire system - he didn't just do the apu
  2. and his whole focus from when he was brought in to do the PS4 was making it easier for devs
  3. the gpu has 36 CUs partly to allow code developed to take advantage of that many CUs (ie defined as 36 parallel work streams) to be lift->shift to PS5
  4. the ssd is designed to literally get out of the way of developers - they issue a command and the packed data on the ssd appears as unpacked in memory with no further input (unless I mis-interpreted the presentation)
  5. he pushed time to triangle down from 1-2 months on PS4 to less than 1 month on PS5
  6. all the devs are super-excited (why because their jobs became significantly easier)
against this background, it's hard to then come to the conclusion that Cerny decided to use variable clocks as it would make life harder for devs.

His whole design ethos is about ease of development, in all areas that have been publicly disclosed the ease of development is forefront.

I wouldn't worry that this variable clock decision is going to cause grief - on the contrary I suspect that it will be a net neutral in terms of development effort, but could be a huge net positive in terms of enclosure design, power consumption, form factor etc

It wouldn't surprise me if the PS5 has very very low power draw (much lower than expected) and extremely customised power supply etc.

There's an awful lot that hasn't been revealed yet from the dev talk on twitter and the fact that there are still big NDAs in place.
 

TheZynster

Member
Oct 26, 2017
13,285
Controller itself needs to run hot and is designed to do so;80*C+. You can put a thermal pad on it to keep it where it needs to be. Nand chips themselves don't need cooling. Those fancy heatsinks you see on most NVMEs aren't necessary.

reminds me of my old Radeon 6950 i think it was......Card was designed to run consistently at 80 degrees celcius.......it was just the way it was designed, but man i did not like that thing come summer time lol
 

Lady Gaia

Member
Oct 27, 2017
2,477
Seattle
Clearly i am having a 1 on 1 discussion with lady gaia and gofreak before that, but the rest of you are like those those ex-players sitting in the announcer booth taking cheap potshots at two people having a discussion. It's not like i have ignored anything he's said. i have replied to all of his points and presented my concerns.

When posting in a public forum you're not having a 1-on-1 discussion. That what the whole "Conversations" feature of the site is for. Posts are part of a public discussion that anyone is welcome to join. You should also note that your assumption that everyone here is male is as unfounded and inaccurate as many of your technical assumptions.
 

AegonSnake

Banned
Oct 25, 2017
9,566
AegonSnake it's good to discuss this variable clock solution as it is new in the console space - and it's interesting to me how Microsoft went out of their way to promote fixed clocks to DF just before the Sony presentation - you think they knew about the PS5 architecture <hmm>

The reality is that:
  1. Cerny designed the entire system - he didn't just do the apu
  2. and his whole focus from when he was brought in to do the PS4 was making it easier for devs
  3. the gpu has 36 CUs partly to allow code developed to take advantage of that many CUs (ie defined as 36 parallel work streams) to be lift->shift to PS5
  4. the ssd is designed to literally get out of the way of developers - they issue a command and the packed data on the ssd appears as unpacked in memory with no further input (unless I mis-interpreted the presentation)
  5. he pushed time to triangle down from 1-2 months on PS4 to less than 1 month on PS5
  6. all the devs are super-excited (why because their jobs became significantly easier)
against this background, it's hard to then come to the conclusion that Cerny decided to use variable clocks as it would make life harder for devs.

His whole design ethos is about ease of development, in all areas that have been publicly disclosed the ease of development is forefront.

I wouldn't worry that this variable clock decision is going to cause grief - on the contrary I suspect that it will be a net neutral in terms of development effort, but could be a huge net positive in terms of enclosure design, power consumption, form factor etc

It wouldn't surprise me if the PS5 has very very low power draw (much lower than expected) and extremely customised power supply etc.

There's an awful lot that hasn't been revealed yet from the dev talk on twitter and the fact that there are still big NDAs in place.

yeah, im glad i am not the only one who caught MS putting so much emphasis on fixed clocks. there is so much with the github leaks no one seems to be talking or willing to investigate. you could make a hbo documentary out of this stuff. Penello knew way back in december 2018. it's also odd how ms cancelled lockhart immediately after E3 when rumors of ps5 being more powerful started circulating around thanks to andrew reiner and that banned guy. then it was curiously revived six months later around x019. i would love to know what happened here.

im sure the ps5 is a beast, i just need to see it in action. i know cerny is a genius but hes a genius with a budget. the pro's bandwidth issues couldve been foreseen by anyone, but at the end of the day he was limited by his budget. i remember several of us were convinced that sony would never go with a narrow and fast gpu, going as far as to say cerny wouldve fucked up if he had done that, but thats the reality we live in now. i want to trust him, but i need to see demos stat.

the good thing is that both systems are different enough that we will see some tangible benefits on each platform. ms might have better ray tracing, sony might have better character models and draw detail. i hope sony goes back to the days of the ps3 and heavily invests in exclusives again. especially from third party studios.
 
Oct 25, 2017
1,844
on the contrary, i am not convinced of anything. go back and read my posts, and you will see me concede that we will find out what this variable frequency stuff will amount to come launch.
I don't think it will be cleared up come launch. I think the variable clock is going to be ammo for toxic XBox fans for a long time.

Launch games rarely push new systems to their limits, and launch timelines often mean optimisation will be limited to simplest approach, which will be to drop resolution.
 

AegonSnake

Banned
Oct 25, 2017
9,566
I don't think it will be cleared up come launch. I think the variable clock is going to be ammo for toxic XBox fans for a long time.

Launch games rarely push new systems to their limits, and launch timelines often mean optimisation will be limited to simplest approach, which will be to drop resolution.
yeah, its definitely given them an easy win for third party games especially cross gen ones, though it might still not translate into sales if sony can get some killer exclusives at launch since MS seems to be settling for cross gen games for some bizarre reason.

that said, i no longer care about sales as much as i used to. i just hope sony has horizon 2 ready to go up against halo. i dont think a demon souls remake would appeal to the masses as much.
 

Pheonix

Banned
Dec 14, 2018
5,990
St Kitts
yeah, im glad i am not the only one who caught MS putting so much emphasis on fixed clocks. there is so much with the github leaks no one seems to be talking or willing to investigate. you could make a hbo documentary out of this stuff. Penello knew way back in december 2018. it's also odd how ms cancelled lockhart immediately after E3 when rumors of ps5 being more powerful started circulating around thanks to andrew reiner and that banned guy. then it was curiously revived six months later around x019. i would love to know what happened here.

im sure the ps5 is a beast, i just need to see it in action. i know cerny is a genius but hes a genius with a budget. the pro's bandwidth issues couldve been foreseen by anyone, but at the end of the day he was limited by his budget. i remember several of us were convinced that sony would never go with a narrow and fast gpu, going as far as to say cerny wouldve fucked up if he had done that, but thats the reality we live in now. i want to trust him, but i need to see demos stat.

the good thing is that both systems are different enough that we will see some tangible benefits on each platform. ms might have better ray tracing, sony might have better character models and draw detail. i hope sony goes back to the days of the ps3 and heavily invests in exclusives again. especially from third party studios.
I was one of the people that said, and to quote myself, "there is no chance in hell sony goes with a 36CU GPU".

In all fairness, I also at the time didn't believe that it would be possible to get a GPU running at 2Ghz and as such, I felt the best we would get with a 36CU GPU was something running at 1.8Ghz for 8.2TF.

I did, however, believe the XSX would be more powerful, and had the PS5 pegged at 10TF using a 44CU GPU running at 1.8Ghz.

I don' know what or how sony did it, but them being able to take that 36CU GPU and clock it as high as 2.2Ghz is something me or no one around period would have believed to be possible.

I really don't care about them being a 10TF console... I would even take games at 1440p and be ok with that. The one thing I wish they had done though was to use 16Gbs RAM chips as opposed to 14Gbs.
 

BreakAtmo

Member
Nov 12, 2017
12,828
Australia
I was one of the people that said, and to quote myself, "there is no chance in hell sony goes with a 36CU GPU".

In all fairness, I also at the time didn't believe that it would be possible to get a GPU running at 2Ghz and as such, I felt the best we would get with a 36CU GPU was something running at 1.8Ghz for 8.2TF.

I did, however, believe the XSX would be more powerful, and had the PS5 pegged at 10TF using a 44CU GPU running at 1.8Ghz.

I don' know what or how sony did it, but them being able to take that 36CU GPU and clock it as high as 2.2Ghz is something me or no one around period would have believed to be possible.

I really don't care about them being a 10TF console... I would even take games at 1440p and be ok with that. The one thing I wish they had done though was to use 16Gbs RAM chips as opposed to 14Gbs.

Well in a way you were right, given what Cerny said about how they couldn't even hit 2GHz with the traditional fixed clock system. Your (our) mistake was not foreseeing this innovative fixed power draw system with variable clocks and more specialised cooling.

On that note, do you think the mysterious cooling system really is the "both sides" setup from the Sony patent? Or will it be something else we don't expect?
 

Deleted member 56752

Attempted to circumvent ban with alt account
Banned
May 15, 2019
8,699
Curious to see how this is revealed, comes out, etc. I mean, this has got to be the most interesting new generation of ALL time, seriously. I have no idea what to expect. None. I don't even really want a new console even though I probably need one given how the X runs the modern warfare BR that just came out
 

Pheonix

Banned
Dec 14, 2018
5,990
St Kitts
AMD's engineers did all the work, not Sony. Well, and TSMC.
AMD and TSMC did it, sony enabled them.
Well in a way you were right, given what Cerny said about how they couldn't even hit 2GHz with the traditional fixed clock system. Your (our) mistake was not foreseeing this innovative fixed power draw system with variable clocks and more specialised cooling.

On that note, do you think the mysterious cooling system really is the "both sides" setup from the Sony patent? Or will it be something else we don't expect?
I wouldn't be surprised one bit at this point. That honestly is the only way this makes sense. So far nearly every nextgen related patent they have put out has come pass. I mean, Cerny literally shouted out the cooling solution the engineers designed in his talk... he shouted out the god damn cooling team!!!
 

Bunzy

Banned
Nov 1, 2018
2,205
I was one of the people that said, and to quote myself, "there is no chance in hell sony goes with a 36CU GPU".

In all fairness, I also at the time didn't believe that it would be possible to get a GPU running at 2Ghz and as such, I felt the best we would get with a 36CU GPU was something running at 1.8Ghz for 8.2TF.

I did, however, believe the XSX would be more powerful, and had the PS5 pegged at 10TF using a 44CU GPU running at 1.8Ghz.

I don' know what or how sony did it, but them being able to take that 36CU GPU and clock it as high as 2.2Ghz is something me or no one around period would have believed to be possible.

I really don't care about them being a 10TF console... I would even take games at 1440p and be ok with that. The one thing I wish they had done though was to use 16Gbs RAM chips as opposed to 14Gbs.


it really is nuts they got over 2ghz, think last gen they had the PS4 gpu clocked at 800mhz and we thought that was fast. It's literally insane to have that speed in a console
 

ShapeGSX

Member
Nov 13, 2017
5,212

modiz

Member
Oct 8, 2018
17,831
The people who ultimately got it to run at 2.2GHz (that's what I was replying to) were the excellent engineers at AMD and at TSMC with the new process. It's difficult and rewarding work (in my opinion). Sony wasn't in there day to day pounding on the critical paths to make it run faster. Give credit where it's due.
That is not exactly accurate, Sony decides which cooling system to use and their power budget.
 

androvsky

Member
Oct 27, 2017
3,503
The people who ultimately got it to run at 2.2GHz (that's what I was replying to) were the excellent engineers at AMD and at TSMC with the new process. It's difficult and rewarding work (in my opinion...maybe there's something wrong with me :) ). Sony wasn't in there day to day pounding on the critical paths to make it run faster. Give credit where it's due.
Cerny went out of his way to talk about how Sony helps AMD design and engineer features that can end up in PC GPUs. Without looking at individual engineer task lists it's impossible to know for sure, but I doubt it's fair to not give Sony any credit.
 

BreakAtmo

Member
Nov 12, 2017
12,828
Australia
AMD and TSMC did it, sony enabled them.

I wouldn't be surprised one bit at this point. That honestly is the only way this makes sense. So far nearly every nextgen related patent they have put out has come pass. I mean, Cerny literally shouted out the cooling solution the engineers designed in his talk... he shouted out the god damn cooling team!!!

It'll be the only time that "both sides" was ever cool.
 
Nov 2, 2017
2,275
It depends on the resolution you're playing at. A 25% gap between a 2080 and a 2080 Ti at 1080p/1440p becomes a larger 35% gap at 4K. I also think it's wrong to consider anything below a 2080 (or at least above a 2070 super) to compare PS5 performance. RDNA2 adds quite a lot and a 400MHz bump is a very significant one. I wouldn't be surprised 52CUs to be closer to a 2080 Ti because the clock frequency barely drops from the RX 5700 XT. But I agree CU scaling on the higher end of the scale is completely unknown. If like GCN (former AMD GPU architecture) it scales poorly, then I'm unsure why there would be massive consequences on enjoying either console.
Like I said: ''unless you're CPU bottlenecked". At 1080/1440p the 2080Ti is obviously bottlenecked to a degree. Why else do you think the gap increases that much at 4k?
RDNA2 does add a lot but the improvements we know about are in Turing as well. It's just that they haven't really been used in games so far. Next gen Turing is probably going to gain a lot compared to RDNA1 as RDNA1 lacks VRS & mesh shaders.
 
Mar 22, 2020
87
Just found a pretty interesting video about how AMD is leading the future through consoles. Quite a lengthy video though.

I really wish you wouldn't watch this youtuber's videos. He's very popular for making up rumors along with others youtubers on r/amd. Pretty much everytime those rumors are proven wrong except the most obvious ones. The title he chooses for his videos often lead me to believe he's not offering unbiased opinion.

Like I said: ''unless you're CPU bottlenecked". At 1080/1440p the 2080Ti is obviously bottlenecked to a degree. Why else do you think the gap increases that much at 4k?
At lower resolutions, there is a smaller workload to go through every frame, which forces the GPU to catch up more regularly. The available VRAM isn't fully utilized then, so it tends to like high clock frequencies over CU counts. However, even though a larger chip with more CUs will show underutilization or "poorer efficiency" of CUs, it's also still faster. That said, 1.25x of 120fps at 1080p or 1440p is ~144fps. Often, the RX 5700 XT offers above 100/120/140fps in the most recent games, older games are an outlier.

RDNA2 does add a lot but the improvements we know about are in Turing as well. It's just that they haven't really been used in games so far. Next gen Turing is probably going to gain a lot compared to RDNA1 as RDNA1 lacks VRS & mesh shaders.
Console manufacturers and developers often tune game quality and resolution quite a lot but it definitely seems like both consoles can pretty much keep up out of the box. I'm looking forward to see what AMD adds to RDNA2, but I hate to think benchmarking both solutions won't be very easy or might not happen. In the case of the PS5, I wish upcoming presentations showcase the need of dedicated I/O ASICs, and those cache scrubbers should contribute quite a lot.

Anyway, it might be a while for us to see those consoles or even new AMD GPUs release, and I hope both manufacturers make an effort to reduce the amount of speculation needed to gauge their hardware. I see both communities going at each other (with the help of community managers, at times..).
If this was PC hardware, there would be a very viable use case for both solutions, and an argument to be made for price to performance. It being Console hardware, there might even be less of a point to make comparing both.
 

modiz

Member
Oct 8, 2018
17,831
I really wish you wouldn't watch this youtuber's videos. He's very popular for making up rumors along with others youtubers on r/amd. Pretty much everytime those rumors are proven wrong except the most obvious ones. The title he chooses for his videos often lead me to believe he's not offering unbiased opinion.
I don't know who this is, but this video isnt about any rumors, its offering a speculation on the choices behind these systems and RNDA2 and what it means for the future of gaming, and it all seems very logical from what i can tell. To be fair I am not a game developer or a graphifcs engineer, but for everything he is talking about here he provides quotation and basis.
 

nujabeans

Member
Dec 2, 2017
961
Just found a pretty interesting video about how AMD is leading the future through consoles. Quite a lengthy video though.


Someone should make a thread about this (I can't create threads for some reason). The video is really about AMD's rise in recent years and their relation to Nvidia's leadership position in the market.

This channel Coreteks focuses on PC gaming/GPU industry and doesn't have any sort of "allegiance" to Microsoft or Sony. He argues that Sony's focus on I/O and bandwidth has won the console architecture design "war" and that he sees this as the biggest differentiator in the upcoming console generation. He quotes Nvidia's position on this, which is that communication is much more expensive than a compute operation.

"Arithmetic is free (low precession), communication is prohibitively expensive (in energy cost)" - Nvidia Slide

"Accessing even a small array costs way more than doing an operation." - William Dally Nvidia, Chief Scientist

It's a bit long at 45 minutes but I highly suggest watching the whole thing.

More tidbits:

- he believes there will be PS5 games not possible on PC (due to Sony's high speed architecture and having to design to lowest common denominator in PC's)

- he thinks the PS5 will be more expensive to manufacture because of its I/O design and 12-channel custom controller

- he believes both consoles will perform about the same as RTX 2080 GPU + R7 3700X CPU, highlights PS5's potential in particular

- Nvidia will have a hard time marketing a $700 GPU when consoles are able to perform that well

- Super fast asset streaming is a game changer and will be PS5's advantage over both PC and XSX. Argues that things like Sony's Spider-Man demo will not be possible on XSX because MS didn't focus on high performance I/O

- AMD's RDNA reveal and demo at Computex last year may have hinted at the possibilities of a high bandwidth architecture (a la PS5). Lisa Su quoted Mark Cerny and how Cerny wants to revolutionize gaming in the next decade

- Project Awakening by CyGames shown back in 2018 may have actually been the first footage of a PS5. Also points out that the developer is aiming for a highly seamless open-world. Coreteks thinks this could be the game that Cerny alluded was using ray-tracing at a high level.
 

Dictator

Digital Foundry
Verified
Oct 26, 2017
4,930
Berlin, 'SCHLAND
Someone should make a thread about this (I can't create threads for some reason). The video is really about AMD's rise in recent years and their relation to Nvidia's leadership position in the market.

This channel Coreteks focuses on PC gaming/GPU industry and doesn't have any sort of "allegiance" to Microsoft or Sony. He argues that Sony's focus on I/O and bandwidth has won the console architecture design "war" and that he sees this as the biggest differentiator in the upcoming console generation. He quotes Nvidia's position on this, which is that communication is much more expensive than a compute operation.

"Arithmetic is free (low precession), communication is prohibitively expensive (in energy cost)" - Nvidia Slide

"Accessing even a small array costs way more than doing an operation." - William Dally Nvidia, Chief Scientist

It's a bit long at 45 minutes but I highly suggest watching the whole thing.

More tidbits:

- he believes there will be PS5 games not possible on PC (due to Sony's high speed architecture and having to design to lowest common denominator in PC's)

- he thinks the PS5 will be more expensive to manufacture because of its I/O design and 12-channel custom controller

- he believes both consoles will perform about the same as RTX 2080 GPU + R7 3700X CPU, highlights PS5's potential in particular

- Nvidia will have a hard time marketing a $700 GPU when consoles are able to perform that well

- Super fast asset streaming is a game changer and will be PS5's advantage over both PC and XSX. Argues that things like Sony's Spider-Man demo will not be possible on XSX because MS didn't focus on high performance I/O

- AMD's RDNA reveal and demo at Computex last year may have hinted at the possibilities of a high bandwidth architecture (a la PS5). Lisa Su quoted Mark Cerny and how Cerny wants to revolutionize gaming in the next decade

- Project Awakening by CyGames shown back in 2018 may have actually been the first footage of a PS5. Also points out that the developer is aiming for a highly seamless open-world. Coreteks thinks this could be the game that Cerny alluded was using ray-tracing at a high level.
They wont be at a 3700x Level, due to Cache and... Clockspeed. And given how the XSX gpu fairs, the PS5 will not be at 2080 Level. Having a Hard time agreeing with any of the premises there.
 

Fredrik

Member
Oct 27, 2017
9,003
Someone should make a thread about this (I can't create threads for some reason). The video is really about AMD's rise in recent years and their relation to Nvidia's leadership position in the market.

This channel Coreteks focuses on PC gaming/GPU industry and doesn't have any sort of "allegiance" to Microsoft or Sony. He argues that Sony's focus on I/O and bandwidth has won the console architecture design "war" and that he sees this as the biggest differentiator in the upcoming console generation. He quotes Nvidia's position on this, which is that communication is much more expensive than a compute operation.

"Arithmetic is free (low precession), communication is prohibitively expensive (in energy cost)" - Nvidia Slide

"Accessing even a small array costs way more than doing an operation." - William Dally Nvidia, Chief Scientist

It's a bit long at 45 minutes but I highly suggest watching the whole thing.

More tidbits:

- he believes there will be PS5 games not possible on PC (due to Sony's high speed architecture and having to design to lowest common denominator in PC's)

- he thinks the PS5 will be more expensive to manufacture because of its I/O design and 12-channel custom controller

- he believes both consoles will perform about the same as RTX 2080 GPU + R7 3700X CPU, highlights PS5's potential in particular

- Nvidia will have a hard time marketing a $700 GPU when consoles are able to perform that well

- Super fast asset streaming is a game changer and will be PS5's advantage over both PC and XSX. Argues that things like Sony's Spider-Man demo will not be possible on XSX because MS didn't focus on high performance I/O

- AMD's RDNA reveal and demo at Computex last year may have hinted at the possibilities of a high bandwidth architecture (a la PS5). Lisa Su quoted Mark Cerny and how Cerny wants to revolutionize gaming in the next decade

- Project Awakening by CyGames shown back in 2018 may have actually been the first footage of a PS5. Also points out that the developer is aiming for a highly seamless open-world. Coreteks thinks this could be the game that Cerny alluded was using ray-tracing at a high level.
Lots of interesting bits there! If fast IO becomes the target for the future then PS5 will be at a good position.

But I feel like they're putting all eggs in one basket with that SSD. What if high CU count is where PC GPUs are heading? Then we'll see lots of PC ports struggling on PS5.

And just a reminder. PS games not possible on PC has already been the case for quite some time now for obvious reasons, so is that really something people expect? If we're lucky we'll start getting 3 year old PS games though but who knows where PC gaming is in 3 years. A not so bold take would be that PC gaming will be ahead of PS5 in 3 years. But I'd say lets see what happens.


I think it's more interesting how down he was on the nextgen console ray-tracing performance, which you didn't mention. Where he thought XSX would have an advantage over PS5 because of the higher CU count but still only be at the level of RTX 2080 or lower. Super depressive scenario after all the ray-tracing hype. Yet another 30fps generation incoming?
 

marecki

Member
Aug 2, 2018
251
They wont be at a 3700x Level, due to Cache and... Clockspeed. And given how the XSX gpu fairs, the PS5 will not be at 2080 Level. Having a Hard time agreeing with any of the premises there.
I sense another wc article incoming and endorsement from timdog and his crowd.

joking aside I thought after that debacle you'd be cautious about making such definitive statements about as yet not released harware and software without backing it up by evidence like benchmarks, access to next gen games, developer feedback. Unless you actually do have access to these then say that is the basis for your comments.
 

Andromeda

Member
Oct 27, 2017
4,845
They wont be at a 3700x Level, due to Cache and... Clockspeed. And given how the XSX gpu fairs, the PS5 will not be at 2080 Level. Having a Hard time agreeing with any of the premises there.
You don't know that. Until we have the first benchmarks we can't say for sure and you and I are only speculating. Look at how many Pro titles are running better than XBX despite Xbox having 40% more GPU and 50% more bandwidth (so twice more than here). In some cases the res is close or even the same.
 

Deleted member 10847

User requested account closure
Banned
Oct 27, 2017
1,343
You don't know that. Until we have the first benchmarks we can't say for sure and you and I are only speculating. Look at how many Pro titles are running better than XBX despite Xbox having 40% more GPU and 50% more bandwidth (so twice more than here). In some cases the res is close or even the same.

Care to say which games using the same resolution are running better on ps4 pro?