Status
Not open for further replies.

E.T.

Member
Oct 25, 2017
4,039
This bodes well both sets of fans. Xbox and Playstation gamers are eating.
 

Vector

Member
Feb 28, 2018
7,364
As a Software Engineer who doesn't have first hand experience in game development, I like how convenient and straightforward Sony has made developing for their platform with their system architecture and quick access to resources - if I were a Game Dev I would rather focus on things like game logic as opposed to figuring out how to efficiently use a split memory configuration and the like. It's a far cry from the PS3 days and I'm glad Sony is taking this route.

I'm also not entirely sure how many 3rd Party studios will effectively leverage the extra juice the Series X has.
 

jroc74

Member
Oct 27, 2017
31,335
I don't have anything to comment about the OP or the developer comments but to chime in here about the GTX 970. The GTX 970 is widely considered to be a 3.5 GB card because once the RAM was pushed to the limit where it would be forced to use the slower pool of memory and the performance would chug. There was a class action lawsuit filed against nVidia and they had to payout some money for this. So yea, it's not split pool of memory but it's not ideal either, if a game needs bandwidth/memory and the game is forced to use the slower pool there will definitely be a performance hit as we have seen in the past.
Sources:
www.pcgamer.com

Why Nvidia's GTX 970 slows down when using more than 3.5GB VRAM

Nvidia GTX 970 inaccurately stated the GTX 970 has the same ROPs and L2 cache as the 980 at launch.
Yup.

I remember the controversy about this.

Total size of ram is nice and better than that situation. But there is probably a con to the different speeds still.

Both consoles will have pros n cons beyond TF vs SSD.

Too many people here convinced themselves there was no benefits to a lower CU count. You'll just have to get used to developers telling you there is. There are benefits and drawbacks to all tech. If you believe there's only benefits or only drawbacks with any of the tech used in either consoles you have the bias.
This.
 

c0c0suma

Banned
Jan 20, 2018
79
I can only speak to his claim. I can't speak to the veracity.



I don't get the feeling that he thinks XSX sucks but his enthusiasm stops just short. I'm not going to vilify his other tweets but he has a preference and that's fine.



This is treading slightly higher than my level of understanding but I appreciate the effort. I understand most of it but some of it is at odds with what I thought I knew. But to your point I haven't been actively trying to learn about this since before RDNA was even announced so I guess I have more reading to do and videos to watch. Thanks again.
Most RDNA changes are meant to be better scalability. AMD knew GCN sucked at gaming scaling ( while did great on computational tasks tho ), so they made quite a few changes to make games easier to scale, including :

1. Wave32 instead of Wave 64.
Wave32 mode allows the latency in worst case reduced by 2x. Which means, if the code was well optimized for Wave 64 mode, it can be very efficient. However if it's not well optimized, the Wave 32 mode can largely reduce the performance problem.
Generally it's harder to make full use of Wave 64 in games, that's why Nvidia was sticking to their 32 thread sized warps too. It's just more optimal to current games.

In layman's words, Wave 32 is almost as efficient as Nvidia's 32 threaded Warps. The wave 64 mode was causing troubles on GCN to get high gaming performance.

2. Optimized data path
RDNA added L1 caches to improve data traffic in Shader Arrays.
The architecture of RDNA looks like this:

2x CUs ----- Work Group (WGP/DCU) ----Shader Array --- Shader Engine --- GDDR PHY (vram)

Registers - L0 Cache -------------------------- L1 Cache ------------------------------------- L2 Cache

So as you can see, without the L1 Cache the distance between L0 to L2 is very long. And that's where GCN choked too. Vega series had to use HBM to make some fix on that however it's not nearly enough. Because in games the computations are more random and data access are more random too. Each missed data request would need to go through a really long data path to finish.
With the newly Added L1 Cache, which can satisfy many data requests and largely improve the scalability of shader arrays.

This basically means that scalability isn't a issue on normal games unless the devs are crashing the performance purposefully.
So what's the reason that larger chips have better performance in real game tests, even when they are having a lower Teraflops?
Why did the 40 CU < 9.7TF card beat the 36 CU ~ 9.95TF card?

Well let's imagine if we are doing the same transporting task here.

We have a factory here which has 40 or 36 production lines, a local raw material storage with 20 or 18 rooms, and a supplying chain that is supplying materials in a weekly basis to the storage first.

the 40 lines factory operates slightly slower, working 5 days a week to deplete the material storage. But due to more production lines, each time it depletes the storage, it products 40 products.

the 36 lines factory operates much faster, it depletes the material very quickly. But the supplying chain isn't fast enough to feed in the raw materials fast enough so basically it works 4 days a week.

So guess what, after every week, the 40 lines factory produces 40 products, and the faster 36 lines factory produces just 36 products.

This is a extremely simplified model that is only meant to be describe the main issues that RDNA is having right now: still pretty memory bandwidth hungry on 4K gaming.
 
Last edited:
Oct 28, 2017
2,666
Both consoles will be beasts and both will excel in certain areas. And both consoles will only be as good as the studios who are working the silicone. Being number one in hardware has never made a huge difference, and the gap has never been smaller, you guys.
 

eathdemon

Banned
Oct 27, 2017
9,690
Yup.

I remember the controversy about this.

Total size of ram is nice and better than that situation. But there is probably a con to the different speeds still.

Both consoles will have pros n cons beyond TF vs SSD.
honestly even ms ssd is no weakling ether. as far as the ram set up on xsx, its very pc in nature. most devs should be fine.
 

LCGeek

Member
Oct 28, 2017
6,097
As a Software Engineer who doesn't have first hand experience in game development, I like how convenient and straightforward Sony has made developing for their platform with their system architecture and quick access to resources - if I were a Game Dev I would rather focus on things like game logic as opposed to figuring out how to efficiently use a split memory configuration and the like. It's a far cry from the PS3 days and I'm glad Sony is taking this route.

I'm also not entirely sure how many 3rd Party studios will effectively leverage the extra juice the Series X has.

There's a difference between raw utilization and optimizaiton.

GPU wise you will see it automatically in game perfromance or native res eventually devs can exploit it for more.

It only gets better.
 

tapedeck

Member
Oct 28, 2017
8,202
Soooo has this guy actually personally worked on the PS5 or not?

That seems like a fairly important qualifier for this opinion lol.
 

BradGrenz

Banned
Oct 27, 2017
1,507
B3D can be just as bad as this thread. For the past couple of days they've been arguing about the whole PS5 clock speed. And now its been moved on to how PS5 SSD is going to throttle.

Yeah. The moderation is not very balanced, either. I was finally banned because I was not polite enough in my interactions with the shitposting fanboys who have been tolerated for years as they spam FUD.
 

KingBae

Member
Oct 28, 2017
735
Seems like the website will have an Instagram live with the dev at some point to clarify some of the questions people asked in the comments. There might be more interesting info if anyone can speak Farsi. I just used google translate to check out the article and saw that mentioned in the comments.
 

chris 1515

Member
Oct 27, 2017
7,084
Barcelona Spain
That was my thought. Also loads of talk about how directx and Windows are slow. It's clearly not running full windows.

I dunno. I work with people who hate Windows and spend all their time trying to get Linux working properly while the rest of us just get on and do our jobs. He could be completely right In his summary and also biased against ms.

Then again he does say that towards the end of the gen:



Which seems totally at odds to everything else he's saying. Weird.

Well whatever, it's great to have a detailed dev view.

It is at the end of the generation. For most of the generation, the PS5 will be there and it goes with the fact it is easier to extract performance from the PS5. The resolution will probably be 15 to 20% higher on XSX at the dawn of the next-generation consoles.
 

rahzel

Member
Oct 27, 2017
458
Yeah. The moderation is not very balanced, either. I was finally banned because I was not polite enough in my interactions with the shitposting fanboys who have been tolerated for years as they spam FUD.
Been there since 2007 under a different name but haven't posted there much at all in the past few years. Shifty Geezer is the only mod I find to be reasonable and neutral. I think I have a good hunch who banned you.
 

RestEerie

Banned
Aug 20, 2018
13,618
giphy.gif
 

cyrribrae

Chicken Chaser
Member
Jan 21, 2019
12,723
It's a mistranslation, he said below a second, which is what cerny already said the target is.
Mm no, I'm referring to a quick resume like feature. Correct me if I'm wrong, but I don't recall Sony announcing anything like that (not that I imagine they haven't implemented it. Or even could implement now if they wanted to.)
 

big_z

Member
Nov 2, 2017
7,991
We negative loading and negative latency... i think sony negative developed last of us 2 and we've already played it.
 

gofreak

Member
Oct 26, 2017
8,048
It sounds like he thinks it's 'better' as a developer - the tools, the libs, the adaptability of the power draw to different bounds etc - which reflects what Jason Schreier said devs were expressing to him elsewhere, re the tools/APIs at least. APIs that make it easier to reach its full performance would tally with what I believe was often the experience this gen too.

What he's not quite saying is that PS5 performs better overall, before anyone runs away with themselves, except with regard to the SSD.

I think the comments might help square off why there was so much apparent positive buzz from developers around PS5 even if it's not as powerful in all respects as the XSX though.
 
Last edited:

c0c0suma

Banned
Jan 20, 2018
79
This is news to me. What leakers and 3rd party devs have been saying things will get ugly for the PS5 due to throttling?
image0.png

image0.png

basically saying that Navi 10/14 @ 2GHz is already very barely. It's highly possibly the limit of Navi. Which means that 2.23GHz is highly likely to be beyond the sweet spot of Navi's I/V curve. So throttling is a must. Of course that's why sony implemented variable frequency.

So about variable frequency, why would it be affecting game developing so much?

Because the shared power budget often prefers GPU instead of CPU. And Sony didn't give devs the ability to control the power balance.

So, imagine you're playing a game which is both CPU and GPU hungry.
At first, the CPU might be operating at 45W 3.5GHz, and the game is smooth at the beginning, the CPU wants to draw more frames per second.
Then the GPU is in heavy duty to draw those frames, and it become a bottleneck soon. So the smart shift transfers the power balance to GPU more.
Meanwhile, the CPU is still calculating the next frame. Guess what? It become slower!
This might cause a serious variable frame time issue which is especially bad for 60hz screens.

That's where they need v sync.
So let's say we have a 16.66ms time budget for each frame. Now again, for each frame, the power needed and the power provided is out of sync. Why? Because the CPU is always calculating for the next frame, while the GPU is always calculating for the current frame. This is the parallel way to be widely used in game rendering to improve fps.

E.g. a frame needs cpu time for 16.66ms, and gpu time for 16.66ms. if it's done in serial way than it needs 33.33ms which makes a 30fps game. However if it's done in parallel way it takes only 1666+16 ms to render 100 frames, which means the game is 60fps.

So imagine a scene where the current frame requires GPU power and the next frame requires CPU power. How to distribute that power requirement? The smart shift is still very bottlenecking here and can cause a serious frame pacing issue again.

With dev's full control, they might be able to balance the power shift better when certain things happen, e.g. a smoke grenade usually destroys the performance, but they don't want the fps to be too variable, so they can limit the power balance.

Again, the variable frequency technology came out so many years ago and no game console ever used it in games, at least not without the game developers' full control. There's a reason to that.
 

spidye

Member
Oct 28, 2017
1,024
PlayStation SSD speeds reach 8-9 GB in peak mode. Now that we've reached this speed, what else will happen apart from loading games and more details?

The first thing to do is remove the loading page from the games. Microsoft also showed the ability to stop and run new games, which can run multiple games simultaneously and move between each in less than 5-6 seconds. This time will be below zero in PlayStation

:)
I've read parts of the original article and he says below one second not zero.
 

Simuly

Alt-Account
Banned
Jul 8, 2019
1,281
It seems logical to me that fewer, faster CUs (PS5) will be easier to fill with work than a higher number of slower CUs (Series X).

If there was an advantage for PS5 I didn't think it would be the API. So it's very interesting what he says about Direct X. This brings to mind that funny phrase 'coding to the metal' from last gen.
 

Nintendo

Prophet of Regret
Member
Oct 27, 2017
13,718
I'm also not entirely sure how many 3rd Party studios will effectively leverage the extra juice the Series X has.

The same way they leverage the extra juice the X1X has, except there isn't that much extra juice next gen. The difference is much smaller between the PS5 and the XSX.
 

GhostofWar

Member
Apr 5, 2019
512
Can't to wait to see these machines in action! Would love to see Crytek make a resurgence....

Most of the talent at crytek left when they weren't being paid years back. ID benefited from that nicely, I think some went to star citizen aswell but the juries out on that I guess. This guy in the interview has been there since dec 2018 for example.
 

c0c0suma

Banned
Jan 20, 2018
79
As a Software Engineer who doesn't have first hand experience in game development, I like how convenient and straightforward Sony has made developing for their platform with their system architecture and quick access to resources - if I were a Game Dev I would rather focus on things like game logic as opposed to figuring out how to efficiently use a split memory configuration and the like. It's a far cry from the PS3 days and I'm glad Sony is taking this route.

I'm also not entirely sure how many 3rd Party studios will effectively leverage the extra juice the Series X has.
I wonder if you don't have first experience in game development, why would you think that Sony made developing for their platform more convenient and straight forward? And how on earth made split memory more complicated? PCs have been using split video memory since the beginning of time, UMA has already simplified a lot on that, 10GB of 560GB/s memory meant for textures and models... is this even something worthy of mention about when you're trying to make a statement that MS is doing a worse job at developer support?

The variable frequency without dev's control is one of the worst ideas in console history, if it's not already the worst. It should only be used in linear games instead of every game. There're many unpredictable things happening with non-linear games which can make the smart shift much worse than expected. Just google about how many people using laptops are trying to disable turbo boost to get a more stable play experience...

In short, the variable frequency is pretty much all against the dev friendly propaganda.
 
Last edited:

Ex Libris

Attempted to circumvent ban with alt account
Banned
Nov 1, 2017
287
image0.png

image0.png

basically saying that Navi 10/14 @ 2GHz is already very barely. It's highly possibly the limit of Navi. Which means that 2.23GHz is highly likely to be beyond the sweet spot of Navi's I/V curve. So throttling is a must. Of course that's why sony implemented variable frequency.

So about variable frequency, why would it be affecting game developing so much?

Because the shared power budget often prefers GPU instead of CPU. And Sony didn't give devs the ability to control the power balance.

So, imagine you're playing a game which is both CPU and GPU hungry.
At first, the CPU might be operating at 45W 3.5GHz, and the game is smooth at the beginning, the CPU wants to draw more frames per second.
Then the GPU is in heavy duty to draw those frames, and it become a bottleneck soon. So the smart shift transfers the power balance to GPU more.
Meanwhile, the CPU is still calculating the next frame. Guess what? It become slower!
This might cause a serious variable frame time issue which is especially bad for 60hz screens.

That's where they need v sync.
So let's say we have a 16.66ms time budget for each frame. Now again, for each frame, the power needed and the power provided is out of sync. Why? Because the CPU is always calculating for the next frame, while the GPU is always calculating for the current frame. This is the parallel way to be widely used in game rendering to improve fps.

E.g. a frame needs cpu time for 16.66ms, and gpu time for 16.66ms. if it's done in serial way than it needs 33.33ms which makes a 30fps game. However if it's done in parallel way it takes only 1666+16 ms to render 100 frames, which means the game is 60fps.

So imagine a scene where the current frame requires GPU power and the next frame requires CPU power. How to distribute that power requirement? The smart shift is still very bottlenecking here and can cause a serious frame pacing issue again.

With dev's full control, they might be able to balance the power shift better when certain things happen, e.g. a smoke grenade usually destroys the performance, but they don't want the fps to be too variable, so they can limit the power balance.

Again, the variable frequency technology came out so many years ago and no game console ever used it in games, at least not without the game developers' full control. There's a reason to that.
What is this source
 

elenarie

Game Developer
Verified
Jun 10, 2018
10,950
The funny thing is that people who actually know right now are NDA'd ;) But yeah, it seems watching videos gives you a degree in CS. Video games only do everything!

And usually, the people that talk details online haven't signed one meaning they don't have access to the hardware, otherwise they would be burned alive by Sony and MS.

Good luck to you, mate.
 

Nintendo

Prophet of Regret
Member
Oct 27, 2017
13,718
That's pretty brutal. Aren't the Digital Foundry guys pretty well educated and informed? For instance I know Alex is/was a programmer. I don't think it's fair to make blanket statements like that, DF has shown again and again their credibility and expertise.

Yeah and John is a software engineer IIRC. The guys at DF know their stuff pretty well.
 

Marble

Banned
Nov 27, 2017
3,819
It's a pretty bold statement he says that Series X will only reach 12TF at ideal situations. This is like complete 180 to the earlier consencus regarding PS5's 10.28TF number.

Than XSX. This will make the console work mostly on the 10.28 Tflops. But in XSX, since the other parts of the gpu work slower due to the lower clock speed, it actually works a lot at lower Tflops most often and reaches 12 only at ideal situations.
 

Deleted member 20986

Oct 28, 2017
4,911
Every next gen thread turns into a mass grave sheesh...

Yeah seems inevitable but back then I only notice it around the E3 before and after the unveil of new consoles.

Seems this time around it's going on for way too long. Geez how can you argue about this shit for months on end?
 
Status
Not open for further replies.