Thanks for the detailed and constructive reply, always good.
So re the X improvements over just resolution, yes, but much of this came from the fact it had a significat Memory increase
(3.5GB ~63% and ~85% greater bandwidth)
This gave developers a much great scope to play with above just the Tflop increase of almost the same gap as PS5 - XSX of 1.8Tflops.
Enabling us to enjoy greater texture quality, higher LoD, better performance and increase effects, I said all this before the console launched as it was a significant boost over the Pro.
The SSD is not just for storage, it supplements the RAM and enables game choices to be open, higher density (helped by the Mesh Shaders which help improve detail and reduce bandwidth etc. Using this it will enable (as can the XSX) a shift in game design and streaming to reduce pop in, Mip Map chain delayed loads, stutter and all those other things we see often in the current gen, the CPU will greatly empower teams to make bigger and bolder choices alongside performance as you say.
The piece on Scaling was to demonstrate that a ~18-20% gap in GPU performance can be mitigated (all other effects and throughput being equal) by turning on Dynamic scaling for the PS5 by 10% per axis and leave the XSX at native 4K this is where teams can make the easiest choices to use the power and not add a great deal to the development. 1st party will have the choice to make more use of that and the ray tracing functions, which again the XSX should be slightly better at but this will likely be an even smaller gap, BUT I need to stress this is just talk now and we will have to wait for actual games and more info as this is going into much speculation as was my comment on the SSD Ram use, just thoughts.
Thanks
Not to drag this on, but I think it's an interesting conversation ;). Once again, I appreciate the well considered replies, and I also apologies if I am nitpicking.
With regards to visual improvements on X1X, my experience and understanding is that when PC/crossplatform devs build a game, all of those settings they create exist within the general codebase as tweakable variables in a well constructed game, and are tweaked constantly through development on all platforms. As we can see from the two-week port job of a game like Gears 5 to Series X, it was quick and easy for them to dial up any number of settings. The way developers use the extra GPU power (and I am assuming here that when we see a 12TF vs 10.2TF difference and wide CU difference, we are looking at basically the next-gen RDNA2 video cards in both, and MS quite literally went with the next step above, like a moderately overclocked RTX 2070 vs a standard RTX 2080).
I expect 3rd part devs to act like they always have. Make decisions based on whether they want the versions consistent, or to individually look the best they can. There is no making up for the GPU power difference. If they have a dynamic resolution scaler, I imagine they will use it in both versions, and choose to dial up settings in whatever way makes the game look best.
Also, while I agree that extra Vram can be a big visual differentiator, the greatest advantage for that is higher resolution assets, which many games on the X1X still didn't take advantage of. Many games still dialed up other non-memory related GPU tasks, such as:
- Character, world, foliage level-of-detail
- Shadow/lighting quality
- Post-processing quality
- Anti-aliasing quality
- Screen-space reflections
- Refractions
- Particles
- Texture filtering
- Or opted for improved framerates
Choosing to dial these up, if a resolution boost proved negligible, would be an incredible smart thing to do. I also assume that lighting quality could make the biggest GPU difference, especially with Ray-tracing now a big focus.
I suppose my point is that, just like it always has when consoles have power-difference, many devs will absolutely opt to use that difference, and in the case of almost any PC dev, it shouldn't take a huge amount of extra work, and I am quite sure it will be noticeable just as it always has been in the past. I will say though again, that while I am skeptical that Sony will be able to create much of any asset quality divide, if they can somehow manage to stream in textures and assets at higher quality than the series X, that could certainly give the system a very nice visual quality boost. I really can't wait to see how (not if), the incredible NVMes in these systems help create some extraordinary things.
My hunch is that the GPU overclock for the PS5 was a bit of a last-minute change/ hail-mary, and that the Series X has a better balance of hardware vs cost, but I could certainly be wrong if Sony releases significantly cheaper. I think you were bang-on when I believe you suggested doubt that the price difference would be any greater than $50 at most, if there is any difference at all.