Again, that comes down to blunt power, not definitive "superiority" for development or performance.
The PS5 GPU has a bunch of unique built in instruments that may vastly improve performance, which XSX doesn't have (nor anyone for that matter) . For example an entire chip dedicated to on-the-fly optimisations (e.g. culling all pollies on reverse surfaces, all pollies that are side-on, combining pollies that are similar, etc). There was a whole other chip dedicated to similar optimisations but I forget what this was. The Xbox Series X has no such in-built GPU systems, IIRC. It has impressive instruments but they were more about providing options for users and developers rather than optimising the actual workload (e.g. working HDR on BC games (definitely cool) and also offering a raytracing tool).
Indeed, it's still up in the air whether only first party exclusive developers will make use of these really cool GPU instruments Sony developed. But from how Cerny described them, it sounds almost like it would simply be a "switch" that developers can flip as long as their assets/architecture can be read by the GPU. And if it works, and is that easy, it'll immediately make a ton of graphical load and performance optimisations automatically.
Perhaps that'll make up for that 15% raw GPU power difference. Perhaps not. We have to wait and see.
But again, the clear message from tech and development spheres is that "raw power" does not mean "superior", be it in GPU or CPU or whathaveyou.