This is SUCH a massive thread and kudos for it starting with a record-straightening statement. But here's my thinking, if anyone notices it at this point:
If both consoles have SSD architecture by default, that's great news... BUT if developers target content for both platforms, the *slower* SSD standard is going to win. Unless Sony pays for PS5-exclusive DLC for open-world games, whose levels include XXX more geometry or YYY further view distances, why would Ubisoft design core systems that one platform would stutter and pause to load? The whole point of the SSD revolution (pun intended) is to increase the available memory/RAM/etc pool that the GPU/CPU can play with at any given moment. That's entirely different from varying levels of CPU/GPU performance, which engines can account for by slashing effects or pixel resolution.
In other words: You can't drop your Xbox Series X game's resolution to 1440p and expect that to fix the seek-time difference in a 4K PS5 game. You could MAYBE change the LOD settings and have XSX games look fuzzier or blobbier when someone rapidly turns the camera. But even that's not an obvious "easy" fix. Targeting the platform with the slower SSD seek time will be a much cheaper path for third-party multi-plat games in the first 1-2 years.