More like 900p, assuming it scales linearly with teraflop count and that the Series S is precisely 4tf and not a little higher. This is a naive assumption since we don't know the total technical specs of all the systems, and we don't know the requirements of the demo (but that doesn't mean it could only be lower - it could also mean that it's a bit higher than 900p). For example, the Series S presumably has more memory bandwidth per FLOP than PS5 and Series X do, because while it only has approx 1/3 of the tflops of Series X, it would have much, much more than 1/3 the memory bandwidth. In some games that won't make any difference, in others it might mean it "punches above it's weight" in terms of resolution. There are many more technical specs than just memory bandwidth that is unknown, however! And every game engine responds differently.
If you want to do similar naive calculations, the process is:
10.2 (peak ps5) / 4 (assumed Series S) = 2.55
Square root of 2.55 = 1.597, which is the scaling factor. (We do this because resolution is two numbers, but the "p" number is just the vertical number only. For example, 2160p is 4x the pixels of 1080p, despite only being "2x higher" as a number. 2 is the sqrt of 4, which would give us the correct number.)
Divide single axis resolution (1440) by this factor to work out what a "PS5 game on Series S" would scale at approximately. In this case, a 1440p game on PS5 if it scaled linear to teraflop is 901 vertical lines. Multiply a number by the scaling factor to do the inverse, e.g. 1080p on Series S gives 1724p on PS5.
The scaling factor between Series X to Series S is 1.74 (example: 1080p on S = 1879p on X)
The scaling factor between Series X and PS5 is 1.091 (example: 1440p on PS5 = 1571p on X)