Wow at all the salt here... It's very clear that, as long as VRAM is enough, PC graphics cards perform the same as their console equivalents, and that's normal... Because it's literally the same hardware running the same code! The whole "consoles somehow get magically better than PCs as the generation goes along" is usually nonsense, because people compare apples with oranges all the time.
With that said, the next generation consoles are crazy powerful for the price, especially if ray traced effects are kept to a reasonable level. Nvidia cards obviously have a huge performance advantage in ray traced effects compared to RDNA 2, but with low sample counts and things like that you can get very good results.
While I agree the PC alpha effects running at a higher res means this isn't a 1:1 comparison, strictly based on the benchmark stats in the video, "running between 2070 Super and 3070" isn't really accurate. Performance on the PS5 averages closer to the 3070.
I thought the 2080 comparison was wrt rasterization only. With Nvidia cards readily outperforming AMD ones in RT scenarios, I thought the PS5 would fall in the 2060 or 2070 ballpark maybe.
In games that are heavier in RT effects, for instance Watch Dogs Legion, the new consoles were performing to the same level of a 2060 Super IIRC. There was another Digital Foundry video about that.