I have heard many times before that Ultra settings tend to have a serious performance penalty for what is arguably a minimal visual improvement - though of course that'll vary from game to game.
Frostbite is a nasty culprit for diminishing returns at Ultra settings. Good PC ports now often show you screenshots of what is being improved at different settings. Oftentimes, Ultra is immediately and clearly better. Stupid draw distances, more and higher fidelity lighting, more foilage density and of course just uncompressed textures.
Older software definitely has a worse issue with Ultra being far too demanding, proportional to what the player gets in return. Stuff like Ultra shadows in Bioshock Infinite come to mind, where running at high gains back substantial performance while showing virtually zero loss in quality in any regard under scrupulous examination.
It's why I wait to read user reviews and watch tech breakdowns like those from Digital Foundry. If there is some bizarre anomaly, they'll cover it. I game with a 4K monitor and a 1080ti, I and usually just let things run wide open. I only disable smeary AA and screen effects as a standard.
I also welcome the new consoles, they are the main thing that will force NVIDIA to really reinvigorate their product line. Some games, even with low settings have outstanding IQ and framerates on consoles (and often at 4K as well!). Their new cards need to be several magnitudes more performative than consoles for multiple markets. I think when we finally get benchmarks, the 3080ti will be close to twice as potent as a 1080ti (which is still a powerhouse).
That excites me greatly. It also kind of makes me sad. Even with double the performance, a lot of current and not-so-old titles will still be VERY difficult to run at 4K. Like, I could probably run Jedi: Fallen Order at just 80 fps with everything cranked. I can't imagine how under-powered these things will be by the time next-gen games get PC ports and the graphics demands are just insane.