This forum has a warped perception of the average upgrade cycle. 3-4 years for a GPU upgrade is normal, not some unrealistic expectation. You get barely any value upgrading every generation and the leaps are tiny.
The 3080 and 3070 only being viable for high end PC gaming got 2 years would be a complete disaster, and is no sort of endorsement.
That's the unfortunate reality though when you buy hardware at the start of a new gen. None of the new cards are all that much faster than consoles so by the standards of people who buy a GPUs for $500+ I'd say these cards are not going to last much longer than 1-2 years if you want to match consoles at 60fps. You'd need
at minimum 2x the power for that and none of the new cards offer that. No amount of VRAM is going to to make up for that. 3-4 year upgrade cylces are only realistic a couple of years into a console gen when GPUs are already much faster than the GPU baseline of consoles.
If you're uncomfortable with spending $500+ on a GPU that will only last for 2 years then I suggest you to cut your budget and buy a console or a $200-300 GPU to last through the next 2 years.
I do think the 10GB 3080 is in a much better spot than the 3070 due to the Series X memory allocation. We may end up with a scenario where the 3070 needs to use Series S assets which wouldn't be a good look for a $500-$600 GPU.
Once next gen starts you're going to have to drop settings or resolution anyway to get a decent framerate. Also consoles have shared memory so both the 3080 & 3070 have more available memory than the XSX & XSS respectively. Not that it will help them.
Either people overestimate the new GPUs or they underestimate the power of next gen. I've had this same discussion last gen when people were contemplating whether to buy the 770/780 2GB vs 4GB version. I argued that the 4GB version was a waste of money as none of these cards would last long enough for it to matter with next gen and I was right. The 4GB versions were trash 1-2 years into the gen too. Now Kepler did age spectacularly bad but the gap between high end GPUs & consoles was much much larger then. Even a 7970 was more than twice as fast as consoles back then. The 6900xt isn't even that.
Are you being deliberately obtuse? Me and many others have been gaming in native 4K on PC for years. Although in very demanding games you may need cap FPS to 30, you still need the same amount of VRAM in 4K, doesn't really matter if you run at 30 or 60 or 120FPS.
Hitting 4k for the past years in current gen and hitting 4K in next gen games are two different things. How do you think a 3080 is going to fare at 4K in a next gen 30fps game with a dynamic res that drops to 1440p? It's not going to do well at all, not even if you're targetting 30fps. It'll be okay for 60fps 4K games and games that actually do target native 4K but it's naieve to think there won't be games that target 30fps <4K resolution on consoles.