"What GPU should I get to guarantee I can run every game in 4K ultra for the entire generation?"
All this talk of whether 10GB of VRAM is or isn't enough has triggered some nostalgia. Anyone that's PC gamed for any serious length of time will know that at some point in your life, the temptation to try to future-proof your setup will hit you and, oh, will you waste a ton of time and money trying to make that happen. You'll spend a lot of money buying something bleeding edge, which is completely outclassed by cheaper mainstream offerings within a couple of years.
I have a friend who read in PC Gamer that multi-core was the future of gaming so he rushed out and spent a fortune on a workstation motherboard and two Northwood Pentium 4s in 2002, presuming he'd be future proofed "for at least ten years" against any dual-core CPUs Intel would put out. Alas, his CPUs would be massively outclassed by the Prescott range (thanks to doubled cache, 64 bit support and SSE3) that launched just two years later and it only got worse from there. I don't think any game he played ever troubled that second CPU and, my god, was it loud. By the time Intel released the Core Duo range, he'd long gotten rid of his pointless workstation for an Alienware prebuild.
I have nothing quite as ridiculous as that but I still look at some of my PC purchasing decisions and shudder. In 1998 I decided to go all in on Zip discs on the basis that it was becoming a "universal standard". What was making it "universal"? One of the PCs in the lab at college had a drive and the news that Sega were planning a Zip drive for the Dreamcast... Of course, I'd replace it with a CD-RW drive a couple of years later. The maker?
Oh, and there was that time in 2010 I decided to put 32GB of DDR3 into a build at great expense. Don't think I ever touched half of that before my motherboard fried itself. For reference, my current build has 16GB of DDR4 and I'm perfectly happy with that.
So what have you done to future proof your builds. Don't worry. We won't judge. Much.
All this talk of whether 10GB of VRAM is or isn't enough has triggered some nostalgia. Anyone that's PC gamed for any serious length of time will know that at some point in your life, the temptation to try to future-proof your setup will hit you and, oh, will you waste a ton of time and money trying to make that happen. You'll spend a lot of money buying something bleeding edge, which is completely outclassed by cheaper mainstream offerings within a couple of years.
I have a friend who read in PC Gamer that multi-core was the future of gaming so he rushed out and spent a fortune on a workstation motherboard and two Northwood Pentium 4s in 2002, presuming he'd be future proofed "for at least ten years" against any dual-core CPUs Intel would put out. Alas, his CPUs would be massively outclassed by the Prescott range (thanks to doubled cache, 64 bit support and SSE3) that launched just two years later and it only got worse from there. I don't think any game he played ever troubled that second CPU and, my god, was it loud. By the time Intel released the Core Duo range, he'd long gotten rid of his pointless workstation for an Alienware prebuild.
I have nothing quite as ridiculous as that but I still look at some of my PC purchasing decisions and shudder. In 1998 I decided to go all in on Zip discs on the basis that it was becoming a "universal standard". What was making it "universal"? One of the PCs in the lab at college had a drive and the news that Sega were planning a Zip drive for the Dreamcast... Of course, I'd replace it with a CD-RW drive a couple of years later. The maker?
Oh, and there was that time in 2010 I decided to put 32GB of DDR3 into a build at great expense. Don't think I ever touched half of that before my motherboard fried itself. For reference, my current build has 16GB of DDR4 and I'm perfectly happy with that.
So what have you done to future proof your builds. Don't worry. We won't judge. Much.