• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.

Burai

Member
Oct 27, 2017
2,082
"What GPU should I get to guarantee I can run every game in 4K ultra for the entire generation?"

All this talk of whether 10GB of VRAM is or isn't enough has triggered some nostalgia. Anyone that's PC gamed for any serious length of time will know that at some point in your life, the temptation to try to future-proof your setup will hit you and, oh, will you waste a ton of time and money trying to make that happen. You'll spend a lot of money buying something bleeding edge, which is completely outclassed by cheaper mainstream offerings within a couple of years.

I have a friend who read in PC Gamer that multi-core was the future of gaming so he rushed out and spent a fortune on a workstation motherboard and two Northwood Pentium 4s in 2002, presuming he'd be future proofed "for at least ten years" against any dual-core CPUs Intel would put out. Alas, his CPUs would be massively outclassed by the Prescott range (thanks to doubled cache, 64 bit support and SSE3) that launched just two years later and it only got worse from there. I don't think any game he played ever troubled that second CPU and, my god, was it loud. By the time Intel released the Core Duo range, he'd long gotten rid of his pointless workstation for an Alienware prebuild.

I have nothing quite as ridiculous as that but I still look at some of my PC purchasing decisions and shudder. In 1998 I decided to go all in on Zip discs on the basis that it was becoming a "universal standard". What was making it "universal"? One of the PCs in the lab at college had a drive and the news that Sega were planning a Zip drive for the Dreamcast... Of course, I'd replace it with a CD-RW drive a couple of years later. The maker?

31CQV8M3XAL._AC_.jpg


Oh, and there was that time in 2010 I decided to put 32GB of DDR3 into a build at great expense. Don't think I ever touched half of that before my motherboard fried itself. For reference, my current build has 16GB of DDR4 and I'm perfectly happy with that.

So what have you done to future proof your builds. Don't worry. We won't judge. Much.
 
Jul 1, 2020
6,521
Putting an ATI Radeon X800 Pro in the first PC I ever built. Within a year or so games started requiring shader model 3 which the X800 line didn't support.
 

Niosai

One Winged Slayer
Member
Oct 28, 2017
4,919
I, uh...upgraded from an i3 to an i5 last month. Totally good for 4K 144 FPS Cyberpunk, right?
Jk. I'm not in a place financially to "future proof"
 

Blade30

Member
Oct 26, 2017
4,610
The thing that comes to my mind is my monitor which I bought in 2009. It's a 16:10 monitor with a 1680x1050 resolution, at the time it was fine and supported by pretty much all games but a few years later there were games I encountered that didn't support that resolution (especially japanese games), so I get black bars and have to lower the resolution. I should have went with a 16:9 monitor instead 🤦
 

sir_crocodile

Member
Oct 25, 2017
23,479
I have nothing quite as ridiculous as that but I still look at some of my PC purchasing decisions and shudder. In 1998 I decided to go all in on Zip discs on the basis that it was becoming a "universal standard". What was making it "universal"? One of the PCs in the lab at college had a drive and the news that Sega were planning a Zip drive for the Dreamcast... Of course, I'd replace it with a CD-RW drive a couple of years later. The maker?

31CQV8M3XAL._AC_.jpg

Zip drives were fantastic for transferring data though. I used them up till 2005ish when usb drives became reasonably priced for 64MB of storage.

I was only using CD's for long term data storage until HDDs became cheap. RW's were a waste of time imo.
 

Isee

Avenger
Oct 25, 2017
6,235
Bought a blu ray drive when PS3 released.
Thought PC games would start shipping on blu rays too.

Other that that, I'm fully aware that every kind of hardware will be obsolete one day.
There is no "performance" future proofing imo.
 
Sep 12, 2018
19,846
Upgrading from an i5 3570K with DDR3 RAM to an i5 4670K with DDR3 RAM on my second build lol, I might have overcompensated on my third go getting an i9 9900K.
 

Sanctuary

Member
Oct 27, 2017
14,198
I can't really think of anything aside from buying a Radeon 5850 in the fall of 2010 only to turn right around and buy a GTX 580 about a year later. Normally I just do a full system build every 3-5 years and buy a new GPU every 3-4 years as well, but more recently I started having two PCs at any given time. My previous gaming PC becomes my "old game/storage" PC whenever I make a new one, and my current storage PC is from the end of 2013 and still playing modern games just fine. Of course, it had a hand-me-down GPU upgrade via GTX 1080, so maybe that's cheating.
 

Jazzem

Member
Feb 2, 2018
2,680
Not the most disastrous, but I almost immediately regretted building my PC in 2016 with a GTX 960 instead of a 970 D: Especially seeing how many recommended specs soon after started at the GTX 970. Somewhat similar story too with going i5 6600 instead of i5 6600k. Updated both since thankfully
 

oRuin

Member
Oct 25, 2017
718
Friend bought a VERY early TFT monitor to replace his CRT. The pixel response and blur made it impossible to play pretty much anything. Terrible.
 

Serule

Member
Oct 25, 2017
1,766
I updated from a 960 to a 1660 last year, but the 1660 gets so loud that I run it at lower settings all the time. (Both were blower coolers)
 
Oct 26, 2017
3,913
I bought a blu-ray RW drive back in 2009 thinking that PC media would be moving over to BDs. Didn't really think everything would go digital.

That said I've still got it, it still works and on occasion I have put a blu-ray in my PC. Definitely a waste of money though.
 

Deleted member 24021

User requested account closure
Banned
Oct 29, 2017
4,772
Buying an i5 because everyone told me that it's great for gaming and that I don't need an i7.

That turned out to be bullshit, I can't play open world games without the framerate tanking.
 

Andri

Member
Mar 20, 2018
6,017
Switzerland
Def did the upgrading to more DDR3 ram, when i had to get a new cpu/mobo with DDR4 way before games needed more than i had before the upgrade.
 
Oct 28, 2017
1,916
Probably the AGP version of Nvidia 7300Gt, which I purchased for running games with shader model 3, but then having several games that didn't even start, complaining about the lack of shader model 3 support of my pc.
I remember beating Serious Sam 2 for the first time on my uncle's computer which I only had access to on sundays because of this, LOL
 

Theswweet

RPG Site
Verified
Oct 25, 2017
6,402
California
Bought an HD 7850 1GB which I had to swap out within a year for a card with more VRAM. My R9 290 Tri-X + i5 3570k build worked great until they died within 24 hours of each other in 2018 at least!
 

kami_sama

Member
Oct 26, 2017
6,998
Bought an HD 7850 1GB which I had to swap out within a year for a card with more VRAM. My R9 290 Tri-X + i5 3570k build worked great until they died within 24 hours of each other in 2018 at least!
That sounds like a PSU failure that took them both to the grave. RIP
I don't think I've had things get outdated soon after buying them. I've regretted some things though.
 

Symphony

Member
Oct 27, 2017
4,361
Constantly buying problematic motherboards.

Oh I'll get a 2500k i5, I can overclock it later on - motherboard doesn't support overclocking despite every other model in the range doing so. Oh, I'll get this Ryzen motherboard it ticks all the boxes and I can upgrade to 32gb RAM later on - motherboard can't support more than 2 sticks at the higher frequency and the extra 2 I bought basically bricked the whole PC for a while when I put them in.

Not a problem I think I'll ever avoid since motherboards are the one thing that continue to confuse me after all these years, so many ridiculous little things you have to look for.
 

Deleted member 20471

User requested account closure
Banned
Oct 28, 2017
1,109
I bought a gtx 770 at the beginning of 2014, I thought that 2gb of vram were enough to play ps4/xbox one games... oh boy, I was wrong D:
 

kaputt

Member
Oct 27, 2017
1,204
I always thought about it. When the time comes it to splash my money on more expensive parts, I have second thoughts and end up buying cheaper ones.

This is a very interesting topic. I always felt that there's a sweet spot in price x performance, that I shouldn't spend more than a certain amount. When you look backwards with PC parts, that does seem to be the case.

I think the best way to future proof nowadays is to wait for the release of better parts, after new consoles arrive. Then you will have parts that are a step above the consoles and games won't demand much more than that, if you're happy to play with console equivalent settings. I own a GTX 970 and can play most of the games released in a way better fashion than with my PS4, sometimes locking tem to 30 FPS but with a more stable performance and fancier graphics (Control, for example).

So future proofing on PC it's much more a decision of settling down your standards. If you always want to run games on ultra at 60 FPS or above, you'll need to constantly spend big on PC with frequent upgrades, I believe there's no other way.

The other day I watched a Linus Tech Tips video that compared a wide variety of builds, based on their price range, and it's interesting to see how much pricier to high-end builds are, I don't think the benefits are worth it, especially considering how much you need to spend to keep that level of fidelity on future years:

youtu.be

Cheap vs. Expensive Gaming!?

Get the Ring Doorbell Welcome Kit today at https://Ring.com/LTT Use code LINUS and get 25% off GlassWire at https://lmg.gg/glasswire We built six gaming PCs ...
 
Last edited:

Terbinator

Member
Oct 29, 2017
10,206
I've done so many side-grades it isn't even funny.

680 -> original Titan or BD-RW drive is up there, though.
 

Theswweet

RPG Site
Verified
Oct 25, 2017
6,402
California
That sounds like a PSU failure that took them both to the grave. RIP
I don't think I've had things get outdated soon after buying them. I've regretted some things though.

Maybe. I had hard reboots happening on that old rig 6 months prior that I (correctly) diagnosed as the PSU going south, but I wouldn't be shocked that it went undiagnosed for long enough that it did some lasting damage, even if the PSU I swapped it out with fixed the immediate problem.
 

Yibby

Member
Nov 10, 2017
1,777
I have nothing quite as ridiculous as that but I still look at some of my PC purchasing decisions and shudder. In 1998 I decided to go all in on Zip discs on the basis that it was becoming a "universal standard". What was making it "universal"? One of the PCs in the lab at college had a drive and the news that Sega were planning a Zip drive for the Dreamcast... Of course, I'd replace it with a CD-RW drive a couple of years later. The maker?
Yeah i had a zip drive and one other student at my school. So we were the only ones who could share some useless data lol.

shoutout to everyone who bought a physx card
I miss physx effects in games ... like the fog in the batman games. I wish they would add physx effects in games again.
 

Linus815

Member
Oct 29, 2017
19,704
*sees pc magazine article about how beastly the upcoming Geforce 2 GTS/Ti/Ultra are*
Hey dad, can you buy me a geforce 2?
Sure son.
*Dad comes home with a Geforce 2 MX 200 a week later*
Bless his soul... but that thing was almost the same speed as my Voodoo 3 lol

I also remember getting a radeon HD 6990 thinking how its gonna serve me for years because dual gpu is the future. Replaced it after 4 or 5 months IIRC, constant problems, insane heat and noise, and the crossfire support was just garbage
 

kami_sama

Member
Oct 26, 2017
6,998
Maybe. I had hard reboots happening on that old rig 6 months prior that I (correctly) diagnosed as the PSU going south, but I wouldn't be shocked that it went undiagnosed for long enough that it did some lasting damage, even if the PSU I swapped it out with fixed the immediate problem.
What might have happened is that some overvoltage protection was kicking in and that's why you got reboots.
Until one day it didn't kick in and fried something.
 

Shadow

One Winged Slayer
Member
Oct 28, 2017
4,102
Bought a 2009 iMac in April 2010. Could have waited 3 months and got the 2010 version with a superior GPU and a better CPU for around the same price. Not the worst, but I still could kick myself.
 

Theswweet

RPG Site
Verified
Oct 25, 2017
6,402
California
What might have happened is that some overvoltage protection was kicking in and that's why you got reboots.
Until one day it didn't kick in and fried something.

Just to be clear - the PSU I swapped in 100% wasn't the problem. I used that one for well over a year in my current rig with no issues, only swapping it out for some extra headroom when I upgraded to a 3900x.

My theory is that the old PSU did some unknown damage to my aging motherboard, which did some damage to my GPU. Oddly enough, that GPU still *works*... on Linux, somehow. But even on a completely stock Windows 10 install it will black screen, refusing to boot to desktop. It actually kicked the bucket (and later, the motherboard/CPU) during a LAN party, and I was able to confirm that odd issue with the GPU on a friend's rig, where it exhibited the same exact behavior.

What actually happened is that the GPU exhibited the problem partway through the LAN party, so I had to grab a replacement, and then while *that* worked far enough to install the drivers, upon reboot the system refused to post entirely, and *nothing* I did could fix it.

Needless to say, that was a very bad weekend...
 

matrix-cat

Member
Oct 27, 2017
10,284
I bought a 144Hz monitor right before G-Sync came around. They started making G-Sync conversion kits for some monitors (basically all the circuitry for an entire new monitor that you were supposed to fit into your old monitor's shell), but they never did my model, so it was just obsolete almost immediately.
 

Dyno

The Fallen
Oct 25, 2017
13,242
I bought a 580 3gb year ago on a 2600k and it lasted me like 6 years. When I finally got a 970 after the death of the 580 it performed slightly better. That msi lightning extreme OC 580 was fucking ridiculous tbh.
 

zoggy

Banned
Oct 28, 2017
1,203
quarantine

started gaming again and then realized how much i missed pc compared to playing on consoles. didnt have a pc in 4 or so years, so i had to build a new one.

thought why not, i have money, lets get the best card! i
Well, the 2080 Ti isn't outdated at all, it will run every game you throw at it more than fine at every resolution. It's still faster than 99% of all graphics cards :D

literally unplayable
 

CreepingFear

Banned
Oct 27, 2017
16,766
Future proofing for mass storage by buying 4 and 6 TB hard drives when I should have bought a NAS years earlier.
 

dodo667418

Knights of Favonius World Tour '21
Member
Oct 28, 2017
1,694
Bought a 2070 last year weeks before the Super variants came out basically for the same price. I felt pretty stupid, since it was the first generation of cards with tensor cores, there was bound to be some card with better value and improved performance right around the corner. In hindsight I would have waited for the RTX 30XX series, but all in all I'm not too fussed. My 2070 still delivers some great performance, my CPU is a bottleneck anyway (i5 6600k) so I will upgrade that first. Currently waiting for the next Zen release and either get the newest gaming value CPU or the 7 3700x if it gets cheap enough, I just want some more cores so I can play open world games without all these hiccups.
 

Xiofire

Prophet of Regret
Member
Oct 27, 2017
4,133
Bought a 780Ti thinking it'd see me through a few generations.

The 960 handily beat it for much less money not long after.
 

Plidex

Member
Oct 30, 2017
1,153
In terms of PC hardware I always go for the performance/price sweetspot.

I gues a fail could be buying an i9 Macbook Pro and a couple months later Apple announcing they are moving to ARM. But I wouldn't be able to play Flight Simulator on an ARM laptop, so I don't feel it's a fail. It will probably hurt resale value though.
 

Iztok

Member
Oct 27, 2017
6,133
I mean, this is a stealth jab at recent 2080Ti buyers, isn't it.

The closest I got was my 2080S/i9 9900k build from last November.
Not too bad, though, quite pleased with it actually.