• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.
  • We have made minor adjustments to how the search bar works on ResetEra. You can read about the changes here.
Jul 18, 2018
5,861
Battlefield 4 killed my GPU (GTX 560 ti) because of the one indoor snow/mountain map (Locker?). It was just 50 billion grenades, rpgs and eods bots all being spammed. Saw the block artifacts and that was it.
 

Chackan

Member
Oct 31, 2017
5,097
I never had a high end PC, so gaming in the highest rez possible with max settings was never a thing.

But I remember playing Quake 2 (or was it Unreal) and the game performing great, or so I thought (ignorance is bliss..).

Bought a Voodoo Banshee and that was when I first tried the Q2 benchmark with OpenGL. Man those graphics and that framerate!

Only started noticing real issues with framerate on the PS3 gen, though
 

ray_caster

Member
Nov 7, 2017
664
That leaked alpha of Doom 3 crushed my Geforce 4 Ti 4600.
Definitely. I remember running this at, like, 10 or so frames per second on a Geforce 3 in 2002.

Growing up with PC in the '90s and early '00s, any time a year passed since you bought your computer it would be outdated because the tech was moving so fast. I remember Doom putting quite a strain on my computer at the time, which I believe was a 386. The same goes for Quake on a 486 (minimum requirements were Pentium, but fuck it), and Doom 3 on my top-of-the-line computer in 2004. Then we have Crysis of course, back on my brand new computer in 2007. That being said, plenty of games in between those releases were maxing out my computer at the time.
 

Tankshell

Member
Nov 1, 2017
2,117
OG Far Cry 2004. My PC had never seen such lush dense jungle environments, I still remember the day it came out and how badly it ran on my rig, but that first beach level was soooooo awesome!

 
Last edited:

Iztok

Member
Oct 27, 2017
6,136
The one that bothers me the most is Ghost Recon: Wildlands.

This game is just poorly optimized.

Played it first on my 980Ti and I had to make so many sacrifices to get close to 60, but never really locking it in.
Now with a 2080S, I'm at least playing at 1440p rather than 1080p, but performance is still terrible.
Breakpoint works a lot better, but it's a terrible game in comparison.

Wildlands is probably my favorite coop game of the generation, I still play it today.
 

Yataran

Member
Jul 17, 2018
438
Copenhagen, DK
Recently, Forza Horizon 3. I bought it after it was announced that it was going to be delisted, and then installed its 50+ GB. I wasn't expecting anything amazing from my very humble PC, but definitely a bit better than what I got based on some Youtube videos with hardware that I assumed was similar to mine (GT 1030). Extremely poor framerate, which I think was caused mainly by the CPU (i3 3240). I've been wondering whether I should update the PC or buy a Series S... Most of the PC games I have work well in that setup. FH3 is the newest and most demanding one.

However, that experience doesn't compare with trying to run Quake in my 486 DX4 back in the day. The only way to have a minimally decent framerate was to play it in the smallest screen size possible.
 

SexRanger

Member
Nov 13, 2017
177
Finland
Shiny Entertainments Sacrifice.

Made my Celeron 466Mhz + Voodoo 3 2000 AGP + 128 Mb Ram machine weep.

Minimum required for this game was 300Mhz + 64 Mb Ram. I was sure this was going to run well. Oh boy, with max settings it sure didn`t.

Next year, Max Payne forced me to upgrade.
 

Winstano

Editor-in-chief at nextgenbase.com
Verified
Oct 28, 2017
1,828
R5 3600 + RTX 3080... Flight Sim and Cyberpunk maxed out make it scream. On the plus side, I've not needed to buy a space heater for my room this winter.
 

.exe

Member
Oct 25, 2017
22,229
I'm wary of playing Yakuza Kiwami 2 or Shadow of the Tomb Raider for too long, because they seem to stress every single component of my dated PC. Very noisy, very hot.
 

defaltoption

The Fallen
Oct 27, 2017
11,486
Austin
Skyrim is what made me build my first gaming pc, I tried it in my fathers pc after upgrading the ram and had like 8 to 15 FPS

Crysis 3 and Witcher 3 both made me upgrade my entire setup again.

Besides that I can't remember my pc really "struggling" but I haven't been on pc for the most part in about 2 years so I haven't played C2077 or Flight Sim on pc.
 

Deleted member 9241

Oct 26, 2017
10,416
Tomb Raider 1 really pushed my Monster 3D card back in the day. The original Unreal was quite the test for the time too if I recall.
 

Crowh

Member
Nov 20, 2017
333
Cyberpunk was the first game that made my 9900k Cry, my 3090 wasnt having a really good time either lol
 

Uzzy

Gabe’s little helper
Member
Oct 25, 2017
27,172
Hull, UK
DOOM, then Tiberian Sun, had to buy a new hard drive for that!

This millennia, though, my 980 Ti is starting to show it's age, struggling on a chunk of the AAA games recently released.
 
May 17, 2018
126
Excluding unoptimized games like WD Legion and Cyberpunk, MS Flight Sim and Fortnite with full RTX bring my 3080 to its knees. But in Flight Sim's case it's also a big CPU strain, and I'm still on an 8700k for now.

I'm pleased it's not just me who was shocked that fortnite with rtx and dlss didn't run well. Mind I feel the game doesn't really benefit from rtx (40 fps at 4k on a 2080ti )
 

exodus

Member
Oct 25, 2017
9,949
Microsoft Flight Simulator is the new stress test for GPUs. Wipes the floor with my overclocked 3090 without breaking a sweat.
It makes my 2080Super run like most games on the PS3 did. It is the undisputed king.

Really? With a 6700K and 2070S, I'm still mostly CPU limited at 1600p. Is your GPU actually at 100% utilization or are you just assuming you're not CPU limited since your threads aren't maxed?
 

snausages

Member
Feb 12, 2018
10,352
It's Flight Sim, but also Cyberpunk is the first game I've played where it seems like my GPU and CPU get equally kicked about the place (9700k 3070 rtx)

First time I actually heard my power supply fan as well.
 

DPB

Member
Nov 1, 2017
1,853
Ultima IX, I couldn't get it to display correctly first time I tried it (everything was blue and untextured). Even after upgrading my graphics card it still ran horribly. I tried it years later when it ran comfortably, and it's a pretty poor game.
 

Atolm

Member
Oct 25, 2017
5,828
Detroit: Become Human is pushing my 6700k to 80-90% usage and the RTX3070 to around 95-100.
 

Zissou

Member
Oct 26, 2017
1,889
I try to time my PC build/upgrade to be after the new console generation gets going, when a graphics card comes out which 1) doesn't cost a fortune and 2) can play console ports at max settings without batting an eye. I ended up with a regular 1080 which fulfilled those requirements. Control was the first game where it clearly struggled, and Cyberpunk (though probably for optimization reasons partially?) was even worse. I play a lot of indie stuff which tends to be less demanding, so I'll upgrade in a year or two.
 

Yopis

Banned
Oct 25, 2017
1,767
East Coast
Control and Cyberpunk. Control only because gpu with OC get highest temps. Also Red Dead gives it a workout at 4k. Anything else chews through it.

3080 Strix OC edition. With additional 700 memory 120cc on top of factory oc stable.

Control hits 99% usage same for the others at times.
 

HBK

Member
Oct 30, 2017
7,978
X series. Latest incarnation: X4: Foundations.

Hybrid Empire Management/Space Sim that even if decently optimized can eat CPU for breakfast.
 

Buddy

Member
Oct 25, 2017
1,295
Germany
My i7 6700k/gtx1080 combo is a fighter... It plays every game I throw at it with good quality settings and >60fps

But i haven't tried the newest ones like flight simulater or cp 2077.
 

Anddo

Member
Oct 28, 2017
2,856
The Witcher 2, it brought my Radeon based laptop to it knees. The only playable resolution was 720P at sub 30 frames.
 

AmirMoosavi

Member
Dec 10, 2018
2,023
Outcast's requirements were utterly insane for 1999. 500MHz CPU? 1 GIGABYTE of hard drive space?? At the time our family PC had a 100 MHz CPU with about 1 GB total hard drive space, had to wait another year or so before I could play it.
 

AmirMoosavi

Member
Dec 10, 2018
2,023
I'm wary of playing Yakuza Kiwami 2 or Shadow of the Tomb Raider for too long, because they seem to stress every single component of my dated PC. Very noisy, very hot.

I have Yakuza Kiwami 2 and Rise of the Tomb Raider. I can't get 1080p60 with High on Kiwami 2, but my bigger frustration is how poor hair looks in the game unless you absolutely max out SSAA. With RoTR the AA options that suit my laptop don't look great, and it's crashed to desktop a couple of times with a D3D error so I've put it aside now. Makes a good benchmarking tool, though.
 

.exe

Member
Oct 25, 2017
22,229
I have Yakuza Kiwami 2 and Rise of the Tomb Raider. I can't get 1080p60 with High on Kiwami 2, but my bigger frustration is how poor hair looks in the game unless you absolutely max out SSAA. With RoTR the AA options that suit my laptop don't look great, and it's crashed to desktop a couple of times with a D3D error so I've put it aside now. Makes a good benchmarking tool, though.

Yeah, the AA in both of those is really lacking. Lots of subpixel details that resolve in a jagged mess + not great AA :(

Injecting SSAA helps a little with Kiwami 2, but it's still very noisy overall.
 

oneils

Member
Oct 25, 2017
3,089
Ottawa Canada
Skyrim is what made me build my first gaming pc, I tried it in my fathers pc after upgrading the ram and had like 8 to 15 FPS

Crysis 3 and Witcher 3 both made me upgrade my entire setup again.

Besides that I can't remember my pc really "struggling" but I haven't been on pc for the most part in about 2 years so I haven't played C2077 or Flight Sim on pc.

For me it was partly due to another BGS game: Morrowind.

0w0yIC.jpg


I tried to play it, in 2001, on a Pentium 4 pre-built machine (from FutureShop) that had 128MB of Rambus Ram and a nVidia TNT2 Riva 64MB graphics card. I think that card is from 1999 and I got like 4 frames per second in Morrowind with it (weirdly, I specifically remember NVIDIA having a lower case "n" back then...).

Before I ever built a pc, though, I bough the original xbox to play morrowind. Then I built a rig in 2003 or 2004 and played Morrowind all over again.
 

Menchin

Member
Apr 1, 2019
5,174
RDR2 before performance patches destroyed my 2080Ti and 9700k

Even Cyberpunk 1.0 ran better than that
 

MachRc

Member
Oct 6, 2020
44
261945-links-the-challenge-of-golf-dos-front-cover.jpg

I tried copy con 'ing the autoexec.bat , the config.sys
I tried to free up as much expanded memory as possible.

I probably became the person who I am, because of this roadblock on my 286.
 
Nov 1, 2017
1,624
Yakuza Kiwami 2, Avengers - GTX 1060 3gb
2560x1080 was too much for it. I think I hovered around 25-40 fps after tweaking the settings (combination of medium and high). Last summer I thought how great it would be to have a new gpu so I could play it on high settings at 60+ fps. Bought a 2060 Super and was satisfied with performance in the game... I could also play games like Rise of the Tomb Raider at max settings. Unfortunately, the Super died and it took two months for a replacement from NVidia. It was around that time that the 3000 series was announced. I knew there had been rumors but the talk I had been hearing about the performance gains seemed too good to be true at the time.

While waiting for NVidia to send me another 2060S, I bought into the hype and purchased an Avengers key from someone on here. Avengers was the first time I felt was unplayable on my gpu. Part of it may have been due to how broken the game was at launch, but I also knew the 1060 was showing its age.

edit: wasn't expecting to see so many mentions of Kiwami 2 lol
 
Last edited: