• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.
  • We have made minor adjustments to how the search bar works on ResetEra. You can read about the changes here.

Tygre

Member
Oct 25, 2017
11,193
Chesire, UK
Yes, of course they will, just like with every other console transition. All PCs will become worthless junk overnight, that's just how it works.


Honestly, every fucking console generation, the exact same shit. Some people are either credulous rubes or just have zero historical perspective.
 

scabobbs

Member
Oct 28, 2017
2,109
No, unless you're gaming at 4K on PC. You'll undoubtedly have to upgrade many parts to achieve 4K ray tracing. For people like me who are staying with 1440/144hz, I'm certain I'll be fine with my 1080ti and 3900x.
 

Monster Zero

Member
Nov 5, 2017
5,612
Southern California
So, you're basically basing this obsolete thing based on a hypothetical scenario where RDNA2 on consoles will have "significantly improved" ray tracing performance over a 2080 Ti. I wouldn't get my hopes up about AMD in the GPU space right now, even with the RDNA2 GPUs they're set to announce next month.

Lindsay-Lohan-Spits-Out-Drink.gif

Prepare for another Radeon VII moment.
 
Last edited:

Pargon

Member
Oct 27, 2017
12,109
Every generation this happens... PC GPU owners will of course be fine.
Back when the PlayStation 4 launched, it was a 1.84 TFLOPS machine while PCs had 5 TFLOPS GPUs available (780 Ti).
The XSX is going to launch with a 12 TFLOPS APU while the current fastest GPU you can buy today is 13 TFLOPS - for $1200+.

The PlayStation 4 launched with a weak 1.6 GHz netbook-class CPU, which was dwarfed by even older CPUs that had much higher IPC and 4.5–5.0 GHz clocks.
The XSX is going to launch with a modern 3.5 GHz 8c16t Zen 2 CPU, while mainstream PCs today often have… 4.0–4.2 GHz 8c16t Zen 2 CPUs.

It's not exactly the same situation.
Yes, there are going to be differences. The TDP is likely tuned lower than you'd get with an equally-clocked Ryzen CPU, and there are probably other cost-saving measures like using less cache.
TFLOPS are a bad way to measure gaming performance, with NVIDIA having higher game performance than their raw compute performance implies when compared against AMD.
We have no idea what AMD's RT implementation is going to be like compared to NVIDIA's, and it will probably be going up against NVIDIA's second-generation RT implementation by the time the consoles launch.

But the pace of hardware improvements has slowed considerably, and the gap in performance seems far closer than it's ever been.
By the time the consoles launch, we'll probably have Zen 3 CPUs, and a new series of GPUs from NVIDIA. With the shrink to 7nm, those new GPUs will hopefully have a big leap in performance compared to what is currently available.
So the sky is not falling, but I think a lot of people with current PC hardware, expecting it to last for much of next generation, are going to be disappointed.

Even though you cannot compare AMD TFLOPS with NVIDIA TFLOPS, my GTX 1070 with its 6.5 TFLOPS is going to fall quite short of next-gen with 12 TFLOPS and RT hardware. At best, my 4.0 GHz Ryzen 1700X is probably going to match next-gen CPUs.
I'm not expecting there to be PC hardware—except maybe at the very highest-end of what is available by launch—which will be capable of running games built for 30 FPS on next-gen hardware at ≥60 FPS with equivalent or better settings. Maybe you'll be fine with cross-gen games, but not "next-gen" games that are really pushing the hardware.
Storage could get pretty interesting once Sony announce details of the PS5 hardware too.

For me, it's not a question of whether the consoles are going to be faster than my existing PC or not—I'm sure they will be.
It's whether it will be worth it for me to spend what it takes to build a better PC than these consoles.
There is a lot that I value about the experience of gaming on PC, but I spent >$1000 on hardware upgrades this generation only to be left disappointed with how poorly many games ran.
Last generation, a mid-gen upgrade of similar cost blew consoles out of the water and would run games stutter-free at 60 FPS or higher with ease. This generation the games run at higher frame rates but rarely smoothly. I went into more detail about it with posts in this topic recently.

i really don't see how Microsoft and Sony aren't going to be taking a sizable loss on each console even at $499, regardless how good their relationship with AMD is
It's because AMD and NVIDIA realized they could screw over PC gamers once NVIDIA released the Titan at $999 and it wasn't a complete failure, and people kept buying hardware even though prices were inflated due to GPU mining.
What Sony and Microsoft are paying will be far closer to the "real" cost of that hardware.

Yeah I notice 0 difference in gaming on my Sata SSD limited to 500 MB/s versus my NVMe drive that has 3200 MB/s read speed.
That's because game loading isn't optimized for it.
To quote an older post of mine:
Without making things too complicated, let's just say that there are only two things that have to happen when loading a game: getting data off the drive, and decompressing it into memory.

With an HDD being slow, the time it takes to decompress the data into memory might only be 5% of the loading time - so a developer may not see the point in optimizing it, and it only runs on a single CPU thread because that is easier to build.
With a SATA SSD the data access is significantly faster, so it speeds things up a lot - to the point that most of the load time is now waiting on decompression to happen.
With an NVMe SSD the data might be loaded in 10x faster than the SATA SSD, but because most of the loading time is still spent waiting for decompression, the difference is negligible.

With a game optimized for loading off an SSD, the developer now focuses on speeding up the data decompression. Instead of only using one CPU core they build a loading system which scales to as many CPU cores as the system has.
Now that the loading uses all 8 cores/16 threads on a Ryzen CPU (and more in higher-end PCs) instead of only one of them, loading is significantly faster - and we start to see that difference between SATA and NVMe emerge.

I mean, even an RTX 2080 can't run RDR2 today at 4K/60...
It can. Just not on "ultra" settings.
People really need to stop judging performance by how games run when maxed-out.
 
Nov 8, 2017
957
This thread is pure madness. On one hand we're comparing the next gen consoles to a 5700XT, and in the other we're saying today's GPUs will be obsolete. Which is it, because those two statements don't jive?

Every generation, folks that don't understand PC tech fall for console hardware marketing. And every generation they get burned. I would have thought the Xbox One X recently missing the 4K/60, GTX 1070 mark (which is a pure contradiction in itself) that these boards had hyped would have taught folks a lesson. Yet here we are...
 
OP
OP

Deleted member 11276

Account closed at user request
Banned
Oct 27, 2017
3,223
I think it is preposterously absurd to think that Microsoft would go to market with a game console that has the power of a ~$1200 GPU. Considering they still have to put a CPU, storage, PSU, RAM, and motherboard in that thing. Do you really think Microsoft is going to sell a $1500 console? Even at that price it might still sell at a loss if we assume it is as powerful as the GPUs you are referencing.
I know, 12 TFLOPs RDNA really sounds so unrealistic. That + you'd have to share the TDP with the Ryzen CPU as well since its an APU and it's still a console, meaning even less space to cool. But so many people bringing it up, saying 12 TFLOPs is basically confirmed now, by many insiders too... So I just decided to go with the flow here.
 

GhostTrick

Member
Oct 25, 2017
11,417
Back when the PlayStation 4 launched, it was a 1.84 TFLOPS machine while PCs had 5 TFLOPS GPUs available (780 Ti).
The XSX is going to launch with a 12 TFLOPS APU while the current fastest GPU you can buy today is 13 TFLOPS - for $1200+.


Bullshit.
A 2080Ti can reach up to 2ghz clocks. People should stop basing the Tflops rating of Nvidia GPUs based on the conservative boost clocks, which put it at 17Tflops. Because boost clocks aren't limited to the numbers Nvidia claims. That's not how boost clocks work. And no, it's not a "limited boost" either.
 

pswii60

Member
Oct 27, 2017
26,719
The Milky Way
If games can scale down to Lockhart's GPU then you should be fine. Older CPUs on the other hand may struggle, we'll see RAM requirements increase, and we could see SSD being mandatory for some games.
Back when the PlayStation 4 launched, it was a 1.84 TFLOPS machine while PCs had 5 TFLOPS GPUs available (780 Ti).
The XSX is going to launch with a 12 TFLOPS APU while the current fastest GPU you can buy today is 13 TFLOPS - for $1200+.
As others have mentioned, the 2080Ti can definitely pushes much more than 13tf in the real world due to the high boosts available on most recent cards, not to mention its performance advantages per flop over RDNA at 4k.

Regardless, the 30xx series will launch before the next gen consoles hit, and given it'll be Nvidia's move to 7nm combined with it being 4 years since we saw the last "generational" leap from them, I think it's safe to say the new range will be significantly ahead of we have today, and indeed next gen consoles too.

Although 12tf in the next gen consoles is definitely impressive.
 
Last edited:

snesiscool

Member
Feb 15, 2018
299
Hopefully my RX 570 is still good next gen. That thing is weaker than an Xbox One X but more powerful than the rumored Lockhart specs.

What I will need is a new CPU. My Ryzen 3 2200G is a beast of a quad-core, but it's still just a quad-core.
 

Minsc

Member
Oct 28, 2017
4,160
If games can scale down to Lockhart's GPU then you should be fine. Older CPUs on the other hand may struggle, we'll see RAM requirements increase, and we could see SSD being mandatory for some games.

It's still so mysterious to me imaging a game which wouldn't run without a SSD on PC.

They'd have to do some sort of bench test, I guess? And since external drives are pretty popular make it not start if you don't score like over 400MB/s (as most don't do more than 500MB/s)?

Or I guess they could just make it so you can't play on external drives at all? And you need that 2GB/s internal speed? And this would eliminate your game from supporting external storage on the new PS5/Series X systems then too as it's not gonna be running any faster there unless it is like a Vita memory situation all over again - but even Vita memory cards were slow as fuck.

But why? What could be so critical that a normal drive loading in to RAM a little slower would result in the game not running?

In all my years of gaming I've never once encountered a game which wouldn't run on a drive unless it was a SSD.
 

GameAddict411

Member
Oct 26, 2017
8,577
We haven't seen anything yet except a lousy and unimportant TFLOP figure. People need to fucking understand that TFLOPs are not comparable between Nvidia and AMD. It's not even comparable among AMD own GPU generations. We also have zero clue what they mean by ray tracing hardware and I SERIOUSLY doubt it's anywhere need as good as Nvidia solution considering how much they invested in developing this technology for the Turing architecture.
 

Dictator

Digital Foundry
Verified
Oct 26, 2017
4,947
Berlin, 'SCHLAND
Back when the PlayStation 4 launched, it was a 1.84 TFLOPS machine while PCs had 5 TFLOPS GPUs available (780 Ti).
The XSX is going to launch with a 12 TFLOPS APU while the current fastest GPU you can buy today is 13 TFLOPS - for $1200+.
Nvidia and AMD are going to be releasing new GPUs in the time before next gen consoles coming out as you mention - which will all put the economies of scale as to how consoles can launch with such GPUs into perspective.

Much like how the economies of scale for the CPU side will also be similar by the time consoles launch.
 

Atolm

Member
Oct 25, 2017
5,844
You're delusional if you think you're not going to brute force new games, as it has always been the case.

I have a 6700k 4'6Ghz and a 1080 and it already struggles with some games, even at 1440p. It will be a paperweight for AAA gaming when PS5 launches. There's no better time to buy a console in price/performance ratio than when it launches, and there isn't a worse moment to build a PC than in the same year of a console launch.

I've been on both markets since the Pentium 2/3 era and it has always worked like this. The PC I bought around early 2000 couldn't run any PS2 or Xbox title, specs jumped out of the window in mere months and we went from Pentium 3 at 450mhz and 64mb of RAM to others at 1,4Ghz and 256/512mb configs in like a year. The one from 2006 started fine but as the generation dragged it did the same. It could run the first AssCreed and Batman AA well but not their sequels.

Doesn't matter, anyway. The PC gamer has its own pace with upgrades and that's fine, I won't do a full upgrade until late 2021 or early 22.
 
Last edited:
Oct 27, 2017
6,902
This thread is pure madness. On one hand we're comparing the next gen consoles to a 5700XT, and in the other we're saying today's GPUs will be obsolete. Which is it, because those two statements don't jive?

Every generation, folks that don't understand PC tech fall for console hardware marketing. And every generation they get burned. I would have thought the Xbox One X recently missing the 4K/60, GTX 1070 mark (which is a pure contradiction in itself) that these boards had hyped would have taught folks a lesson. Yet here we are...

People buy in to console marketing buzzwords too easily.

Fastest SSD on the market
Hardware based ray tracing
4K with VRR and up to 8K capability
"Double the power of our previous console"
"gee guys, that must mean 12 TEE FLOPS!!"
 

Duxxy3

Member
Oct 27, 2017
21,929
USA
AMD GPUs have existed in the mid-range down, for the last several years. Have they shown anything that rivals Nvidia's top end?

I don't think they'll go after the 2080 ti because that thing is insane.

But I do think they'll go after the 2080 with what I think this next gen 12tflops GPU is based on - the 5800 xt. Whether is does that, and has comparable ray tracing, is completely unknown.

Either way, GPU prices are horribly overinflated right now. Maybe the PS5 and XBSX brings them back down to earth (fingers crossed).
 

Prefty

Banned
Jun 4, 2019
887
Honestly guys...I think some of you are crazy if you think we are going to get anything better than a 1060 + ray tracing GPU wise.

That won't translate well into computer but who knows this generation, I bet that a future "RTX 4060-4070" will be THE GPU of this gen on PC, that is why I'm gonna skip the 20xx and 30xx series completely, Im completely sure that my 1070 will hold up until 2021 or 2022 even if I have to end up lowering everything to low.
 
OP
OP

Deleted member 11276

Account closed at user request
Banned
Oct 27, 2017
3,223
We haven't seen anything yet except a lousy and unimportant TFLOP figure. People need to fucking understand that TFLOPs are not comparable between Nvidia and AMD. It's not even comparable among AMD own GPU generations. We also have zero clue what they mean by ray tracing hardware and I SERIOUSLY doubt it's anywhere need as good as Nvidia solution considering how much they invested in developing this technology for the Turing architecture.
That is true, while we have a rough estimate of how RDNA and Turing flops compare (basing on the performance comparisons of the desktop cards) we can't know for sure and there is much more than just TFLOPs to an architecture speed.

I also don't believe AMD's tech will be better than RTX, but we just have to wait and see. Hopefully we will see more at CES.
 

Ragnorok64

Banned
Nov 6, 2017
2,955
I don't think they'll go after the 2080 ti because that thing is insane.

But I do think they'll go after the 2080 with what I think this next gen 12tflops GPU is based on - the 5800 xt. Whether is does that, and has comparable ray tracing, is completely unknown.

Either way, GPU prices are horribly overinflated right now. Maybe the PS5 and XBSX brings them back down to earth (fingers crossed).
Why do you feel consoles would push GPU prices down?
 

mogwai00

Member
Mar 24, 2018
1,264
Last time it happened was 2005.
And it wasn't entirely true (360 GPU, Xenos, had a more advanced feature set, but multiplatform games performed better on PC since day one, let alone when new GPUs arrived in 2006).
 

MrBob

Member
Oct 25, 2017
6,671
The new consoles will definitely be the price to performance champions for years. I do think we could be in for quite the jump with new Nvidia cards next year though. They are moving to a new 7nm euv process that should allow quite the upgrade that might break the "don't upgrade gpu when new consoles are released" curse.

Love to revisit this thread mid 2020 when Nvidia have their new video cards out.
 

Minsc

Member
Oct 28, 2017
4,160
Minsc
Star Citizen only works fine on SSD. PC game.

Interesting - last time I tried it was one of the test builds where you walked around the ship but that worked fine on my older PCs standard 7200rpm drive I use for storing games. So it just doesn't start at all if it detects it's not running off an SSD? I wonder what happens if you trick it.
 

Deleted member 4783

Oct 25, 2017
4,531
Not at the beginning, but yes, they will be.
 
Oct 28, 2017
1,159
Series X and PS5 seem to be more powerful relative to existing tech compared to PS4/XB1 when they launched.

I expect it to be more like when 360 launched in 2005, when it was more powerful than pretty much everything for a year or so.

The price to performance ratio for the new consoles I expect to be significantly better than PC for a good while, it seems they are not cutting corners this time.
 

Deleted member 4346

User requested account closure
Banned
Oct 25, 2017
8,976
Why are people ripping on AMD? lol. I own a 5700 XT and it works great. Trades blows with the 2070 Super for $100 less. I don't get it.

People are ripping on AMD because they didn't release a GPU to compete with the GTX 1080 Ti until after the RTX launched, and they don't have a GPU to compete with the 2080 Ti. The 5700 XT is low-key a great GPU that punches way above its weight. I've had a Radeon VII since its launch and it's been great for what I paid, 2080 performance for less (in fairness the VII is an odd card that probably should never have come to market). AMD's demise in the GPU space is exaggerated, people basing it off of the 2080 Ti/RTX Titan when almost no one buys those cards, lol.

Just, to expect miracles out of AMD's first hardware raytracing implementation, to me is not very realistic. NVIDIA's implementation has a huge performance hit but AMD's might be even worse. There's no evidence that it'll be better, not yet.
 

Pargon

Member
Oct 27, 2017
12,109
Bullshit.
A 2080Ti can reach up to 2ghz clocks. People should stop basing the Tflops rating of Nvidia GPUs based on the conservative boost clocks, which put it at 17Tflops. Because boost clocks aren't limited to the numbers Nvidia claims. That's not how boost clocks work. And no, it's not a "limited boost" either.
That doesn't sound typical. I had a quick look and EVGA's $1500 water-cooled card boosts to 1755MHz. That does put it at 15 TFLOPS though.
But going by NVIDIA's performance rating alone—and I believe the 780 Ti could also be pushed far higher than the stock boost clocks—the 780 Ti was nearly 3x the performance of the PlayStation 4 at launch.
Even if 17 TFLOPS was a typical overclock for the 2080 Ti, that only puts it about 40% faster than the XSX.

The performance gap is far closer than it's ever been, and we're talking about a ~$500 console vs a $1500+ GPU here.
Last-gen that was a $400 console vs a $700 GPU.
And I have always bought GPUs in the ~$350 price segment.

Nvidia and AMD are going to be releasing new GPUs in the time before next gen consoles coming out.
Yes, I did go into that in my post. But I don't foresee the gap being nearly as large as it was last generation - that would be equivalent to a 32 TFLOPS monster GPU for $700.
And the topic was asking how current GPUs would hold up. Most people don't have anything like a 2080 Ti; even PC gaming enthusiasts.

As I said; the sky is not falling. It's not going to kill PC gaming, and PC gamers will have faster hardware available to them.
My concern is if the faster hardware is going to be affordable like it once was, and if it's actually going to provide a meaningfully better experience.
The Outer Worlds ran at ≥60 FPS on my current PC. But it also stutters badly when doing so, even if I turn the settings as low as they can go - which ruined the experience for me. And that's been my experience with far too many high-end games this whole generation. More examples here.
 

Techno

Powered by Friendship™
The Fallen
Oct 27, 2017
6,454
Interesting - last time I tried it was one of the test builds where you walked around the ship but that worked fine on my older PCs standard 7200rpm drive I use for storing games. So it just doesn't start at all if it detects it's not running off an SSD? I wonder what happens if you trick it.

It starts on a standard HDD from what I remember, I had it installed not that long ago. But it was basically unplayable because of the low framerate and stuttering - even on low settings. Having it installed on a SSD fixes that.
 

oofouchugh

Member
Oct 29, 2017
4,001
Night City
Multiple hardware SKUs, including the cheaper tier Xbox and Switch, means we'll be just fine with graphics scalability. You'll be just fine as long as you're not targeting max/ultra/4k/144hz. Anything around the 1060 should hold just fine for 1080/60 unless you're absolutely cranking up the settings.
 

No_Face

Member
Dec 18, 2017
1,080
Brigerbad, Switzerland
People buy in to console marketing buzzwords too easily.

Fastest SSD on the market
Hardware based ray tracing
4K with VRR and up to 8K capability
"Double the power of our previous console"
"gee guys, that must mean 12 TEE FLOPS!!"
Well, Cerny said faster than anything available on the market at the time of the interview. There is not really much room for interpretation there. Do you think he just flat out lied?
And hardware based ray tracing has been confirmed not just by Sony/MS, but by devs iirc
8K capability is obviously not implying actual gaming performance and no one on here believes it is, so I don't see the issue there
And the 12 TF number for Series X comes from insiders here on ERA, it's not derived from Spencers comments. So unless they are bullshiting or confusing GCN with RDNA, I don't know what you are on about?
 

Deleted member 4346

User requested account closure
Banned
Oct 25, 2017
8,976
The Outer Worlds ran at ≥60 FPS on my current PC. But it also stutters badly when doing so, even if I turn the settings as low as they can go - which ruined the experience for me. And that's been my experience with far too many high-end games this whole generation. More examples here.

As an aside, I came back to this game with the latest AMD driver, and patches to the game. It doesn't stutter as badly as it did at launch. But I think you're right about the PC experience. Some of the games that have been ported to the PC this gen really show optimization issues. I think of the PC as the "premium gaming platform" but these games didn't feel premium on the PC.
 

GhostTrick

Member
Oct 25, 2017
11,417
That doesn't sound typical. I had a quick look and EVGA's $1500 water-cooled card boosts to 1755MHz. That does put it at 15 TFLOPS though.
But going by NVIDIA's performance rating alone—and I believe the 780 Ti could also be pushed far higher than the stock boost clocks—the 780 Ti was nearly 3x the performance of the PlayStation 4 at launch.
Even if 17 TFLOPS was a typical overclock for the 2080 Ti, that only puts it about 40% faster than the XSX.

The performance gap is far closer than it's ever been, and we're talking about a ~$500 console vs a $1500+ GPU here.
Last-gen that was a $400 console vs a $700 GPU.
And I have always bought GPUs in the ~$350 price segment.


Yes, I did go into that in my post. But I don't foresee the gap being nearly as large as it was last generation - that would be equivalent to a 32 TFLOPS monster GPU for $700.
And the topic was asking how current GPUs would hold up. Most people don't have anything like a 2080 Ti; even PC gaming enthusiasts.

As I said; the sky is not falling. It's not going to kill PC gaming, and PC gamers will have faster hardware available to them.
My concern is if the faster hardware is going to be affordable like it once was, and if it's actually going to provide a meaningfully better experience.
The Outer Worlds ran at ≥60 FPS on my current PC. But it also stutters badly when doing so, even if I turn the settings as low as they can go - which ruined the experience for me. And that's been my experience with far too many high-end games this whole generation. More examples here.


Nah, 1755Mhz sounds too low. Or you're refering to advertised boost clocks. Which aren't typical.
Also, comparing Tflops doesn't work like this. You're talking about 2 different architectures here.
 

Aztechnology

Community Resettler
Avenger
Oct 25, 2017
14,166
Current higher end Nvidia cards are still going to be more powerful. Plus granular controls mean I get the performance where I think it's important.
 

GameAddict411

Member
Oct 26, 2017
8,577
Well, Cerny said faster than anything available on the market at the time of the interview. There is not really much room for interpretation there. Do you think he just flat out lied?
And hardware based ray tracing has been confirmed not just by Sony/MS, but by devs iirc
8K capability is obviously not implying actual gaming performance and no one on here believes it is, so I don't see the issue there
And the 12 TF number for Series X comes from insiders here on ERA, it's not derived from Spencers comments. So unless they are bullshiting or confusing GCN with RDNA, I don't know what you are on about?
Everything you said has room for interpretation. While I don't think that Cerny was lying about the SSD performance, we still don't know how it's going to be implemented. Is it going to be big enough to hold the whole game? Is it just a fast cache system that can accelerate only parts of the game, or only after the game data is loaded on the SSD? Is it going to be up-gradable? The TFLOP figures are also still vague and don't tell that much either. think people should wait until we have concrete information before making a butloads of assumption like the OP did.
 

Dylan

Member
Oct 28, 2017
3,260
I'm very curious about the upcoming CPU & GPU gens. I really have no idea what to expect.
 

No_Face

Member
Dec 18, 2017
1,080
Brigerbad, Switzerland
Everything you said has room for interpretation. While I don't think that Cerny was lying about the SSD performance, we still don't know how it's going to be implemented. Is it going to be big enough to hold the whole game? Is it just a fast cache system that can accelerate only parts of the game, or only after the game data is loaded on the SSD? Is it going to be up-gradable? The TFLOP figures are also still vague and don't tell that much either. think people should wait until we have concrete information before making a butloads of assumption like the OP did.
Agreed, I guess I just took issue with the condensending phrasing in his last sentence. Cause that is not AT ALL where those numbers are coming from and stuff like that just feeds into dumb platform warring nonsense.
 

JahIthBer

Member
Jan 27, 2018
10,400
hope my 1660ti last me until 2022 at least
it will at 1080p on non ultra settings, i think this is going to be the one generation where most PC gamers don't notice the change, since most PC gamers still play at 1080p & Consoles are going to waste a ton of performance aiming for 4K.
The only thing might be SSD's being a requirement, but 1TB NVME's are so cheap now, there isn't much of an excuse anymore.
 

thePopaShots

Member
Nov 27, 2017
1,696
Bought a 2070 Super this year with every intention of upgrading by the time developers start taking advantage of new hardware in a couple of years, I'm sure I will be fine with my setup until that happens.