• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.
Status
Not open for further replies.

ThreepQuest64

Avenger
Oct 29, 2017
5,735
Germany
So wait, it was common for mid-range PCs to have an RTX2080 in 2018,
Yeah, some people's perceptions are delusional. I remember an RTX presentation on Metro Exodus where the YouTuber said "for 60fps you only need a single RTX 2080 Ti" like it's the most standard thing you could have.

And no, not just because there's even a better card like a 2080 Ti it doesn't make a 2080 mid-range and not common either.

According to Steam from its hardware survery in September 2019, a RTX 2080 is used by only 0.90%, whereas the majority (14.01%) uses a GTX 1060.
 

Keyouta

The Wise Ones
Member
Oct 25, 2017
4,193
Canada
My four year old quad core cpu may start having more issues as time goes on past the next gen console's release, but I don't think the PC will hold back game development where devs are actively cutting ideas and features down.
 

Hektor

Community Resettler
Banned
Oct 25, 2017
9,884
Deutschland
Jesus, ERA echo chamber, can you be more obvious? OP isn't entirely wrong. Can you buy top of the line $1000 PC CPUs right now that are miles better than what PS5/XboxNext will have? Yes, you can. Does every PC gamer have them? No, they do not.

The situation right now is as follows:
VNxYSvk.png


Big majority of PC gamer CPUs right now are 4 and 2 physical cores. It will be many years before 6 and 8 cores is majority. So yes, OP is correct, PC spec fragmentation will hold games back next gen. This actually reminds me a lot of last gen when 360 came out which destroyed average gaming PC in CPU department. There were many PC ports that received inferior version on PC as compared to 360.

So again, in the end it's AVERAGE PC spec that decides what is and isn't holding back, not the absolute top of the line 0.001% specs.

Buddy, "the average person" on that list is a chinese teenager exclusively playing dota 2

These people do not, in fact, deside what is and isn't holding anything back because they're not even part of any equation
 

Tomasdk

Banned
Apr 18, 2018
910
I guess this is your first time waiting for new consoles OP, because I saw ridiculous claims like this during each pre-launch period. Even more funny is that we don't even know the specs of the new consoles. So if you truly expect what you wrote, you will be severely disapointed, I can asusre you of that.
 

Detail

Member
Dec 30, 2018
2,947
Willing to bet my 3970x from 2012 still whoops it with my overclock lol.

I am however excited that consoles are getting CPU upgrades, I look forward to having something more than barebones physics and braindead AI again, even the 360 and PS3 had better physics and destruction and AI than this gen.
 
OP
OP
DonMigs85

DonMigs85

Banned
Oct 28, 2017
2,770
It's possible, but you have to define what you mean by "reasonably close." Within 20%, around a 2070 level? Probably! Within 10%, like the 2070 Super? Would be very surprising.
The 2080 is more or less around 1080 Ti-level, a 16 nm GPU that came out in early 2017. It would be kinda embarrassing if PS5 can't at least come somewhat close to that (within 10-15% maybe).
 

Kuosi

Member
Oct 30, 2017
2,366
Finland
yeah I'd be worried about those pc's too, not the few hundred million last gen consoles that most console gamers are still on
 

Kyougar

Cute Animal Whisperer
Member
Nov 3, 2017
9,354
Buddy, "the average person" on that list is a chinese teenager exclusively playing dota 2

These people do not, in fact, deside what is and isn't holding anything back because they're not even part of any equation

Especially, with 1.3 Billion PC gamers, 4% of gamers with more than 8 cores is already half the Console install base. and >21% with more than 6 cores is already double the console install base.
 

Spark

Member
Dec 6, 2017
2,538
And cross gen died at PS4 launch.
(It didnt).

Cross gen will also be a million times more prominent this generation with the similar architecture and transferable library. If people are worried about low end PCs and not the 130+ million old generation consoles still in use then they're delusional.

PCs have literally never held back development in the past. Developers always raise the min specs at the start of a new generation, this thread is a joke.
 

Inuhanyou

Banned
Oct 25, 2017
14,214
New Jersey
Some may scoff but op is right. Cpu and ssd alone are things most low end pc users dont have. Devs will have to code smartly.

Its like how star citizen is using things that cause problems for lower end users

Jump from base ps4 to ps5 is more significant than 3 to 4 in terms og getting rid of bottlenecks. Last gen the memory capacity and bandwidth was the biggest bottleneck. The gpu and cpus could have done more
 

Spark

Member
Dec 6, 2017
2,538
Some may scoff but op is right. Cpu and ssd alone are things most low end pc users dont have. Devs will have to code smartly
Then that low-end baseline have to upgrade *gasp*, like every other generational shift in the past. They're going to need to upgrade for raytracing as well. It's not going to hold back shit, especially when these developers are no doubt making most of these early next generation games for current generation machines as well. Can't be telling those Xbox One S users to slap an SSD in their machine.
 

KayonXaikyre

Member
Oct 27, 2017
1,984
This thread is funny lol. Out of all the years I've been PC gaming the only time I remember a console doing some work was the Xbox 360 and I can tell you that on the PC side of things the only thing that happened was that it was upgrade time. No dev actually started going, "but what about the PCs!" They never do that. They always go ok heres the required specs and we all go "goddamn those are min / recommended specs" and then some people complain and then most just upgrade (or find ways to play anyways) lol. Hell they release games that current PCs right now have trouble running when PCs are already way stronger than consoles. Dev's always put whatever they want in a game and then just future proof it and of course games can obviously scale down to run on some pretty low end things (Crysis is a good example).

I know the point has changed a lot, but the title of this is, "PCs will actually hold consoles back next gen on the CPU side for a while" and that hasn't ever happened and it likely never will. Plus back then I'm pretty sure consoles were also loss leaders too and were losing money off the consoles as they sold em and they don't do that anymore to that extent. You aren't getting some cheap console that also somehow smokes a high end pc or smokes even all mid range so bad that devs are going to do anything different.

Few points to consider. First, The PS4, Xbox One, and the Switch are going to hold back consoles much more than PCs ever will considering a lot of the games are currently being developed for those consoles right now and it takes years to make games. That means that games that got planned for earlier this year might not be coming till 2021 or 2022. So for a while, games will be cross generation and won't be using a lot of what these other consoles offer.

Next, the install base of the next generation consoles will be low considering it takes time to sell consoles. No third party developer is going to go all in on the next generation consoles without putting a cross generation version because its missing out on hundreds of millions of consoles in doing so and Sony and Microsoft primarily relies on third party output. First party titles won't be coming to PC anyway (other than maybe Xbox stuff) so that won't even matter to begin with. This means that it'll be till 2022 when you start to see the majority of true next generation third party titles crop up outside of first / third party exclusives. By the time 2022 gets here, PC is going to be doing some other shit, but these consoles will be exactly the same.

Finally, keep in mind that games have budget lol. Not every game is going to have ridiculous assets and graphics. We have games right now that don't use the maximum capabilities of the consoles. You got beautiful games like DBFZ running on a Switch even though it looks great and there will be several more games like that to come because games don't have infinite budget or require all the features afforded in the next generation consoles.

So with that, "PCs will actually hold consoles back next gen on the CPU side for a while" is false. Not only will it not be for a while, it won't be at all. It will never happen next generation. Even if theres a 3.2 ghz 3700x in there somehow with some ultra fast ssd and whatever the fuck else it still wont matter because in 2022 people will just have to upgrade if they can't keep up like we've been doing since the beginning of dedicated graphics cards in PC gaming. This isn't new and it's not about to stop any game from telling me my PC sucks and I can't play until I buy a new one, but that won't be happening to my i9 9900k @5GHZ and 2080ti next gen and it probably won't be happening for most people with uppermid / high end setups either.
 

b00_thegh0st

Member
Nov 6, 2017
1,017
Yeah, some people's perceptions are delusional. I remember an RTX presentation on Metro Exodus where the YouTuber said "for 60fps you only need a single RTX 2080 Ti" like it's the most standard thing you could have.

And no, not just because there's even a better card like a 2080 Ti it doesn't make a 2080 mid-range and not common either.

According to Steam from its hardware survery in September 2019, a RTX 2080 is used by only 0.90%, whereas the majority (14.01%) uses a GTX 1060.

Most people still play at 1080p/60fps and a 1060 is more than enough to achieve that. I know that's my case. As I'll keep my screen going into next-gen I'm even wondering if I'll need to upgrade my gpu. I'm more worried about my cpu but I know the options are there so there'll be no holding back of whatever.
 

Sanctuary

Member
Oct 27, 2017
14,207
Some may scoff but op is right. Cpu and ssd alone are things most low end pc users dont have. Devs will have to code smartly.

Its like how star citizen is using things that cause problems for lower end users

So what? It's as though that has never, in history happened before with a cutting edge PC game, and especially not with one that isn't even in the optimization phase yet. Where do you think these magically considerate developers draw the line then, 2012, 2011...1999 PC specs?
 
Oct 30, 2017
1,249
I have an 8700k. Six cores, 12 threads, running and 4.7ghz.

That should be fine for the next gen if Im targeting 60fps minimum.
 

GhostTrick

Member
Oct 25, 2017
11,305
I remember Watch_Dogs on PS3... It was an abomination


PS3 was a console released in 2006. Watch Dogs released 8 years later. On a far more oudated hardware (DX9 tier vs DX11).
PS4 released in 2013. PS4 Pro released in 2016. Heck, Xbox One X released in 2017.
We.re talking about DX11/12 class of GPU vs... Upcoming DX12 GPUs.

Sure, those console CPUs are slow. But people think those new CPUs will ve pushed day one at 30fps ?
 

FluffyQuack

Member
Nov 27, 2017
1,353
I'm not trying to argue. I literally don't know anything about tech on this level and am curious about what changed between then and now, since I never really understood why all of a sudden at some point of time it seemed like consoles just stopped being technically impressive. Someone said that PCs in the early-mid 90s didn't have graphics cards which is actually super interesting to me (had no idea about that).

I realize that it's not super 100% relevant to the topic at hand but it doesn't seem like there's a lot of discussion left to be had about the OP's original post, so as someone who doesn't really know anything and read the OP and said "huh, that seems plausible" the torrent of no responses made me realize I should probably learn more about this.
I think the only time PCs were all-around weaker than consoles and arcade hardware was the 80's (and I'm talking specifically about PCs as computers like C64 and Amiga had specialized hardware for 2D graphics, like rendering sprites). During the 90's PCs made up its lack of specialized graphics hardware with raw CPU power. And then 3d accelerated cards started becoming the norm during late 90's, at which point PC hardware was way ahead of PS1 and N64.

I wanna say for a long time (PS1 to PS3 generation) we had a trend where consoles seemed stronger than PCs for 1 or 2 years, and after that PC hardware was noticeably more powerful until the next console generation. I think good proof of this claim is looking at ports. PC to console ports usually involved major downgrades (for instance, every RTS on console during the 90's). And then you had multi-platform games where PC version looked better almost every time (for instance, Splinter Cell 1). But then, if you look at launch titles (which were rarely available on PC), those usually looked noticeably better than PC games at the time. I dunno if anyone can find a game which looked as good as Halo 1 in 2001 or Kameo in 2005 (it's very possible I'm making this claim in ignorance, if so, please enlighten me).

And of course, hardware comparisons can be pretty complex. While console hardware was pretty good whenever a new console generation launched, they were always lagging behind in certain areas. For instance, PCs were always better when it came to hard drive and RAM sizes, while consoles would have lasting benefits thanks to very specialized hardware: I think I remember reading the PS2 had lightning-fast RAM speed, and Xbox 360 had a pool of RAM which was shared between system and GPU.

Another thing which makes it harder to compare is how game development was so different. You had many PC-only developers and many console-only developers. I'd say in general PC developers were better at pushing tech (especially between mid 90's to early 00's), while console developers were better with art direction. (That's only a general impression, as there were definitely PC games with great art direction and console games pushing tech in interesting ways.)

I think the current generation is pretty unique for 2 reasons. First of all, PS4 and Xbox One weren't very spectacular compared to PC hardware even at launch. I can remember comparing the performance I got in games which were available on PS4 and PC, and it was virtually identical (and I wasn't using high-end hardware at the time, though it wasn't bad either). But I think the biggest thing is that I feel like this is the first time ever where 80% of developers are making games for everything. I feel like that makes it easier to directly compare games because (aside for console first parties) you have the same games on every platform.
 
Last edited:
Oct 27, 2017
4,018
Florida
This thread is funny lol. Out of all the years I've been PC gaming the only time I remember a console doing some work was the Xbox 360 and I can tell you that on the PC side of things the only thing that happened was that it was upgrade time. No dev actually started going, "but what about the PCs!" They never do that. They always go ok heres the required specs and we all go "goddamn those are min / recommended specs" and then some people complain and then most just upgrade (or find ways to play anyways) lol. Hell they release games that current PCs right now have trouble running when PCs are already way stronger than consoles. Dev's always put whatever they want in a game and then just future proof it and of course games can obviously scale down to run on some pretty low end things (Crysis is a good example).

I know the point has changed a lot, but the title of this is, "PCs will actually hold consoles back next gen on the CPU side for a while" and that hasn't ever happened and it likely never will. Plus back then I'm pretty sure consoles were also loss leaders too and were losing money off the consoles as they sold em and they don't do that anymore to that extent. You aren't getting some cheap console that also somehow smokes a high end pc or smokes even all mid range so bad that devs are going to do anything different.

Few points to consider. First, The PS4, Xbox One, and the Switch are going to hold back consoles much more than PCs ever will considering a lot of the games are currently being developed for those consoles right now and it takes years to make games. That means that games that got planned for earlier this year might not be coming till 2021 or 2022. So for a while, games will be cross generation and won't be using a lot of what these other consoles offer.

Next, the install base of the next generation consoles will be low considering it takes time to sell consoles. No third party developer is going to go all in on the next generation consoles without putting a cross generation version because its missing out on hundreds of millions of consoles in doing so and Sony and Microsoft primarily relies on third party output. First party titles won't be coming to PC anyway (other than maybe Xbox stuff) so that won't even matter to begin with. This means that it'll be till 2022 when you start to see the majority of true next generation third party titles crop up outside of first / third party exclusives. By the time 2022 gets here, PC is going to be doing some other shit, but these consoles will be exactly the same.

Finally, keep in mind that games have budget lol. Not every game is going to have ridiculous assets and graphics. We have games right now that don't use the maximum capabilities of the consoles. You got beautiful games like DBFZ running on a Switch even though it looks great and there will be several more games like that to come because games don't have infinite budget or require all the features afforded in the next generation consoles.

So with that, "PCs will actually hold consoles back next gen on the CPU side for a while" is false. Not only will it not be for a while, it won't be at all. It will never happen next generation. Even if theres a 3.2 ghz 3700x in there somehow with some ultra fast ssd and whatever the fuck else it still wont matter because in 2022 people will just have to upgrade if they can't keep up like we've been doing since the beginning of dedicated graphics cards in PC gaming. This isn't new and it's not about to stop any game from telling me my PC sucks and I can't play until I buy a new one, but that won't be happening to my i9 9900k @5GHZ and 2080ti next gen and it probably won't be happening for most people with uppermid / high end setups either.

This is an accurate take. As long as you got the cash the PC won't hold anything back.
 

p3n

Member
Oct 28, 2017
650
So wait, it was common for mid-range PCs to have an RTX2080 in 2018, even when they launched in the third week of September? Sometimes I think people get confused on what is entry level, low-end, mid-range, high end and enthusiast level. Did you happen to forget how much the 2080 was at launch? Even now the 2080 isn't exactly a mid-range card, even with the price drop. I mean yeah, a system like that will be more than capable with next gen games, but I think you're underestimating those specs.

I consider it mid-range even if nvidia's monopolistic pricing model seems out of balance in comparison. The 8700K was "last year's model" in late 2018. The 2080 is the upper end of mid-range - as has been the case with all previous x80 non-Ti models. The top end Ti models are high-end while the Titans have been dumb...I mean "enthusiast level" for 3 generations now.

If you don't take the GPU pricing into consideration a 8700K + 2080 is indeed mid-range from a performance perspective.
 

Md Ray

Member
Oct 29, 2017
750
Chennai, India
The 2080 is more or less around 1080 Ti-level, a 16 nm GPU that came out in early 2017. It would be kinda embarrassing if PS5 can't at least come somewhat close to that (within 10-15% maybe).
Considering a GTX 1080 Ti draws 300 watts, and the PS5 will likely draw under 200 watts total system power, not really embarrassing that a very expensive modern graphics card that draws a ton of power is faster.
 

Spark

Member
Dec 6, 2017
2,538
This is an accurate take. As long as you got the cash the PC won't hold anything back.

To add to that, developers always treat PC as a new generation system once the shift begins. Like the new consoles have small install bases initially, developers put up with that lower sales potential because it'll change and increase over time. They have never 'held back' due to lower end PCs, they wait for the PC users to upgrade over time much like they wait for console users to migrate to newer consoles. The premise of this thread is idiotic frankly.
 

Md Ray

Member
Oct 29, 2017
750
Chennai, India
I consider it mid-range even if nvidia's monopolistic pricing model seems out of balance in comparison. The 8700K was "last year's model" in late 2018. The 2080 is the upper end of mid-range - as has been the case with all previous x80 non-Ti models. The top end Ti models are high-end while the Titans have been dumb...I mean "enthusiast level" for 3 generations now.

If you don't take the GPU pricing into consideration a 8700K + 2080 is indeed mid-range from a performance perspective.
You're full of delusions.

8700K and RTX 2080 are not mid range parts in any way, shape or form.
 

Techno

The Fallen
Oct 27, 2017
6,409
Most people still play at 1080p/60fps and a 1060 is more than enough to achieve that. I know that's my case. As I'll keep my screen going into next-gen I'm even wondering if I'll need to upgrade my gpu. I'm more worried about my cpu but I know the options are there so there'll be no holding back of whatever.

I have a 1060 as well and my CPU is an I7 8086K, not sure if I'm upgrading anything any time soon. Maybe might look into a new graphics card but I think I will wait for whatever is next.
 

leng jai

Member
Nov 2, 2017
15,117
I consider it mid-range even if nvidia's monopolistic pricing model seems out of balance in comparison. The 8700K was "last year's model" in late 2018. The 2080 is the upper end of mid-range - as has been the case with all previous x80 non-Ti models. The top end Ti models are high-end while the Titans have been dumb...I mean "enthusiast level" for 3 generations now.

If you don't take the GPU pricing into consideration a 8700K + 2080 is indeed mid-range from a performance perspective.

A 2080 isn't mid range no matter how you spin it, it was Nvidia's second fastest GPU in 2018. Calling it midrange even this year is ridiculous, midrange for your average gamer now is a 1070. Even according to Nvidia in this generation it's a 2060 Super at most.
 

Md Ray

Member
Oct 29, 2017
750
Chennai, India
A 2080 isn't mid range no matter how you spin it, it was Nvidia's second fastest GPU in 2018. Calling it midrange even this year is ridiculous, midrange for your average gamer now is a 1070.
Especially considering the only graphics card from nVidia that released this year, that's better than the RTX 2080, is the 2080 Super. Which is a pathetic 5% faster than the RTX 2080.
 

TangorFopper

Member
Feb 2, 2018
55
Italy
I had to login to refute this comment.

I don't blame you for misunderstanding, but clock speed is not the sole factor in determining which CPUs are "quicker" unless you are comparing them within the same company and within the same family. For example, a modern generation 2GHz single-core of the Core i5-9600K processor will be much faster than a 3Ghz Pentium 4.

What matters is IPC (instructions per cycle). The clock speed (the GHz) dictates how many cycles you can have per second. But what really matters is the amount of instructions you can churn through in each cycle. Even if a processor is clocked faster, if it has worse IPC, then it'll be slower. How is a CPU able to attain faster IPC? By stuffing more transistors in a smaller space and by optimizing CPU architecture.

Not to mention, different CPUs implement different "computer instructions". The proper term is instruction set architecture (ISA), but this is technical jargon. Intel implements some highly efficient instructions that the CPU can run in order to perform complex mathematical calculations (if you've taken college physics/math, we call these vector operations).

Also, CPUs can easily overclock to 4Ghz now if you get the K versions. You don't even need fancy software or custom cooling. Just do it through the BIOS.

Also, multi-threaded programming is notoriously difficult. Just because you have 8 cores/16 threads, does not mean you can magically utilize them all in a way that will provide you 16x CPU performance. There is the overhead of managing each thread of execution, and cross-thread communication. Trust me, parallel programming is possibly one of the hardest domains of programming. The PS3's Cell was 8 CPU cores way back when PC was only 2 cores (maybe "4" with hyperthreading, which is fake threading). But nobody would be caught dead saying the PS3 was faster than the top of the line PC CPUs.

Lastly, consoles are inherently limited by their form factor. They don't have the luxury of full tower cases, and they therefore need to be heat-efficient in order to fit within the thermal parameters of the chassis. Less heat generation almost certainly implies less power throughput, which is another factor that determines CPU speeds.

tl;dr PC CPUs have more power supplied to them ('cuz they can generate more heat and get away with it), PC CPUs are more efficient for every "cycle" of instructions they munch through. Cores is a very poor proxy of overall performance. AMD and Intel were forced to go the core/thread route rather than the Ghz route due hitting the limits of physics.

Great post.
I hope more people read this so they can get a better knowledge on the tech side of things.
 
OP
OP
DonMigs85

DonMigs85

Banned
Oct 28, 2017
2,770
Man I must be really dirt poor then, I thought midrange was something like an i5-8400 paired with something around a 1070 or 2070 at best
 

Deleted member 426

User requested account closure
Banned
Oct 25, 2017
7,273
For most console games, PC is an afterthought. I doubt it'll hold anything back that many people's computers aren't powerful enough.
 
Status
Not open for further replies.