• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.
  • We have made minor adjustments to how the search bar works on ResetEra. You can read about the changes here.

Spark

Member
Dec 6, 2017
2,540
That's what they do every single generation. The biggest performance increase (comparatively) seems to be from the SSD this time. Admittedly, that will be a huge performance increase, at least as far as having to sit forever waiting for levels to load.



The 360 was more powerful than most gaming PCs around when Oblivion launched. That was an outlier scenario, but it did happen.
That was the norm at the time. The PS2 could run rings around high end PCs at launch in regards to post processing effects and character models etc.

It wasn't until this generation that consoles became essentially budget PCs in terms of hardware and power. This is the new thing.
 

laxu

Member
Nov 26, 2017
2,782
I really doubt how good the raytracing support on next gen consoles will be. To give you an idea, running something like Control at 4K with all raytracing settings puts even the 2080 Ti on its knees, to the point that the framerates dip to sub 30 fps. Metro Exodus with raytracing is just about playable at 4K + raytracing with the fps hovering around 30. There is no way that the consoles are anywhere even close to that powerful. If any raytracing is involved you can bet that the game will run at something like 1080p at best and upscaled from there.

At their best I feel current console games are visually just as good as anything on PC, if you ignore the miserable 30 fps framerates and crap controller support (lack of configuration, no gyro aim etc) inherent to the platform. Next gen will hopefully raise that to at least 45 fps for AAA games, with variable refresh rate support 45-60 would be a good range that would allow for impressive graphics at checkerboard 4K or upscaled from 1440p or 1880p. Ultrawide support and better controller support would be welcome.

8K support is pure marketing. It's just that HDMI 2.1 makes it possible so in X years when TV manufacturers start peddling more 8K TVs you can hook it up to your console and have it work. That's it, you will have zero games that will actually run at that resolution, everything will be upscaled to it.
 

RivalGT

Member
Dec 13, 2017
6,401
The CPUs didn't advance much at all this gen compared to lastgen. And the previous gen we had systems that were not as balanced, the push for HD was too much for them. Devs can still push for 30 though, but we should at least have an option now.
 

Spinluck

▲ Legend ▲
Avenger
Oct 26, 2017
28,488
Chicago
If consoles gamers really cared about FR they wouldn't buy most 30fps games.

Insomniac had some pretty valuable data proving this.
 

8byte

Attempted to circumvent ban with alt-account
Banned
Oct 28, 2017
9,880
Kansas
These discussions pop up at the turn of every generation, and at every generation, they are proven wrong.

Developers will continue to target the frame rate that best suits their goals, and it will vary wildly from game to game.
 

ShadowFox08

Banned
Nov 25, 2017
3,524
Well I doubt the PS5/Scarlett is going to be even able to hit 60fps at 8K. So PC's will still hit 144hz and above for a long time, since their is legitimately not a single chance whatever is going into either console at this point will have the power for anything stable at 8K.

Why we're still targetting higher and higher resolutions for system which can't handle it I will never know but that's the console way.
They won't do 8k. Its either going to happen as a mid console or save it when the next generation after comes.
 
Nov 8, 2017
6,321
Stockholm, Sweden
Ah yes the obligatory "for the next gen everyone will be targeting 60fps for sure, no really for sure this time" post, 60 fps will not be a priority for the next gen, ≈4k will become a standard for all games and ray tracing is super costly so no, get ready for a lot of games targeting 30 fps, as usual.

This is not a complaint btw, i love when games go for maximum eye candy, depending on the genre of course, 60+ fps is a must for fighting games, driving games and competitive shooters.

The gap will be about as big as it always is between consoles and pc, there are some monstrous cpu's and gpu's on the market now.
 
Last edited:

Quad Lasers

Member
Oct 26, 2017
3,542
Support for raytracing is a tick in favor of more 30fps games, not less. Don't know why you cited it.
 

Sanctuary

Member
Oct 27, 2017
14,233
That was the norm at the time. The PS2 could run rings around high end PCs at launch in regards to post processing effects and character models etc

Not an apples to apples comparison. The PS2 could not do justice to the kinds of games that were becoming popular at the time on PC. That was also a period where PC developers were still primarily making PC exclusives (and console games were obviously tailored and a different experience on their respective systems) and we did not have the "console first" side of the business that it became around 2008 or so. Honestly, the only console that really blew me away compared to what a PC was doing at the time of release was the Dreamcast.

Oblivion was the first, and only game that I can remember that was on both platforms where the console version clearly won. But then six months later the PC was on top again, and the gap only kept growing. Happens every gen though. The consoles start out strong, even if they aren't normally quite up to PC levels overall, usually push graphics, but then are much weaker in performance and until this most recent gen, never had a chance in catching up. They still did not really catch up the current gen, but it was a good step in the right direction and made a big difference for those that had 4K TVs or monitors.

It wasn't until this generation that consoles became essentially budget PCs in terms of hardware and power. This is the new thing.

It's only new in the sense that it's using an architecture that is closer to a PC than not. The overall paradigm hasn't really changed though. Especially not when it comes to what is marketed the hardest either; graphics over performance. If the new consoles could push 60fps (which is where I am going to remain for a few more years anyway) as their targeted standard, I'd be more than happy with that, but the reality is likely going to be 30fps like it always has been. So unless Nvidia shits the bed yet again with their next new lineup of cards, I'll continue doing what I've been doing. Buying a console for the exclusives, and everything else gets played on PC.


Only as a gimmick. "Look at what our console can support" despite the fact that people that actually can tell the difference between a 4K and 8K TV (on a 30fps console and on a 55'' to 77'' TV), let alone even own an 8K TV will be a niche among a niche.
 
Last edited:

headspawn

Member
Oct 27, 2017
14,620
They're never reaching that gap.

If you're worried about the gap between high-end PC and console; never buy another console.
 

Belthazar90

Banned
Jun 3, 2019
4,316
I'd rather them use the extra power to push for more realistic cloth physics, dense foliage with more realistic interaction with characters, better lighting and animation, more elements on screen, highter resolution and better draw distance with framerates above 30 being sought after only after they achieved everything they could with the graphical fidelity.

PS: I don't play FPS or racing games, so framerate doesn't matter that much to me.
 

ShutterMunster

Art Manager
Verified
Oct 27, 2017
2,460
It's not gonna happen. You get what you pay for, in gaming and everything else. Want to play in the best hardware possible, build your own high-end system. That simply isn't gonna change. Not even on the moment the next-gen consoles release, let alone halfway through the gen.

Also, the notion that PCs target any arbitrary number of superhigh FPS doesn't make any sense. PCs don't target any number of FPS, unlike consoles, because the hardware is extremely variable. YOU yourself can try targeting a number of FPS while building your own rig, but it is as far as you can get from an exact science given the range of uses (and games) you can put your PC to.

Annnnnnnd /endthread
 

laxu

Member
Nov 26, 2017
2,782
Not an apples to apples comparison. The PS2 could not do justice to the kinds of games that were becoming popular at the time on PC. That was also a period where PC developers were still primarily making PC exclusives (and console games were obviously tailored and a different experience on their respective systems) and we did not have the "console first" side of the business that it became around 2008 or so. Honestly, the only console that really blew me away compared to what a PC was doing at the time of release was the Dreamcast.

Oblivion was the first, and only game that I can remember that was on both platforms where the console version clearly won. But then six months later the PC was on top again, and the gap only kept growing. Happens every gen though. The consoles start out strong, even if they aren't normally quite up to PC levels overall, usually push graphics, but then are much weaker in performance and until this most recent gen, never had a chance in catching up. They still did not really catch up the current gen, but it was a good step in the right direction and made a big difference for those that had 4K TVs or monitors.



It's only new in the sense that it's using an architecture that is closer to a PC than not. The overall paradigm hasn't really changed though. Especially not when it comes to what is marketed the hardest either; graphics over performance. If the new consoles could push 60fps (which is where I am going to remain for a few more years anyway) as their targeted standard, I'd be more than happy with that, but the reality is likely going to be 30fps like it always has been.

I agree with everything you said. I'm happy with the visual quality of current PS4 Pro games but I do want higher framerates. Give me a God of War sequel at that graphics level with less disguised level transitions and solid 60 fps. Give me a Horizon sequel with gyro aim and 60 fps.

This is the first of the x86 hardware console generations that actually uses what is current in terms of hardware, obviously just cut down and downclocked versions to hit their price points. It has surely helped that RAM, VRAM and SSD costs have come down.
 

sn00zer

Member
Feb 28, 2018
6,096
The gap now between consoles and PC is pretty dang small.
PC does better IQ, framerate, and resolution. Back in the mid 2000s PCs were a generation ahead. The difference was pretty staggering (look at console game that came out in 2004 when HL2, Far Cry, and DOOM 3 released), now its nicer for sure, but not the leap it used to be and hasnt been for a long time.
 

ShutterMunster

Art Manager
Verified
Oct 27, 2017
2,460
The gap now between consoles and PC is pretty dang small.
PC does better IQ, framerate, and resolution. Back in the mid 2000s PCs were a generation ahead. The difference was pretty staggering (look at console game that came out in 2004 when HL2, Far Cry, and DOOM 3 released), now its nicer for sure, but not the leap it used to be and hasnt been for a long time.

This too.
 

Mona

Banned
Oct 30, 2017
26,151
60fps on console lmao

devs (and consumers) have shown time and time again that 30fps is preferable to them

would be nice, but im not getting my hopes up
 

Spark

Member
Dec 6, 2017
2,540
The gap now between consoles and PC is pretty dang small.
PC does better IQ, framerate, and resolution. Back in the mid 2000s PCs were a generation ahead. The difference was pretty staggering (look at console game that came out in 2004 when HL2, Far Cry, and DOOM 3 released), now its nicer for sure, but not the leap it used to be and hasnt been for a long time.
Tech in general moved at a faster paced back then. The myth of upgrading your PC every two years was a real thing back then. Those $1500 PCs were literally outperformed by the $399 Xbox 360 in the span of a few months. I'm definitely glad those days are over.
 
Nov 8, 2017
6,321
Stockholm, Sweden
The gap now between consoles and PC is pretty dang small.
PC does better IQ, framerate, and resolution. Back in the mid 2000s PCs were a generation ahead. The difference was pretty staggering (look at console game that came out in 2004 when HL2, Far Cry, and DOOM 3 released), now its nicer for sure, but not the leap it used to be and hasnt been for a long time.

This has more to do with most games being released on both pc and consoles now, in a way that it didn't use to, both HL2 and doom 3 where ported to the original xbox in 2005.

The gap in power between a top of the line pc and the base consoles is enormous.
 
Last edited:

0ptimusPayne

Member
Oct 27, 2017
5,754
It will be all over the place, but I'm gonna assume 1st party titles are gonna have a performance mode option based upon the options I've been given on my X and Pro. I do think most MP/FPS games will be 60fps.
 

Sanctuary

Member
Oct 27, 2017
14,233
The gap now between consoles and PC is pretty dang small.
PC does better IQ, framerate, and resolution. Back in the mid 2000s PCs were a generation ahead. The difference was pretty staggering (look at console game that came out in 2004 when HL2, Far Cry, and DOOM 3 released), now its nicer for sure, but not the leap it used to be and hasnt been for a long time.

As it stands right now it depends on what you mean when you say "PC". If you mean an average, mid-ranged gaming PC that costs $1000 for everything, including the card, then maybe. But if you're talking about an enthusiast gaming PC (and it doesn't even have to be on the extreme $$$ side either), not really. Aside from a very select few titles (where the X is close to reaching parity), the PC I built last December runs circles around the Pro and the X. Graphically I think it depends on whether or not you are sitting five or more feet away from a large TV, or sitting much closer to a smaller monitor that has a higher PPI. The PC generally has better visuals by default, but many of them only stand out in screenshots compared to games in motion unless there's just horrible aliasing going on. Also, I don't understand the bolded. That's like the three most important differences, but you consider those differences to be small? If you want to talk about value though, the consoles beat PCs easily for just the hardware and general ease of use.

Being locked at 30fps is actually a pretty significant difference. One that can't be captured in a screenshot either. Whether or not it actually matters for some people though is a different story. Personally, it's huge to me, and that's also why I don't really want to go above 60fps right now even on PC. With the consoles being perpetually landlocked to 30fps, it would just feel awful for me to get used to 120fps and then have to go back to playing 30fps for exclusives.

When the PS5 launches though, I think the loading times (or lack of) will persuade me to look at it more favorably. I still prefer a higher frame rate, but the loading is really what made the experience so off-putting to me. Going from 2-5s loading times to 30s-60s is excruciating.

60fps on console lmao

devs (and consumers) have shown time and time again that 30fps is preferable to them

would be nice, but im not getting my hopes up

Do you simply mean that consumers prefer 30fps if that allows for better graphics? Because in a general sense, I can't believe anyone actually prefers 30fps over 60fps even if they claim to not tell the difference.
 
Last edited:

Sedated

Member
Apr 13, 2018
2,598
With more cpu power there will be... More stuff happening onscreen such as more npcs more destruction etc and it'll again be 30fps games. First person shooter games will all be 60 like usual but with few drops this time i guess.

And a vast majority of pc gamers also play between 30-60fps. Those 100+fps pc people r like needle in a haystack.
 

monmagman

Member
Dec 6, 2018
4,126
England,UK
I'll be more than happy with 30fps and eye bleeding graphics......sorry,lol.(Although I imagine this is probably what will happen most of the time anyway).
 

VariantX

Member
Oct 25, 2017
16,891
Columbia, SC
These discussions pop up at the turn of every generation, and at every generation, they are proven wrong.

Developers will continue to target the frame rate that best suits their goals, and it will vary wildly from game to game.

Frankly, this really needs to stop coming up at the beginning of every generation. Its always been about this. Its been about the developers vision to the point that they'll sacrifice both resolution and even a stable framerate let alone 60 fps to reach that goal. They're also willing to sacrifice visual fidelity if necessary if they want fluid 60fps gameplay. If you want 100+ FPS, then buy or build a PC that can get you that in because thats the only place where you can have that experience on a consistent basis because you can brute force your way there with enough money spent in the right places.
 

Bjones

Member
Oct 30, 2017
5,622
Consoles will mainly be thirty while the increased base spec will bring down high pc gaming at ultra settings.
 

Kuosi

Member
Oct 30, 2017
2,366
Finland
Moore's law is upon us. It's going to be very interesting to see the CPU you'll need in a PC to even double a 30fps PS5/XB2 game considering they're going to have 8 core / 16 thread Ryzen CPU's clocked around 3Ghz. Even more difficult with 10+ tflop GPU's and SSD's.

It's not Jaguar's anymore :P
How's the 3950x AMD is supposed to release this year for starters? 16core 32thread and higher clocks
 

leng jai

Member
Nov 2, 2017
15,119
I have no doubt we will see a lot of 60fps games on console but not when it comes to open world tittles (which is a significant share of the market). Ubisoft can't even get their open world games on PC to 60 right now without significant issues unless you've got a beastly CPU.
 
Oct 25, 2017
11,721
United Kingdom
This gen has seen games with fantastic graphics, that have still pushed for 60fps, only struggling to keep a stable framerates because of the console specs not being well balanced, with the slow Jaguar CPU's, limited RAM + memory bandwidth and old GPU's.

Next gen's much newer, faster Ryzen CPU (a massive step up from the 2013 Jaguar) much newer, more powerful Navi GPU (again another big step up from the old GCN tech) and more RAM + memory bandwidth, should make for much better and faster consoles, which should have less problems running games with nice graphics and higher resolutions at 60fps.
 

Deleted member 18161

user requested account closure
Banned
Oct 27, 2017
4,805
How's the 3950x AMD is supposed to release this year for starters? 16core 32thread and higher clocks

And that's great but how many people will have a CPU like that when PS5 and XB2 launch and even then what about PC gamers who sneer at anything under 120 or 144fps? I don't see how it will be attainable at least at launch for next gen exclusive third party games and that's completely ignoring that most PS5/XB2 games will be native 4k.

My point is that this isn't last gen where any mid range i5 can destroy the Jaguars at launch. A Zen 2, 8 core / 16 thread Ryzen isn't going to be a meme and won't be easily bested by 90% of PC gamers rigs especially when targeting 2160p.
 
Oct 27, 2017
9,429
And that's great but how many people will have a CPU like that when PS5 and XB2 launch and even then what about PC gamers who sneer at anything under 120 or 144fps? I don't see how it will be attainable at least at launch for next gen exclusive third party games and that's completely ignoring that most PS5/XB2 games will be native 4k.

My point is that this isn't last gen where any mid range i5 can destroy the Jaguars at launch. A Zen 2, 8 core / 16 thread Ryzen isn't going to be a meme and won't be easily bested by 90% of PC gamers rigs especially when targeting 2160p.

You think those console CPU cores will be running anything close to PC CPU core speeds of the nearest neighbor chipset?

giphy.webp
 

Tagyhag

Member
Oct 27, 2017
12,528
I really hope most people don't think every game will at least be 60fps next gen, you're setting yourself up for disappointment.

Would it be awesome? Hell yeah. But it's not realistic.

That said, the graphics bar has been closing IMO. PC will always do better IQ, resolution, and framerate but other than the quality of ray tracing I assume most games will look pretty similar next gen.

That's what happens with Moore's Law and dev costs/time.
 

Deleted member 18161

user requested account closure
Banned
Oct 27, 2017
4,805
You think those console CPU cores will be running anything close to PC CPU core speeds of the nearest neighbor chipset?

And yet look what was achieved on a shitty 1.6Ghz Jaguar.

Also I never said they would. All I said is that when a big name third party makes a visually impressive, next gen exclusive game targeting 4k/30fps with RT on PS5/XB2 you're going to need an insane rig to run the same game at 2160p at a locked 60fps on PC or perish the thought 120/144fps.

PC's will always be better if you spend. It's just going to cost a lot more to double framerates next gen because consoles won't have netbook level CPU's. That's all.
 

Bosch

Banned
May 15, 2019
3,680
With the rumoured cpu's and raytracing hardware for next gen consoles.... I'm expecting the gap between high end pc and consoles to be much less than what it has ever been
Nope, because devs will force 4k or the closest they will can get from that and what will get will be 4k in a few games sub 4k in others with 30 fps.

A 5700 XT is not enough for 4K. It is a good gpu for 1440p@60 but devs will not go this way they will force 4k with 30 fps...

There is no magic on consoles. You already know what u will get.