• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.
  • We have made minor adjustments to how the search bar works on ResetEra. You can read about the changes here.

Aeron

Member
Oct 27, 2017
1,156
The tech the next gen systems are using will be an absolute joke if they can't do 1080p 60fps without compromising quality.

Even with the penchant to crank the graphics as high as possible surely they'll realise the great balance of graphics and performance they'll be able to do instead of chasing diminishing returns.
 

H4zeVentura

Alt-Account
Banned
Dec 28, 2018
111
I feel you Buddy. Wish there would be more games with an 60 FPS Option.
Everything that has a faster pace and gameplay than the usuall RPG should aim for 60 FPS.
 

Sony

Banned
Oct 30, 2017
565
I hope they don't. I prefer 30fps+bells and whistles in certain games/ genres.
 

Dr. Caroll

Banned
Oct 27, 2017
8,111
Nah they'll probably focus on streaming, so 60 will take a back seat again.
Higher framerates reduce streaming input latency.
60 FPS will never be standard on consoles. It's a harder sell than pretty graphics and trendy buzzwords like "4K" and "HDR."
You don't need to "sell" 60fps to audiences when almost all the games audiences actually care about are already 60fps in some fashion. There has been a huge shift. A majority of the best selling games each year are 60fps now. Has been this way for a few years. People claimed that if given the choice, developers would stick to 30fps, and uh... no, that didn't happen at all. Developers have been flocking to adopt 60fps as the new norm. The biggest engine middleware provider on the planet, Epic, are actively promoting 60fps with a proliferation of Unreal 4 games offering quality/performance toggles. Many of the AAA titles released in 2018 were 60fps on Pro and X. Many of the top AAA titles of 2019 will be 60fps on Pro and X, including open world games like Dying Light 2 and Rage 2.

We are only going to see more and more 60fps games. There's no going back at this point. The days of pretending "30fps is good enough, honest" are gonna fade away just like people pretending that games running in the TEENS (Crysis on PS3 says hi) was "good enough because developers will always choose visuals over performance".

A year ago, Fortnite was 30fps on consoles. Nowdays it runs at 60fps on a high end iPhone. (And both home consoles.)
 

HellofaMouse

Member
Oct 27, 2017
6,168
Im even ok with a stable 30, but we all know its not gonna happen. Fps isnt as marketable as visual fidelity
 

KayonXaikyre

Member
Oct 27, 2017
1,984
Man I wish. Playing at 100+ FPS has made me think my stuffs broken when I see 30 fps. Takes me a little bit to adjust. 60 FPS still looks fine though, but I had a few times where I wasn't sure if something had broken because it looks so choppy.

Problem is consoles aren't going to be strong enough to do 4k 60 consistently (in a way that will actually make the games look substantially better then the ps4 pro / xbox one x) and I think they will continue to push resolution and effects over framerate since it's more obvious to the casual consumer (in genres that typically already do obviously racing and sports and stuff will be 60fps). Some people can't even tell the difference between 30 and 60 fps lol.
 

T002 Tyrant

Member
Nov 8, 2018
8,968
Didn't Phil Spencer say Nextbox was focusing on frame rate? Something about the CPU/GPU balance?

Personally my opinion.

If frame rate is a thing of focus, and say MS show Elder Scrolls 6 running at 60fps at 4k (hypothetically) it'll be short lived for about two years when they push the graphical envelope or make an environment more challenging than we can currently imagine, then 30fps will be more and more common again.

Simply put pretty graphics are almost always put ahead of performance. So even if the console was designed to balance both. Eventually someone will push the envelope on the graphics and sacrifice the framerate.
 

Valdega

Banned
Sep 7, 2018
1,609
You don't need to "sell" 60fps to audiences when almost all the games audiences actually care about are already 60fps in some fashion. There has been a huge shift. A majority of the best selling games each year are 60fps now. Has been this way for a few years.

Eh? Are we talking about single-player AAA console games? Because the majority of those are still 30 FPS. GTA, RDR, God of War (offers an unlocked mode but was designed for 30 FPS), Spider-Man, Detroit, Assassin's Creed, Far Cry, Watch_Dogs, Uncharted, Last of Us, Horizon, etc. Basically, games that rely on realistic and cinematic presentations stick with 30 FPS and I don't see that changing.

For multiplayer games, 60 FPS has become much more standard because those focus more on gameplay and less on presentation. For single-player AAA, not so much.
 

Error 52

Banned
Nov 1, 2017
2,032
There was no correlation between 60fps and better sales and as much as we may prefer 60fps, I believe that until there is a correlation between that performance and sales, most games will remain at 30fps.

I don't think it's a coincidence that the biggest multiplayer shooters of the last two gens are 60fps.
 

Adamska

Banned
Oct 27, 2017
7,042
I'd rather have people finally accepting that not every game needs to meet their arbitrary criteria in regards to its framerate. End of the day, what matters is that the game provides the experience its creators set out to realize, and that doesn't necessarily require 60FPS. Heck, even on PC (where I particularly always aim to play at 120FPS) games will sometime not allow users to alter the game's refresh rate, and that's fine.
 

OwOtacon

Alt Account
Banned
Dec 18, 2018
2,394
No, but 60fps will become more mainstream as a byproduct of more developers working on PC games as well as the need for options given the half generations.
 

sabrina

Banned
Oct 25, 2017
5,174
newport beach, CA
omg did insomniac make the puddle worse than before. omg does spider-man's suit even ripple anymore?

omg does the witcher have way less foliage than before?

omg why was watch_dogs downgraded so much?

omg does shadow of the tomb raider look worse than rise?

omg I don't think odyssey is improved enough over origins

omg how come games aren't all at 60 fps?


it's lose/lose/lose/lose/lose/lose/lose because everyone feels entitled to something
 

Ωλ7XL9

Member
Oct 27, 2017
1,250
The need for 60fps is always an interesting subject. Personally I'd love to see every FPS/Sports/Racing genre to have 60fps as standard. Every other genre has been thoroughly enjoyable in it's 30fps state. Heck even God of War (as a fan of the series since 2005) felt superb to play and control at 4K/30fps on PS4 Pro. I didn't find/get the need to play in higher frame rate despite having that option. Here's hoping next gen allows for more games to give a higher frame rate as one of the enhancement options.
 

Deleted member 22585

User requested account closure
Banned
Oct 28, 2017
4,519
EU
Some will, some won't. But I think that we will get the option between higher fps or higher resolution more often.

Generally though, I think sp games will stick to 30 more often while mp will be 60 most of the time.
 

monmagman

Member
Dec 6, 2018
4,126
England,UK
I'm guessing we might get more at 60,but with the push to 4K visuals will once again take priority as that is what gets the masses attention.
 

60fps

Banned
Dec 18, 2017
3,492
I doubt it. Most developers would rather put any extra power into making games look prettier.
I don't get this statement anymore. Games running in 60fps DO look prettier than their 30fps counterparts once in motion. And games are always in motion.

What good is a 4k resolution when you can't make out any details anymore once you move the camera because it runs only in 30fps? In 30fps try focusing on a distant object and start moving the camera. The image just jitters around. 4k in 30fps is only good for pretty screenshots. But games are more than screenshots. They're interactive, controllable mediums, and controlling in 60fps is always better than 30fps, that's a fact.

This gen we seen an increase of 60fps titles for consoles despite the very bad CPU

Next gen likely to have a much more real CPU so yeah likely much more 60fps titles
Agreed. People get used to having high framerate modes nowadays.

There isn't a game in existence that wouldn't benefit from 60fps or higher.
Exactly.
 
Last edited:

EkStatiC

Banned
Oct 27, 2017
1,243
Greece
Question for Devs:

Are they concerned that nowdays games are marketed through video (streams, let's play's, long gameplay trales/presentations) and in this format a 30fps game is significant different from a 60fps one?
 

Taker34

QA Tester
Verified
Oct 25, 2017
1,122
building stone people
I don't get this statement anymore. Games running in 60fps DO look prettier than their 30fps counterparts once in motion. And games are always in motion.

What good is a 4k resolution when you can't make out any details anymore once you move the camera because it runs only in 30fps? In 30fps try focusing on a distant object and start moving the camera. The image just jitters around. 4k in 30fps is only good for pretty screenshots. But games are more than screenshots. They're interactive, controllable mediums, and controlling in 60fps is always better than 30fps, that's a fact.
Depends on the game and genre. The same can be said about 1080p/60fps which doesn't look adequate to me. It's a blurry mess and unpleasant to look at, once you start playing at 1440p or 4K. I don't see the benefit of higher framerates when the resolution is just too low. At least I haven't encountered a situation where the 4K image gets compromised by moving the camera, that may depend on the game.
 

Aeron

Member
Oct 27, 2017
1,156
Do enough people have 4K displays that prioritising 4K over ultra quality 1080p 60fps makes sense from the developers POV?
 

JohnPaulv2.0

Member
Dec 3, 2017
571
Better graphics are much easier to market than 60 fps. A nice looking screenshot makes people go "wow" much quicker than a video if it running at 60 fps.

I think this is definitely more true for the days of print media. However, if a lot our videogame announcements are now first watched on YouTube etc where fluidity can be perceived as well as the visuals then perhaps consumers will be more responsive to it.
 

Dr. Caroll

Banned
Oct 27, 2017
8,111
Eh? Are we talking about single-player AAA console games? Because the majority of those are still 30 FPS. GTA, RDR, God of War (offers an unlocked mode but was designed for 30 FPS), Spider-Man, Detroit, Assassin's Creed, Far Cry, Watch_Dogs, Uncharted, Last of Us, Horizon, etc. Basically, games that rely on realistic and cinematic presentations stick with 30 FPS and I don't see that changing.

For multiplayer games, 60 FPS has become much more standard because those focus more on gameplay and less on presentation. For single-player AAA, not so much.
You're compacting a lot of games into a single set. Take some of your examples year by year. In 2016, Watch_Dogs 2, Uncharted 4, and Far Cry Primal were 30fps, sure. But Call of Duty: Infinite Warfare, Call of Duty: Modern Warfare Remastered, Mirror's Edge: Catalyst, Doom 2016, Titanfall 2, and Battlefield 1 were all 60fps. And all bleeding edge games with extremely expensive production values. Most of those games dominated the sales charts. Infinite Warfare was basically the best selling game of the year.

Year after year DICE and Activision and Bethesda release stunning singleplayer campaigns that run at 60fps. EA released THREE singleplayer campaigns that ran at 60fps in 2016. That's in addition to their evergreen FIFA series being, naturally, 60fps. EA did not embrace 60fps until the 8th generation. They had stuff like Nightfire and TimeSplitters back in the 6th gen that were 60fps, but they were the exceptions. Battlefield 3 was a crossgen turning point. 30fps on last gen consoles. 60fps on next gen consoles. And then every Battlefield (and Battlefront, and Titanfall) after that was 60fps in both SP and MP. We have seen a huge shift. DICE and Activision are the proof that you can make 60fps games and also push bleeding edge visual tech. Battlefield is particularly notable for being 60fps and offering huge singleplayer levels with high levels of destruction. If you want your games to be 60fps, you can make them 60fps. And the biggest publishers in the world are making them 60fps because they see value in that. They're not copping out and making the MP 60fps and the SP 30fps.

Square also released Final Fantasy XV, which runs at a somewhat wobbly 60fps on Pro/X.

With the introduction of the PS4 Pro we saw games like Hitman 2016 trying to hit 60fps, something the X managed way better. (It goes without saying Hitman 2 this year is also 60fps.) Go back a year. 2015's The Witcher 3 became a 60fps game with the arrival of the Xbox One X. Rise of The Tomb Raider from 2015 was ported to other platforms and was 60fps on Pro/X. (Shadow of the Tomb Raider is also 60fps.) Gears of War 4 was released in 2016, and became 60fps with the arrival of the X. And of course other 2015 titles were 60fps including Battlefront, Wolfenstein: TOB, Call of Duty: Black Ops III, and Battlefield: Hardline.

We come forward to 2017. Wolfenstein II, Battlefront II, Resident Evil 7, and Call of Duty: WWII, which was a particularly insane sales titan, were all 60fps. Underdog NieR: Automata was also 60fps. Super Mario Odyssey was 60fps. Nioh was 60fps. Sniper Elite 4 was 60fps on Pro. The Evil Within 2 had a 60fps mode on Pro/X. You know, there are not a lot of AAA games released each year when you think about it. The major publishers have atrophied hardcore. Most of EA's output in 2017 was sports games. Most of which were 60fps, mind you.

2018 is a little atypical. Battlefield V runs at 60fps and has a very pretty and underrated campaign, but it didn't tear up the sales charts. Black Ops 4 is 60fps and prints money, but is missing its campaign this year. FIFA made huge money and is 60fps. Shadow of the Tomb Raider was 60fps, but unfortunately sold middlingly. And of course Rockstar game along with RDR2 which runs at 30fps on Pro and X and chugs alarmingly on the other consoles.

Look to next year. Resident Evil 2: Remake will be 60fps on at least some console platforms. Devil May Cry V will be 60fps. Bioware are considering a 60fps mode for Anthem on Pro/X. Rage 2 will be 60fps on Pro/X at least. Doom: Eternal will be 60fps. Gears 5 will be 60fps. Wolfenstein Youngblood will likely be 60fps just like every other Wolfenstein game. These are high profile games. If Activision have Modern Warfare 4 in the wings full of nostalgia bait, it will sell bazillions of copies. And its 60fps campaign will be beautiful and very 60fps. (On Pro and X at least. The older consoles are starting to wheeze.) Given EA's trends, that Respawn Star Wars game will probably be 60fps. Titanfall 3, if it is coming in 2019, will also be 60fps.

Microsoft are firmly pushing 60fps in their first party stuff. EA push 60fps in their high profile stuff. Activision's two biggest cash cows, games that define gaming for huge numbers of people, are 60fps. There are a huge number of games from across the various AAA publishers -- EA, Activision, Bethesda, Square -- that offer 60fps on the higher end consoles or even all models. And these are not games with dated graphics. These are some of the most visually impressive games on the market.

Sony does their own thing. They locked Uncharted 4's campaign at 30fps later in development and left MP at 60fps. Other publishers wouldn't do that. You don't see Activision releasing a Call of Duty where the MP is 60fps and the campaign is 30fps. The campaigns are always 60fps. God of War having a 60fps mode is nothing to sneeze at. You will see more of this from Sony. The industry at large has warmed to performance modes just like N64 developers warmed to sticking a higher resolution mode in everything.

I think it's kinda telling that Sony basically don't make FPS games anymore, and the only FPS games they do make are VR titles that run at 60fps+ through sheer necessity. 30fps FPS games are glaring. People will buy them -- Far Cry 5 sold amazingly because it's ace -- but the lack of responsiveness in a market where basically every competitor is 60fps causes problems.

Look at Bethesda published non-30fps stuff Prey was 30fps. And the biggest complaint about the game was its input latency. Dishonored 2 was 30fps. Same complaint. People pick up Wolfenstein II and Doom 2016 and they're amazing and responsive and crisp. Then they pick up Dishonored 2 and have a "what is this incredibly sluggish nonsense?" moment. If consoles had better CPUs, Prey and Dishonored 2 would be 60fps games. It's those weak Jaguar CPUs holding them back. People, myself included, hope that the new CPU architectures they're using in the next line of consoles will mean that people who buy the high end Xbox or people who buy the PS4 will be playing games across the board in 60fps. Currently, a huge number of the most popular titles on the market are 60fps. And their popularity is significantly boosted by their 60fps responsiveness. (And they're often graphically stunning. It's not some if/or thing.) Resident Evil 7 being 60fps after RE6 ran in the low 20s on PS3/360, and Revelations 2 ran at a somewhat uneven 60fps on PS4/XBO -- this was a huge boost. RE7 feels good. And being 60fps also made PSVR conversion easier.

Go back to the seventh generation. In the AAA space, 60fps games were unicorns. Devs were struggling with games that ran in the low 20s, across the board. Far Cry 3 and 4 ran so badly the opening cutscene in Far Cry 3 chugged along at 15fps on the PS3. Now Ubi's games run at a smooth 30fps for the most part. (And Siege is 60-120fps+) And once they have better CPUs in the next Xbox and PS5, they will push framerates higher just as they currently push resolution and asset fidelity higher due to increased GPU headroom.

Even the moderate CPU boosts of the Pro and X have caused a significant number of games to incorporate 60fps modes. The next series of consoles will have GPUs moderately better than the Pro and X, but CPUs that are astronomically better. It will be a game changer. Consoles have been held back by these weak, weak CPUs -- and despite this heaps of the top selling games each year -- a majority in some years -- are 60fps. The 60fps games put together sometimes sell more than the rest of the chart put together. Naysayers said it wouldn't happen, that publishers and developers don't care about framerate, but it did happen. Unless you live in a bubble and only play Sony third person cinematic games, the AAA industry -- including story-driven singleplayer games -- is overflowing with titles that support 60fps on consoles. And Epic are pushing hard for more developers to target 60fps on Unreal 4. A lot of engine features they've been adding over the past year have been specifically designed to pursue that goal, and they're using Fortnite as an example for developers to aspire to. A game people once thought was never going to work at 60fps due to CPU bottlenecking.

It's kinda interesting how PUGB is only finally running at a reasonably stable 30fps on consoles. But its mobile phone spinoff, PUBG Mobile, which has bazillions of players, runs at 60fps on high end phones. This was unimaginable a few years ago. Fortnite of course runs at 60fps on high end iPhones.
 

Valdega

Banned
Sep 7, 2018
1,609
But Call of Duty: Infinite Warfare, Call of Duty: Modern Warfare Remastered, Mirror's Edge: Catalyst, Doom 2016, Titanfall 2, and Battlefield 1 were all 60fps. And all bleeding edge games with extremely expensive production values. Most of those games dominated the sales charts. Infinite Warfare was basically the best selling game of the year.

Sure, that was true in 2016. In 2018, Red Dead Redemption 2, Spider-Man, Assassin's Creed: Odyssey, Far Cry 5, Fallout 76, Monster Hunter: World, Just Cause 4, God of War, Darksiders 3 and Detroit were all 30 FPS. Call of Duty and Battlefield are only 60 FPS because of their multiplayer focus and if they're going to do it for multiplayer, they might as well do it for single-player as well. If there's a trend of AAA games pushing for 60 FPS, 2018 definitely didn't show it.

The 60fps games put together sometimes sell more than the rest of the chart put together. Naysayers said it wouldn't happen, that publishers and developers don't care about framerate, but it did happen.

I'd love for this to be true but I'm just not seeing it. Call of Duty and Battlefield sell because of their IPs and they only target 60 FPS because of their multiplayer focus. id games like Doom, Wolfenstein and RAGE target 60 FPS because they're id games and id understands the importance of framerate. The majority of AAA IPs still target 30 FPS, however. Uncharted, Last of Us, Spider-Man, GTA, Red Dead, Assassin's Creed, Far Cry, Fallout, Monster Hunter, Just Cause, God of War, Darksiders, Ghost Recon, Mass Effect, Dragon Age, Destiny, Horizon, Killzone, State of Decay, Persona, Final Fantasy, Dead Rising, Zelda, Elder Scrolls, For Honor, etc, are all still 30 FPS. Hell, Ratchet & Clank used to be 60 FPS until Insomniac openly stated that 60 FPS doesn't matter to most players and isn't worth the development hassle. I personally disagree but their decision contradicts the trend you're describing.

Even when given extra hardware power, the majority of AAA developers are still going to use it to push visuals rather than performance for single-player games, especially open-world ones.
 

Fall Damage

Member
Oct 31, 2017
2,058
When shown the side by side comparisons of PC high/ultra settings to PS4/Xbox's (mediumish?) I have a hard time telling them apart. However judder from camera rotation on 30 fps games really stands out. A while ago I asked my wife, who doesn't game at all, which version of a racing game I had running looked better without any additional info (30 vs 60). She concluded the 30 fps game looked like it was moving in slow motion without knowing why, or knowing anything about how framerates work.

For me the upgrade from extra frames is always worth the trade off. If there is camera judder it spoils the look of everything else. I don't see 60 fps an an unmarketable visual effect at all.
 
Oct 25, 2017
11,703
United Kingdom
Assassin's Creed: Odyssey, Far Cry 5, Fallout 76, Monster Hunter: World, Just Cause 4, God of War, Darksiders 3 and Detroit were all 30 FPS.

True but the PC versions of AC Odyssey, Far Cry 5, Fallout 76, Monster Hunter World, Just Cause 4 and Darksiders 3 were 60fps or higher because PC's have much faster CPU's, so with Ryzen offering a massive leap in CPU power, next gen consoles should be able to target 60fps on these games too. Current gen is just held back by the old Jaguar CPU. Even God of War had a 1080p / 60fps mode on the Pro, which was a decent effort despite the Jaguar.

With console games gaining graphics and performance options too, next gen games should hit 60fps far more often, even if not every game is going to be 4k / 60fps. Developers will have more power and resources at their disposal from the new hardware and all the rendering options that have become common this gen, so developers should still be able to push the nice visuals while also targeting 60fps, like they can on PC.

I expect most games will offer a 60fps mode with different graphics options, maybe a 4k / 30fps for really graphically demanding games.
 

giapel

Member
Oct 28, 2017
4,597
No. But it would be an interesting experiment to let gamers decide. Have a 30fps mode and a 60fps mode and make whatever sacrifices are needed to achieve 60fps. Lower resolution, lower quality textures, shadows, enemy counts, whatever. See then if gamers really value FPS above all else.
 

Dr. Caroll

Banned
Oct 27, 2017
8,111
Sure, that was true in 2016. In 2018, Red Dead Redemption 2, Spider-Man, Assassin's Creed: Odyssey, Far Cry 5, Fallout 76, Monster Hunter: World, Just Cause 4, God of War, Darksiders 3 and Detroit were all 30 FPS. Call of Duty and Battlefield are only 60 FPS because of their multiplayer focus and if they're going to do it for multiplayer, they might as well do it for single-player as well. If there's a trend of AAA games pushing for 60 FPS, 2018 definitely didn't show it.

I'd love for this to be true but I'm just not seeing it. Call of Duty and Battlefield sell because of their IPs and they only target 60 FPS because of their multiplayer focus. id games like Doom, Wolfenstein and RAGE target 60 FPS because they're id games and id understands the importance of framerate. The majority of AAA IPs still target 30 FPS, however. Uncharted, Last of Us, Spider-Man, GTA, Red Dead, Assassin's Creed, Far Cry, Fallout, Monster Hunter, Just Cause, God of War, Darksiders, Ghost Recon, Mass Effect, Dragon Age, Destiny, Horizon, Killzone, State of Decay, Persona, Final Fantasy, Dead Rising, Zelda, Elder Scrolls, For Honor, etc, are all still 30 FPS. Hell, Ratchet & Clank used to be 60 FPS until Insomniac openly stated that 60 FPS doesn't matter to most players and isn't worth the development hassle. I personally disagree but their decision contradicts the trend you're describing.

Even when given extra hardware power, the majority of AAA developers are still going to use it to push visuals rather than performance for single-player games, especially open-world ones.
I think you're kinda ignoring that current consoles are incapable of running the likes of Far Cry 5 at 60fps. This isn't some developer choice thing. They flat-out can't do it. The CPUs are too weak.

Tomb Raider, Hitman, Halo, Gears, Forza, Call of Duty, Battlefield, Battlefront, Rainbow 6, Doom, Wolfenstein, Titanfall, FIFA + basically every other sports franchise, Metal Gear Solid, Monster Hunter, Kingdom Hearts, The Witcher, Final Fantasy, Resident Evil, Devil May Cry, God of War, Fortnite, Sniper Elite, and many other games including smaller titles such as Hellblade and The Surge -- they're 60fps in some capacity thanks to Pro/X. Not necessarily stable 60fps, but 60fps. In the 7th generation, Battlefield did not run at 60fps. Now every single EA FPS game runs at 60fps. Even Mirror's Edge upgraded to 60fps where the original Mirror's Edge was 30fps. Resident Evil has gone from running in the low 20s (RE6) to running at a 60fps (RE7 & RE2 REmake). Next year Techland are finally transitioning to 60fps for Dying Light 2. You mention Monster Hunter as a 30fps series, yet it has a performance mode that runs okayish on the X.

Back in the 6th generation, the only 60fps FPS games were Metroid Prime, TimeSplitters, and maybe-kinda Nightfire. Everything else was 30fps. Or even lower. This trend got even worse in the 7th gen where basically nothing except Call of Duty was running at 60fps. Now 60fps is widespread among the leading AAA games. Formerly 30fps series like Halo are 60fps now. FPS games that aren't 60fps are seen as weird. Destiny 2 attracts a lot of criticism compared to the PC version.

Sony don't make all that many FPS games anymore because reasons, but all the FPS games they do make are 60fps because they're PSVR titles. All of Sony's PSVR games are 60fps because anything lower gives people motion sickness. Dropping below 60fps fails PSVR certification.

The trend towards the normalization of 60fps has been growing stronger for years. The only thing holding it back is extremely weak CPUs, something the next generation will hopefully rectify. 2018 was a big year for open world games, and two of the most popular games of 2018 were 30fps Ubisoft titles. But once the CPUs stop sucking, that won't be an issue anymore. I know that's a little bit handwavey, but open world games currently target 60fps by and large (bearing in mind Dying Light 2 next year will be 60fps) because the CPUs are just too weak. The GPUs are strong, but the CPUs aren't. A game like The Witcher 3 can run at 60fps on Xbox One X because it's not very CPU demanding. A game like AC: Odyssey can't because the CPU was lame by 2013 standards.

Your argument that Battlefield and Call of Duty are only 60fps because of their multiplayer is very weak. The SP and MP are by different teams, for a start. And these developers creating incredibly good looking games (with heaps of tech that isn't even really used in the MP such as cutting edge facial animation systems) that run at 60fps is not some incidental thing that happened coincidentally because the multiplayer team was doing stuff in their corner.

Compare that to Naughty Dog. Uncharted 4's multiplayer is 60fps. The singleplayer was supposed to be 60fps, and this was a huge bullet point originally, but it ended up only being 30fps.

When the lead developer of The Last of Us and the assortment of Naughty Dog senior developers who had migrated to Infinity Ward were making Call of Duty: Infinite Warfare's campaign, they made a 60fps campaign. When Visceral made Battlefield: Hardline, they made a 60fps campaign. And said campaign is still graphically beautiful today. Having 60fps multiplayer doesn't make 60fps singleplayer trivial.

You mention Dragon Age and Mass Effect. Bioware historically made games that ran terribly on consoles. KOTOR often runs in the teens on Xbox. (It runs at 60fps via Xbox BC, which is cute.) But with their recent work, they targeted 30fps for Dragon Age: Inquisition and Mass Effect: Andromeda. They have claimed that 60fps is under consideration for Anthem on the PS4 Pro and X. We shall see what comes of that. But ten years ago that wouldn't even have been a thing. We took for granted that console games would be 30fps, or more likely something chuggier. Yet in the modern market, AAA games being 60fps on consoles has become normal. Going forward, it will become even more normal.
No. But it would be an interesting experiment to let gamers decide. Have a 30fps mode and a 60fps mode and make whatever sacrifices are needed to achieve 60fps. Lower resolution, lower quality textures, shadows, enemy counts, whatever. See then if gamers really value FPS above all else.
Uh... developers already do that on the Pro and moreso the X. Resident Evil 2: Remake has a 60ps performance mode. Monster Hunter World has a performance mode. The Witcher 3 has a 60fps performance mode. God of War has a 60fps performance mode. Shadow of the Tomb Raider has a 60fps performance mode. Hitman 2 has a 60fps performance mode. Final Fantasy XV has a 60fps performance mode.

The issue with current consoles and framerate is not graphical fidelity. It is terrible CPU performance due to the use of low end Jaguar CPUs. You can't magically make Far Cry 5 run at 60fps on current consoles by turning the graphics to paste. It's entirely a CPU problem. That's why the X can run Far Cry 5 at basically 4K resolution, but the framerate is capped at 30fps.
 
Last edited:

TechnicPuppet

Member
Oct 28, 2017
10,834
A lot of MS games have recently started having at least 60fps mode in some form. I'd expect that to be the case for nearly all of their games next gen with the better CPU.
 

SunhiLegend

The Legend Continues
Member
Oct 25, 2017
2,573
Honestly I'd rather keep a stable 30fps with extra graphical features than 60fps, apart from sports games and a couple shooters pretty much every game I've played this gen has been 30, all the big AAA games have certainly been like that and I doubt they're going to change going forward. Hell even on my PC where I can run at 60 I always lock to 30 instead and bump up the graphics, 60 is always going to be a compromise unless I spend a ridiculous amount to be able to run max settings 60fps on PC, so I'd rather keep them high and run at 30 than reduce a lot to get to 60.
 

Pargon

Member
Oct 27, 2017
12,014
If anything I am more concerned for next-gen because AAA developers will continue to not care about gameplay and build 30 FPS experiences.
With them having access to a relatively modern and competitive CPU, it concerns me that PC will no longer be able to brute-force games to run at high frame rates any more, just as some games struggled to stay above 60 FPS this generation.

that is not true, 45fps is awful, either give 30 fps locked or 60fps locked but not this half assed variable framerate stuff, it is highly distracting to have fluctuating framerates
With variable refresh rate displays, higher framerates are pretty much always better, no matter what they are.
It's only old, fixed 60Hz refresh rate displays that have to choose between 30 or 60.

I have a Gsync monitor and before I got it people used to tell me how Gsync makes 45-50fps feel much smoother and very close to how 60fps feels on a locked refresh rate monitor. That's not really true, it's smoother due to lack of judder from duplicate frames but it's still noticeable and very obvious that it's lower than 60.
[…]
I persinally think Variable refresh rates benefits the most when running above 60fps instead. The reason for this being refresh rate around 100 or more feel considerably better but since it's not really possible to consistently hit 144fps or 120fps in latest games all the time, having a variable refresh rate helps there.
I agree with you completely on these points.
A VRR display does not make sub-60 feel like 60, but does make it feel a lot better than a locked 30.

And getting a VRR display made frame rates above 60 far more viable than they were before, since it allows the frame rate to be unlocked while still appearing smooth.
If anything, that has pushed the minimum frame rate that I target in most games up to about 80-90 FPS, rather than making it more acceptable for the frame rate to drop below 60.
If you want to get an idea of how smooth a game feels at those framerates with variable refresh rate then just play a game with Vsync off in your fixed rate monitor while it runs at those framerate. It won't look the same because obviously it'll tear everywhere but the "smoothness" should be comparable as there won't be full duplicate frames.
Can't agree with you here though. V-Sync off is considerably less smooth than using G-Sync, especially on a 60Hz display.

I'd rather have people finally accepting that not every game needs to meet their arbitrary criteria in regards to its framerate. End of the day, what matters is that the game provides the experience its creators set out to realize, and that doesn't necessarily require 60FPS.
I cannot think of any game with moving elements/player interaction that would not benefit from a higher frame rate. Nothing which has a moving camera in it should run at 30 FPS.
I couldn't care less about "the experience its creators set out to realize" if that means it runs at 30 FPS. The only thing that means is that I'm not going to play it.
Heck, even on PC (where I particularly always aim to play at 120FPS) games will sometime not allow users to alter the game's refresh rate, and that's fine.
That is not remotely 'fine' at all.
 

Nooblet

Member
Oct 25, 2017
13,632
Can't agree with you here though. V-Sync off is considerably less smooth than using G-Sync, especially on a 60Hz display.

.
I meant smoothness in terms of input and how your mouse/aim/cursor moves in sub 60FPS scenarios, not in terms of image quality because obviously there is tearing.
 
Oct 27, 2017
4,018
Florida
I hope so and hope not at the same time. What I mean is there are some games I'd rather have the frame buffer for mor more effects and detail.

Uncharted and TLOU are great examples of games that are 30 with cranked graphics. Hell, Sony's entire first party lineup pretty much.

I'd rather play Halo Infinite 30fps in campaign with cranked graphics and effects and 60fps in multiplayer for example.
 

Detail

Member
Dec 30, 2018
2,947
I think developers will always favour graphical fidelity over frame rate on consoles.

I am a PC gamer mainly but also own a switch and a one X and I would take 60fps and more responsive gameplay over graphics anyday tbh.
 

Stanng243

Member
Oct 25, 2017
1,242
I just recently bought Assassins Creed Odyssey on the PS4 Pro, whilst it is a good game I'm disappointed with the frame rate. The problem is probably compounded by the fact I played Origins on the PC at 60fps. But Origins feels so much better at that frame rate. It has an immediacy and fluidity that is perfect for that sort of game. The combat is fast paced and I find myself making mistakes because of the input lag and low frame rate in Odyssey. I also don't feel as immersed in Odyssey, it feels stuttery and not as fluid, and no amount of motion blur can cover it up, in fact it makes it worse. The menu's are also slow to load, but at least they got those 4k visuals, just dont move to much or it trashes the visuals.

Also GOW was an amazing game and it was even more amazing in the frame rate mode on the PS4 pro. Games are always better at higher frame rates. They are never made worse with it.

I don't understand why some developers don't see the value in frame rate. They seem to view games as a single image, when games are played with a series of images. Games feel and look much better when our eyes are fully fooled into believing we aren't seeing a series of images. Which is better attained by more of those images in a second.

Are developers just going to throw more polygons and higher resolutions and textures at next gen games. Will 4K at 30fps be the new norm or 8k sub 30fps, or see slightly further with more tesselated junk objects that you cant interact with. Surely we are reaching diminishing returns with single image fidelity. Only a minority will notice or be able to enjoy such visuals, when everyone can enjoy higher frame rates.

Instead of 4k as a selling point wouldn't Play Assassins Creed at 60 fps be a bigger selling point. Not to pick on Assassins Creed wouldn't it be a big selling point for any game where everyone else is just focusing on pure single image visuals.

60fps should be the minimum in my opinion, we should be pushing way beyond that by now, even without VR. High frame rate is vital for immersive gameplay experiences.

This is kind of a plea to developers, please look at the whole experience instead of just the single fleeting image.

So am I wrong, are people not that bothered about frame rates, is 30fps forever going to be the norm and that's fine, or is it about time developers started focusing on it.
I'd rather have 30 FPS with more bells and whistles than 60 FPS. I've never really noticed or cared about framerates personally.
 

cnorwood

Banned
Oct 28, 2017
3,345
Maybe, if hybrid VR games like RE7 become more of a thing then yes 60fps will more and more standard IMO. If not, no. There is no reason to, outside of people here most people don't give a fuck about frame rates. And as someone who notices frame rates I don't really give a fuck outside of vr
 

Dezzy

Member
Oct 25, 2017
3,435
USA
I hope they do, games look good enough. Just look at RDR2, the game looks amazingly good and still holds 30 fps. I don't know how they could make that better, or why they'd ever need to. The only improvement left to make is the performance.
 

Ra

Rap Genius
Moderator
Oct 27, 2017
12,207
Dark Space
I'd rather have 30 FPS with more bells and whistles than 60 FPS. I've never really noticed or cared about framerates personally.

How much experience do you have with high framerates though? Have you ever existed in an ecosystem where every game ran at 60fps?

A lot of people, not saying this is you as I don't know, brush off 60fps because they just haven't been in the position to live with it and see it as the norm.
 

jon bones

Member
Oct 25, 2017
26,019
NYC
I sure hope so. I love the accessibility of consoles, but it's so hard to go down from 60 fps on my PC.

A lot of MS games have recently started having at least 60fps mode in some form. I'd expect that to be the case for nearly all of their games next gen with the better CPU.

Add in Freesync via HDMI 2.1 and the 'higher end' X model console and I think we'll be able to get a very smooth 4K experience next gen.