• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.

BrickArts295

GOTY Tracking Thread Master
Member
Oct 26, 2017
13,746
I think they "unintentionally" will. The focus will be on 4K, but for the firs couple of years 1080p/1440p will be the new 900p, which I'm confident to say most here will be more than fine with it. I hope they can at least offer the options of 1080p/60fps or 4K/30fps like some are doing with X/PRO patched games.
 

illamap

Banned
Oct 28, 2017
466
What is there left to improve graphically that is only possible on 30fps games? Even this gen we didn't really see any paradigm change that resulted in huge visual gains. Pbr, ssr, parallax corrected cubemaps, capsule ao, ssao, clustered techniques all were possible on last gen on some capacity. Only new thing I can think is froxel based volumetrics instead screen space variants used last gen. Like two titles use sdf, so it is not exactly big thing at least currently, but something that might help lod management next-gen. And all of that is possible on 60fps titles this gen.

Biggest bottleneck compared to 30 fps to 60fps titles graphically afaik is drawcall bottleneck on cpu thus simpler environments geometrically compared to 30fps counterparts. Next gen quad overshading is gonna be bigger problem than it is today so it sets limit how much more geometrically detailed can 30 fps titles be.

Another point that is rarely discussed is that games already look fairly pretty, so how do you make already pretty drawing even prettier in terms of art? So there is also bottleneck how much prettier can a game be made on artside.
 

Sei

Member
Oct 28, 2017
5,708
LA
Man, trying to play Bloodborne after playing Dark Souls 3 at 60 fps is painful and nausea inducing.

Playing the Witcher 3 on consoles was garbage.

If they really want to tap PC gamers, I would hope they have some new standards.
 

Snake Eater

Attempted to circumvent ban with alt account
Banned
Oct 27, 2017
11,385
I think native 4K with roughly 30FPS will be the norm next generation
 

AtomicShroom

Tools & Automation
Verified
Oct 28, 2017
3,075
They won't. They'll prioritize native 4K and flashier graphics way before they care about 60fps.
 

TheSyldat

Banned
Nov 4, 2018
1,127
Given that everybody is jumping on the 4K bandwagon when both the GPUs and the TV screens are barely able to keep a steady 30 FPS what do you guys think ...

Of course 30 FPS will keep on being the norm ...

That is unless for once in video game history gamers actually give a bloody damn and stop buying games that don't meet the criteria , but also keep on playing on 1080p for a full generation ...
Games are always better at higher frame rates.They are never made worse with it.
Do some retro gaming on PC or on pre pc era micro computers , or hell even some Neo-Geo games and you'll find tons of examples of the contrary ...
Again higher frame rate doesn't mean much if said higher frame rate isn't being used ...
 

Valdega

Banned
Sep 7, 2018
1,609
How has Gears become "less and less relevant"? What are you talking about?

The Last of Us's multiplayer is so popular it has an active player base on PS3 years 5 years after release. It is a popular multiplayer game, which is why its sequel also has multiplayer. Just because people on forums don't talk about it doesn't make it any less popular. As I explained, the popularity of multiplayer in Sony's games kinda seems to be intentionally downplayed to present this image of Sony as the only company making "real" singleplayer games. The fact it is still played half a decade after release kinda disproves the argument some people wheel out about MP in Sony games being "tacked on MP that nobody plays after a few weeks".

After Epic stopped working on Gears, the IP became far less relevant to the general gaming audience. Same thing happened when Bungie stopped working on Halo. When IPs are passed off to different developers, that's generally when people stop caring. There are many people who played the first three Gears/Halo games and then stopped once the original developers left. Gears of War was a killer app for the Xbox 360. That's no longer the case with the Xbox One. Forza is really the only killer app on Xbox these days.

As for TLOU's multiplayer, would you say its player base represents the majority of TLOU fans or a much smaller minority? I'm pretty sure it's the latter, especially when you consider that TLOU has sold over 17 million copies. When I say "tacked on," I'm not saying that nobody plays it. I'm saying it's not the focus of the game because multiplayer has never been the focus of Naughty Dog's games. You could cut the MP modes from TLOU and Uncharted and the vast majority of players wouldn't care. Hell, I'm pretty sure the majority of TLOU/Uncharted players have never even tried the MP modes.

Even if that were true, that has absolutely no bearing on the game itself. This is basically a demographics argument. You're saying that any game series which gains a vocal multiplayer fanbase along the way is no longer a singleplayer game series even if the singleplayer is consistently excellent.

No, I'm saying a multiplayer-centric game is one where the majority of players only care about the multiplayer. CoD and Battlefield are multiplayer-centric games. The majority of TLOU players don't care about its multiplayer and therefore it is not a multiplayer-centric game.

HITMAN 2 has a multiplayer mode. It also has a co-op mode. However, the majority of HITMAN players don't care about either. That's why HITMAN 2 is not a multiplayer-centric game. Assassin's Creed used to have a multiplayer mode but the majority of players didn't care about it so Ubisoft eventually scrapped it. Fortnite started as a co-op game but the vast majority of players only care about its BR mode so classifying it as a BR game is accurate. The focus of the majority of players is what determines if a game is SP- or MP-centric.

It goes hand in hand with this bizarre idea that these Call of Duty campaigns in particular are made in a year. A lot of people got it into their heads that Call of Duty campaigns were somehow "low effort" and "churned out" in a single year as stocking fillers, basically. In reality they are huge AAA projects with 36 month development cycles and involve some of the best singleplayer story-driven game talent these studios can recruit.

In terms of production values, you are absolutely right. In terms of actual gameplay and level design? CoD and BF campaigns are pretty lackluster. They are equivalent to Michael Bay movies: all style and no substance. Rigidly linear and scripted level design, overly reliant on set-pieces, no meaningful weapon or enemy variety, minimal player agency. Glorified shooting galleries, really. They've barely evolved since CoD4 which is why they feel stale and forgettable. CoD and BF campaigns are basically super expensive spectacles but not much else.

And on the other side of the fence, EA are firmly dedicated to singleplayer experiences. Battlefield, Titanfall, Star Wars -- all prominently feature singleplayer. They have a new Star Wars game coming from Call of Duty creators Respawn this year, unless it gets delayed. And it'll probably be 60fps.

Battlefield features single-player but I wouldn't be surprised if they just cut it entirely in the future. Titanfall didn't have single-player until TF2 and it still bombed. The franchise is likely dead now. Several single-player Star Wars games have already been canceled (like 1313 and Visceral's game).

There's long been a prevalent belief that consumers won't accept $60 games that don't include both single-player or multiplayer modes. That's why Tomb Raider 2013 had a multiplayer mode and why Uncharted and TLOU had them as well. In recent years, publishers have become more comfortable going with single-player only games, though that usually translates to "open-world and enough filler to last 100+ hours." Publishers aren't as comfortable going with multiplayer-only games, though that may be changing with games like For Honor, Skull & Bones, R6: Siege and BLOPS4.

In the old days, there was no confusion. Doom was a singleplayer game. Quake was a singleplayer game. Quake II was a singleplayer game. Quake III was a multiplayer game. Quake 4 was a singleplayer game. Unreal was a singleplayer game. Unreal Tournament was a multiplayer game. The term "MP focused" has come to mean "fantastic singleplayer game with a multiplayer fanbase that is rather vocal so this somehow lessens the game's accomplishments as a singleplayer title."

Quake has always been a multiplayer-centric series. The single-player campaigns were passable but the focus of most players was multiplayer, hence the multiplayer-only Q3, Quake Live and Quake Champions. The term "MP focused" means exactly what it says: focused on multiplayer. CoD and BF are focused on multiplayer because the majority of their players focus on multiplayer. You may personally focus on their single-player campaigns but you are the minority.
 

astro

Member
Oct 25, 2017
56,902
Do some retro gaming on PC or on pre pc era micro computers , or hell even some Neo-Geo games and you'll find tons of examples of the contrary ...
Again higher frame rate doesn't mean much if said higher frame rate isn't being used ...

They don't mean those types of games though, we're discussing modern games here.

Obviously older games conform to different standards and sometimes for a reason.
 

TheSyldat

Banned
Nov 4, 2018
1,127
They don't mean those types of games though, we're discussing modern games here.

Obviously older games conform to different standards and sometimes for a reason.
Even modern day games won't benefit from a higher frame rate if the rest of the game mechanics aren't tied and timed accordingly.
Look at the nightmare that Durante went through while trying to save the PC port of "Little King's Story" ...
Again if the rest of the underlying design isn't meant to target 60FPS then pushing the game to 60 won't improve the game.
So when the OP says "games are always better with a higher frame rate" no it isn't true if the game hasn't been designed from the ground up (so the decision has been made during pre production before ANY line of code has been typed that the game was to hit a higher refresh rate )

There are recent examples even on PC that a higher frame rate just kills the gameplay if you didn't design the game accordingly.
 

Valdega

Banned
Sep 7, 2018
1,609
Look at what kind of games those developers make. Basically all the 30fps games are open-world-style games. Because they need lots of CPU power for 60fps, that can't be provided by current-gen CPUs. Now look at the list of 60fps games. Many of them were 30fps last gen (FIFA, Battlefield, Resident Evil, ...) - but thanks to more powerful CPUs, they are now 60fps. The same thing will happen with open-world games. There is a clear trend towards 60fps, it's basically inevitable.

Except for Battlefield, none of the games you listed are CPU-intensive. The developers targeted 60 FPS this gen because they decided that those games play better at 60 FPS and are worth the sacrifice in potential scope/complexity/visuals. For example, RE7 is basically a first-person corridor shooter that never has more than a few enemies on-screen whereas RE6 was a bombastic spectacle with elaborate set-pieces and large, detailed environments. Most developers do not believe the sacrifice is worth it, hence the prevalence of 30 FPS games.

So when the OP says "games are always better with a higher frame rate" no it isn't true if the game hasn't been designed from the ground up (so the decision has been made during pre production before ANY line of code has been typed that the game was to hit a higher refresh rate )

There are recent examples even on PC that a higher frame rate just kills the gameplay if you didn't design the game accordingly.

Well, no, coders and scripters just have to make sure their logic is framerate-independent. If they do that, the game won't break at variable framerates. Most modern console games target 30 FPS but their code is framerate-independent so they don't have any issues running at 60+ FPS on PC.
 

astro

Member
Oct 25, 2017
56,902
Even modern day games won't benefit from a higher frame rate if the rest of the game mechanics aren't tied and timed accordingly.
Look at the nightmare that Durante went through while trying to save the PC port of "Little King's Story" ...
Again if the rest of the underlying design isn't meant to target 60FPS then pushing the game to 60 won't improve the game.
So when the OP says "games are always better with a higher frame rate" no it isn't true if the game hasn't been designed from the ground up (so the decision has been made during pre production before ANY line of code has been typed that the game was to hit a higher refresh rate )

There are recent examples even on PC that a higher frame rate just kills the gameplay if you didn't design the game accordingly.

Yes they would, as long as changing the frame rate doesn't break the game then every game will benefit from higher frames simply from the boost to clarity in motion.

So,okay, not EVERY game will benefit from this.

Let's re-phrase: any game that COULD hit 60 and play well would 100% be better at 60.
 

TheSyldat

Banned
Nov 4, 2018
1,127
Yes they would, as long as changing the frame rate doesn't break the game then every game will benefit from higher frames simply from the boost to clarity in motion.

So,okay, not EVERY game will benefit from this.

Let's re-phrase: any game that COULD hit 60 and play well would 100% be better at 60.
Even with your rephrasing you basically straight up agree that therefore the ALL team has to have that target in mind from the get go at the very start of the project. And like I said given that everybody is jumping on the 4K bandwagon and that 4K TV screens are barely able to keep up a solid steady 30 without fluctuation at 4K , no it won't be on the map for big AAA hard hitters unless gamers are willing to say that they prioritize frame rates over resolution ...

At some point you have to choose folks . Animation fluidity , or higher resolutions you CAN'T have both , so which one is it ?
 

astro

Member
Oct 25, 2017
56,902
Even with your rephrasing you basically straight up agree that therefore the ALL team has to have that target in mind from the get go at the very start of the project. And like I said given that everybody is jumping on the 4K bandwagon and that 4K TV screens are barely able to keep up a solid steady 30 without fluctuation at 4K , no it won't be on the map for big AAA hard hitters unless gamers are willing to say that they prioritize frame rates over resolution ...

At some point you have to choose folks . Animation fluidity , or higher resolutions you CAN'T have both , so which one is it ?

I don't know why you're clouding this so much, it's very simple.

If a game can hit 60fps it will play better at 60fps. Obviously certain games need to trade off, that was never in question.
 

Nooblet

Member
Oct 25, 2017
13,626
Dismissing a campaign like Infinite Warfare as "all style no substance", "lacklusture" and "barely evolved since COD4" is as disingenuous as it gets and is basically saying "I don't like it so it's lacklusture". Infinite Warfare is one of the best made campaigns out there, not just for COD game but as a campaign in general. That game is more like a Battlestar Galactica game than any battlestar Galactica game and yet because it's COD it'll get called a Michael Bay esque game with no substance.

Campaign in COD games take up more than half of the game's budget, if they weren't important to the franchise and if people didn't play them then they wouldn't be making them. BO4 doesn't have a campaign not because they thought they'd do fine without one, but because the campaign got mismanaged the fuck out of it and Acti has made it clear that BO4 not having a campaign does not mean anything for the future COD games. If BO4 didn't have Blackout it'd have sold way worse than it has due to a lack of campaign, that I can say with absolute certainty. Having a AAA BR game really helped the game.

As for Gears of War, Epic always tried to downplay the MP and push on the single player for those games because Cliff apparently wanted the game to be considered as a serious story focused shooter...right up until Gears 3 they kept doing this. So as far as the developers are concerned their focus was the single player. Gears Judgement was a game that focused on the campaign and story even more. The Coalition may be a new studio but it's led by the same guy who made past Gears games i.e. Rod Ferguson. When they got the reins to the franchise they wanted to show that they understand Gears first and as such made a game that showed that they understand the franchise and its systems properly, especially after the disaster that was Judgement (and also seeing how Halo 4 turned out after a handover), which is why Gears 4 was such a safe game and focused on the MP where the game's systems are so clearly visible. They also said that after they've made the safe game they want to experiment, which is why you see Gears 5 going a different route and trying something new in SP.

Point being, just because you don't like something does not make it less relevant because that is dismissive.

Also the reasoning that franchise handovers lead to irrelevance is a terrible argument. the reason Halo is less relevant today is because there is more competition, not because there was a franchise handover and Bungie stopped making them. These games sell millions and millions of copies, most people who actually buy them games don't even know it was made by a different studio because as far as they are concerned "Microsoft" made Halo. The only studio out there who is capable of selling games to mainstream casual market on their name alone is Rockstar. The rest are literally just a name on the box that no one even blinks at, as far as the majority are considered.
 
Last edited:

TheSyldat

Banned
Nov 4, 2018
1,127
I don't know why you're clouding this so much, it's very simple.
If a game can hit 60fps it will play better at 60fps. Obviously certain games need to trade off, that was never in question.
I'm not clouding this I'm pointing out that there is a dissonance between what gamers say they want and what the market shows they actually care for.
And the market clearly shows that folks care much more for "the pretties" than they care for consistent high frame rates and therefore animation fluidity .
So again I repeat given that currently 4K TV screens can't hit a solid 60 FPS and that 1080p on 4K looks ugly as shit what do you actually favorise ? "Latest and greatest" in TV screen technology ... and therefore given that TV screen manufacturers don't give two fucks about games you deal with the 30 FPS shenanigans , or you guys stop buying 4K screens and buying games because they have "them 4K pretties" and devs can focus on 60 FPS ...

Not clouding anything just stating a fact you can't have both so which is it that you actually prefer this generation ?
 

astro

Member
Oct 25, 2017
56,902
I'm not clouding this I'm pointing out that there is a dissonance between what gamers say they want and what the market shows they actually care for.
And the market clearly shows that folks care much more for "the pretties" than they care for consistent high frame rates and therefore animation fluidity .
So again I repeat given that currently 4K TV screens can't hit a solid 60 FPS and that 1080p on 4K looks ugly as shit what do you actually favorise ? "Latest and greatest" in TV screen technology ... and therefore given that TV screen manufacturers don't give two fucks about games you deal with the 30 FPS shenanigans , or you guys stop buying 4K screens and buying games because they have "them 4K pretties" and devs can focus on 60 FPS ...

Not clouding anything just stating a fact you can't have both so which is it that you actually prefer this generation ?

There's no dissonance.... we're talking about the simple fact that if a game can hit 60fps it will play better at 60fps.

That's it, simple fact. Obviously there are reasons why devs choose to not target 60.
 

TheSyldat

Banned
Nov 4, 2018
1,127
There's no dissonance.... we're talking about the simple fact that if a game can hit 60fps it will play better at 60fps.
That's it, simple fact. Obviously there are reasons why devs choose to not target 60.
And said reasons are because the market share clearly shows that people care more about "them pretties , and all that 4K goodness" than they care about frame rate. So the reason is for the very dissonance that I just highlighted that you deny exists...
 

astro

Member
Oct 25, 2017
56,902
And said reasons are because the market share clearly shows that people care more about "them pretties , and all that 4K goodness" than they care about frame rate. So the reason is for the very dissonance that I just highlighted that you deny exists...

Except none of that is actually contrary to the point: if a game can hit 60fps it will play better at 60fps.

It will also gain fidelity via clarity in motion.

Yes, there are reasons devs don't target 60fps always. That doesn't change anything about that above statements.
 

Revolsin

Usage of alt-account.
Banned
Oct 27, 2017
4,373
You sacrifice a lot more than just resolution aiming for 60. A lot of the game's graphics as a whole will need to be downgraded from the ground-up to even allow a high framerate.

You simply wouldn't have gotten games anywhere near as beautiful as GoW or Uncharted 4 on PS4 if their baseline was 60.

So the actual question is, will developers make shittier graphics for the sake of a niche that wants 60fps? Hell no.
The games that actually need 60fps to be good like shooters and fighters will keep it, but the rest can and should stay away.
 

astro

Member
Oct 25, 2017
56,902
You sacrifice a lot more than just resolution aiming for 60. A lot of the game's graphics as a whole will need to be downgraded from the ground-up to even allow a high framerate.

You simply wouldn't have gotten games anywhere near as beautiful as GoW or Uncharted 4 on PS4 if their baseline was 60.

So the actual question is, will developers make shittier graphics for the sake of a niche that wants 60fps? Hell no.
The games that actually need 60fps to be good like shooters and fighters will keep it, but the rest can and should stay away.

But the trade off is too much. The loss of clarity in motion and the impact on gameplay makes 30fps not worth it, imo.

Of course, this is why I game on PC, I get to experience both.
 

TheSyldat

Banned
Nov 4, 2018
1,127
User Banned (1 Week): Continued hostility and antagonizing other members; previous infractions
Except none of that is actually contrary to the point: if a game can hit 60fps it will play better at 60fps.

It will also gain fidelity via clarity in motion.

Yes, there are reasons devs don't target 60fps always. That doesn't change anything about that above statements.
Okay since you're apparently that thick and dense let me spell it out for you .
Both in terms of techinicity and financially you can't have both , so when a game is slated to be a visual flagship for a platform do expect it to not hit 60 FPS because they chose to go for the pretties.
So no games aren't always better off with a higher frame rates it depends on genre , context , whether its multiplat or first party exclusives etc etc etc .

And yes targetting 60 FPS as a dev team can be a meaningless and needlessly stressfull endeavour for a project that won't benefit from it.
Sorry but Witcher 3 won't be any better than it already is if played at 60 FPS , the story will still be as good the gameplay won't be improved at all , gamers will just experiment a slight imporvement gamefeel wise but really nothing that is truly ground breaking. Targeting 60 FPS is therefore in this case an unreasonable demand plain simple .... Especially given that it's played on screen technologies that are barely able to hit a steady 30 in the first place ...


Of course, this is why I game on PC, I get to experience both.
So you don't even have a dog in this race because you already made your choice loud and clear and yet you still feel like dogpiling on everyone . Talk about being a jackass here ...
And a troll ...
 

astro

Member
Oct 25, 2017
56,902
Okay since you're apparently that thick and dense let me spell it out for you .
Both in terms of techinicity and financially you can't have both , so when a game is slated to be a visual flagship for a platform do expect it to not hit 60 FPS because they chose to go for the pretties.
So no games aren't always better off with a higher frame rates it depends on genre , context , whether its multiplat or first party exclusives etc etc etc .

And yes targetting 60 FPS as a dev team can be a meaningless and needlessly stressfull endeavour for a project that won't benefit from it.


You're being needlessly rude, what's the point?

For the third time, I understand you can't have both.

The fact remains: if it can play at 60fps, it will play better at 60fps.

Knowing this, the trade off for visuals is too high imo.
Sorry but Witcher 3 won't be any better than it already is if played at 60 FPS , the story will still be as good the gameplay won't be improved at all , gamers will just experiment a slight imporvement gamefeel wise but really nothing that is truly ground breaking. Targeting 60 FPS is therefore in this case an unreasonable demand plain simple .... Especially given that it's played on screen technologies that are barely able to hit a steady 30 in the first place ...

TW3 is multitudes better at 60+fps, it's night and day difference.
 

Almagest

Member
Oct 28, 2017
1,447
Spain
Last time I checked some people said they couldn't even notice 30 from 60, so I highly doubt devs are going to hurt one of their major selling points in AAA games (shiny graphics) for better frame rate.
 

60fps

Banned
Dec 18, 2017
3,492
Sorry but Witcher 3 won't be any better than it already is if played at 60 FPS , the story will still be as good the gameplay won't be improved at all , gamers will just experiment a slight imporvement gamefeel wise but really nothing that is truly ground breaking. Targeting 60 FPS is therefore in this case an unreasonable demand plain simple .... Especially given that it's played on screen technologies that are barely able to hit a steady 30 in the first place ...
What?
Of course The Witcher 3 would be WAY better in 60fps. Controlling a character or the camera in 60fps just feels way better for starters, no matter which game. The controls would be more responsive. And you could actually focus on the mountains in the distance while the camera is moving.

30fps is the reason why people watching me play tell me they get headaches when I move the camera too fast.
Except none of that is actually contrary to the point: if a game can hit 60fps it will play better at 60fps.

It will also gain fidelity via clarity in motion.
Agreed, that's a fact.
 

Revolsin

Usage of alt-account.
Banned
Oct 27, 2017
4,373
What?
Of course The Witcher 3 would be WAY better in 60fps. Controlling a character or the camera in 60fps just feels way better for starters, no matter which game. The controls would be more responsive. And you could actually focus on the mountains in the distance while the camera is moving.

30fps is the reason why people watching me play tell me they get headaches when I move the camera too fast.

Agreed, that's a fact.

A Witcher 3 built for 60fps on consoles wouldn't even resemble the game we have currently, let alone be better.

It'd look closer to Skyrim.
 

TheSyldat

Banned
Nov 4, 2018
1,127
TW3 is multitudes better at 60+fps, it's night and day difference.
I played both on PC and on the consoles version and the difference between the two is barely worth mentionning ...
Yes Geralt feels a bit snappier woopty do ... did it change the story telling (the main reason why you play the game in the first place ) answer is no , did it change the mechanics ? no , did having it run at 60 FPS make it snappier feeling ? well yes but that's about it really ...

So again and for the last time it depends on your project what you want to market yourself as etc etc , no 60 FPS is not always a reasonnable target to aim at .
 

aisback

Member
Oct 27, 2017
8,739
I think we will get it more next gen.

The CPUs should be able to handle it a lot better then the Jaguar ones we currently have
 

Qudi

Member
Jul 26, 2018
5,318
I'll go with yes. I remember this kind of question being asked back on neogaf with regards to 1080p and the current console gen. And there were a fair bit of people who dismissed that idea because they thought it was unrealistic to both get things like better textures, effects, higher polycount etc, while also raising resolution. Turns out the new consoles were more than capable of doing both.

And you also already have the pro and x dipping into offering higher framerates, so it's clearly something that the console makers and developers are thinking about.



Was 30FPS rock solid in Horizon? Because going from mostly pc gaming to that game was incredibly jarring, everything was a blur and controlling it felt like playing in molasses. A friend had a similar experience with GoW, to the extend that he stopped playing it
Im a console gamer mostly and i have no problem playing 30FPS games. I also can switch between 30FPS and 60FPS games without any headaches. Depends on the person if they adapted to different framrates, i guess. As for Horizon it felt very fluid for me and there were almost never framerate dips below 30FPS (see digitalfoundry for prove).
 

astro

Member
Oct 25, 2017
56,902
I played both on PC and on the consoles version and the difference between the two is barely worth mentionning ...
Yes Geralt feels a bit snappier woopty do ... did it change the story telling (the main reason why you play the game in the first place ) answer is no , did it change the mechanics ? no , did having it run at 60 FPS make it snappier feeling ? well yes but that's about it really ...

So again and for the last time it depends on your project what you want to market yourself as etc etc , no 60 FPS is not always a reasonnable target to aim at .

I have said it many times now. I have never once claimed that 60fps is always the right move.

I have simply stated that games will play better at 6-fps if they can, and TW3 absolutely does.

You're being dishonest claiming it's not, 60fps+ adds so much to TW3 combat He doesn't "just feel a bit snappier", you get clarity in motion to the point you can read animation tells far more quickly, you get to respond faster because of the higher frames and the fact you can see more clearly while moving, you get to read the environment better as everything is still clear as you move.

I guess you're just going to ignore how rude you were and move along... okay then...
 

60fps

Banned
Dec 18, 2017
3,492
I played both on PC and on the consoles version and the difference between the two is barely worth mentionning ...
Yes Geralt feels a bit snappier woopty do ... did it change the story telling (the main reason why you play the game in the first place ) answer is no , did it change the mechanics ? no , did having it run at 60 FPS make it snappier feeling ? well yes but that's about it really ...

So again and for the last time it depends on your project what you want to market yourself as etc etc , no 60 FPS is not always a reasonnable target to aim at .
In 30fps the image becomes blurry once you move the camera. That alone is reason enough why 60fps is always preferable.

That's why we even have a Survival Horror game like Resident Evil 7 or a RPG like Nier Automata, made in 60fps nowadays. Who would have expected this a couple of years ago? Developers know that more and more people get used to 60fps so this trend will hopefully continue with next console generation.
 
Last edited:

Deleted member 9584

User requested account closure
Banned
Oct 26, 2017
7,132
I would love to see developers build their games in a way that allow for 30fps ports on Switch (or Switch plus) and 4K 60fps versions on consoles. This would make the "graphics leap" not that huge but 60fps across the board would be awesome.
 

Dr. Caroll

Banned
Oct 27, 2017
8,111
Maybe it's because Era is a bit console oriented, but it's honestly remarkable how many people don't know the difference between being CPU limited and GPU limited. There are some graphical effects that are CPU heavy, such as shadows, but by and large the biggest problem stopping open world games from being 60fps on consoles is a weak CPU.

Any PC gamer who has tried to play a CPU limited open world game will tell you that no amount of graphics downgrades will significantly improve performance. You can run the game at 640x480 with everything on Low and it won't magically run at 60fps. In fact, it'll probably run exactly the same as it did on High. There is this idea that fancy visual effects onscreen are directly tied to framerate. But in many cases that isn't true.

4K is obscenely wasteful. Games that try to run at 4K or close to it are openly demonstrating their willingness to basically piss the X's GPU resources up the wall. Oh, look at me, I don't even need all this GPU power. I can just blow it rendering a stupidly large number of pixels. Games running at extremely high resolutions but NOT running at 60fps at something like 1080p is directly tied to the fact they've got oodles of GPU resources to waste, but they have a low end notebook CPU. Far Cry 5 can render the luscious scenery at basically 4K on consoles, but it can't render a human NPC further than a few hundred meters because AI is a gigantic CPU hog. The next consoles having new CPU architecture will be a game changer.
After Epic stopped working on Gears, the IP became far less relevant to the general gaming audience. Same thing happened when Bungie stopped working on Halo. When IPs are passed off to different developers, that's generally when people stop caring. There are many people who played the first three Gears/Halo games and then stopped once the original developers left.
And they were replaced by new fans. It happens. There are many people who stopped playing Assassin's Creed with AC3, but the series is still going strong. There have been plenty of game series that changed developers and are still thriving. Bethesda didn't create Fallout. But Fallout 3 was a huge success. As was Fallout 4. Machinegames didn't create Wolfenstein. But their version of Wolfenstein did well.
Gears of War was a killer app for the Xbox 360. That's no longer the case with the Xbox One. Forza is really the only killer app on Xbox these days.
How do you even quantify that? How is Gears 5, a game that will likely sell millions of copies and earn glowing reviews not a (potential) killer app?
No, I'm saying a multiplayer-centric game is one where the majority of players only care about the multiplayer.
The demographics that buy something do not decide what a work is. GoldenEye was not "multiplayer focused" or "multiplayer centric" or whatever word you wanna use. Its MP was added late in development. The game's singleplayer was a landmark for the genre and remains relevant today long after its MP has faded into a world where people played MP primarily on couches.

Where we fundamentally disagree is that you think that random people on the internet are truth itself. If people on the internet believe that a game flopped, it flopped. The sales figures -- the actual knock-on-wood truth of the matter is not important. People on the internet believe Call of Duty: Infinite Warfare was a flop. It was the best selling game of 2016. They believe "everyone hates it because it's garbage". This is not true for obvious reasons.

Your "voice of the people"-esque logic would almost extend to arguing that if enough people believe a videogame is platform exclusive, then it is platform exclusive.

It is possible for a single work to have multiple fanbases. Just look at Resident Evil. Is RE a singleplayer survival horror series or a co-op series? Both answers are true in different contexts. RE5/6/Revelations 2 are in the middle. They are perfectly good singleplayer games that can also be played in co-op.

Trying to use argumentum ad populum to "prove" that Resident Evil is one thing or the other is not a good argument. I know that's just fancy Latin, but taking a step back, your entire argument is that if "people" don't believe a game is singleplayer, then the game is not singleplayer. Words like "tacked on" are nonsensical semantic nonsense at this point. A game with singleplayer where the singleplayer is the primary focus of game development because it's super hard to make and requires dozens of millions of dollars -- that game is not "MP focused with tacked on MP". MP fans can believe that all they want. They can also believe that the campaign was "probably tossed together in six months" if they want. They can believe "nobody cares about the campaign" all they want. Doesn't make it true.

You know, another example that comes to mind is AC: Odyssey and Kassandra. Kassandra is the canon female protagonist. There are multiple things pointing to it. The novel, the fact her bird is male, the fact Deimos is a male deity so she doesn't fit the role, and stuff like that.

2/3s of AC: Odyssey players chose to play as Alexios instead. Just because 2/3 chose Alexios does not mean that Alexios is the canon protagonist. It does not make AC: Odyssey a "male protagonist focused" game. Some might cite death of the author here, but the audience's opinions or demographics do not change fundamental truths about a work.

If a game developer creates games for PC and then ports those games to console, the game is "PC oriented". This is factual. Beyond dispute. If the console version sells more copies, that doens't mean the console version is the original version. This is another one of those weird quirks. People often assume that games are console first and then ported to PC. And their entire logic process is "Well, the console version sold more, therefore most fans must be console gamers, therefore it must be the primary version."
Assassin's Creed used to have a multiplayer mode but the majority of players didn't care about it so Ubisoft eventually scrapped it.
Ubisoft panicked after the release of AC: Unity, which didn't feature PvP MP like its predecessors but featured co-op. AC multiplayer was always popular. AC: Unity switching from PvP to co-op attracted criticism. The game was rushed, unpolished, and this led Ubisoft to rethink their development practices. They are currently considering bringing back multiplayer in future AC titles. Removing it had zilch to do with "players didn't care about it". It was corner cutting.
Fortnite started as a co-op game but the vast majority of players only care about its BR mode so classifying it as a BR game is accurate. The focus of the majority of players is what determines if a game is SP- or MP-centric.
Fortnite and Fortnite: Battle Royale are two different games. You're straying into a weird argument here -- that the game's GENRE is determined by its F2P spinoff. Really? By your logic, every single game that incorporates a Battle Royale that becomes popular is now a Battle Royale game in and of itself. You gotta be more nuanced that that. If 343 caved and released an F2P Halo Infinite Battle Royale spinoff, you would then claim that Halo Infinite is a Battle Royale title. This is like saying that if a techno group release a rock song that is a huge smash, the entire album it comes from is now a "rock album" and they are a "rock band".
In terms of production values, you are absolutely right. In terms of actual gameplay and level design? CoD and BF campaigns are pretty lackluster. They are equivalent to Michael Bay movies: all style and no substance. Rigidly linear and scripted level design, overly reliant on set-pieces, no meaningful weapon or enemy variety, minimal player agency. Glorified shooting galleries, really. They've barely evolved since CoD4 which is why they feel stale and forgettable. CoD and BF campaigns are basically super expensive spectacles but not much else.
I can't help feeling you probably haven't played a Call of Duty campaign or a Battlefield campaign in a very long time. Battlefield: Hardline was a STEALTH game. The polar opposite of most of what you describe. A game where killing people was almost always your choice and thematically discouraged by your role as a straight arrow police officer. Where the player was placed in sort of wide linear environments and given the freedom to play how they wanted. Battlefield V's general genre shifts from one campaign to the next, but they typically focus on combining stealth and shooting depending on the player's choice. Instead of highly scripted set pieces, Battlefield V largely favors moderate sized sandbox environments where the player can approach them however they want. I'm not sure how anyone could play a recent Battlefield campaign and say, "Oh, yea, when I was freely exploring a non-linear environment using stealth? That was rigidly linear and overly reliant on set pieces. Just a glorified shooting gallery with minimal player agency."



Call of Duty, for its part, has been all over the place design and experimentation wise. I think they're ultimately a bit more conservative than DICE, which has led to them pandering to players a bit much. Less willing to rock the boat. The different developers try different ideas. I have a major problem with your argument on this point. You're someone claiming that Call of Duty does not qualify as a singleplayer story-driven cinematic game of any note. And then your list of supposed complaints is indistinguishable from a list of complaints against something like Uncharted or The Last of Us. Do you like ANY cinematic games? At all?

Call of Duty: Black Ops 2 featured a storyline that changed in response to the player's choices both visible and invisible, with several endings. Any person who has played Black Ops II will tell you how their mind was blown when they discovered a certain hostage rescue was not in fact scripted to fail. When they released that a small but effective assortment of player choices like "I will run really fast and maybe I'll get there in time, " or "I'll drive more carefully here" had a tangible consequence.

Call of Duty: Advanced Warfare cribbed a lot from Crysis 2. Offering combat arena level design with solid levels of verticality in some cases. You were also given gadgets you could deploy at will. With this game, the series tried to allow players to use tools to solve combat scenarios instead of using those gadgets only in context sensitive situations. It also had a really good plot and some top notch set piece design. (Black Ops 3, which was a mess but a curious mess, took these ideas of player tools in any combat scenario even further.)

Call of Duty: WWII was a bit too safe for my tastes -- with the best mission in the game being the female protagonist French Resistance meeting where you infiltrate the Nazi HQ undercover and try to keep your disguise intact, before adopting a Wolfenstein-esque approach to stealth -- but it offered huge tactical freedom during combat encounters. Call of Duty hasn't embraced the sandbox like Battlefield has, but the rails are increasingly being rubbed away. This was paired with the studio finally removing regenerating health, something that Infinite Warfare had offered in its Specialist Difficulty where you had non-regenerating health, limb damage, and stuff like that. Uncharted 4's influence is quite clear on WWII in particular how it handles vehicle sections. Where an older game might have a single linear route and you fail if you go the wrong way, WWII handles it fluidly. You just follow the train, and wherever you drive leads you to the destination. This is a wide linear, rubber banding approach to level design in FPS games that feels more organic.

Call of Duty: Infinite Warfare, for its part, is an amazing game. A love letter to Wing Commander that is the closest thing to Squadron 42 you're likely to get from a mainstream developer for the foreseeable future. It had a bunch of novel innovations. A bigger focus on stealth options during missions. The ability to choose side missions from the war table. Flying around in space, dogfighting. It wasn't super complex dog-fighting, but it was fun. It also has fantastic writing and acting courtesy of Brian Bloom.
Battlefield features single-player but I wouldn't be surprised if they just cut it entirely in the future.
Highly unlikely.
Titanfall didn't have single-player until TF2 and it still bombed. The franchise is likely dead now.
Titanfall 1 didn't have a campaign because they ran out of money. The campaign was by far the most expensive part of the game, and they ran into trouble. Titanfall 2's campaign was rather outstanding. And Titanfall 3 is currently in development. Remember -- EA are rather generous. Look at Crysis. They published Crysis. Sold okay. Published Crysis 2. Sold okay, but they were a bit disappointed. But they gave the developers another chance, and that was Crysis 3. Which... tanked super hard, but at least they tried. Respawn are working on both Titanfall 3 and the new Star Wars game.
Several single-player Star Wars games have already been canceled (like 1313 and Visceral's game).
1313 was cancelled by Lucasarts, not EA. Visceral's development hell project has been recycled into another Star Wars game that is presumably also singleplayer. EA are dedicated to singleplayer. They're just savvy enough to realise that audiences aren't hugely fond of 5-10 hour long games. So they either make open world ones or they pair the moderate length game with multiplayer to ensure it sells.
There's long been a prevalent belief that consumers won't accept $60 games that don't include both single-player or multiplayer modes. That's why Tomb Raider 2013 had a multiplayer mode and why Uncharted and TLOU had them as well. In recent years, publishers have become more comfortable going with single-player only games, though that usually translates to "open-world and enough filler to last 100+ hours."
Bubsy 3D had multiplayer. The Bubsy 3D team made a game called Syphon Filter. It didn't have MP. But every single sequel did. Adding multiplayer to game sequels is a super old thing. People just pretended to be surprised when it happened to Tomb Raider, as you mention. I always find it weird how some folks act as though Tomb Raider 2013 having multiplayer was some kind of oddity for the series considering Rise of the Tomb Raider has multiplayer. And so does Shadow of the Tomb Raider. They switched to a form of co-op multiplayer because that obviously makes more sense for the genre, but as the success of those Lara Croft co-op spinoff games back in the day showed, audiences do enjoy co-op Tomb Raiding and puzzle solving. The business logic is sound. Tomb Raider+Friends=Good stuff.
 

Issen

Member
Nov 12, 2017
6,816
It seems to me like many more games and franchises are 60fps even on consoles this gen. Metal Gear, Yakuza 0/Kiwami, F1/DiRT series, most popular AAA MP games... And that's with the huge detriment of the shitty AMD Jaguar CPUs this gen. Even the games that don't hit 60 seem to have a much more solid lock on 30, and we've seen more widespread use of vsync as well.

Not every game will be 60 FPS, but with FreeSync-like technologies, the much stronger CPUs and current market trends I expect the future of higher, smooth framerates is pretty bright.
 
OP
OP
Calabi

Calabi

Member
Oct 26, 2017
3,484
Even modern day games won't benefit from a higher frame rate if the rest of the game mechanics aren't tied and timed accordingly.
Look at the nightmare that Durante went through while trying to save the PC port of "Little King's Story" ...
Again if the rest of the underlying design isn't meant to target 60FPS then pushing the game to 60 won't improve the game.
So when the OP says "games are always better with a higher frame rate" no it isn't true if the game hasn't been designed from the ground up (so the decision has been made during pre production before ANY line of code has been typed that the game was to hit a higher refresh rate )

There are recent examples even on PC that a higher frame rate just kills the gameplay if you didn't design the game accordingly.

Your being very disingenuous with your arguments. I'm not sure theirs much point in arguing with you since you've been banned, but I'll just say that. Of course if the game is broken and runs at double speed or something at higher frame rates then your right, but if wasn't broken and was built to handle variable frame rates then it would be better The question is why would you do that in this day and age, build a game that cant work at higher or variable frame rates, you'd have to kind of be a bad developer to do that.
 
Oct 25, 2017
11,692
United Kingdom
Current gen consoles have decent enough GPU power (more so with Pro and X) but weak CPU's, so developers have to make compromises.

Higher graphical settings at 30fps.

vs

Lower graphical settings at 60fps.

The balance of CPU vs GPU is one sided, so developers have to make compromises when picking performance or more complex graphics. This is why PC's can run the same multiplatform games at 60fps, because of the much stronger CPU's available.

Thanks to how massively improved the Ryzen CPU's are compared to the old Jaguar, next gen consoles should have a much better balance of CPU vs GPU power, so 60fps should be way more common, while still allowing developers to push high end graphics.

Obviously there will still be some compromises, as no system has unlimited resources but there should be less compromises going forward.
 

Valdega

Banned
Sep 7, 2018
1,609
And they were replaced by new fans. It happens. There are many people who stopped playing Assassin's Creed with AC3, but the series is still going strong. There have been plenty of game series that changed developers and are still thriving. Bethesda didn't create Fallout. But Fallout 3 was a huge success. As was Fallout 4. Machinegames didn't create Wolfenstein. But their version of Wolfenstein did well.

So you're suggesting that Gears is still as relevant today as it was 10 years ago? Okay. The post-AC3 games weren't passed off to a different developer. But they did make significant changes to the series with the last two entries, mostly because the series was becoming irrelevant with its old formula. Gears is still using the same formula as the first game. The Fallout example is a poor one, as 99% of modern Fallout fans have never even played the first two games. The same is true of Wolfenstein, though the success of Machinegames' entries is highly debatable. TNC definitely flopped commercially.

How do you even quantify that? How is Gears 5, a game that will likely sell millions of copies and earn glowing reviews not a (potential) killer app?

There are several quantifiable metrics you can use. The new Gears games don't make a lot of headlines or forum threads or get a lot of attention on social media. Everyone remembers the TV spot for the original game. Not so much for Gears 4. The first two games had a cross-media presence and were often featured on television shows and movies.

Example:


You can also look at the Metacritic scores for each game and see a clear downward trend, with Gears 4 having the lowest score of the numbered entries (84 compared to the 94 of the original). Even the remaster of the original game (GoW: Ultimate Edition) only received an 82, providing pretty good evidence that the Gears formula doesn't really have the appeal it used to.

Where we fundamentally disagree is that you think that random people on the internet are truth itself. If people on the internet believe that a game flopped, it flopped. The sales figures -- the actual knock-on-wood truth of the matter is not important. People on the internet believe Call of Duty: Infinite Warfare was a flop. It was the best selling game of 2016. They believe "everyone hates it because it's garbage". This is not true for obvious reasons.

Infinite Warfare was a failure compared to the previous CoD titles. Activision openly stated that the game failed to meet sales expectations because it failed to resonate with fans. As such, I seriously doubt we'll ever see Infinite Warfare 2. It's important to understand that success is relative. A game can sell millions of copies and still be considered a failure if its predecessors sold twice as much. In this case, Infinite Warfare very likely only sold millions because it had "Call of Duty" in the title. That's not to say the game is bad but consumers don't seem to be very interested in its setting.

The demographics that buy something do not decide what a work is. GoldenEye was not "multiplayer focused" or "multiplayer centric" or whatever word you wanna use. Its MP was added late in development. The game's singleplayer was a landmark for the genre and remains relevant today long after its MP has faded into a world where people played MP primarily on couches.

Goldeneye's single-player isn't relevant today at all. It wasn't even that good when it came out. It was impressive by console standards (which were very low) but it definitely didn't hold up for long. The multiplayer had a much more enduring legacy. There's even a HL2 mod called GoldenEye: Source that still has an active player base. When people reminisce about GoldenEye, they're remembering the 4-player split-screen deathmatch escapades they had with their buddies and how Oddjob was so cheap.

I can't help feeling you probably haven't played a Call of Duty campaign or a Battlefield campaign in a very long time. Battlefield: Hardline was a STEALTH game. The polar opposite of most of what you describe. A game where killing people was almost always your choice and thematically discouraged by your role as a straight arrow police officer. Where the player was placed in sort of wide linear environments and given the freedom to play how they wanted. Battlefield V's general genre shifts from one campaign to the next, but they typically focus on combining stealth and shooting depending on the player's choice. Instead of highly scripted set pieces, Battlefield V largely favors moderate sized sandbox environments where the player can approach them however they want. I'm not sure how anyone could play a recent Battlefield campaign and say, "Oh, yea, when I was freely exploring a non-linear environment using stealth? That was rigidly linear and overly reliant on set pieces. Just a glorified shooting gallery with minimal player agency."

You're right, I haven't played a BF campaign since BF4. It does look like they've become more open-ended, which is good. However, based on the footage you provided, it doesn't look like the stealth mechanics have much depth nor does the level design look particularly interesting. Hitman and Dishonored this is not. Also, are all of the levels like this or are these the exceptions?

Trying to use argumentum ad populum to "prove" that Resident Evil is one thing or the other is not a good argument. I know that's just fancy Latin, but taking a step back, your entire argument is that if "people" don't believe a game is singleplayer, then the game is not singleplayer.

For some reason, you keep ignoring the difference between "only" and "centric." The Battlefield games are multiplayer-centric, not multiplayer-only. There's a pretty big difference. "Multiplayer-centric" means the game contains single-player but the focus (i.e. the reason most people play the game) is the multiplayer. You seem to believe that there's no such thing as a focus and if a game contains both single-player and multiplayer, then the player base is evenly split between them and they both hold equal weight in terms of consumer appeal. That is objectively false. Statistically, CoD and BF are multiplayer-centric games. You may love their single-player campaigns and to you, those may actually be their most appealing parts. However, you are in the minority. Shipping a CoD/BF game without single-player is entirely plausible (as evident by BLOPS4). Shipping a CoD/BF game without multiplayer is not. That fact alone tells you everything you need to know about whether a game's focus is on single-player or multiplayer.

A perfect example of "focus" is the recent PC port of Sunset Overdrive. It doesn't include the multiplayer mode of the Xbox version. If it were a multiplayer-centric game, PC players would be outraged and the game would be getting review bombed on Steam. But it isn't. The reviews are Very Positive. Why? Because most players don't care about SO's multiplayer. Sunset Overdrive is a SP-centric game (or in the case of the PC version, SP-only).

Call of Duty, for its part, has been all over the place design and experimentation wise. I think they're ultimately a bit more conservative than DICE, which has led to them pandering to players a bit much. Less willing to rock the boat. The different developers try different ideas. I have a major problem with your argument on this point. You're someone claiming that Call of Duty does not qualify as a singleplayer story-driven cinematic game of any note. And then your list of supposed complaints is indistinguishable from a list of complaints against something like Uncharted or The Last of Us. Do you like ANY cinematic games? At all?

I can appreciate cinematic adventure games (ala Telltale's work). When it comes to cinematic action games, not so much. But players who do enjoy cinematic action games seem to favor Uncharted and TLOU over CoD and BF, likely due to the superior quality of the characters and narrative.

You know, another example that comes to mind is AC: Odyssey and Kassandra. Kassandra is the canon female protagonist. There are multiple things pointing to it. The novel, the fact her bird is male, the fact Deimos is a male deity so she doesn't fit the role, and stuff like that.

2/3s of AC: Odyssey players chose to play as Alexios instead. Just because 2/3 chose Alexios does not mean that Alexios is the canon protagonist. It does not make AC: Odyssey a "male protagonist focused" game. Some might cite death of the author here, but the audience's opinions or demographics do not change fundamental truths about a work.

Kassandra is the canon protagonist of the upcoming novel, not of the game. I don't think the game has a canon protagonist, though one could easily argue that the game is Alexios-centric for the reasons you mentioned. There's also the fact that he's on the box art and the focus of most of the marketing. For all intents and purposes, when most players think of AC:O, they'll think of Alexios, even if a minority of players believe that Kassandra is better.

Titanfall 1 didn't have a campaign because they ran out of money. The campaign was by far the most expensive part of the game, and they ran into trouble. Titanfall 2's campaign was rather outstanding. And Titanfall 3 is currently in development. Remember -- EA are rather generous. Look at Crysis. They published Crysis. Sold okay. Published Crysis 2. Sold okay, but they were a bit disappointed. But they gave the developers another chance, and that was Crysis 3. Which... tanked super hard, but at least they tried. Respawn are working on both Titanfall 3 and the new Star Wars game.

I have my doubts. When it comes to new IP, publishers are generally pretty lenient with sales expectations. However, if the sequel fails to meet expectations, that's typically the end of the IP. Titanfall 2 didn't meet expectations. TF3 could be in the development but I think it's much more likely that Respawn will be focusing entirely on their Star Wars game.

As for Crysis, that series didn't use the traditional publishing model. If I recall correctly, Crytek handled most of the funding themselves with grants from the government and revenue from their engine licensing business. EA mostly just handled marketing and distribution under the EA Partners program (the same thing they did with The Orange Box, L4D2 and Portal 2). The Crysis series isn't really representative of EA's willingness to accept failure.

Ubisoft panicked after the release of AC: Unity, which didn't feature PvP MP like its predecessors but featured co-op. AC multiplayer was always popular. AC: Unity switching from PvP to co-op attracted criticism. The game was rushed, unpolished, and this led Ubisoft to rethink their development practices. They are currently considering bringing back multiplayer in future AC titles. Removing it had zilch to do with "players didn't care about it". It was corner cutting.

Given the fact that each AC game has like ten studios working on it, I seriously doubt the lack of multiplayer is simply the result of Ubisoft being cheap. It's far more reasonable to believe that multiplayer simply isn't a priority because most AC players don't care about it. If they did, we'd still be seeing multiplayer in AC games.

Fortnite and Fortnite: Battle Royale are two different games. You're straying into a weird argument here -- that the game's GENRE is determined by its F2P spinoff. Really? By your logic, every single game that incorporates a Battle Royale that becomes popular is now a Battle Royale game in and of itself. You gotta be more nuanced that that. If 343 caved and released an F2P Halo Infinite Battle Royale spinoff, you would then claim that Halo Infinite is a Battle Royale title. This is like saying that if a techno group release a rock song that is a huge smash, the entire album it comes from is now a "rock album" and they are a "rock band".

Fortnite's massive popularity is a result of its BR mode. The overwhelming majority of players play that mode. I think it's safe to assume that the vast majority of players have never even played the original co-op mode. When everyone identifies Fortnite as a BR game (which they do), it's reasonable to call it a BR game.

Adding multiplayer to game sequels is a super old thing. People just pretended to be surprised when it happened to Tomb Raider, as you mention. I always find it weird how some folks act as though Tomb Raider 2013 having multiplayer was some kind of oddity for the series considering Rise of the Tomb Raider has multiplayer. And so does Shadow of the Tomb Raider. They switched to a form of co-op multiplayer because that obviously makes more sense for the genre, but as the success of those Lara Croft co-op spinoff games back in the day showed, audiences do enjoy co-op Tomb Raiding and puzzle solving. The business logic is sound. Tomb Raider+Friends=Good stuff.

You're neglecting to mention that the co-op in RoTR and SoTR is only available in some of the DLC. It's the very definition of tacked on. Also, the Lara Croft spin-offs weren't very successful. That's why Square stopped making them.
 
Last edited:

julia crawford

Took the red AND the blue pills
Member
Oct 27, 2017
35,181
I'd rather they focus on building more inclusive games that tell different kinds of stories and explore different kinds of gameplay.

I'd rather they read a feminist book than the fiftieth edition of Mastering C++, which definitely has a cool drawing of a funny bird on the cover.
 

DvdGzz

Banned
Mar 21, 2018
3,580
Glad I don't care about FPS as much as some. I enjoyed Halo 1 more than CoD and it was 30 fps. Sure, 60 is better but I can easily handle 30 fps.
 

horkrux

Member
Oct 27, 2017
4,719
I never said it was bad. It was decent, though pretty forgettable to compared to Duke 3D's. Q1's legacy is its multiplayer. It basically started eSports and popularized modding.

Cool, but I still made an effort to play its campaign 20 years after its release for the first time, and I still had a blast. I'd say it's still one of the best SP shooters you can play.
I wouldn't have taken notice of its campaign if it hadn't been part of its legacy.

I fail to see why it can't be both if the quality is there.
 

Valdega

Banned
Sep 7, 2018
1,609
Cool, but I still made an effort to play its campaign 20 years after its release for the first time, and I still had a blast. I'd say it's still one of the best SP shooters you can play.
I wouldn't have taken notice of its campaign if it hadn't been part of its legacy.

I fail to see why it can't be both if the quality is there.

It can be. You really enjoyed Q1's SP so for you, Q1 is all about the single-player. However, for people that played Quake when it came out, the multiplayer was the most memorable part.
 

Tedmilk

Avenger
Nov 13, 2017
1,909
From what I've been reading, next gen consoles will have a stronger CPU compared to their GPU, meaning (in theory) a greater likelihood of more 60fps titles. In reality... I doubt it.

Having said that, if the base units are targeting 4K, maybe we'll see pro units that focus on frame rate doubling? In that case, count me in.