• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.
  • We have made minor adjustments to how the search bar works on ResetEra. You can read about the changes here.

Pargon

Member
Oct 27, 2017
12,012
Like, it's simply terrible. It gives that soap opera effect, no matter the game, it just feels fake. Worst, sometimes the animations feel like they're accelerated. This is pretty noticiable in the last of us remastered, uncharted trilogy remastered and kingdom hearts remastered cutscenes not capped at 30fps.
I'm all for 60fps+ gameplay, but please, devs, keep the cutscenes at 30fps.
Games that either switch to pre-rendered cutscenes which are only rendered at 30, or worse, games like Quantum Break which are real-time but lock the frame rate to 30 FPS, are terrible.

"Soap opera effect" is literally just smooth motion.
It's old men complaining because they have been conditioned to garbage frame rates by spending their lives exclusively watching movies at home on 60Hz displays with 3:2 pulldown, flat panels with judder and no motion resolution at 24Hz, or in theaters which use multi-bladed shutters to increase the refresh rate.
They are set in their ways and have no interest in adapting to better frame rates - so they made up a term to make high frame rates sound like cheap productions.

The irony of the situation is that 24 FPS was never mean to look this bad.
Back when 24 FPS was chosen as the lowest frame rate they could get away with to save money when shooting on reels of film, it was projected at 24Hz with a single-bladed shutter. That meant the films flickered a lot; hence the name "flicks".
Modern projectors use a multi-bladed shutter to display the same frame multiple times and increase the refresh rate from 24Hz to 48Hz or 72Hz.
The problem is that displaying 24 FPS at 48Hz or 72Hz does not just eliminate the flicker. Repeating the frame like this introduces significant judder, and means that motion is no longer as smooth as it's supposed to be.

If you actually watch 24 FPS content on a single-bladed projector; or on a CRT, DLP, or OLED display using black frame insertion to reduce the effective refresh rate to 24Hz, you will find that motion suddenly becomes incredibly fluid.
It looks as though you just enabled the smoothest interpolation on the display - except there are no interpolation errors, because all you did was reduce the refresh rate by drawing black frames.
The problem is that it also flickers worse than any display you've ever seen - especially if it's bright. It's only watchable at low brightness levels in a dark room.

Interpolation actually restores the original smoothness of motion that was originally present with 24 FPS film - and does it without flickering.
The downside is that interpolation is imperfect and may introduce other visual artifacts.

Not a problem I've ever had with any game I've played.
I don't understand why gameplay would be fine but not cutscenes. It's all real-time graphics in a video game. Why wouldn't gameplay also have this soap opera effect?
It's an issue of perception.
You're conditioned that bad frame rate = "cinematic" so when you see cutscenes your brain expects motion to look terrible. It's not actually different from gameplay.

So it's like that awful motion smoothing that our parents insist on having enabled on their TVs for some godforsaken reason.
It's not. Nearly all animation is interpolated in games. It's just rendered out at a higher frame rate. The animation itself is unchanged.
Motion interpolation on TVs has to analyze a 2D image and guess the motion vectors. Games render the in-between frames in the animation data they were given, rather than guessing.

That being said, good interpolation is not inherently bad. Here's the results from a 10-year old Sony TV:


It may not be perfect, but I don't know how anyone can look at that and say it's not an improvement.
The downside for gaming is that it adds input lag. That TV goes from 56ms in game mode to 89ms with interpolation enabled. It's actually not a bad increase in latency - only 33 ms. The problem is that the base input lag is high.

Let's break the cycle and make 60 fps standard for all media
It should be 120 now that HDMI officially supports 120Hz.
60 is what we've been stuck with since television's inception. It's a relic.

Nier:A cutscenes were in 30fps and better for it
They were pre-rendered garbage.

[…] Gemini man is a really interesting example of the challenges that Hollywood faces, the HFR is actually not as jarring as the Hobbit movies were, but the quality of the CG occasionally falls below the threshold of acceptability, something that perhaps might not show itself at 24fps. There are are a couple of shots that look like Will Smith is in Uncharted 3
Any of the examples that people use to point out that the sets/make-up/prosthetics/cg "looks terrible in HFR" always looks equally bad in 24 FPS to me. But people blame the frame rate for it.

Oh lord, ,no. It'll create an effect very similar to what we see with games that have motion capture at 30, but everything else is significantly higher frame rate. You can already see this on some 30i anime and it looks distracting/awkward sometimes. Normally frame rate judder and stuff doesn't bother me, but there are times where the camera movement being much higher than the actual animation frame rate makes it extremely nauseating to me. That's NOT a good idea, especially given a lot of animation, notably Japanese animation, are animated with less frames.
That's how most 2D animation is done. The characters animate at a low frame rate; e.g. 12 FPS, while camera pans and similar motion is animated at 24 FPS.
The reason so much 3D animation which tries to imitate 2D looks bad is because they render everything out at 12 FPS instead of only the character animation.
Keeping the animation at its original frame rate but all the camera pans etc at 120 would be an improvement over having to use interpolation (though interpolation can help out with the character animation too).
 

NeroPaige

Member
Jan 8, 2018
1,708
Jarring when playing unlocked framerate and then going to awful 30fps.

going from 60 fps gameplay to 30 fps cutscenes is super jaring which is a common sight for me on PC.
Some PC Japanese games (musou titles) also have that awful BINK video compression whereas PS4 reads from the Blu-Ray.

You can download youtube videos of PS4 cutscenes and they would still have less visible artifacts and no micro stutter. The PC ones look like they're 23.976 fps instead of a solid 30 too.
 

faint

Member
Oct 27, 2017
1,156
I agree that both 30+ FPS cutscenes and the drop from 60+ FPS gameplay to a 30 FPS cutscene can look incredible jarring, but as a previous poster mentioned, I believe the former is due to perception of what we've been conditioned to consider cinematic. It doesn't make sense to me that I prefer gaming at high frame rates yet loathe some cutscenes at equal frame rate. I think over time I'll get used to it, although I do think TLOU is a great example of a game where cutscenes just look "off".
 

Mullet2000

Member
Oct 25, 2017
5,904
Toronto
I've never understood "soap opera effect" as being a negative. It's the kind of things that comes off as people being told something is bad so they continue to repeat that it's bad.

It...never looked bad to me? 48 fps Hobbit doesn't look bad to me? A 60 fps cutscenes doesn't look bad to me? I genuinely don't get it.
 

LCGeek

Member
Oct 28, 2017
5,857
You lost the argument to me using the soap box effect and then talking about animation speed.

You can control the animation speed most going to 60fps don't. You can have your cake and eat it too sadly most in control of that aspect don't bother to simply adjust for the speed of 60fps.

Actual soap box effect can be lessened turning crappy features on your hdtv off.

You don't say the same about your eyes seeing much more temporal motion outside of video games it's mind boggling any rational has this argument with others or themselves.

I'm with users who saying it jarring to be playing games at one speed than have cutscenes in another

Cutscences at 60fps are no different than Watching Movies or a show at 60fps. It just doesn't look natural so I agree. Obviously gameplay at 60fps is no problem.

This will always be wrong. Games don't get the benefit of reality scale and downsampling.
 

PurestGamer78

Member
Oct 27, 2017
210
You lost the argument to me using the soap box effect and then talking about animation speed.

You can control the animation speed most going to 60fps don't. You can have your cake and eat it too sadly most in control of that aspect don't bother to simply adjust for the speed of 60fps.

Actual soap box effect can be lessened turning crappy features on your hdtv off.

You don't say the same about your eyes seeing much more temporal motion outside of video games it's mind boggling any rational has this argument with others or themselves.

I'm with users who saying it jarring to be playing games at one speed than have cutscenes in another



This will always be wrong. Games don't get the benefit of reality scale and downsampling.

You're entitled to your own opinion A cutscene is literally like watching a movie or show. People don't think 60fps looks good in movies. Why would a cutscene be any different? The way 60fps doesn't look natural in movies is the same way with some cutscenes.
 

EvilBoris

Prophet of Truth - HDTVtest
Verified
Oct 29, 2017
16,683
Any of the examples that people use to point out that the sets/make-up/prosthetics/cg "looks terrible in HFR" always looks equally bad in 24 FPS to me. But people blame the frame rate for it.

Exactly, it's not the frame rate that it is the issue, it's that you've taken something that looks like it doesn't belong In real life and made it look more real.

I remember when the BBC switched to HD broadcasting, the presenters complained that their makeup looked terrible. Basically it was and that little bit of extra Vaseline on the lens that the older cameras and the resolution provided simply hid it.
Makeup artists started having to try harder and change the techniques they had been using for decades.

To me it is interesting we chase "photorealism" but are still leaving behind fundamental components as what makes something look real or not by the wayside.
60fps not looking right or not looking natural is actually people saying they prefer things to look unnatural and wrong.
 

Mivey

Member
Oct 25, 2017
17,825
I agree OP.
Everything, from cutscenes to actual game play, should be 24 FPS, with lots of film grain to make everything appear natural. Only then can you have the true cinematic experience.
The human eye can't see more than 30 FPS anyway.
 

60fps

Banned
Dec 18, 2017
3,492
Uh.

Saying 30 is better than a (consistent) 60 is a silly arbitrary point. If you're not aiming for parity with gameplay and as high as the hardware can push, why 30. Why not 15? I mean, halving it once made it more cinematic, why not do it again. Hell, why not bring it down to 7.5? That's like, at least four times as cinematic as 30fps, right?
I'm all for 30spf (seconds per frame). This gives us all the time to really appreciate those super crisp 4k, or better, 8k images.
 
OP
OP
Swift_Gamer

Swift_Gamer

Banned
Dec 14, 2018
3,701
Rio de Janeiro
Some pr team or persons doing premature excuse for gen 9 being 30fps again?
What?
You're entitled to your own opinion A cutscene is literally like watching a movie or show. People don't think 60fps looks good in movies. Why would a cutscene be any different? The way 60fps doesn't look natural in movies is the same way with some cutscenes.
I really don't think it looks good because it's more life like and breaks the illusion. It's seems fake and staged. I really don't like it. Hobbit at 48fps was terrible.
 

LCGeek

Member
Oct 28, 2017
5,857
You're entitled to your own opinion A cutscene is literally like watching a movie or show. People don't think 60fps looks good in movies. Why would a cutscene be any different? The way 60fps doesn't look natural in movies is the same way with some cutscenes.

Yes I'm entitled to my opinion, thank you for remind me of that.

Like isn't good enough in a dicussion like this and I'm point out a huge reason why. It's insanely misguided to apply aspects of fake rendered scene to something that has been captured from reality and post processed.

All media based from reality gets the benefits of reality vs being rendered. For instance you don't need to animate life and it's done for you and with cameras that have thousands of fps you can see there is a lot there. You would have to do the same thing for a rendered scene and then speed it up properly.

I'm not saying 60fps doesn't look janky I'm saying it looks janky for these reason. Yet instead of being a whiny consumer I want devs or movie makers to do a better job post processing or adjusting for the speed which is possible.
 

headspawn

Member
Oct 27, 2017
14,608
Yall ever think that maybe 30fps cutscenes are too gamey and maybe 24fps is the sweet spot?
 

Kanann

Banned
Oct 25, 2017
2,170

If they are not "Phil Spencer" (2019 version) they will alway look for a chance to say.

"look! 30fps is ok, they don't care"
"we can 30fps forever because they support us!"
"spider-man!"

They should go watch black and while 12fps films or animations of they want (a or multiple) nightmare that bad.


people start buying 120hz tv.
phone regularly 90hz or 120hz in gaming.

and they just.....just.....sigh
 

Asbsand

Banned
Oct 30, 2017
9,901
Denmark
Like, it's simply terrible. It gives that soap opera effect, no matter the game, it just feels fake. Worst, sometimes the animations feel like they're accelerated. This is pretty noticiable in the last of us remastered, uncharted trilogy remastered and kingdom hearts remastered cutscenes not capped at 30fps.
I'm all for 60fps+ gameplay, but please, devs, keep the cutscenes at 30fps.
Worst of all, all those pre-rendered cutscenes used to mask loading and transitions on consoles become sub-native 30fps cutscenes on pc. Playing Mass Effect 2 and 3 was sometimes egrigious because they intermingled pre-rendered with in-game so much.

That Genophage Cure part everyone likes was weak as hell to me because of how aritifical everything felt from a technical POV. I later played the game on PS3. It was laggier but the resolution and the framerate was in sync for that whole setpiece and it had a much more profound effect.

Higher FPS =/= Artistic Result
 
Aug 17, 2018
65
pffff haha no, fuck 30fps wherever it may appear. Switching from 60 or 165 fps gameplay to 30 fps cutscene is just so immersion breaking and annoying.
 
OP
OP
Swift_Gamer

Swift_Gamer

Banned
Dec 14, 2018
3,701
Rio de Janeiro
If they are not "Phil Spencer" (2019 version) they will alway look for a chance to say.

"look! 30fps is ok, they don't care"
"we can 30fps forever because they support us!"
"spider-man!"

They should go watch black and while 12fps films or animations of they want (a or multiple) nightmare that bad.


people start buying 120hz tv.
phone regularly 90hz or 120hz in gaming.

and they just.....just.....sigh
But 30fps is absolutely ok. Doesn't mean I don't like or want 60fps.
 

Łazy

Member
Nov 1, 2017
5,249
30fps scenes interpolated to 60fps (what occurs in The Last of Remaster, and on TVs that have Smooth Motion mode), is bad nobody is denying that.....thats the only place that I see a logical reason someone would say the cutscenes should 30 because 60 looks bad.

Now you agree with the OP.
Which means there is atleast a few 60+fps games you have played that when the cutscenes roll out at 60+fps you didnt like the experience.
So now I would like to know what these games are so I can test them in case its the games that are borked....maybe you are blanket agreeing to something that only affects 1 game.

Also note that this whole quote chain started from this post v



So since it looks so wrong, i wanted to see which games you had played that had bad 60+fps cutscenes.
People just post but have nothing to back up what they are saying....even if its your own opinion....you formed your opinion on something....I want to see that something, maybe we are on the same side.
Looks so wrong... obviously to me.

Wtf, I don't have to back up my tastes wow... I like what I want.
I can't believe it...
 

Black_Stride

Avenger
Oct 28, 2017
7,388
Looks so wrong... obviously to me.

Wtf, I don't have to back up my tastes wow... I like what I want.
I can't believe it...

As i said your tastes are yours but if you are going post them on a public forum you cant then say ohh but im not going to tell you about it.
You opened yourself up to scrutiny by posting.

Its not like im even asking you to justify your opinion.

You are failing to mention even one 60fps game that looks wrong(according to youe opinion)?To have formulated you opinion and tastes you woudld have had to actually played games with said issues.....
Clearly you dont actually have anything to add to the topic, so I will just leave it at that till you can post atleast one noneinterpolated 60fps game which has the effect being discussed here.
 

Pargon

Member
Oct 27, 2017
12,012
Exactly, it's not the frame rate that it is the issue, it's that you've taken something that looks like it doesn't belong In real life and made it look more real.

I remember when the BBC switched to HD broadcasting, the presenters complained that their makeup looked terrible. Basically it was and that little bit of extra Vaseline on the lens that the older cameras and the resolution provided simply hid it.
Makeup artists started having to try harder and change the techniques they had been using for decades.

To me it is interesting we chase "photorealism" but are still leaving behind fundamental components as what makes something look real or not by the wayside.
60fps not looking right or not looking natural is actually people saying they prefer things to look unnatural and wrong.
I don't just mean that higher frame rates are more revealing, as higher resolutions were too.
I mean that many of the examples used simply look bad and frame rate is not even a factor.

Here's an article that was doing the rounds a couple of years ago, criticizing the way HDR and tone mapping are used in games, which calls out The Hobbit as an example of HDR gone wrong in movies; producing an ugly "videogamey" look.
https://ventspace.wordpress.com/2017/10/20/games-look-bad-part-1-hdr-and-tone-mapping/ said:
And so we run head first into the problem that plagues games today and will drive this series throughout: at first glance, these are all very pretty 2017 games and there is nothing obviously wrong with the screenshots. But all of them feel videogamey and none of them would pass for a film or a photograph. Or even a reasonably good offline render. Or a painting. They are instantly recognizable as video games, because only video games try to pass off these trashy contrast curves as aesthetically pleasing. These images look like a kid was playing around in Photoshop and maxed the Contrast slider. Or maybe that kid was just dragging the Curves control around at random.
The funny thing is, this actually has happened to movies before.

maxresdefault7tjzh.jpg


Hahaha. Look at that Smaug. He looks terrible. Not terrifying. This could be an in-game screenshot any day. Is it easy to pick on Peter Jackson's The Hobbit? Yes, it absolutely is. But I think it serves to highlight that while technical limitations are something we absolutely struggle with in games, there is a fundamental artistic component here that is actually not that easy to get right even for film industry professionals with nearly unlimited budgets.
That has nothing to do with frame rate at all, but people used the bad CG as an example of HFR "making the effects look bad".
No: it looks bad even in a screenshot. It's not that the movie looked good at 24 FPS and the illusion was broken with a 48 FPS presentation.

I haven't seen the full movie, but have seen clips from Billy Lynn's Long Halftime Walk which look terrible because of the camera movement and direction that Ang Lee used.
Instead of shooting it like a normal movie, but at a higher frame rate, it has this amateurish direction and does things like fast camera pans back and forth just because it's possible with the higher frame rate.
 
Oct 27, 2017
1,665
Nope, 60fps isn't terrible for cutscenes. 30fps is what looks terrible. Everything should be 60fps at the minimum, ideally 144fps+.
 

Ninjatogo

Member
Oct 28, 2017
229
Can we make this a poll to see what the people think? I for one, prefer a higher frame rate for every piece of media I consume and I'm sure there are quite a few like that on here.
 

merchantdude

Member
Oct 29, 2017
276
No. 60fps for cutscenes is perfectly fine. Switching framerate between cutscenes and gameplay on the other hand would be super distracting.
 

Nerdkiller

Resettlement Advisor
Member
While we're on the topic of 30fps scenes and 60fps gameplay, can I just bring up that Platinum should do away with prerendered scenes altogether? It's bad enough that they're 30, but to see compression artifacts while the gameplay has pristine image quality really irritates me.

Oddly enough, I remember Hitman 2: Silent Assassin on consoles having the inverse issue. Gameplay was 30, while cutscenes were 60.
 

60fps

Banned
Dec 18, 2017
3,492
Injustice 2 switches from 60fps gameplay to 30fps special moves cutscenes on the fly. Looks super weird.

It's just time to get rid of this halfway house called 30fps altogether.
 

Evil Lucario

Member
Feb 16, 2019
448
Badly directed cutscenes are really the problem, not really framerate. Actually, I'd argue that higher framerates *enhance* cutscenes.

I just finished Code Vein recently and while I loved the game (and even surprisingly enjoyed the story), you can tell that they didn't put much money in the cutscenes. In that case, 30fps and 60fps or higher would not have made a difference in cutscene enjoyment.

On the flipside, DMC5 is uncapped and it looks extremely pristine.
 

Tokklyym

Member
Oct 28, 2017
276
I haven't seen this discussion before and its great. I wonder if you added an age component to this conversation the max fps at all time crew would trend younger.

I think this is an artistic choice and not a technical one. If the designers wants their story scenes to emulate film then 24-30 fps for sure. It can be another way to distinguish the stop in play in a way other than actual stop in play.

But I get why people think their game is broken when they see 30fps cut scenes. I disagree, but I understand. It's like people who used to say letterbox covered up most of the screen.
 
OP
OP
Swift_Gamer

Swift_Gamer

Banned
Dec 14, 2018
3,701
Rio de Janeiro
What he has been conditioned to accept what movies are supposed to look like.

Lets be real, the preference for lower framerate doesn't stem from any rational reason, but because it was the standard we've become accustomed to.
This makes no sense whatsoever.
We were conditioned to like small TVs and sub 240p resolutions on movies but now everyone wants 4K. So if your argument held any water, people would still prefer lower resolutions for movies since that's what they were used to before.
Sorry, nope. High fps for anything not sports or games looks simply bad because it makes everything looks fake and cheap. It just comes with the territory of making things look life like. God those talk shows that are 60fps are cringe. And I hate when people film and publish their videos at 60fps. It's simply bad.
 
Oct 25, 2017
14,741
I haven't seen this discussion before and its great. I wonder if you added an age component to this conversation the max fps at all time crew would trend younger.

I think this is an artistic choice and not a technical one. If the designers wants their story scenes to emulate film then 24-30 fps for sure. It can be another way to distinguish the stop in play in a way other than actual stop in play.

But I get why people think their game is broken when they see 30fps cut scenes. I disagree, but I understand. It's like people who used to say letterbox covered up most of the screen.
I can handle 30fps cutscenes just fine, but 24 I'm gonna need some real evidence of any benefit in order to respect such decision.

What exactly do computer generated graphics gain from 24fps that could possibly be worth the stutter whenever the camera moves in fixed 60hz displays?
 

Pokemaniac

Member
Oct 25, 2017
4,944
Having the frame rate drop during cutscenes is super jarring. It's even worse when the cutscenes are actually pre-rendered videos at a lower resolution than what you're playing at, which I saw a bunch when I played NieR: Automata on PC recently.
 

Linus815

Member
Oct 29, 2017
19,765
I can handle 30fps cutscenes just fine, but 24 I'm gonna need some real evidence of any benefit in order to respect such decision.

What exactly do computer generated graphics gain from 24fps that could possibly be worth the stutter whenever the camera moves in fixed 60hz displays?

It's also missing the main reason for the filmic look - the motion blur. Not even the most high quality motion blur in pc graphics can emulate that of films. 24 fps in games will NEVER look as "good" as they do for film.
 

Serious Sam

Banned
Oct 27, 2017
4,354
Like, it's simply terrible. It gives that soap opera effect, no matter the game, it just feels fake. Worst, sometimes the animations feel like they're accelerated. This is pretty noticiable in the last of us remastered, uncharted trilogy remastered and kingdom hearts remastered cutscenes not capped at 30fps.
I'm all for 60fps+ gameplay, but please, devs, keep the cutscenes at 30fps.
No.