Games that either switch to pre-rendered cutscenes which are only rendered at 30, or worse, games like Quantum Break which are real-time but lock the frame rate to 30 FPS, are terrible.Like, it's simply terrible. It gives that soap opera effect, no matter the game, it just feels fake. Worst, sometimes the animations feel like they're accelerated. This is pretty noticiable in the last of us remastered, uncharted trilogy remastered and kingdom hearts remastered cutscenes not capped at 30fps.
I'm all for 60fps+ gameplay, but please, devs, keep the cutscenes at 30fps.
"Soap opera effect" is literally just smooth motion.
It's old men complaining because they have been conditioned to garbage frame rates by spending their lives exclusively watching movies at home on 60Hz displays with 3:2 pulldown, flat panels with judder and no motion resolution at 24Hz, or in theaters which use multi-bladed shutters to increase the refresh rate.
They are set in their ways and have no interest in adapting to better frame rates - so they made up a term to make high frame rates sound like cheap productions.
The irony of the situation is that 24 FPS was never mean to look this bad.
Back when 24 FPS was chosen as the lowest frame rate they could get away with to save money when shooting on reels of film, it was projected at 24Hz with a single-bladed shutter. That meant the films flickered a lot; hence the name "flicks".
Modern projectors use a multi-bladed shutter to display the same frame multiple times and increase the refresh rate from 24Hz to 48Hz or 72Hz.
The problem is that displaying 24 FPS at 48Hz or 72Hz does not just eliminate the flicker. Repeating the frame like this introduces significant judder, and means that motion is no longer as smooth as it's supposed to be.
If you actually watch 24 FPS content on a single-bladed projector; or on a CRT, DLP, or OLED display using black frame insertion to reduce the effective refresh rate to 24Hz, you will find that motion suddenly becomes incredibly fluid.
It looks as though you just enabled the smoothest interpolation on the display - except there are no interpolation errors, because all you did was reduce the refresh rate by drawing black frames.
The problem is that it also flickers worse than any display you've ever seen - especially if it's bright. It's only watchable at low brightness levels in a dark room.
Interpolation actually restores the original smoothness of motion that was originally present with 24 FPS film - and does it without flickering.
The downside is that interpolation is imperfect and may introduce other visual artifacts.
It's an issue of perception.Not a problem I've ever had with any game I've played.
I don't understand why gameplay would be fine but not cutscenes. It's all real-time graphics in a video game. Why wouldn't gameplay also have this soap opera effect?
You're conditioned that bad frame rate = "cinematic" so when you see cutscenes your brain expects motion to look terrible. It's not actually different from gameplay.
It's not. Nearly all animation is interpolated in games. It's just rendered out at a higher frame rate. The animation itself is unchanged.So it's like that awful motion smoothing that our parents insist on having enabled on their TVs for some godforsaken reason.
Motion interpolation on TVs has to analyze a 2D image and guess the motion vectors. Games render the in-between frames in the animation data they were given, rather than guessing.
That being said, good interpolation is not inherently bad. Here's the results from a 10-year old Sony TV:
It may not be perfect, but I don't know how anyone can look at that and say it's not an improvement.
The downside for gaming is that it adds input lag. That TV goes from 56ms in game mode to 89ms with interpolation enabled. It's actually not a bad increase in latency - only 33 ms. The problem is that the base input lag is high.
It should be 120 now that HDMI officially supports 120Hz.
60 is what we've been stuck with since television's inception. It's a relic.
They were pre-rendered garbage.
Any of the examples that people use to point out that the sets/make-up/prosthetics/cg "looks terrible in HFR" always looks equally bad in 24 FPS to me. But people blame the frame rate for it.[…] Gemini man is a really interesting example of the challenges that Hollywood faces, the HFR is actually not as jarring as the Hobbit movies were, but the quality of the CG occasionally falls below the threshold of acceptability, something that perhaps might not show itself at 24fps. There are are a couple of shots that look like Will Smith is in Uncharted 3
That's how most 2D animation is done. The characters animate at a low frame rate; e.g. 12 FPS, while camera pans and similar motion is animated at 24 FPS.Oh lord, ,no. It'll create an effect very similar to what we see with games that have motion capture at 30, but everything else is significantly higher frame rate. You can already see this on some 30i anime and it looks distracting/awkward sometimes. Normally frame rate judder and stuff doesn't bother me, but there are times where the camera movement being much higher than the actual animation frame rate makes it extremely nauseating to me. That's NOT a good idea, especially given a lot of animation, notably Japanese animation, are animated with less frames.
The reason so much 3D animation which tries to imitate 2D looks bad is because they render everything out at 12 FPS instead of only the character animation.
Keeping the animation at its original frame rate but all the camera pans etc at 120 would be an improvement over having to use interpolation (though interpolation can help out with the character animation too).