i understand the difference between delta time vs fixed timestep, don't worry. yes, fixed timestep games slow down rather than drop frames, which makes them more manageable when there's inconsistent performance, and most old 2D games do this (it's a simpler way to develop with no real downsides on fixed hardware, after all). but this thread isn't even focused on inconsistent performance: 2D Metal Slug is a fixed timestep game that _targets_ 30fps, and 3D Doom "caps" at 35fps (updating every other frame of VGA mode13h IIRC).
finding 30fps to be a trait that makes a video game _inherently unplayable_ means i probably won't relate to your tastes in the slightest.
With
Metal Slug you only have 2D camera movement, and it scrolls at an appropriate pace for 30 FPS.
Much of the time spent in those games is on a static screen rather than a scrolling one.
As for
Doom, I know quite a few people that couldn't play it without getting bad motion sickness until source ports that ran the game at higher frame rates were widely available.
I was never able to get into
Doom at all for that reason, but I loved
Quake.
But people are generally referring to modern 3D games when they say that 30 FPS is "unplayable". My point was that older games were not built the same.
Some of the few 3D games which run at 30 FPS that I
have been able to return to, are the original
Tomb Raider trilogy.
Player movement speed and camera motion is controlled in such a way that it avoids many of the issues that results in bad motion sickness for me in modern 30 FPS games - even though it doesn't have motion blur to smooth things over.
At least, it was okay for me the last time I tried to return to it. I only played a level or two rather than trying to complete it, and may have a different reaction today.
I think the fact that it's 4:3 helps a lot.
I used to be able to just tolerate
Katamari Damacy in 4:3, but playing the newer releases in 16:9 or 21:9 makes me severely motion sick now. Like day-ruining motion sickness.
It's probably something similar to this effect:
Cover the center of the image and it appears to move much faster, cover the edges and it appears to move much slower.
I think that's been a factor in my increasing intolerance for 30 FPS over time.
And when I say "intolerance" I don't mean "I don't like it", I mean that I get motion sickness to varying degrees of severity.
I don't presume to understand everything in play here though.
At higher frame rates, I found that switching to an ultra-wide monitor
helped with motion sickness due to the wider view.
All I can say for certain is that I've had to avoid most 30 FPS games for at least a couple of decades now, and I never owned a PlayStation, Nintendo 64, or Saturn - so I skipped over most of that bad early 3D.
Mid-gen games on the original Xbox is when I remember it starting to get particularly bad, which is probably linked to the increasing popularity of widescreen support, now that I think about it (I had a widescreen CRT).
Display size is a factor as well, and that's also the time that TVs started to get a lot bigger.
30 FPS is just not suitable for modern games or displays any more.
60 FPS should be the absolute minimum, and ideally they would be pushing for 90–120.
Nope, no motion and game mode is on. Like I said, same input my PS4 pro used.
Have you actually gone back and done an A/B comparison, or is it that you feel it's worse now on the PS5 than you remember on the PS4 Pro?
Because it could simply be that you have already started to adapt to 60 FPS now that it's an option.