Games will always push graphics and framerate will suffer. Things won't be much different from this generation IMO. Variable refresh rate support will be used as a crutch to allow unlocked framerate games to be better than 30 while almost never reaching 60 - so I guess get used to 45fps games.
Even that would be a significant improvement.
I think the comparison to PC he drew is giving us options.
Giving us a 4k/30fps or a 1080-1440p/60fps option. But they need hardware capable of doing that seamlessly. Which is what he is getting at.
That's not how this works, and is exactly the problem that this generation suffers from.
Resolution is mostly GPU-bound, while framerate is a combination of CPU+GPU, but these days it is
mostly the CPU which is the limiting factor.
That's precisely why you don't have games which run at 4K30 presenting 1080p60 options.
1080p60 only requires 1/2 the GPU power of 4K30, but also requires ≥2x the CPU speed.
I have a Ryzen 1700X CPU running at 4 GHz, which is far more than double the speed of the CPU in the current gen consoles, but that is not enough to brute-force all games to run at 60 FPS.
I don't believe there's a CPU fast enough to do that for all current-gen games, not even an i7-8700K running at 5GHz.
Part of this is poor optimization of the PC ports from many developers, but also general issues with games not being well multi-threaded.
CPUs are hitting a limit of how fast an individual core can be, and are now getting faster mostly by adding more cores.
Some games are very well optimized and performance continues to scale up as you give them more cores, but the majority of games are still reliant on single-threaded performance. Few games actually use more than ~20% of my Ryzen CPU's capabilities.
they also want to maintain 8 cores for backwards compatibility, and get a combo deal on navi.
They would probably need 8 hardware threads, but I don't know that 8 cores would be necessary.
It's a couple of years away still, and consoles do use custom hardware, but current Zen-based APU designs pair a 4-core CPU with a GPU.
Yeah, two things I'm thinking of, one they always promise this stuff before a new generation and two, theses consoles will likely have free sync technology and thus making "60fps" not really super important.
I mean I sure hope they will have free sync or similar, no reason not to.
The advantage of VRR is that it frees you from being locked to divisors of the display's refresh rate, so you can run games as fast as the system is capable of instead of having to cap it.
It eliminates the stuttering that you get from running unlocked framerates on a fixed refresh-rate display, but does not significantly improve the appearance of low framerates.
40 FPS still looks and feels like 40 FPS - it doesn't magically feel like 60. But it is better than having to cap it to 30.
If anything, since getting a G-Sync display, it's pushed me to target higher framerates since I can run most games at ~80-100 FPS on my system rather than locking to 60. It did not change my opinion on sub-60 FPS gaming.
The difference of going from 60 FPS to ~90 FPS is as significant as going from 30 FPS to 60 FPS.
I think it's more likely that console games will still be targeting a locked 30 FPS, since most people will be using 60Hz fixed refresh rate displays next gen, but will allow the framerate to be unlocked for ~35-45 FPS gaming if you have a VRR display.
I don't think anyone is suggesting that every game needs to be 60fps.
The majority of displays refresh at a fixed 60Hz, so all content should be running at 60 FPS.
30 FPS results in significant amounts of motion blur and judder on its own, and developers have to add even more motion blur on top of that to try and smooth out the judder.
Modern displays which have blur reduction modes do not allow them to be enabled below 60Hz, and many blur reduction modes cannot be enabled below 85Hz or so.
Not for me, I can't tell the difference between 20 and 120fps. but I can tell the difference between 1440p upscale and native 4k.
A lot of my co-workers are PC gamers, they said the same thing, I've been shown multiple videos and I can never tell the difference. At best it just looks like the characters on screen are moving slightly faster.
You need a 120Hz display to see the advantages of 120 FPS gaming.
Framerate should have no effect on the speed of a game, only fluidity and motion clarity.
Here's a comparison I recorded a few weeks ago when someone else was arguing that high framerate doesn't make a difference in 2D games since sprites are animated at a much lower framerate than the game.
30 and 60fps look similar to me. 24 look ok, 15 is awful. I could not tell the difference in game though.
Make sure you're viewing on a device that can play the 60 FPS stream, and not a phone/tablet.
One of the advantages of higher framerates is that the distance an object travels between frames is significantly shorter (thus motion appears smoother) which is lessened the smaller the screen is.
That's not to say you can't tell the difference between 30/60 FPS on a phone, but the difference is far more significant the larger the screen is, and the more of your vision it fills.