Fortunately G-Sync / FreeSync, and the upcoming HDMI VRR, means that you no longer have to make that decision.
An unsteady 50-60 FPS on a Variable Refresh Rate display is almost as smooth as a locked 60 FPS on a 60Hz display.
Fortunately G-Sync / FreeSync, and the upcoming HDMI VRR, means that you no longer have to make that decision.
I'm playing Shadow of the Colossus on a 4K Bravia in Cinematic mode with HDR and couldn't imagine playing it any other way. People always say resolution doesn't matter, and in some cases it doesn't, but in others it can immerse you into the world that much more. I turned the HDR off and turned Performance mode on and my jaw almost hit the floor. I can't believe people are experiencing this remake that way. It's a significant difference in visual quality.
Also, the higher your fps, the more responsive your game is, the better it is to play. If I lock Tekken 7 to 30fps - it divides evenly just fine with no stuttering - but is now twice as unresponsive to play. Same deal with every game ever made.
As an aside:
It's funny that it took strapping displays to your face for Sony and other 3rd party developers to finally get serious on rock solid game performance. Which sucks, because its not unheard of for people to experience discomfort gaming at both low and uneven frame rates (including 30, motion blur or no) on a regular tv.
You're confusing G-Sync with ULMB / BFI implementations.
With a low-persistence impulse-type display, such as a CRT or an LCD using BFI / ULMB, you must match the refresh rate to the framerate. If you do not, you end up with multiple images being displayed. 60 FPS at 120Hz = double images, 60 FPS at 180Hz = triple images etc.
With G-Sync, which currently only operates in a full-persistence sample-and-hold display mode, the higher the display's native refresh rate, the faster the scan-out and the lower the latency will be. So higher native refresh rates benefit G-Sync, but hurt ULMB.
Indeed.Playstation 5 is around the corner. This topic is more relevant than ever and needs attention.
So Borderlands 3 luckily has a 60fps mode on PS4 Pro. You can switch seamlessly between "resolution" and "performance" in the options menu, and after switching to "resolution" you will instantly notice the input lag while navigating the menu. Just moving the cursor makes a night and day difference.
Framerate is not just about motion clarity, which is perfectly described in this topic, but also about smooth controls. Games are interactive, moving images you control. For all of these aspects, interactive, moving, control, framerate is the most important factor. Games in higher framerates automatically control better, even when you're just navigating through the menu or sorting the inventory. High framerates make games not only look better (in motion), but also control and feel better.
Therefore, I hope 60fps becomes the new standard somewhere down the next console generation. The trend is looking good so far, with all the performance modes we get nowadays, and Phil Spencer saying with the next XBox they will put more focus on performance.
Not remotely the same...This is like an audiophile asking someone listening to music with normal headphones "don't you care about having the best audio quality????"
My headphones are good enough and my frames are good enough.
60fps is not about "best quality". That would be 120fps or more. 60fps is the baseline, the standard that should be considered the absolute minimum when playing videogames.This is like an audiophile asking someone listening to music with normal headphones "don't you care about having the best audio quality????"
Seriously.Indeed.
The 30fps defense force is always acting like we A: We play static screenshots, and B: input responsiveness does not matter.
It's like talking about a pair of busted earbuds that have no treble or bass and everything sounds muddy vs a decent pair of cheap earbuds. Yeah, you can still make out what's being said, but it's not a good experience.
30fps is ~$60 Sennheiser headphones, 60fps is $120 headphones.It's like talking about a pair of busted earbuds that have no treble or bass and everything sounds muddy vs a decent pair of cheap earbuds. Yeah, you can still make out what's being said, but it's not a good experience.
60fps is ~$15 - $30 generic headphones, 120fps is $60 Sennheiser, 200+ fps is $120 headphones.30fps is ~$60 Sennheiser headphones, 60fps is $120 headphones.
Motion interpolation works perfectly for me when a game is 30fps. No 31, no 29. The 30 works perfect.
It feels almost like 60 fps.
Exactly. So if they can improve even more MI on the next TVs, I am really ok with 30 fps. On some games, I really can't tell the difference between 60 vs 30 with MI.I am on your team.
It is shocking how most gamers don't even know or believe this.
And there are even some new TVs today with interpolation made just for gaming.
I believe it.I'm not saying that I don't believe it but it truly amazes me when people say they can't see the difference between 30 fps and 60+fps. I play on both PC and PS4 and higher fps makes the game both feel and look better IMO. God of War was gorgeous but I absolutely had to play it on the performance mode, it wasn't even 60 fps consistently but it just felt so much better to play and it still looked amazing. One thing I've found I'm sensitive to is motion blur, I hate it and I feel it makes overall IQ worse.
60 FPS is better and I feel it gives the devs an opportunity to give you much better controls.
But most games are designed with the limitations of modern displays and around being 30 FPS.
True but LCD displays are way slower for inputs than CRT. At least in my experience.Any display is at least 60 fps capable, for decades.
I actually never saw a 30hz display in real life.
Exactly. So if they can improve even more MI on the next TVs, I am really ok with 30 fps. On some games, I really can't tell the difference between 60 vs 30 with MI.
examples: Crash Bandicoot, R&C, Uncharted 4 and every pro version that has 30fps option.
So I have graphics and smoothness. Win win.
I tried it out on my LG OLED and even though it makes 30 look/ feel more like 60, the brightness takes a hit and it introduces flicker. Hopefully future models will improve on it.I am on your team.
It is shocking how most gamers don't even know or believe this.
And there are even some new TVs today with interpolation made just for gaming.
True but LCD displays are way slower for inputs than CRT. At least in my experience.
It's like talking about a pair of busted earbuds that have no treble or bass and everything sounds muddy vs a decent pair of cheap earbuds. Yeah, you can still make out what's being said, but it's not a good experience.
Lol, exactly. That's what I meant with 60fps should be considered the baseline.60fps is ~$15 - $30 generic headphones, 120fps is $60 Sennheiser, 200+ fps is $120 headphones.
30fps are the headphones in a clearance rack for $5 at Walgreens.
Completely agreed. 30fps is the reason non-gamers tend to complain about feeling sick or getting headaches whenever the image is moving too fast when watching me play in 30fps. Because your eyes can't focus on a moving 30fps image. When the image starts moving, everything on screen gets blurry. In movies we have professional camera work. In games you take control of the image yourself, hence: The higher the framerate, the better.It is about time to 60fps be the minimum standard.
When are graphics going to be enough?
If we keep things like this we will never improve the fps because there will be always some shiny effect to exchange for it.
But the average person doesn't even understand what fps, stutter and motion blur means.
They feel something is wrong but they have no clue, and blame the graphics.
EXACTLY. This is what I'm trying to describe above. Absolutely right.If i stare at a 30fps game with the same kind of keen eyeball fixity - out in the world - that i do with 60 fps games ( borderlands 2, halo MCC, Gears Multiplayer, DMC5 ) - the blurring will literally start to give me a headache. To play 30 hz games i when i want to turn the camera, which is often, i either unfocus my eyes from the oncomming nastiness out in the world, or put my eyeball focus on the player character ( who doesn't stutter and shimmey at all when you turn the camera past a certain snails pace ) and then once i've moved my eyes i quickly, in a whirl, turn the camera where i want it set, and once the camera is where i want it, i tend to play in that field without moving the camera much, and if i do move the camera i try to move it very slowly so as not to induce the shimeyjudderblurr.
Exactly.Give me 1080/1440p @ 144 over 4K @ 30 every day of the week.
Once you leave screenshot mode, there's a game to be played.
Lol. Indeed.That's why people who tell you that they prefer 30 fps with nice graphics have actually no idea what they're talking about.
But the input lag is noticeable compared to a CRT. I just think that most games today are designed around it so you have to play something older to really notice it.Any TV, LCD or CRT is constantly running at 60 fps (HZ) or 120fps, the game or movie does not matter.
TVs have input lag, and it is affected by the TV model and image processing algorithms.
When you turn on Game Mode on a TV you turn off some algorithms and get a lower input lag.
But your TV is always running at 60fps or more.
But the input lag is noticeable compared to a CRT. I just think that most games today are designed around it so you have to play something older to really notice it.
No.This is like an audiophile asking someone listening to music with normal headphones "don't you care about having the best audio quality????"
And a normal person would tell them "lol why should I care stop bugging me".No.
The equivalent is the audiophile telling that person to stop using mp3s now that storage is abundant in handhelds.
I had a theory that most gamers started playing with the PS1 and afterwards and if they did not experience gaming beforehand or on PC they would be used to 30 fps gaming and generally less responsive gaming in general so 60ps and super responsive gameplay wouldn't be as important to them. So I put in the television as well as old ass gamers remember gaming on a CRT tvs and monitors where the gameplay and input responses were lightening quick.(isn't this topic about fps? lol)
It is true CRT usually had less input lag.
With LCD and OLED you need to look for the TV model input lag before buying it.
Bad TVs can have even something like 100ms of input lag.
The good ones are around 20ms, and this is almost perfect.
You should always play games in game mode too, to get the lowest input lag.
Monitors are different.
Most of them have somewhere around 1ms of input lag. You just can't notice it.
This is why I always prefer playing competitive fighting games on the PC.
The bottom line is: never buy a TV if you don't know it's input lag numbers. Always play on Game Mode. And if you want 0 input lag, buy a monitor.