• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.
  • We have made minor adjustments to how the search bar works on ResetEra. You can read about the changes here.

Jaffo

Attempted to circumvent ban with alt account
Banned
Oct 25, 2017
292
Rome, Italy
It is kinda important. Everytime i turn on my ps4 after playing with my pc i get motion-sick. At the beginning it's just unbearable. After some time your eyes get used to the lower refresh though. I will always prefer a smoother frame rate over graphics.

Framerate > resolution (clarity) > graphics
 

impact

Banned
Oct 26, 2017
5,380
Tampa
If the 60fps game is some ugly ass sub-native res it's not going to look good regardless. SMO looks pretty bad on my 4K TV, Horizon ZD does not.
 

JudgmentJay

Member
Nov 14, 2017
5,220
Texas
I'm playing Shadow of the Colossus on a 4K Bravia in Cinematic mode with HDR and couldn't imagine playing it any other way. People always say resolution doesn't matter, and in some cases it doesn't, but in others it can immerse you into the world that much more. I turned the HDR off and turned Performance mode on and my jaw almost hit the floor. I can't believe people are experiencing this remake that way. It's a significant difference in visual quality.

You can leave HDR on in performance mode though.
 

LCGeek

Member
Oct 28, 2017
5,857
I haven't heard about this.

Mark Rejhon of blurbusters mentions it a few times. It came up in a dicussion of combining both ULMB and gysnc together it's beyond what Nvidia is allowed to do within the driver as only the OS can do it. This is why I want some of these tech featrures to be built in to firmware/os level vs tvs. You mention as much with black frame insertion yes it's nice for a tv to do it but something within OS or engine would allow for even better support.

Better tv support or tech inside will not fix games or OS lacking what they should have to better support the technology in question. This is why HDR implementation is pissing me off MS is just slow on the pc side while on X1 its up despite them using the same base kernel of windows.

Also, the higher your fps, the more responsive your game is, the better it is to play. If I lock Tekken 7 to 30fps - it divides evenly just fine with no stuttering - but is now twice as unresponsive to play. Same deal with every game ever made.

As an aside:
It's funny that it took strapping displays to your face for Sony and other 3rd party developers to finally get serious on rock solid game performance. Which sucks, because its not unheard of for people to experience discomfort gaming at both low and uneven frame rates (including 30, motion blur or no) on a regular tv.

In no way does my post walk away from higher fps. My entire point is to counter points that say they aren't any benefits there clearly is with a research link to it. This has nothing to do with the response of your inputs but rather the motion of your screen, they aren't even the same factor being discussed. Rather even at 30fps at high refreshrate not equal the flicker of the screen is quicker thus the image comes up faster. Your monitor is not going to be making the stutter that is your machine.

I want good framepacing, syncing and almost no input lag good luck in your journey though.

You're confusing G-Sync with ULMB / BFI implementations.
With a low-persistence impulse-type display, such as a CRT or an LCD using BFI / ULMB, you must match the refresh rate to the framerate. If you do not, you end up with multiple images being displayed. 60 FPS at 120Hz = double images, 60 FPS at 180Hz = triple images etc.
With G-Sync, which currently only operates in a full-persistence sample-and-hold display mode, the higher the display's native refresh rate, the faster the scan-out and the lower the latency will be. So higher native refresh rates benefit G-Sync, but hurt ULMB.

No I'm not I'm clearly talking about the scanout speed or refresrate not how it's syncing or strobing, just the mere speed of the screen refreshing. That gives a benefit as well as what you mention, all I'm saying is that is benefit regardless of your fps, though people should be advised of the factor you mention. Left a link that goes in to it as well.

I think the balance has to go both ways hardware and software. It's completely silly to have 4 different screens with some support yet no single OS or console I use has it. Software can't evolve if it's never worked on to begin with. A lack hardware standards fuck us on TV's but no one wants to agree to anything as I'm sure you can tell from HDR companies don't always play nice.
 
Last edited:
Oct 28, 2017
1,956
i put framerate and motion clarity under "performance"
a 60 fps game perform better than a 30 but won't look as good. i'm talking about fixed settings here obviously
 

60fps

Banned
Dec 18, 2017
3,492
Playstation 5 is around the corner. This topic is more relevant than ever and needs attention.

So Borderlands 3 luckily has a 60fps mode on PS4 Pro. You can switch seamlessly between "resolution" and "performance" in the options menu, and after switching to "resolution" you will instantly notice the input lag while navigating the menu. Just moving the cursor makes a night and day difference.

Framerate is not just about motion clarity, which is perfectly described in this topic, but also about smooth controls. Games are interactive, moving images you control. For all of these aspects, interactive, moving, control, framerate is the most important factor. Games in higher framerates automatically control better, even when you're just navigating through the menu or sorting the inventory. High framerates make games not only look better (in motion), but also control and feel better.

Therefore, I hope 60fps becomes the new standard somewhere down the next console generation. The trend is looking good so far, with all the performance modes we get nowadays, and Phil Spencer saying with the next XBox they will put more focus on performance.
 

jett

Community Resettler
Member
Oct 25, 2017
44,655
Playstation 5 is around the corner. This topic is more relevant than ever and needs attention.

So Borderlands 3 luckily has a 60fps mode on PS4 Pro. You can switch seamlessly between "resolution" and "performance" in the options menu, and after switching to "resolution" you will instantly notice the input lag while navigating the menu. Just moving the cursor makes a night and day difference.

Framerate is not just about motion clarity, which is perfectly described in this topic, but also about smooth controls. Games are interactive, moving images you control. For all of these aspects, interactive, moving, control, framerate is the most important factor. Games in higher framerates automatically control better, even when you're just navigating through the menu or sorting the inventory. High framerates make games not only look better (in motion), but also control and feel better.

Therefore, I hope 60fps becomes the new standard somewhere down the next console generation. The trend is looking good so far, with all the performance modes we get nowadays, and Phil Spencer saying with the next XBox they will put more focus on performance.
Indeed.

The 30fps defense force is always acting like we A: We play static screenshots, and B: input responsiveness does not matter.
 

Deleted member 37739

User requested account closure
Banned
Jan 8, 2018
908
I still play a lot of games at 30 FPS and I find I adjust quite quickly, but having played a good deal of 60 FPS games this generation I'll say that I far and away prefer performance over IQ and that motion clarity is an integral part of overall presentation. Horses for courses, though, and games that build their design around their performance target can often eliminate a lot of the issues of lower frame rates.
 

Azurik

Attempted to circumvent ban with alt account
Banned
Nov 5, 2017
2,441
Single player - 30fps with all graphical bells and whistles

MP - Solid 60fps with adjusted graphics

I just much prefer cinematic SP games. Obviously, if next generations of consoles get powerful enough to maintain that cinematic experience with 60fps+ that so be it (even though we would never know if/ how it could have looked better in 30fps - gears 5)
 
Last edited:

LumberPanda

Member
Feb 3, 2019
6,338
This is like an audiophile asking someone listening to music with normal headphones "don't you care about having the best audio quality????"
 

60fps

Banned
Dec 18, 2017
3,492
This is like an audiophile asking someone listening to music with normal headphones "don't you care about having the best audio quality????"
60fps is not about "best quality". That would be 120fps or more. 60fps is the baseline, the standard that should be considered the absolute minimum when playing videogames.

We played games in 60fps 30 years ago, so why should 30fps suddenly be ok in 2019.

Indeed.

The 30fps defense force is always acting like we A: We play static screenshots, and B: input responsiveness does not matter.
Seriously.
 

Pankratous

Member
Oct 26, 2017
9,252
I want 30FPS with better graphics because I can't tell the difference between 30/60 in normal gameplay.

If I spin the camera round and round and round and round then yeah there's a difference, but...
 

Deleted member 27315

User requested account closure
Banned
Oct 30, 2017
1,795
Motion interpolation works perfectly for me when a game is 30fps. No 31, no 29. The 30 works perfect.
It feels almost like 60 fps.
 

Onikage

Member
Feb 21, 2018
414
It is about time to 60fps be the minimum standard.

When are graphics going to be enough?
If we keep things like this we will never improve the fps because there will be always some shiny effect to exchange for it.

But the average person doesn't even understand what fps, stutter and motion blur means.
They feel something is wrong but they have no clue, and blame the graphics.
 
Last edited:

Sulik2

Banned
Oct 27, 2017
8,168
60FPS looks so weird even in gameplay it sometimes make me stop and go man that looks strange. Overwatch would make me do it while playing . The most recent Jedi Fallen order trailer did that to me to. While it helps gameplay feel smoother I think 60fps looks so bad it actually makes games look worse.
 

nickfrancis86

Member
Nov 10, 2017
427
I'm not saying that I don't believe it but it truly amazes me when people say they can't see the difference between 30 fps and 60+fps. I play on both PC and PS4 and higher fps makes the game both feel and look better IMO. God of War was gorgeous but I absolutely had to play it on the performance mode, it wasn't even 60 fps consistently but it just felt so much better to play and it still looked amazing. One thing I've found I'm sensitive to is motion blur, I hate it and I feel it makes overall IQ worse.
 

Lobster Roll

signature-less, now and forever
Member
Sep 24, 2019
34,357
Give me 1080/1440p @ 144 over 4K @ 30 every day of the week.

Once you leave screenshot mode, there's a game to be played.
 

bionic77

Member
Oct 25, 2017
30,888
60 FPS is better and I feel it gives the devs an opportunity to give you much better controls.

But most games are designed with the limitations of modern displays and around being 30 FPS.
 

Deleted member 27315

User requested account closure
Banned
Oct 30, 2017
1,795
I am on your team.

It is shocking how most gamers don't even know or believe this.
And there are even some new TVs today with interpolation made just for gaming.
Exactly. So if they can improve even more MI on the next TVs, I am really ok with 30 fps. On some games, I really can't tell the difference between 60 vs 30 with MI.

examples: Crash Bandicoot, R&C, Uncharted 4 and every pro version that has 30fps option.

So I have graphics and smoothness. Win win.
 

bionic77

Member
Oct 25, 2017
30,888
I'm not saying that I don't believe it but it truly amazes me when people say they can't see the difference between 30 fps and 60+fps. I play on both PC and PS4 and higher fps makes the game both feel and look better IMO. God of War was gorgeous but I absolutely had to play it on the performance mode, it wasn't even 60 fps consistently but it just felt so much better to play and it still looked amazing. One thing I've found I'm sensitive to is motion blur, I hate it and I feel it makes overall IQ worse.
I believe it.

If you started playing games on the original PS1 and stuck with Sony since then (which is probably the majority of gamers) the only time you got majority 60 FPS gaming was on the PS2. Most of the big games on all of the other gens were primarily 30 FPS. And thats probably true across all platforms.

If you play PC games or any 2D console you have a lot of experience with really responsive controls. But that is probably a minority of gamers at this point.

At least thats my theory.
 

Onikage

Member
Feb 21, 2018
414
60 FPS is better and I feel it gives the devs an opportunity to give you much better controls.

But most games are designed with the limitations of modern displays and around being 30 FPS.

Any display is at least 60 fps capable, for decades.
I actually never saw a 30hz display in real life.
 

Onikage

Member
Feb 21, 2018
414
Exactly. So if they can improve even more MI on the next TVs, I am really ok with 30 fps. On some games, I really can't tell the difference between 60 vs 30 with MI.

examples: Crash Bandicoot, R&C, Uncharted 4 and every pro version that has 30fps option.

So I have graphics and smoothness. Win win.

And if the game is 60fps some TVs turn it into 120fps with interpolation.
It is such a fascinating technology. Instead of using SLI and buying 2 GPUs to gain more power, you are just using a TV algorithm to solve the problem...

WIth faster chips and possibly AI, the interpolation could be a game changer even for PCs.
 

Aaron D.

Member
Oct 25, 2017
9,317
I can play CSGO at 300+ that doesn't mean it's a looker.

Context is important.

One of the most amazing trends over the past few years has been the return of retro-shooters.

Dusk, Amid Evil and Ion Fury have completely reinvigorated my passion for the genre.

While none of these tiles are as technically complex on the visual side as modern AAA titles like CoD or Battlefield, being able to play them on Max settings, high-resolution and at a buttery smooth 120fps w/Gsync is a complete game-changer.

Same goes for less-demanding indie titles in other genres, the recent gem 10 Miles To Safety being another example.

None of this "ruins" locked 30fps gameplay by any means, but boy does it make a noticeable difference and dramatically enhanced gaming experience nonetheless.
 

P40L0

Member
Jun 12, 2018
7,618
Italy
60fps smooth gameplay and 30fps pumped up, real time cut-scenes is the best of both worlds, and many games already do this (e.g. Halo 5, Gears 5).
 

Azurik

Attempted to circumvent ban with alt account
Banned
Nov 5, 2017
2,441
I am on your team.

It is shocking how most gamers don't even know or believe this.
And there are even some new TVs today with interpolation made just for gaming.
I tried it out on my LG OLED and even though it makes 30 look/ feel more like 60, the brightness takes a hit and it introduces flicker. Hopefully future models will improve on it.
 
Last edited:

Onikage

Member
Feb 21, 2018
414
True but LCD displays are way slower for inputs than CRT. At least in my experience.

Any TV, LCD or CRT is constantly running at 60 fps (HZ) or 120fps, the game or movie does not matter.
TVs have input lag, and it is affected by the TV model and image processing algorithms.
When you turn on Game Mode on a TV you turn off some algorithms and get a lower input lag.

But your TV is always running at 60fps or more.
 

takriel

Attempted to circumvent ban with alt account
Banned
Oct 25, 2017
10,221
That's why people who tell you that they prefer 30 fps with nice graphics have actually no idea what they're talking about.
 

60fps

Banned
Dec 18, 2017
3,492
It's like talking about a pair of busted earbuds that have no treble or bass and everything sounds muddy vs a decent pair of cheap earbuds. Yeah, you can still make out what's being said, but it's not a good experience.
60fps is ~$15 - $30 generic headphones, 120fps is $60 Sennheiser, 200+ fps is $120 headphones.

30fps are the headphones in a clearance rack for $5 at Walgreens.
Lol, exactly. That's what I meant with 60fps should be considered the baseline.

It is about time to 60fps be the minimum standard.

When are graphics going to be enough?
If we keep things like this we will never improve the fps because there will be always some shiny effect to exchange for it.

But the average person doesn't even understand what fps, stutter and motion blur means.
They feel something is wrong but they have no clue, and blame the graphics.
Completely agreed. 30fps is the reason non-gamers tend to complain about feeling sick or getting headaches whenever the image is moving too fast when watching me play in 30fps. Because your eyes can't focus on a moving 30fps image. When the image starts moving, everything on screen gets blurry. In movies we have professional camera work. In games you take control of the image yourself, hence: The higher the framerate, the better.

If i stare at a 30fps game with the same kind of keen eyeball fixity - out in the world - that i do with 60 fps games ( borderlands 2, halo MCC, Gears Multiplayer, DMC5 ) - the blurring will literally start to give me a headache. To play 30 hz games i when i want to turn the camera, which is often, i either unfocus my eyes from the oncomming nastiness out in the world, or put my eyeball focus on the player character ( who doesn't stutter and shimmey at all when you turn the camera past a certain snails pace ) and then once i've moved my eyes i quickly, in a whirl, turn the camera where i want it set, and once the camera is where i want it, i tend to play in that field without moving the camera much, and if i do move the camera i try to move it very slowly so as not to induce the shimeyjudderblurr.
EXACTLY. This is what I'm trying to describe above. Absolutely right.

Give me 1080/1440p @ 144 over 4K @ 30 every day of the week.

Once you leave screenshot mode, there's a game to be played.
Exactly.

That's why people who tell you that they prefer 30 fps with nice graphics have actually no idea what they're talking about.
Lol. Indeed.
 
Last edited:

Atisha

Banned
Nov 28, 2017
1,331
30 fps games would look on par, in terms of image clarity in motion with 60 fps games if only our modern day contemporary tv's had an optional 30 hz refresh rate you could select. The 'judder' or 'frame doubling', or 'blurry hot mess' wouldn't be a thing with a monitor or television capable of 30 hz refresh rate. It's the asyncrouncous connection, the disharmony between a 30 fps game and a 60 hz monitor refresh rate that makes the game look attrocious, especially at the perpirary of the game world when you move the camera beyond a snails pace.

If i stare at a 30fps game with the same kind of keen eyeball fixity - out in the world - that i do with 60 fps games ( borderlands 2, halo MCC, Gears Multiplayer, DMC5 ) - the blurring will literally start to give me a headache. To play 30 hz games i when i want to turn the camera, which is often, i either unfocus my eyes from the oncomming nastiness out in the world, or put my eyeball focus on the player character ( who doesn't stutter and shimmey at all when you turn the camera at any rate ) and then once i've protected my eyes, i quickly, in a whirl, turn the camera where i want it set, and once the camera is where i want it, i tend to play in that field without moving the camera much, and if i do move the camera i try to move it very slowly so as not to induce the shimeyjudderblurr.

30 fps sucks.
 
Last edited:

bionic77

Member
Oct 25, 2017
30,888
Any TV, LCD or CRT is constantly running at 60 fps (HZ) or 120fps, the game or movie does not matter.
TVs have input lag, and it is affected by the TV model and image processing algorithms.
When you turn on Game Mode on a TV you turn off some algorithms and get a lower input lag.

But your TV is always running at 60fps or more.
But the input lag is noticeable compared to a CRT. I just think that most games today are designed around it so you have to play something older to really notice it.
 

tr1b0re

Member
Oct 17, 2018
1,329
Trinidad and Tobago
Though the framerate difference is noticeable, it's something that I quickly get used to and forget all about, especially with single player games where I'm not too concerned about being 'optimal'

I have a midrange PC so while I can run many games on high settings, with some I need to choose graphics or framerate, and usually I'll go for graphics with a 30fps lock. I just notice the nicer graphics more consistently than I do the framerate, which is something that I don't pay much attention to after I start playing (unless it's dropping lower than 30 of course)
 

Onikage

Member
Feb 21, 2018
414
But the input lag is noticeable compared to a CRT. I just think that most games today are designed around it so you have to play something older to really notice it.

(isn't this topic about fps? lol)

It is true CRT usually had less input lag.
With LCD and OLED you need to look for the TV model input lag before buying it.

Bad TVs can have even something like 100ms of input lag.
The good ones are around 20ms, and this is almost perfect.
You should always play games in game mode too, to get the lowest input lag.

Monitors are different.
Most of them have somewhere around 1ms of input lag. You just can't notice it.
This is why I always prefer playing competitive fighting games on the PC.

The bottom line is: never buy a TV if you don't know it's input lag numbers. Always play on Game Mode. And if you want 0 input lag, buy a monitor.
 

Deleted member 18161

user requested account closure
Banned
Oct 27, 2017
4,805
Spider-Man and Horizon are great examples of near 4k image quality at 30fps with a solid framerate, perfect frame pacing and fantastic controller response.

60fps is of course preferable especially in competitive multiplayer games but there are a lot of 60fps console games this gen even with the awful Jaguar architecture so I'm very hopeful of 60fps options with lower resolution on the next gen consoles due to Ryzen especially with 80% of games being cross gen for the first couple of years.
 

bionic77

Member
Oct 25, 2017
30,888
(isn't this topic about fps? lol)

It is true CRT usually had less input lag.
With LCD and OLED you need to look for the TV model input lag before buying it.

Bad TVs can have even something like 100ms of input lag.
The good ones are around 20ms, and this is almost perfect.
You should always play games in game mode too, to get the lowest input lag.

Monitors are different.
Most of them have somewhere around 1ms of input lag. You just can't notice it.
This is why I always prefer playing competitive fighting games on the PC.

The bottom line is: never buy a TV if you don't know it's input lag numbers. Always play on Game Mode. And if you want 0 input lag, buy a monitor.
I had a theory that most gamers started playing with the PS1 and afterwards and if they did not experience gaming beforehand or on PC they would be used to 30 fps gaming and generally less responsive gaming in general so 60ps and super responsive gameplay wouldn't be as important to them. So I put in the television as well as old ass gamers remember gaming on a CRT tvs and monitors where the gameplay and input responses were lightening quick.
 

Tyaren

Character Artist
Verified
Oct 25, 2017
24,753
Games like Horizon Zero Dawn and Spider-Man are games that look both graphically amazing and run at a solid and smooth 30fps. They are pleasing to the eye also in motion. Playing those games I don't spend a single second thinking about framerate and how they'd look even better/smoother in 60fps. So 30fps is perfectly fine for them.