• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.
  • We have made minor adjustments to how the search bar works on ResetEra. You can read about the changes here.

Deleted member 18161

user requested account closure
Banned
Oct 27, 2017
4,805
Yes it's overrated. Many PC gamers who could play on 4K choose not to because they'd rather have higher FPS or be able to turn on features that otherwise couldn't be turned on for trying to push the power towards resolution. There's been plenty of instances on Xbox where they brag about 4K, yet the game looks better on PC at lower resolution because there are more options checked.

It's marketing.

Replies like this are brilliant. We had PC gamers going on and on and on for years about resolution and as soon as a console comes out that can do or get very close to native 4k and they can't have the same at 60fps without spending an ungodly amount of cash, suddenly resolution doesn't matter and 1080p or 1440p is 'good enough' lol.

Same thing with the PS4 vs Xbox debates. A few years ago the World was ending if games were 1080p on PS4 vs 900p on XB1 but now it apparently doesn't matter if a game is native 4k on Xbox One vs 1440p on PS4 Pro (a much larger difference than 900p vs 1080p).

Basically people will justify whatever narrative fits their current console / PC.
 

Coconico

Member
Oct 25, 2017
332
Miami
No. 4K or bust. 1080P is like Vaseline smeared on the screen.

giphy.gif
 

Burrman

Banned
Oct 25, 2017
7,633
I'm happy with 1080p for a couple more years. Just up the framerates pls. The worst IMO is when multiplayer is 60 and campaign is 30, like they did with gears 4. Maybe it was the right choice, but it was so hard to jump into the campaign after a few rounds of MP.
 

funky

Banned
Oct 25, 2017
8,527
TBH I think the type of AA and motion blur effects my enjoyment more then raw resolution.

Using up all that GPU power on native 4k only to slap some blurry ass TAA on top of it just grinds my gears. Might as well target 1440p at that stage.



And as always framerate should be king. Ill take whatever resolution you got once the framerate is smooth over native 4k and random drops to the low 20's
 

gabdeg

Member
Oct 26, 2017
5,970
🐝
4K just looks normal to me now. I've seen plenty of games shimmer even at native 4K. 8k downscaled, now that's a way for a game to look. Unfortunately most games don't even hit 30fps on that on a 2080Ti.
 

The Omega Man

Member
Oct 25, 2017
3,933
I prefer FPS over resolution, like 1080p@60fps over 4k@30fps but I feel like the majority of console gamers prefer higher resolutions and don't mind lower frame rates.
 

Terror-Billy

Chicken Chaser
Banned
Oct 25, 2017
2,460
I prefer higher framerates, but I also understand that the netbook CPUs on consoles can't do much, so I'm happy to have hi-res games on my consoles of choice.

Thats why most devs aren't even attempting native 4k on the pro and x, usually just do a checkerboard solution or 1440p.
Unless you have a checklist at hand, that is completely false about the X. RDR2 is native 4K ffs.
 

PetrCobra

Member
Oct 27, 2017
954
First we ran out of colors, I remember when the general rule was that twice as many colors was better than twice as much resoluion, Now we're running into diminishing returns in resolution. I mean, 4K is already an overkill for home environment. Next up is HDR, I have no idea where that has a limit cause I don't even use HDR myself, but there's bound to be a limit there. I'm personally hoping that after this, we're going to stop chasing visual boundaries and start instead focusing on visual quality within the existing boundaries, maybe while prioritising framerate as well, because that's what's more important than any of the above, I believe.

Oh and also, we're not really there yet OP. Not until Switch is 1080p as a baseline, which I'm not sure when that's gonna happen. maybe the next iteration, maybe the one after that?
 

gozu

Member
Oct 27, 2017
10,345
America
Assuming a distance of 10 feet between you and the TV, which is where i'm sitting right now:

1080p is fine til around 60".

4k is fine for well above 100". (it is currently used at movie theaters, but the pic is a bit fuzzy, they should be 8K honestly)

8k is fine for over a hundred feet diagonal.

16k is fine for photorealistic VR (132 Megapixels split between both eyes. )

There will never be a need for 32k TVs, but a few niche industries might still benefit from 32K panels.
 

HyGogg

Banned
Oct 27, 2017
2,495
No we wait until the first console to surpass 4K then it matters again
Phones are good evidence that resolution wars have and endpoint when they reach a pixel density where people no longer care. We're not really even at the point where 4K is standard yet, so we have a bit to go still, but eventually it ceases to be a selling point.
 

Wetwork

Banned
Oct 27, 2017
2,607
Colorado
FPS is king, console makers should focus on making 60 the standard with the One X/Pro and making 144 the standard nextgen. What good does the resolution do if your FPS is a fucking slideshow? I just started up Destiny 2 on PC since it's free for now.

Holy shit, 144FPS 1080p >>>>. Taking anything sub 60FPS is laughable, I'd gladly give up resolution any day for more frames anyday (luckily I have a PC soooo I don't have to)
 

Jaypah

Member
Oct 27, 2017
2,866
I can handle 1080p (and in most instances I have to) but sitting a few feet from a 75inch 4k screen the difference between 1080p and 4k is gigantic. I agree 1440 is noticeably better than 1080p but I'd rather have 4k for everything, even with decreased visuals elsewhere. But I will sacrifice whatever I have to to get to 60fps. Framerate, then resolution and then shader quality/effects for me in order of importance.
 

headspawn

Member
Oct 27, 2017
14,620
Guywith1080ptv: "bruh, there's barely a difference between 1080p and 4k anyways, way overhyped."

Guyw/1440p144hzmonitor: "who needs HDR anyways, 144hz or everything is fucking slideshow."

Guyw/triplemonitorsetup: "Not buying your shit games unless you support 5870x1080p resolution, anything less isn't immersive."

Guywith720pLCDfrom2006: "I don't get all the whacking off for highres buzzwords anyways, I have a ps4pro but this is all I need".

 

s3ltz3r

Banned
Nov 12, 2017
1,149
User Banned (3 day): console warring and trolling
Yup. Resolution is overrated.

That's why dynamic 4K or 4K is not a big deal. Unless it becomes a marketing weapon as it is for the Xbox One X.
 

Deleted member 7948

User requested account closure
Banned
Oct 25, 2017
1,285
I'm happy with 1080p60.

Actually, I play in a 1440p screen and will gladly reduce the render resolution if needed (but keep my HUD native please). I barely notice the difference when playing.
 

-PXG-

Banned
Oct 25, 2017
6,186
NJ
Once you see anything in actual, native 4K, even 1080p looks outdated in comparison.
 

Deleted member 4093

User requested account closure
Banned
Oct 25, 2017
7,671
Phones are good evidence that resolution wars have and endpoint when they reach a pixel density where people no longer care. We're not really even at the point where 4K is standard yet, so we have a bit to go still, but eventually it ceases to be a selling point.
Console wars are a whole different thing man
 

1.21Gigawatts

Banned
Oct 25, 2017
3,278
Munich
I have a 65 inch screen and either game on a chair thats about 2 meters from the screen or a couch thats about 3-4 meters away from the screen.
From the chair the difference between 1080p and 4k is pretty obvious. From the couch its noticeable, but just barely.
With RDR2 I switched from the chair to the couch because the game has a weird blurry look at times(Ps4 Pro), but from the couch it looks pretty crisp. (Horizon and God Of War I played from the chair)

So yeah, I think it is important. But there are diminishing returns and I think image quality has reached a point where there isn't much necessity for improvement anymore.
 

Treasure Silvergun

Self-requested ban
Banned
Dec 4, 2017
2,206
For me, it is. In the sense that I have no need for higher resolutions, nor do I throw a tantrum if I don't have the absolute best available.

My gripe with resolution is actually a gripe with screens. Nobody would make such a fuss about resolution if screens had technology that allowed to perfectly adapt resolutions lower than the screen's native res without introducing artifacts or blurrying the image.

Also I don't know why so many people care about grass, foliage and tiny prints in the backgrounds of their games. It's nice, but in the end, it's just noise (unless it's the whole point, or a very important part, of the game). I mean, I don't need to be able to actually read the license plate of a car flying at my character when an explosion occurs, 'cause I'm supposed to dodge the damn car after all, not look at it. Maybe I've spent so many years playing games where backgrounds were just backgrounds, I'm not interested in giving backgrounds a second look even today. So all that detail is lost on me. It's pretty, but I'm there for the game. Why would I want to read the fine print on a sign in a fictional campus, or a fictional old American west town? Yet modern resolutions allow for that kind of detail and, unfortunately, devs need to fill that empty space with something, and it must be readable because if it's not, people will complain. No wonder the graphically spoiled can't even look at indie games with big pixels (even if some of those are excessive in the opposite way).
 

SleepSmasher

Banned
Oct 27, 2017
2,094
Australia
I'd say nowadays, when TAA is well implemented (like in Fortnite), resolution is becoming a less critical aspect. For instance, Fortnite at 2560x1440 on PC with TAA has *great* IQ, absolutely nothing to complain about.
 

Sandersson

Banned
Feb 5, 2018
2,535
Replies like this are brilliant. We had PC gamers going on and on and on for years about resolution and as soon as a console comes out that can do or get very close to native 4k and they can't have the same at 60fps without spending an ungodly amount of cash, suddenly resolution doesn't matter and 1080p or 1440p is 'good enough' lol.

Same thing with the PS4 vs Xbox debates. A few years ago the World was ending if games were 1080p on PS4 vs 900p on XB1 but now it apparently doesn't matter if a game is native 4k on Xbox One vs 1440p on PS4 Pro (a much larger difference than 900p vs 1080p).

Basically people will justify whatever narrative fits their current console / PC.
But I mean.. its just a fact though. BF1 at 4k and 30-40 fps vs BF1 at 1440p and 80-100 fps. The differerence is so steep that even my grand parents notice the difference and pick the higher fps.

It has nothing to do with this "console vs pc" narrative you are trying to peddle. Its just that the jump from 480p to 1080p was totally worth the fps sacrifise while 1440p to native 4k or higher just isnt. Especially when technologies like checkerboarding exist.
 

daninthemix

Member
Nov 2, 2017
5,024
I am not joking. Fired up Shogun 2 this morning at 4K resolution and couldn't believe how jaggy it looked. After applying MSAA x8 it looked better but still.

It's interesting, I recently replayed GTA5 at 4K, and that game is hugely cleaned up with 4x MSAA. That said, I can play plenty of games at 4K without any AA at all and they look great (in many cases I'd rather have no AA than lose tons of detail with blurry post AA / post TAA).
 

Tyaren

Character Artist
Verified
Oct 25, 2017
24,797
I don't think it is overrated at all. Just look at RDR2 on PS4 in 1080p and then on Xbox One X in native 4K. That is quite the jump in visual fidelity and clarity. And it is really just the pure resolution bump, because other than that the versions (except the base Xbox One version, which has some cutbacks, as Digital Foundry noted) have the same visuals, assets and effects. Even the clarity jump from PS4 Pro's 4K checkerboarding to the X's native 4K is quite big.
 

Shepherd

Member
Oct 25, 2017
1,040
Absolutely not. Playing in 4k in a large TV is incredible. The difference is not as big as SD-> HD, but it's enough to wow you a few times.
 

LCGeek

Member
Oct 28, 2017
5,857
it's overrated cause we don't get enough dpi to match the density increase these days. I wish so I had never seen an SED screen being demonstrated we get junk.
 

ImaginaShawn

Banned
Oct 27, 2017
2,532
4k plus hdr is the baseline for me next gen, framerate matters much less, as long as it is somewhat stable and over 15fps I honestly won't notice a difference.
 

degauss

Banned
Oct 28, 2017
4,631
Nice thread.

I'd be interested in some developers targeting lower resolution games that look more real, or use more advanced shaders and lighting. Maybe with the way GPUs are designed and scale that's not realistic though, and more resolution is an easier engineering task.

I can't even get my other half to notice the difference between HD and SD broadcast material most of the time, sometimes I think we are "too inside" with this stuff, and also TVs built in scalers are really good thesedays.

I do notice much more when I play something on a monitor with my face right up next to it, difference between 1080p and 1440p is mushy vs lush, but maybe it has a bad scaler, or just has lots less wizzardry going on than my Bravia, sort of like reference sound monitors vs fun hifi speakers.

That said, I think Last of Us 2 is still going to look incredible on a poorly compressed postage stamp sized video, and some other games are going to look uninsteresting no matter what uncompressed 4K bs video/screenshots you post of them.
 
Last edited:

KingdomKey

Member
Oct 27, 2017
1,106
As long as a game keep 1080p this gen I'm happy. But 4K is definitely better. But I rather take higher FPS than resolution if I get the choice.
 

FlintSpace

Banned
Oct 28, 2017
2,817
I always believed so.
Haven't experienced true 4K but I am mostly sure the allure to move from 4K to above would have waaay diminishing returns.
We are going to get stuck at 4K for quite awhile which is such a good news.
 

Ikuu

Banned
Oct 27, 2017
2,294
I'm happy with 1440p/144Hz, if I was going to upgrade I'd probably go ultrawide over 4k.
 

MrKlaw

Member
Oct 25, 2017
33,070
A lot of effects - smoke/reflections/shadows - aren't rendered at the same resolution anyway. So I'd prefer less obsession with output resolution and a general focus on image quality

I'd say 1440p with good quality AA/scaling is probably good enough
 

Pargon

Member
Oct 27, 2017
12,025
I find that the overly-aggressive TAA implementations used in many games now seems to blur the line between native and sub-native rendering - and not in a good way.
I appreciate how effective it can be at eliminating aliasing, but even native 4K looks sub-native in most games now, and ghosting can still be a problem.

Most TAA implementations need a minimum of something like 2.25x downsampling to have the sharpness of a native resolution image.
It's an older comparison now, but:
"Native" rendering with TAA often gives me migraines because I am constantly straining my eyes to try and focus the image. I find that I seem to be more susceptible to motion sickness too - though part of that may be caused by the ghosting.
4K without AA is still super jaggy to me.
Well yeah, any resolution benefits from anti-aliasing.
Most of the people I've seen making claims that 4K does not need anti-aliasing are downsampling from 4K to 1080p on a PC with the expectation that this is how native 4K looks.
Even then, 4x downsampling is not nearly enough to eliminate aliasing. In my tests, you need something like 16x downsampling (at 1080p) to truly eliminate aliasing. It's still preferable to be using some form of post-process AA in addition to that - even something basic like FXAA.
As resolution and pixel density gets higher, less anti-aliasing is required, but you still need to use some form of AA. Even those 13" notebooks with 4K displays (330 pixels per inch) still need anti-aliasing.

I think that 8K will likely be the upper limit for display resolution at most if not all sizes, with the only exception being VR - if it survives that long.

All of that said, I'll take framerate over resolution any day, and have even considered downgrading from a 1440p display to a 1080p one because I find that my GTX 1070 isn't getting the framerates that I want (≥90 FPS) at 3440x1440 any more, and sub-native rendering looks really bad. I've practically stopped buying/playing new releases now.
 
Last edited:
Mar 17, 2018
2,927
I find that the overly-aggressive TAA implementations used in many games now seems to blur the line between native and sub-native rendering - and not in a good way.
I appreciate how effective it can be at eliminating aliasing, but even native 4K looks sub-native in most games now, and ghosting can still be a problem.

Most TAA implementations need a minimum of something like 2.25x downsampling to have the sharpness of a native resolution image.
It's an older comparison now, but:
"Native" rendering with TAA often gives me migraines because I am constantly straining my eyes to try and focus the image. I find that I seem to be more susceptible to motion sickness too - though part of that may be caused by the ghosting.

Well yeah, any resolution benefits from anti-aliasing.
Most of the people I've seen making claims that 4K does not need anti-aliasing are downsampling from 4K to 1080p on a PC with the expectation that this is how native 4K looks.
Even then, 4x downsampling is not nearly enough to eliminate aliasing. In my tests, you need something like 16x downsampling (at 1080p) to truly eliminate aliasing. It's still preferable to be using some form of post-process AA in addition to that - even something basic like FXAA.
As resolution and pixel density gets higher, less anti-aliasing is required, but you still need to use some form of AA. Even those 13" notebooks with 4K displays (330 pixels per inch) still need anti-aliasing.

I think that 8K will likely be the upper limit for display resolution at most if not all sizes, with the only exception being VR - if it survives that long.

All of that said, I'll take framerate over resolution any day, and have even considered downgrading from a 1440p display to a 1080p one because I find that my GTX 1070 isn't getting the framerates that I want (≥90 FPS) at 3440x1440 any more, and sub-native rendering looks really bad. I've practically stopped buying/playing new releases now.

If it survives that long lol. It will. 4K native usually only needs some decent SMAA for me. While there are some jaggies left it's not that bad, but no AA is not acceptable to me.
 
Oct 31, 2017
8,466
It's definitely an area subject by diminishing returns at this point.
Of course any higher resolution is always going to look prettier than a lower one, but the price in hardware horsepower required (and consequently in performances) makes it a lesser priority in my personal opinion.

I'm currently on a 1440p G-Sync monitor and I have absolutely no desire to go above that, while on the other hand there's never too many frames per seconds as far as I'm concerned.
In fact I also have a 4K TV (Sony Bravia) and I definitely prefer gaming on my monitor, between the two.

Also, subjectively speaking I don't feel like noticing a pixel is the worst thing ever, while on the other hand low or choppy framerate is garbage.
 

Cliff Steele

Banned
Oct 28, 2017
4,477
I have a 4k TV and mostly just play Switch games on it. And I don't care.

I'm also going to sell my Pro for a Slim on Black Friday.