I'm with you, theres a certain clarity that comes with 4k that 1080p cant hit.
Yes it's overrated. Many PC gamers who could play on 4K choose not to because they'd rather have higher FPS or be able to turn on features that otherwise couldn't be turned on for trying to push the power towards resolution. There's been plenty of instances on Xbox where they brag about 4K, yet the game looks better on PC at lower resolution because there are more options checked.
It's marketing.
Are you me?Resolution is everything, and dictates what version of each game I buy. I don't factor in DLC, I factor in resolution. I own literally every system, and resolution, is all that matters.
fixed for scientific accuracy
Unless you have a checklist at hand, that is completely false about the X. RDR2 is native 4K ffs.Thats why most devs aren't even attempting native 4k on the pro and x, usually just do a checkerboard solution or 1440p.
Phones are good evidence that resolution wars have and endpoint when they reach a pixel density where people no longer care. We're not really even at the point where 4K is standard yet, so we have a bit to go still, but eventually it ceases to be a selling point.No we wait until the first console to surpass 4K then it matters again
Agreed.
I lol'd
Console wars are a whole different thing manPhones are good evidence that resolution wars have and endpoint when they reach a pixel density where people no longer care. We're not really even at the point where 4K is standard yet, so we have a bit to go still, but eventually it ceases to be a selling point.
But I mean.. its just a fact though. BF1 at 4k and 30-40 fps vs BF1 at 1440p and 80-100 fps. The differerence is so steep that even my grand parents notice the difference and pick the higher fps.Replies like this are brilliant. We had PC gamers going on and on and on for years about resolution and as soon as a console comes out that can do or get very close to native 4k and they can't have the same at 60fps without spending an ungodly amount of cash, suddenly resolution doesn't matter and 1080p or 1440p is 'good enough' lol.
Same thing with the PS4 vs Xbox debates. A few years ago the World was ending if games were 1080p on PS4 vs 900p on XB1 but now it apparently doesn't matter if a game is native 4k on Xbox One vs 1440p on PS4 Pro (a much larger difference than 900p vs 1080p).
Basically people will justify whatever narrative fits their current console / PC.
I am not joking. Fired up Shogun 2 this morning at 4K resolution and couldn't believe how jaggy it looked. After applying MSAA x8 it looked better but still.
Well yeah, any resolution benefits from anti-aliasing.
I'm with you, theres a certain clarity that comes with 4k that 1080p cant hit.
I find that the overly-aggressive TAA implementations used in many games now seems to blur the line between native and sub-native rendering - and not in a good way.
I appreciate how effective it can be at eliminating aliasing, but even native 4K looks sub-native in most games now, and ghosting can still be a problem.
Most TAA implementations need a minimum of something like 2.25x downsampling to have the sharpness of a native resolution image.
It's an older comparison now, but:
"Native" rendering with TAA often gives me migraines because I am constantly straining my eyes to try and focus the image. I find that I seem to be more susceptible to motion sickness too - though part of that may be caused by the ghosting.
Well yeah, any resolution benefits from anti-aliasing.
Most of the people I've seen making claims that 4K does not need anti-aliasing are downsampling from 4K to 1080p on a PC with the expectation that this is how native 4K looks.
Even then, 4x downsampling is not nearly enough to eliminate aliasing. In my tests, you need something like 16x downsampling (at 1080p) to truly eliminate aliasing. It's still preferable to be using some form of post-process AA in addition to that - even something basic like FXAA.
As resolution and pixel density gets higher, less anti-aliasing is required, but you still need to use some form of AA. Even those 13" notebooks with 4K displays (330 pixels per inch) still need anti-aliasing.
I think that 8K will likely be the upper limit for display resolution at most if not all sizes, with the only exception being VR - if it survives that long.
All of that said, I'll take framerate over resolution any day, and have even considered downgrading from a 1440p display to a 1080p one because I find that my GTX 1070 isn't getting the framerates that I want (≥90 FPS) at 3440x1440 any more, and sub-native rendering looks really bad. I've practically stopped buying/playing new releases now.