• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.
  • We have made minor adjustments to how the search bar works on ResetEra. You can read about the changes here.

Can you reliably tell the difference between 1440p and 4K?

  • I cannot reliably tell the difference

    Votes: 517 37.4%
  • I can reliably tell the difference

    Votes: 866 62.6%

  • Total voters
    1,383

Tora

The Enlightened Wise Ones
Member
Jun 17, 2018
8,640
Try and play the Witcher at 1440p vs 4k.

One is very sharp, the other is crispy as hell; in either case, both look good but you can tell the difference.
 

noyram23

Banned
Oct 25, 2017
9,372
That would be because of Dolby Vision/HDR probably. BBC documentaries are usually reference quality in the first place. Their Planet Earth doco was the default demo disc for everyone back when BD and HD DVD first came out.



It's mainly the dark scenes with tons of blacks that causes issues for lower bit rates, for bright scenes they are usually fine.
Yeah, even my b6 has some banding on dark scenes but c9 actually don't have that problem. This TV is superb
 

Nooblet

Member
Oct 25, 2017
13,637
Wait, is this true? I swear the planet earth/our planet 4k stuff looks miles better than even my uhd bluray movies. I don't get it.
Ofcourse.
The bitrate of a 1080P Blu Ray is 35-40Mb/s, for Netflix 4K that's 16Mb/s. The picture will be crisper in Netflix due to higher resolution but not better and it's going to be considerably fuzzier in quality during any motion as not only is it only pushing half the bitrate, it's using it to do 4x the resolution and HDR data (I donno how much that takes) so the quality drops even further. So it's using less information to push a lot more pixels.

This is why I've said before that it's pointless to be a resolution purist and dismiss other options like CBR and such due to it not being "real 4K, the fact of the matter ks there are different types of 4K content and they are not all the same. It's the quality of the pixel that matters not just the quantity.
 
Last edited:

345

Member
Oct 30, 2017
7,386
i can tell the difference when i'm looking for it (55-inch OLED, normal-sized living room) but it's super minor for most games when they're in motion.

it's great for something like forza or wipeout where you're spending a long time looking at the same detailed object (and where there's no FPS tradeoff) but i use the performance modes for almost anything else when available.

this is why i think lockhart is actually a good idea. it shouldn't compromise the scope of games and should turn in good results for anyone who doesn't care much about "native 4K".
 

alexbull_uk

Prophet of Truth
Member
Oct 27, 2017
1,923
UK
Definitely disagree, the difference is immediately noticeable between 1440p and native 4K.

That's not to say that 1440p doesn't look great though.
 

RingRang

Alt account banned
Banned
Oct 2, 2019
2,442
I can tell the difference between 1440p and native 4K, but with each step you take up the resolution ladder the differences become less obvious.

I sit about ~1.50 - 1.70m (depending on my head's position) away from a 55" Samsung TV and I couldn't tell the difference if my life depended on it. Neither could anybody who's been at my place. 4K are a complete waste of resources as far as I'm concerned and I refuse to believe that anybody could reliably tell those two resolutions apart. I feel like the 1080p → 1440p jump is substantial. 1080p looks washed out, whereas 1440p removes all the softness of the image but anything beyond that, whether 1800p or 4K, requires a splitscreen comparison to show the benefits. And I'm not even sure if that helps. Can you tell 1440p and 4K apart?
Saying something looks "washed out" is usually a way of saying something is lacking color. That would have nothing to do with the resolution.
 

bionic77

Member
Oct 25, 2017
30,894
i can tell the difference when i'm looking for it (55-inch OLED, normal-sized living room) but it's super minor for most games when they're in motion.

it's great for something like forza or wipeout where you're spending a long time looking at the same detailed object (and where there's no FPS tradeoff) but i use the performance modes for almost anything else when available.

this is why i think lockhart is actually a good idea. it shouldn't compromise the scope of games and should turn in good results for anyone who doesn't care much about "native 4K".
This is exactly how it is for me.

It is a noticeable difference but it is really hard for me to latch on to really tiny graphical details once it starts moving. I definitely notice all of this if I am watching someone else play a game, like Forza. But if I start playing Forza you eventually get into the game and focus on doing well and at a certain point I probably wouldn't notice if the resolution was 1080 or 4k if the action was fast enough. Framerates are always obvious though.
 

Skyebaron

Banned
Oct 28, 2017
4,416
1440p with some good Sharpening, like driver level ones can do wonders if your card cant do consistent fps @ 4k.

 

Mercador

Member
Nov 18, 2017
2,840
Quebec City
I can see the difference. However, I wonder if I should stay at 1080@60 when it struggle to do 2160@30 or 1440@~45 is fine. It feels like 1440 is not a clear multiplication so I wonder how the screen manage that, if any expert could explain me, it would be great!
 

jandg

Banned
Dec 23, 2019
141
Ofcourse.
The bitrate of a 1080P Blu Ray is 35-40Mb/s, for Netflix 4K that's 16Mb/s. The picture will be crisper in Netflix due to higher resolution but not better and it's going to be considerably fuzzier in quality during any motion as not only is it only pushing half the bitrate, it's using it to do 4x the resolution and HDR data (I donno how much that takes) so the quality drops even further. So it's using less information to push a lot more pixels.

This is why I've said before that it's pointless to be a resolution purist and dismiss other options like CBR and such due to it not being "real 4K, the fact of the matter ks there are different types of 4K content and they are not all the same. It's the quality of the pixel that matters not just the quantity.
Bitrate doesn't tell the full story tho, it's also compression. I don't know what type of compression is being used for streaming Netflix, I'd assume it's something proprietary, but it's really a key component to the success of the service.
 

tommyv2

Member
Nov 6, 2017
1,425
Non-native resolution makes this conversation moot. There is no such thing as a 1440 TV, so 1080 to 2160 is integer scaling and will seem sharper, at least.
 

Akita One

Member
Oct 30, 2017
4,628
But it's not enough to just have a 4K TV, you actually need 4K content as well, which is rare still.

But yeah, there is a big difference between a game that runs at 4K60 vs. 1440/60 at the same settings.

Non-native resolution makes this conversation moot. There is no such thing as a 1440 TV, so 1080 to 2160 is integer scaling and will seem sharper, at least.

You beat me to it lol
 

exodus

Member
Oct 25, 2017
9,951
Did a test with 2 friends by playing star wars fallen order and changing resolutions. 1440p and 4k on my 65" Sony is almost impossible to tell apart from regular viewing distance. The second the game is moving, it's impossible and guesswork.

Anyone claiming otherwise is bullshitting or their TV has a crappy upscaler.

It depends on the game. Fallen Order has very, very good image quality even at 1080p.

Some games benefit more from higher resolution than others. It's mostly down to the geometry and nowadays the TAA implementation. Red Dead Redemption 2 comes to mind, as even 1440p can look a bit blurry.
 

Dark1x

Digital Foundry
Verified
Oct 26, 2017
3,530
i play on a 27" 4k PC monitor and can see the difference easily.
This first post is exactly why resolution matters when playing at a desk versus a living room setup. I agree with the OP. On my 65" screen, the difference is minor from a normal viewing distance. With monitors, you sit much closer so it's far more obvious.

Either way, resolution only matters due to our reliance on fixed pixel displays. None of this would be an issue with a CRT.
 

exodus

Member
Oct 25, 2017
9,951
Non-native resolution makes this conversation moot. There is no such thing as a 1440 TV, so 1080 to 2160 is integer scaling and will seem sharper, at least.

1) Most people aren't even integer scaling their 1080p content to their 2160p displays.
2) Most console games aren't even rendering native resolution.
 

exodus

Member
Oct 25, 2017
9,951
True but the artifacts/compression of 4k streaming is not noticeable when you're sitting a proper distance (I sit around 6m from 65' LG C9) and it's worth for the HDR and other advantages introduced by UHD. It's probably due picture quality of the set too, that's why I highly recommend getting OLED TVs unless you want to watch a sunlit room
6m from a 65' screen is hardly a proper viewing distance. There's no way you can even resolve 1080p at that distance, let alone 4K.

By the way if you're on PC there's also 2880x1620p which is an in between resolution I rarely hear mentioned which scales nicely and is nowhere near the cost of 4K either.

I'm still on a 1080p display. If a game supports supersampling, 1620p is typically my goto.
 

eebster

Banned
Nov 2, 2017
1,596
I can tell a difference between 1440p and 4k sitting 8 feet away on a 65 inch TV. 1440p definitely has a softer look. I can't tell the difference between 1800p and 4k though so I typically set all my PC games to 1800p and crank up the effects.

And I hope next gen developers will do that as well. Go for 1800p and maximise effects instead of sacrificing visual quality to achieve 4k
 

jandg

Banned
Dec 23, 2019
141
This first post is exactly why resolution matters when playing at a desk versus a living room setup. I agree with the OP. On my 65" screen, the difference is minor from a normal viewing distance. With monitors, you sit much closer so it's far more obvious.

Either way, resolution only matters due to our reliance on fixed pixel displays. None of this would be an issue with a CRT.
Not for me tho, 65" @ normal viewing distance. And I honestly really could not deal with the jaggies in Sekiro, it made that much of a difference.
 
Jan 21, 2019
2,902
This first post is exactly why resolution matters when playing at a desk versus a living room setup. I agree with the OP. On my 65" screen, the difference is minor from a normal viewing distance. With monitors, you sit much closer so it's far more obvious.

Either way, resolution only matters due to our reliance on fixed pixel displays. None of this would be an issue with a CRT.

Hey John, love your work.

Quick question: Is there future TV tech that excites you?
 

Dark1x

Digital Foundry
Verified
Oct 26, 2017
3,530
Not for me tho, 65" @ normal viewing distance. And I honestly really could not deal with the jaggies in Sekiro, it made that much of a difference.
Well...I guess it also depends on the AA. Sekiro doesn't have great image quality so it's more noticeable. With a strong TAA, it's much less obvious.


Hey John, love your work.

Quick question: Is there future TV tech that excites you?
Not really. Nobody seems to be working on motion resolution. Until modern displays can compare with CRTs in this area, I remain disappointed.
 

noyram23

Banned
Oct 25, 2017
9,372
6m from a 65' screen is hardly a proper viewing distance. There's no way you can even resolve 1080p at that distance, let alone 4K.



I'm still on a 1080p display. If a game supports supersampling, 1620p is typically my goto.
That's pretty much the recommended distance by my calibrator, although it could be more since i'm not sure if he said is 6m lol
 

Fastidioso

Banned
Nov 3, 2017
3,101
I noticed the difference of 4k native even downsampled in a 1080p TV. Below 4k it's quite noticeable.
 

Csr

Member
Nov 6, 2017
2,032
What content did you compare? For various reasons some games look remarkably well even on lower resolutions.
Also how long have you been viewing 4k content? I feel there is an adjustment period, in my experience some people have trouble noticing higher resolutions and framerates if they are not used to them.
 

OmniStrife

Member
Dec 11, 2017
1,778
"The difference between 640 x 480 and 800 x 600 is virtually indiscernible even when sitting close to a 25" screen..." Someone on a forum 20 years ago, probably.
 

noyram23

Banned
Oct 25, 2017
9,372
Ooooow, OK, that makes a lot of sense.

I sit about 2m from my 65" OLED, and about 4m from my 135" projector screen :D. 6m for a 65" is HUGE.
The calibrator pretty much setup and recommended everything and it's probably the best choice I did when it to my entertainment system. I highly recommend getting one
 

IcyInferno

Member
Oct 26, 2017
373
With my 4K 27 inch monitor sitting a few feet away I can see the difference in some games running at 1440p vs 4K but I'm not sure if that could be caused by interpolation artifacts when running in a non-native resolution since it isn't a 1:1 upscale. I'd need a monitor of the same size with a native 1440p resolution to really compare. Some games seem to look fine running at 1440p though so maybe those just don't have as high resolution textures.
 

BloodshotX

Member
Jan 25, 2018
1,596
I don't get it. Are you playing a game at 4K on a 4K screen and then playing the same game at 1440p on the same screen and not seeing a difference? Because I can do that on my 27 inch monitor and see a clear difference.
I think he is, but your forgetting one thing, with monitors you tent to sit close to. This person uses a 4ktv and sit at "semi" normal viewing distance.

Sure i can see the difference myself, but its really subtle(vegitation looks sharper, maybe more detail on character itself ect). Its not like going from 480p to a 720/1080p screen like 10/15 years ago.
 

Pargon

Member
Oct 27, 2017
12,023
Either way, resolution only matters due to our reliance on fixed pixel displays. None of this would be an issue with a CRT.
I'm not so sure about that.
PC gamers have almost universally rejected the gaussian-filtered scaling that NVIDIA uses with their dynamic super resolution feature (DSR).
Nearly any time I see a discussion involving DSR there are people saying that the filter should be disabled.
And you often have people saying that DSR looks bad if you are using non-integer scales. Hmm, I wonder why that could be…

What's the difference between that and a CRT?
The CRT has no "native" resolution which bypasses the gaussian filter, like a fixed-pixel display.
On the CRT, all resolutions are essentially gaussian-filtered.

This is similar to Apple's devices.
A large number of Apple's devices now default to using a non-native resolution.
But few people complain about that being blurry, because a filtered output is the default.

Fixed-pixel devices are not actually displaying the signal correctly when you have 1:1 mapping.
Proper sampling means that you should have a display resolution which is at least double the source resolution - or a source that is appropriately low-pass filtered.
But people are used to the extra sharpness that 1:1 pixel mapping provides.

This is why strong TAA implementations make it far more difficult to discern the native resolution of a source, and differences like 1600p vs 1800p are far less noticeable or distracting.

Not really. Nobody seems to be working on motion resolution. Until modern displays can compare with CRTs in this area, I remain disappointed.
LG had a "Crystal Motion" OLED prototype with 3.5ms MPRT at CES 2019 - so it's being worked on.
The main thing which needs to happen is that they need to reduce the MPRT via panel driving techniques rather than black frame insertion. That will give them much better control over persistence and image brightness.

That's pretty much the recommended distance by my calibrator, although it could be more since i'm not sure if he said is 6m lol
Someone confused 6 feet with 6 meters.
At 6m you should be using a projector and at least a 128" screen. That would just meet THX's minimum recommended image size for 1080p content. Ideally it would be 200" or larger.
65" at 6m is postage-stamp sized. It's smaller than holding an iPad at arms length.
 

Tarot Deck

Avenger
Oct 27, 2017
4,233
I can notice higher frame rate easily, but I can barely notice anything over 2K.

And people talk about 8K...
 

tommyv2

Member
Nov 6, 2017
1,425
.
Someone confused 6 feet with 6 meters.
At 6m you should be using a projector and at least a 128" screen. That would just meet THX's minimum recommended image size for 1080p content. Ideally it would be 200" or larger.
65" at 6m is postage-stamp sized. It's smaller than holding an iPad at arms length.

Exactly this. People are sitting too far from their TVs and using TVs that are too small and yet preaching about pixel counts.
 
Nov 14, 2017
4,928
I don't know why 1080p would make your picture washed out. The picture should be identical at all resolutions, with just an increase in sharpness and detail.

I had my PC hooked up to my 4k OLED for a long time, and would decide on a case by case basis if I wanted either 1080p or 1440p60 or 2160p30. Sometimes 1080p would look better than 1440p because 1080p scales neatly up to 4k, but sometimes 1440p with good AA (like TAA) would also work.

Honestly, if you can't see the distance while sitting at an appropriate distance, you might need glasses.
 

datamage

Member
Oct 25, 2017
913
Not really. Nobody seems to be working on motion resolution. Until modern displays can compare with CRTs in this area, I remain disappointed.
I gotta say, on the C9, things look pretty smooth @ 120Hz.

As for the topic at hand, 4K looks much more crisp than 1440p, unless you are a good distance away. I'm only several feet from a 55" so that difference is apparent.
 

Tapiozona

Avenger
Oct 28, 2017
2,253
So when does the resolution difference no long become noticable? I know it's a meme but at some point the human eyes ability to detect actually does become a thing, especially with distance from screen taken into account.

As question for frames? I personally can't tell much above 60 but that's just me. What about over 120? Can people tell the difference above 120? Seems like the number of frames flickinering per second would actually become undetectable at some point.
 

Lashley

<<Tag Here>>
Member
Oct 25, 2017
59,997
I can't really tell. Even if I could I'd choose 1440p though. I notice FPS more than pixels.