The difference between 1440p and 2160p is quite noticeable in my opinion - especially if the game's output is sharp rather than using a soft TAA implementation.
But it's absolutely not worth the performance trade-off. I'd much rather play games at 1440p with 2.25x extra GPU performance available, compared to rendering 2160p native. It's nothing like a 2x visual improvement.
I've noticed that resolution differences are hyped-up here, while performance differences are often downplayed.
In my opinion, the jump from 60 to 90 FPS is as big as the jump from 30 to 60. It's a very significant difference - you don't even have to go to 120.
Yet here, lots of people act like 60 is barely any different to 30, and 120 or 240 FPS is something few people can see.
That's completely false. Even non-gamers are going to notice frame rate more than resolution.
I'm sure that many people have experienced this: parents watching SD channels because they are "more convenient" (The ch.101 they are used to rather than 151 etc.) and they "can't see the difference anyway".
Meanwhile features like motion interpolation are popular with the general public because it makes action much smoother and clearer - even if movie nerds dislike it.
I strongly disagree with that. Checkerboard is far worse than 1440p as soon as the image starts to move.
But you won't see why that is on consoles, because you can't disable the motion blur they use to cover up the resolution loss.
Most people sit closer to monitors relative to their size than they do to their televisions. 1440p vs 4K is likely going to be
more noticeable there.
In fact, most of the "home theater" setups I've seen with
giant TVs—75″ and up—have them placed in equally-giant rooms, and end up appearing smaller than people's setups with 55" and smaller TVs, or at a desk with a 27″ monitor.
That's a 75″ LCD. Doesn't look it from that distance though, does it?
...