The irony of the poll being, the guys that care about hi res graphics, are only getting 300 lines of motion resolution when moving, reducing those nice pixels into a blurry mess! Sure some TVs have BFI, but even at 1080p 120/144hz you are actually getting a nicer picture vs 4K most of the time.
Sorry, this just isn't true at all…
You're only getting
150 lines of motion resolution if the game is running at 30 FPS.
Motion resolution is directly linked to frame rate on sample-and-hold displays, and the test pattern used runs at 60 FPS. That's why it doubles to ~600 lines when you enable 120 FPS interpolation. So it's halved if the content is only 30 FPS.
And the test pattern isn't even moving particularly fast. It's slower than the camera movement in any game - so the resolution in a game would actually be lower.
Once you hit 60, the importance of frames goes down and the importance of graphics goes up.
That's only if you have a 60Hz display.
With a 120Hz display - especially one which supports variable refresh rates - 60 is the new 30. A relic of the past.
We
finally have official support for 120Hz in displays with HDMI 2.1, and some people are still pushing for 30 FPS.
That's a PC resolution, though. Not really interested in non-integer scaling. 1080p looks way better on a good 4K display than on a 1080p display of identical size because of tighter subpixel density and integer scaling (unless your display and settings suck and you aren't getting integer scaling).
Hardly any televisions support integer scaling, and it took five years of myself and others campaigning about it to convince NVIDIA to add the feature to their drivers. And they really only added it because Intel said it would be a feature in their upcoming GPUs.
Integer scaling works great for 2D games, but results are going to be mixed with 3D games. Most of the time, the extra ~75% resolution from rendering at 1440p instead of 1080p is going to look better.
With the advances in hardware, modern TAA and temporal upsampling techniques used in 3D games, it could be argued that there's far less need for integer scaling than there was five years ago.
I just tried playing some
Shadow of the Tomb Raider, and with its internal resolution scale set to 35% with content-adaptive sharpening enabled, it really doesn't look like you'd expect for a "720p" image.
But as I said earlier: the comparison should have been 4K30 vs 1080p120, not 1080p60 vs 720p120.
You simply can't get good TVs in small sizes. OLEDs start at 55 inches and so on
Plus once you're gaming on a 55-65 inch, it's kind of shitty downgrading to like half that size. The difference in picture quality and screen size >>>>>>>> the higher refresh for almost all games. I only use my 1440p 144hz LCD for CS:GO. I'm not seeing how 60 more hz would make RE2 any better when I'd have to play in crummy SDR on a tiny screen.
LG's 2019 OLEDs support 120Hz VRR, with HDR enabled. You don't need a high-end monitor for that any more.
Being at a desk distance is pretty bad for a huge tv, for games I mean. Big ultrawide and big sized displays is good for workstations, work desks, with multiple tickers, and spreadsheets going on for sure. That real estate for games just have your eyes moving all over the place, and all the curve in the world won't make it so you have to move your neck a bit.
The point of an ultrawide display is that it fills your peripheral vision, instead of being a small window in the center of your vision.
This is far more immersive. I also find that it can help with motion sickness, but there are so many factors for that, that I can see it also being worse for some people.