• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.
  • We have made minor adjustments to how the search bar works on ResetEra. You can read about the changes here.

Lego

Member
Nov 14, 2017
2,100
Then according to the internet you are in luck playing any game in 1080p120hz which I think will be possible with Ori?
Interesting! I wonder if the resolution dip will be worth it. Also from a quick Google search, it would no longer be in HDR, so have to weigh that up.

But thanks, I didn't actually know this!
 

Senteevs

Member
Oct 28, 2017
449
Latvia
I'm also in the X900E club. Probably won't upgrade for some more years since I can't justify the price of OLED for 120Hz alone.
 
I'm rocking 75 inch Q80T and I'm happy with it. Almost went with 65 inch C9, but decided against it due to burn in risk, my living room is fairly bright and I wanted a larger screen.
jXqTVM.jpg
 
Last edited:

Serious Sam

Banned
Oct 27, 2017
4,354
This thread is yikes. Everyone blindly yapping "OLED OLED" over and over again, as if this is the only TV in town and as if everyone has thousands to spend on this expensive tech.

OP, go with 65" Q70R. Not only will you have money left, but bigger screen size will be far more impactful than 55" OLED.

Also, OLEDs black levels aren't that important for gaming, because on average games are far brighter than movies. I would argue that QLEDs peak brightness is far more important for gaming than OLEDs black levels, especially if you aren't gaming in a pitch black room setting.
 

Ferrs

Avenger
Oct 26, 2017
18,829
This thread is yikes. Everyone blindly yapping "OLED OLED" over and over again, as if this is the only TV in town and as if everyone has thousands to spend on this expensive tech.

OP, go with 65" Q70R. Not only will you have money left, but bigger screen size will be far more impactful than 55" OLED.

Also, OLEDs black levels aren't that important for gaming, because on average games are far brighter than movies. I would argue that QLEDs peak brightness is far more important for gaming than OLEDs black levels, especially if you aren't gaming in a pitch black room setting.

wut? a lot of people is recommending the x900h in here, not OLEDs.
 
Oct 28, 2017
1,540
Think it's between a Sony x900h and Samsung QLED for me, don't want to spend a lot of money on an OLED only to be worrying about burn-in all the time.
 

Joo

Member
May 25, 2018
3,883
This thread is yikes. Everyone blindly yapping "OLED OLED" over and over again, as if this is the only TV in town and as if everyone has thousands to spend on this expensive tech.

OP, go with 65" Q70R. Not only will you have money left, but bigger screen size will be far more impactful than 55" OLED.

Also, OLEDs black levels aren't that important for gaming, because on average games are far brighter than movies. I would argue that QLEDs peak brightness is far more important for gaming than OLEDs black levels, especially if you aren't gaming in a pitch black room setting.
Yeah. OLED is definitely better in many ways but it's not like there aren't many really good alternatives available. If the main use is for gaming, I don't see how some higher-end QLED's like Q80T/Q90T or Sony's XH90 (X900H) would be worse. The black levels aren't bad in any way and those might even be a better choice for basically majority of people.

Especially if you're going to use the TV for many years, burn-in might become a problem and ruin the whole TV if you game a lot. OLED still has the best PQ, but something like Q90T has much better reflection handling for example.
 

Gouty

Member
Oct 25, 2017
1,659
I own both a Samsung Q9FN and an LG C9.

The difference in picture quality is stark. The OLED mops the floor with led.
 

aisback

Member
Oct 27, 2017
8,745
I'm edging towards the CX 48 inch but I'm hoping Sony 48 Inch is HDMI 2.1 but I don't think it is.
 

Pargon

Member
Oct 27, 2017
12,030
VRR doesn't help with microstutters or sudden hitches. What it can help with is with games running at 30-45 or 45-60 fps. But this must be targeted by a game to work and with most users not having VRR capable displays I doubt that many will.
Developers do not have to do anything to make VRR work. It's a system-level feature that happens automatically.
This is why G-Sync works with games that were released 20 years prior to it, with no modifications.

If a game is built for 30 FPS and it drops to 25 FPS when things get busy, the display will sync to that automatically.
Same thing if it's built for 60 FPS and were to drop to 50 FPS. In that case it's likely to remain smooth enough that most people won't even notice the drop, while you would have obvious stuttering on a non-VRR 60Hz display.

If the frame rate is dropping because of a CPU or I/O bottleneck, then it is likely to stutter due to a spiky frame-time graph like this:
spiky-frametime-graphcaje1.png

If the frame rate is dropping due to a GPU bottleneck - which is most common - frame drops are generally smooth; producing a frame-time graph like this:
smooth-frametime-grapvlkmc.png

It doesn't matter what the frame rate is, only that it's a smooth transition.

Where developers may have to intervene is if they also want to unlock the frame rate.
If a game is built for 30 FPS there are two main ways you can achieve that: via V-Sync, or via a frame rate limiter.
  • If it locked to 30 FPS by V-Sync, it's possible that enabling VRR could automatically remove or double that limit at the system level.
  • If it's locked to 30 FPS by a frame rate limiter, even disabling V-Sync entirely would not remove that limit - since it's a part of the game itself.
In the latter scenario, the developer would have to step in if they wanted to enable frame rates above 30 FPS when using VRR, but they don't have to do anything for VRR to work below 30 FPS.
The developer might want to set an upper limit of 40 FPS, so that the game now fluctuates between 25–40 FPS rather than 25–30 FPS, since leaving it totally unlocked might run at 25–90 FPS and that difference is just too great.
Or the engine might have CPU-related stuttering above 50 FPS on that specific console, so they want to keep the limit below the point where that occurs.


On top of that, even if a game were to be completely locked to 30 FPS and never drop a frame, you still have the latency benefit of V-Sync no longer being engaged when VRR is used.
A 30 FPS game running on a VRR display should have at least one frame lower latency than the same game running on a non-VRR display.

What's the consensus on Philips TVs these days? Was the other day at a mall and saw they make oled TVs as well, not sure how good or bad they may be
Philips have been a leader in introducing several display technologies such as motion interpolation - they even had CRT televisions with their Digital Natural Motion feature.
But they also had a reputation for building unreliable displays for decades - though I believe they are now manufactured by a completely different company.

The main reason I see to choose a Philips OLED over others is the Ambilight feature, but none of their 2020 models include HDMI 2.1 support - so I would not recommend any of them to pair with a next-gen console.
If you really want the Ambilight feature, I would suggest waiting for 2021 displays.

If you'd have asked me 6 months ago, I'd have said go for it. I've got a 55" 803 OLED in my living room, and it's been fantastic. Unfortunately it's developed a fault with displaying HDR content, and they're refusing to accept that it's an issue.
EvilBoris - would you concur that this behaviour is not "Just how OLED is"? :P
That looks like posterization, which is not "just how OLED is" but likely just how that TV's tone mapping/HDR processing is, rather than a fault.
It's also possible for some color management systems to introduce this problem when adjusted, so there may be some picture settings that help minimize/remove the artifacts.
This is why smooth gradations are one of the things I value most in a display, and is why I appreciate HDTVtest's near-black testing they do now (that's where posterization is most commonly seen). It's one of the reasons I'm thinking that I may hold out another year for Panasonic to add HDMI 2.1 support, as they have the best-performing OLED in that regard.

This thread is yikes. Everyone blindly yapping "OLED OLED" over and over again, as if this is the only TV in town and as if everyone has thousands to spend on this expensive tech.
OP, go with 65" Q70R. Not only will you have money left, but bigger screen size will be far more impactful than 55" OLED.
This is not an HDMI 2.1 display, so a bad recommendation for anyone buying a TV specifically for next-gen consoles.
It will only support 48–120Hz VRR at 1440p - and the PS5 may not even have a 1440p output option, which would limit you to 1080p.

I'm edging towards the CX 48 inch but I'm hoping Sony 48 Inch is HDMI 2.1 but I don't think it is.
The Japanese site for the A9S says that it only supports 4K60 on the spec page - which would mean HDMI 2.0b.
It's possible that it could receive an update for HDMI 2.1 via a later update like the X900H/Z8H, but I wouldn't count on it. I think they're waiting until 2021 for their OLEDs.
The display is set to launch in Europe by the end of August, so hopefully we'll have details then.
 

aisback

Member
Oct 27, 2017
8,745
Developers do not have to do anything to make VRR work. It's a system-level feature that happens automatically.
This is why G-Sync works with games that were released 20 years prior to it, with no modifications.

If a game is built for 30 FPS and it drops to 25 FPS when things get busy, the display will sync to that automatically.
Same thing if it's built for 60 FPS and were to drop to 50 FPS. In that case it's likely to remain smooth enough that most people won't even notice the drop, while you would have obvious stuttering on a non-VRR 60Hz display.

If the frame rate is dropping because of a CPU or I/O bottleneck, then it is likely to stutter due to a spiky frame-time graph like this:
spiky-frametime-graphcaje1.png

If the frame rate is dropping due to a GPU bottleneck - which is most common - frame drops are generally smooth; producing a frame-time graph like this:
smooth-frametime-grapvlkmc.png

It doesn't matter what the frame rate is, only that it's a smooth transition.

Where developers may have to intervene is if they also want to unlock the frame rate.
If a game is built for 30 FPS there are two main ways you can achieve that: via V-Sync, or via a frame rate limiter.
  • If it locked to 30 FPS by V-Sync, it's possible that enabling VRR could automatically remove or double that limit at the system level.
  • If it's locked to 30 FPS by a frame rate limiter, even disabling V-Sync entirely would not remove that limit - since it's a part of the game itself.
In the latter scenario, the developer would have to step in if they wanted to enable frame rates above 30 FPS when using VRR, but they don't have to do anything for VRR to work below 30 FPS.
The developer might want to set an upper limit of 40 FPS, so that the game now fluctuates between 25–40 FPS rather than 25–30 FPS, since leaving it totally unlocked might run at 25–90 FPS and that difference is just too great.
Or the engine might have CPU-related stuttering above 50 FPS on that specific console, so they want to keep the limit below the point where that occurs.


On top of that, even if a game were to be completely locked to 30 FPS and never drop a frame, you still have the latency benefit of V-Sync no longer being engaged when VRR is used.
A 30 FPS game running on a VRR display should have at least one frame lower latency than the same game running on a non-VRR display.


Philips have been a leader in introducing several display technologies such as motion interpolation - they even had CRT televisions with their Digital Natural Motion feature.
But they also had a reputation for building unreliable displays for decades - though I believe they are now manufactured by a completely different company.

The main reason I see to choose a Philips OLED over others is the Ambilight feature, but none of their 2020 models include HDMI 2.1 support - so I would not recommend any of them to pair with a next-gen console.
If you really want the Ambilight feature, I would suggest waiting for 2021 displays.


That looks like posterization, which is not "just how OLED is" but likely just how that TV's tone mapping/HDR processing is, rather than a fault.
It's also possible for some color management systems to introduce this problem when adjusted, so there may be some picture settings that help minimize/remove the artifacts.
This is why smooth gradations are one of the things I value most in a display, and is why I appreciate HDTVtest's near-black testing they do now (that's where posterization is most commonly seen). It's one of the reasons I'm thinking that I may hold out another year for Panasonic to add HDMI 2.1 support, as they have the best-performing OLED in that regard.


This is not an HDMI 2.1 display, so a bad recommendation for anyone buying a TV specifically for next-gen consoles.
It will only support 48–120Hz VRR at 1440p - and the PS5 may not even have a 1440p output option, which would limit you to 1080p.


The Japanese site for the A9S says that it only supports 4K60 on the spec page - which would mean HDMI 2.0b.
It's possible that it could receive an update for HDMI 2.1 via a later update like the X900H/Z8H, but I wouldn't count on it. I think they're waiting until 2021 for their OLEDs.
The display is set to launch in Europe by the end of August, so hopefully we'll have details then.



I can actually order the TV but the specs aren't even on Sony website about it.
 

Serious Sam

Banned
Oct 27, 2017
4,354
This is not an HDMI 2.1 display, so a bad recommendation for anyone buying a TV specifically for next-gen consoles.
It will only support 48–120Hz VRR at 1440p - and the PS5 may not even have a 1440p output option, which would limit you to 1080p.
HDMI 2.1 is input standard, not display standard. 4K60 will be more than enough for next gen consoles. Titles that do 4K120 will probably be countable on two hands, and even those will probably be sub 4K. If one wants to spend twice as much money just so they can brag they have full HDMI 2.1 with 4K120 support sure, they are free to do so. Also you don't know that PS5 won't support 1440p 120hz.
 

Winstano

Editor-in-chief at nextgenbase.com
Verified
Oct 28, 2017
1,834
That looks like posterization, which is not "just how OLED is" but likely just how that TV's tone mapping/HDR processing is, rather than a fault.
It's also possible for some color management systems to introduce this problem when adjusted, so there may be some picture settings that help minimize/remove the artifacts.
This is why smooth gradations are one of the things I value most in a display, and is why I appreciate HDTVtest's near-black testing they do now (that's where posterization is most commonly seen). It's one of the reasons I'm thinking that I may hold out another year for Panasonic to add HDMI 2.1 support, as they have the best-performing OLED in that regard.

It only happens with blues/greys, and it's only started over the past few months... It's not like 'normal' posterisation/banding, and no amount of tweaking solves the problem. When you pause it in certain places, the edges of the problem areas actually flicker quite a lot as well, which suggests to me that it's a processing fault rather than banding. It looks a lot, lot worse in motion, to the point where certain scenes in movies become completely unwatchable...
 

CNoodles

Banned
Mar 7, 2019
708
I have been looking to buy a new tv as well. I play a lot of jrpg's so I am thinking of getting the Samsung q80. Would like the Lg CX but don't want to worry about burn in's.
 

gozu

Member
Oct 27, 2017
10,360
America
The Cx appears to hold back a little on brightness too, so it's likely to be less prone to burn in.
So not a great deal.
To be fair I find the c9 to be too bright so this is a Definitely a positive feature for me.

I used to want 4000 nits HDR but now I am quite happy with sub - 1000 Oleds. This is the sweet spot imo
 

mindatlarge

Member
Oct 27, 2017
2,926
PA, USA
If you're investing in a tv in 2020 for games, get something with HDMI 2.1 so you can be sure that you'll have a good time with PS5 or XSX (or PCs with upcoming HDMI 2.1 GPUs if you're into that). You want that VRR and up to 120hz for the next gen.

Quality HDMI 2.1 sets include:
  • LG C9 (2019 OLED - In clearance and harder to find this late into 2020 but good potential clearance discount if you can find one)
  • LG CX (2020 OLED - A top tier gaming set for 2020; only hesitate if price-size ratio isn't doing it for you or you're too anxious about OLED burn in possibilities to use OLED with games)
  • Sony X900H (2020 FALD - Less perfect blacks than OLED and some FALD light halos, but great colors and picture, and Android TV is among the best Smart TV interfaces. Great price for size.)
  • Vizio P-Series Quantum X ("2021" FALD - Reviews pending but this set should be launching in a few days. Highest peak brightness in the land for the most popping HDR in places, but like the Sony, you will get typical FALD halos on certain bright objects in dark backgrounds. Also great price for size like the Sony. Lousy Smart TV functions though, but a non-issue with an Apple TV or Shield TV.)
Would you recommend the Sony X950H over the X900H? Kinda looks like it would be worth it for an extra $200.

www.rtings.com

Sony X950H vs Sony X900H Side-by-Side TV Comparison

The Sony X950H is slightly better than the Sony X900H overall. The X950H has better viewing angles, reflection handling, and it delivers a better HDR experience, as it has a better HDR color gamut and it can get brighter. However, the X900H has a higher contrast ratio since it doesn't have the...
 

Pargon

Member
Oct 27, 2017
12,030
HDMI 2.1 is input standard, not display standard. 4K60 will be more than enough for next gen consoles. Titles that do 4K120 will probably be countable on two hands, and even those will probably be sub 4K. If one wants to spend twice as much money just so they can brag they have full HDMI 2.1 with 4K120 support sure, they are free to do so.
As I said already: VRR requires 120Hz to work correctly, because most televisions have a minimum refresh rate of 48Hz.
If your TV supports 48–60Hz that means no VRR below 48 FPS.​
48–120 Hz means VRR can be active from 0–120 FPS.​

It's not about "bragging rights" - the issue is that VRR won't work properly without it, and you will miss out on games that do run at 120 FPS.
If you don't care about new features like VRR support, then you aren't really buying a TV for these next-gen consoles, and could have selected any HDMI 2.0 TV in the past five years.

Now before anyone jumps in, you could technically support 0–60 FPS via HDMI 2.0 with a 24–60Hz display, but that would require halving the minimum refresh rate from 48Hz to 24Hz - which is challenging for many reasons.
LCDs really don't like operating at low refresh rates - especially ones with fast response times.
So you have the technical challenge of producing a new display panel that can support 24Hz without flickering, or the ease of replacing the HDMI 2.0 receiver chip with an HDMI 2.1 receiver chip that costs a little more and supports up to 120Hz.

Also you don't know that PS5 won't support 1440p 120hz.
That's why I said it may not have the option. Not that it won't.
I think chances are slim, considering they did not support it on the PS4/Pro at all.
And even if it does, do you really want to buy a TV which requires you to output 1440p rather than 4K?

It only happens with blues/greys, and it's only started over the past few months... It's not like 'normal' posterisation/banding, and no amount of tweaking solves the problem. When you pause it in certain places, the edges of the problem areas actually flicker quite a lot as well, which suggests to me that it's a processing fault rather than banding. It looks a lot, lot worse in motion, to the point where certain scenes in movies become completely unwatchable...
The images looked like tone mapping / color management precision errors rather than a fault, but you're the one with the display, and you're saying that it's started to get worse recently - so I'll take your word for it.

Would you recommend the Sony X950H over the X900H? Kinda looks like it would be worth it for an extra $200.
The X950H is an HDMI 2.0 display, not HDMI 2.1 like the X900H.
Sony has always been weird about the way they support features across their line-up. Even a decade ago, when I bought my last Sony TV, the HX900 I purchased was technically a model below the LX900, but the HX900 had full-array local dimming while the "higher-end" LX900 was edge-lit; but had an extra-slim design, WiFi support, and was available in 40"/52"/60" sizes rather than 46"/52" sizes.

Note that the X900H has yet to receive a firmware update that enables HDMI 2.1 support, so no-one actually knows how good the implementation is yet.
I wouldn't recommend that anyone buys one until we know how well Sony handles things like VRR.
 

Lump

One Winged Slayer
Member
Oct 25, 2017
16,037
Would you recommend the Sony X950H over the X900H? Kinda looks like it would be worth it for an extra $200.

www.rtings.com

Sony X950H vs Sony X900H Side-by-Side TV Comparison

The Sony X950H is slightly better than the Sony X900H overall. The X950H has better viewing angles, reflection handling, and it delivers a better HDR experience, as it has a better HDR color gamut and it can get brighter. However, the X900H has a higher contrast ratio since it doesn't have the...

There is no HDMI 2.1 on the X950H, but there is HDMI 2.1 (pending firmware update) on the X900H. This should make the X900H a slightly better choice for games even if the other features make the X950H slightly better for media consumption.
 

DarkChronic

Member
Oct 27, 2017
5,040
LG B8 owner here, easily the best TV I've ever owned. Of course there are other great alternatives, but once you see the black levels on this thing you will never want to go back. It's outstanding and it's been worth every penny so far.
 
Oct 28, 2017
2,217
I bit the bullet and just ordered the LG B9 as the C9 price gouging and general unavailability seemed like it was just getting worse with time. I dunno if I'd even be able to tell the difference between the picture quality of the B9/C9, anyway. Looking forward to seeing how games are gonna look on it!
 

Ducayne

Member
Oct 27, 2017
644
I'm looking forward to updating my Ks8000 soon with the new consoles. I've been so stuck on getting an OLED but I usually game in daytime in my bright sunny living room (I have big bay windows that don't have curtains). even with the 1000 nit backlit display on my KS8000 I have trouble seeing HDR content well on my tv during the day. Hopefully in the future I'll have space for a dedicated game space where I can have it more dimly lit but I'm already worried that an oled won't be nearly as bright as I'll need it to be. My dad has a QLED which is bright as hell, but I have a hard time letting go of getting an OLED
 

Winstano

Editor-in-chief at nextgenbase.com
Verified
Oct 28, 2017
1,834
The images looked like tone mapping / color management precision errors rather than a fault, but you're the one with the display, and you're saying that it's started to get worse recently - so I'll take your word for it.

It's really hard to describe it... It's on every input, so not just limited to store apps or one device. It's so frustrating. I've run a 4k HDR test video through it at 35Mb/s via Plex, 4k HDR discs, PS4 Pro, everything... When loading screens in TLOU2 come up there's very pixellated halo effects that flicker around the flies/bugs that show up in the top right corner... If it was like it from day one I'd have noticed it, but it's definitely gotten worse. The scene where the guy runs across the battlefield in 1917 was flickering so much that even my father-in-law, who's partially sighted, noticed it!
 

Omnicore

Member
Oct 25, 2017
1,368
Vancouver
Bought the LG B9 77-inch (77 Canadian Only I believe) several months ago when it was on clearance. Still absolutely blown away by it.
Quite minimal differences between the B9 and C9 for the price difference.
 

wollywinka

Member
Feb 15, 2018
3,099
X900H is probably the bang-for-buck champ. And then you have the CX if you can afford the upgrade and have zero burn-in concerns.
I am going to upgrade for next-gen. Should I be concerned about burn-in? It seems a lot of the games I play have static HUDs. That said, new gen, new games. I play a couple of hours a day, but at the weekends, I sometimes have a six-hour session.
 

Jiraiya

Member
Oct 27, 2017
10,295
I went with a 55 inch Q80T. Gorgeous picture and I'm hdmi 2.1 ready. Absolutely phenomenal picture.

I really wanted to go oled...but I'd just worry the entire time i had it.
 

EvilBoris

Prophet of Truth - HDTVtest
Verified
Oct 29, 2017
16,686
I just purchased a 48CX for my games room/office. Absolutely astonishing TV.



If you'd have asked me 6 months ago, I'd have said go for it. I've got a 55" 803 OLED in my living room, and it's been fantastic. Unfortunately it's developed a fault with displaying HDR content, and they're refusing to accept that it's an issue.



EvilBoris - would you concur that this behaviour is not "Just how OLED is"? :P

I've replied on the tweet for you, that's not normal.
It's like it's given on any attempt to gamut map correctly and is just clipping colours
 

EvilBoris

Prophet of Truth - HDTVtest
Verified
Oct 29, 2017
16,686
To be fair I find the c9 to be too bright so this is a Definitely a positive feature for me.

I used to want 4000 nits HDR but now I am quite happy with sub - 1000 Oleds. This is the sweet spot imo
If you saw a 4000nit display you would disagree :P
Even a 1500nit display has a visible difference in colour volume
 

Craiji

Member
May 26, 2018
217
Just upgraded from an 65" X850D to the 65" X900H and I have never been so impressed. I hope it gets the needed software updates before the PS5 hits.
 

Moves

Member
Oct 27, 2017
637
Sticking with my 65" vizio-p quantum, should serve me fine for next gen as far as low input lag, hdr and 4k goes.
 

dgrdsv

Member
Oct 25, 2017
11,886
If a game is built for 30 FPS and it drops to 25 FPS when things get busy, the display will sync to that automatically.
Same thing if it's built for 60 FPS and were to drop to 50 FPS. In that case it's likely to remain smooth enough that most people won't even notice the drop, while you would have obvious stuttering on a non-VRR 60Hz display.
That's with double buffered vsync being always on. Most games disengage vsync when they can't hit the output frequency these days. VRR will remove the tearing in such cases but not much else.
 

Snake__

Member
Jan 8, 2020
2,450
LG OLED no question, nothing comes close (you should be able to find a 55" for $1000 and a 65" for $1500 if you look hard enough/ wait)

If you don't want something that expensive, then buy a TCL, best value by a mile

Nothing in between is worth the price, if you want something more expensive than a TCL, just save up for an LG OLED
 

Brat-Sampson

Member
Nov 16, 2017
3,469
My 13 year old 47 inch Samsung started turning itself off and on every few minutes a couple of days ago so it was finally time to replace it. I found a decent deal on a 55 b9, so looking forward to the upgrade when it arrives on Monday!
 
Last edited:

Pargon

Member
Oct 27, 2017
12,030
Is there any good TV at 50"
Few televisions are made in the 50" size any more. 55" has typically replaced it.
The 48CX OLED would be my recommendation if you don't want to go larger than 50".
But it's worth considering that newer displays are much smaller, if you plan on replacing an older 50" TV.
A 55" OLED may be similar in width - or even smaller - than a 50" Plasma TV. LG's 55CX OLED is 122.8cm wide, while a Pioneer 5090 is 123.3cm wide, or a Panasonic VT60 is 120.3cm.

I'm looking forward to updating my Ks8000 soon with the new consoles. I've been so stuck on getting an OLED but I usually game in daytime in my bright sunny living room (I have big bay windows that don't have curtains). even with the 1000 nit backlit display on my KS8000 I have trouble seeing HDR content well on my tv during the day. Hopefully in the future I'll have space for a dedicated game space where I can have it more dimly lit but I'm already worried that an oled won't be nearly as bright as I'll need it to be. My dad has a QLED which is bright as hell, but I have a hard time letting go of getting an OLED
That's not really an issue of display brightness - HDR content is not intended to be viewed in a bright room like that.
HDR content is not meant to get brighter as the display does. Only parts of the image get brighter in HDR, not the whole thing.

You could have a scene in HDR where a 1000 nits object is placed against a 400 nits background:
  • With a 500 nits display, the bright object would only be 500 nits against a 400 nits background, and would not stand out much (tone mapping could change this, but that over-complicates the example).
  • With a 1000 nits display, it would be displayed as intended - a very bright 1000 nits object against a 400 nits background.
  • With a 10,000 nits display, it would also be displayed as a 1000 nits object against a 400 nits background. You would not see a 10,000 nits object against a 4000 nits background, like you would if this was SDR.
SDR does not work with explicitly coded brightness values. Everything in SDR is relative to white, which is supposed to be set at 100 nits.
But since displays can often support SDR brightness levels of 500, 600, 700 nits, it makes the entire image five, six, seven times brighter than intended.
You can't do that with HDR, because the HDR spec is built for 10,000 nits and there are few TVs capable of 1/5 of that brightness, let alone 5x or more.

What some TVs are doing now is including options to compress the dynamic range of HDR to make it brighter.
This may be tied to a brightness sensor, or a picture setting. But if you were to compress the dynamic range of HDR to match the brightness of an SDR image in a bright room… well, you aren't really seeing HDR any more.
That's not to say newer TVs won't handle this better than your existing one, but fundamentally, HDR is not intended for that kind of environment and you may be better off using SDR.

It's really hard to describe it... It's on every input, so not just limited to store apps or one device. It's so frustrating. I've run a 4k HDR test video through it at 35Mb/s via Plex, 4k HDR discs, PS4 Pro, everything... When loading screens in TLOU2 come up there's very pixellated halo effects that flicker around the flies/bugs that show up in the top right corner... If it was like it from day one I'd have noticed it, but it's definitely gotten worse. The scene where the guy runs across the battlefield in 1917 was flickering so much that even my father-in-law, who's partially sighted, noticed it!
As I said, that doesn't sound out of the ordinary for a display that has less precision in its image processing/color management systems, or poor tone mapping.
But I won't deny the possibility of your TV developing a fault which could look just like it.

I've replied on the tweet for you, that's not normal.
It's like it's given on any attempt to gamut map correctly and is just clipping colours
That's why I don't think it's necessarily faulty, but could be an issue of poor image processing - or a setting that is causing it to happen.
But I've seen some weird results from television failures in the past, with Samsung TVs that turn portions of the image into completely random colored noise, rather than solid areas of one color.

If you saw a 4000nit display you would disagree :P
Even a 1500nit display has a visible difference in colour volume
One of the biggest problems with LG's WOLED design is that the white subpixel affects color accuracy of real-world scenes (not test patterns) and dilutes the color saturation at higher brightness levels. If they could achieve 700-800 nits peak brightness without it, I think the differences would be far less noticeable.
I'm very excited for OLED displays to move beyond color filters though - ideally moving to QD "color filters" which are not really color filters in the traditional sense.

That's with double buffered vsync being always on. Most games disengage vsync when they can't hit the output frequency these days. VRR will remove the tearing in such cases but not much else.
Most games are not using adaptive v-sync. They're triple-buffered to prevent screen tearing and drops below a 30/60 FPS target reducing the frame rate to 20/30 FPS.
And disabling V-Sync does not eliminate stuttering. It's just that tearing is so bad you don't notice it as much. If the frame rate is not synchronized to the refresh rate, it cannot be smooth.

VRR reduces latency, eliminates tearing, and eliminates stuttering (when caused by the frame rate/refresh rate being different) without any drawbacks.