Is HDMI 2.1 really worth it right now though, I'm not fussed in playing anything at 120hz, plus native 4K hasn't been a standard in games yet on consoles.
New consoles will take advantage of it and you're forgetting about VRR.
Is HDMI 2.1 really worth it right now though, I'm not fussed in playing anything at 120hz, plus native 4K hasn't been a standard in games yet on consoles.
There were multiple improvements made in the panel tech between c6-c9 that should help mitigate it. (Bigger red subpixel, etc.) Good luck!
The LG CX OLEDs should hopefully exceed Plasma TVs for motion resolution with the new 120Hz BFI feature when it's set to high. They won't have the response time issues of Plasma TVs either.OLED is still a downgrade from plasma in terms of motion resolution/handling. To me, that's a major concession that will probably keep my plasmas as my primary displays until they die or possibly until I can't source parts for them. Their natural motion/PQ is just so much easier on the eyes than any modern display.
People mean supporting all the features of next-gen consoles, not "next gen" TV tech.No such thing as a next gen proof tv when microled are right around the corner
New consoles will take advantage of it and you're forgetting about VRR.
Will they though initially? I think I'll wait until they are out before upgrading or pick up a tv next year instead.
Oh your question seemed to imply it wasn't worth trying to get a set with HDMI 2.1. There's no harm in waiting until the consoles are out and there are more TV options to choose from. That's a pretty wise move to make, but the next gen consoles will absolutely utilize HDMI 2.1 features at launch.
The LG CX OLEDs should hopefully exceed Plasma TVs for motion resolution with the new 120Hz BFI feature when it's set to high. They won't have the response time issues of Plasma TVs either.
It's one of the reasons I'm waiting for CX prices to drop rather than pick up a cheaper B9/C9 now, as tempting as it may be.
People mean supporting all the features of next-gen consoles, not "next gen" TV tech.
While I hope that µLED is here soon, I still remember the early 2000s on sites like AVS Forum where OLED was "just around the corner" for a decade.
That is exactly what should be expected from BFI - though the panel should be capable of more than 36 nits with it enabled.BFI max on LG is 60hz. The other settings are at 120. BFI at max produced the most "plasma-like" image and he estimated it reached 1080 lines of motion resolution, but it also introduced severe darkening (100 nits goes down to 36) and affected calibration.
Yeah, reviews will probably be up sometime in June. Will make sure to wait for Vincent and RTINGS. Specs don't seem promising and it looks like the difference from the 950 is more significant when compared to the X900F, but zone count isn't be all end all, and this TV is my only hope for this year. If Sony would've just promised to upgrade the 950H through firmware for 2.1 features, I'd be all over the cheap 950G by now...I can't wait for the Rtings review of the X900H. It's reasonably priced, and supports 4K120, VRR, eARC, and Dolby Vision - pretty much my entire checklist for a next gen friendly TV.
I actually did ask LG, they just told me burn in isn't anything to worry about, they wouldn't tell me they will do the panel replacement. :(As someone who just bought a C9 and did TONS of research before doing so.. burn-in concerns are completely overblown, particularly with video games. It's a non-issue for the VAST majority of owners. Even then, LG apparently is willing to do a one-time replacement for burned in panels, even though it's not covered by warranty. You can ask their customer support to confirm that if you'd like.
I think of it like this: I grew up with CRT TVs, which also are susceptible to burn-in, yet how many kids of the 80s grew up with the UI elements of games like Super Mario Bros burned into their screens (the answer: not many.)
300 hours probably isn't anything to worry about. RTings test suggest it would take like 9000 hours to achieve burn in (COD and FIFA on high brightness 24/7 for a year was the test IIrc). Burn in can totally happen, but it will take much longer than 300 hours. Here is the vid in question: https://youtu.be/nOcLasaRCzYFor someone who put in 300+ hours into the last Orcs Must Die game on my projector I'm scared shitless of burn in on OLEDs. I may have to wait a year before I jump in.
300 hours probably isn't anything to worry about. RTings test suggest it would take like 9000 hours to achieve burn in (COD and FIFA on high brightness 24/7 for a year was the test IIrc). Burn in can totally happen, but it will take much longer than 300 hours. Here is the vid in question: https://youtu.be/nOcLasaRCzY
How many hours did they have?I can assure you that the burned-in panels I've replaced on the C9/B9 did not have 9000 hours on them.
Well, a store demo unit I would think would be a bad example tbh. They are left in Torch mode basically the whole time the store is open, often running the same content over and over in some cases too. The TV's your talking about would be C9's and B9's that people owned, correct? That would mean they had burn in within less than a year, then? With all the measures LG has in place, that seems pretty crazy to me. I guess maybe it would have to do with how the TV was used as well, tho. But it is certainly disconcerting regardless.Far less. Store demo units rarely even hit 4000 in a year.
Burn in is a gamble with all self emissive panels and some are more prone to retention than others. A few hundred hours of a static HUD element without changing up content could absolutely cause it
Far less. Store demo units rarely even hit 4000 in a year.
Burn in is a gamble with all self emissive panels and some are more prone to retention than others. A few hundred hours of a static HUD element without changing up content could absolutely cause it
Does the bigger red pixel thing help mitigate burn-in? My C7 has burn-in from yellow and red bars, but everything else is fine. It's gotten super noticeable so I'm trying to get a new TV, was planning on waiting until Balck Friday sales to get the new CX or whatever, but if the C9 is good enough I'd get that.
Well, a store demo unit I would think would be a bad example tbh. They are left in Torch mode basically the whole time the store is open, often running the same content over and over in some cases too. The TV's your talking about would be C9's and B9's that people owned, correct? That would mean they had burn in within less than a year, then? With all the measures LG has in place, that seems pretty crazy to me. I guess maybe it would have to do with how the TV was used as well, tho. But it is certainly disconcerting regardless.
There aren't any substantial panel changes between the C9 and the CX, particularly any that would make burn-in more or less likely, so you should be fine getting a C9 while they're still around. I actually opted for a C9 over the CX recently since the C9 has the full HDMI 2.1 bandwidth available to it (the CX doesn't, they reallocated some of it toward AI picture processing instead), and still has a DTS sound decoder (which was removed from the CX.)
This model isn't available yet so we have to wait a few more weeks.Does anyone know if the 48" CX lacks any features? I know the smaller models can be missing stuff sometimes.
CX RTINGS review is out and yeah, I'm not sure theres a reason to buy a CX over a C9 at all tbh.
I understand there is a better processor in the CX over C9, but Im not entirely sure what thats even for?
Doing all the post processing stuff? (that we all turn off for game mode?)
here that is btw
LG CX OLED Review (OLED48CXPUB, OLED55CXPUA, OLED65CXPUA, OLED77CXPUA)
The LG CX OLED is an excellent high-end TV. It's part of LG's popular OLED lineup, sitting behind the LG GX OLED, and it delivers the same exceptional picture qu...www.rtings.com
BFI for 60Hz sources is the main reason, as it should reduce image persistence to ~4ms.CX RTINGS review is out and yeah, I'm not sure theres a reason to buy a CX over a C9 at all tbh.
There's potential for better image quality with improved processing, and I believe the display is supposed to adjust the picture to better suit bright rooms. I expect these are things which HDTVtest's review will cover.I understand there is a better processor in the CX over C9, but Im not entirely sure what thats even for?
I believe it's supposed to be fully-featured, but we don't know for sure right now.Does anyone know if the 48" CX lacks any features? I know the smaller models can be missing stuff sometimes.
That was a measured drop from 100 nits, which is likely not the same as you would see when brightness is maxed-out.60hz BFI is a measured 64% brightness reduction according to DeWayne Davis.
That doesn't matter - you would calibrate with BFI enabled for those sources anyway.In addition to the brightness reduction, it also adversely affects calibration (gamma and contrast) and the only BFI setting that doesn't in a meaningful way is Auto.
That was a measured drop from 100 nits, which is likely not the same as you would see when brightness is maxed-out.
That doesn't matter - you would calibrate with BFI enabled for those sources anyway.
Auto doesn't affect the picture much because it doesn't do much.
People still don't seem to get it. For BFI to have a significant impact on motion handling, it also has to have a significant impact on brightness.
You can't switch the panel off for 3/4 of every frame and think it won't do that.
It's the same thing for scanline filters. You can't turn half of the screen black and not expect brightness to drop.
They're self-emissive displays with an aggressive ABL. It doesn't scale linearly at all.
Unless there are measurements posted, saying that it "affects calibration" is meaningless.And the calibration impact is absolutely worth mentioning because some choose to enable/disable on the same profile dependent on content.
You would be surprised at how many people seem to think that BFI is pointless until it can be done without affecting image quality at all.And I don't think anyone is arguing that BFI doesn't result in a brightness reduction.
CX RTINGS review is out and yeah, I'm not sure theres a reason to buy a CX over a C9 at all tbh.
I understand there is a better processor in the CX over C9, but Im not entirely sure what thats even for?
Doing all the post processing stuff? (that we all turn off for game mode?)
here that is btw
LG CX OLED Review (OLED48CXPUB, OLED55CXPUA, OLED65CXPUA, OLED77CXPUA)
The LG CX OLED is an excellent high-end TV. It's part of LG's popular OLED lineup, sitting behind the LG GX OLED, and it delivers the same exceptional picture qu...www.rtings.com
So for the average gamer who isn't needing the best of the best, it's not necessary right? Like I can still play next gen without HDMI 2.1Variable Refresh Rate, eARC, Auto Low Latency Mode, and Quick Media Switching are features that will be relevant to game consoles.
So for the average gamer who isn't needing the best of the best, it's not necessary right? Like I can still play next gen without HDMI 2.1
Thanks for the reply! Thinking it could be a great option.This model isn't available yet so we have to wait a few more weeks.
It's not necessary, but without VRR it's possible that games will run at lower frame rates (locked to 30/60 rather than unlocked) and won't be as smooth.So for the average gamer who isn't needing the best of the best, it's not necessary right? Like I can still play next gen without HDMI 2.1
Is this the reason why my TV's (KS9000) BFI option "Clear LED" never worked like I expected and just looks like this?That is exactly what should be expected from BFI - though the panel should be capable of more than 36 nits with it enabled.
It's disappointing that the other settings are not still 60Hz, but with increased persistence for people that would prefer to trade off motion resolution for higher brightness. BFI at 120Hz with a 60 FPS source greatly hurts motion quality and I'd say it almost renders the feature useless.
That's right. 60 FPS sources need 60Hz BFI.Is this the reason why my TV's (KS9000) BFI option "Clear LED" never worked like I expected and just looks like this?
For some odd reason, it seems like the lower end KS8000 does BFI better at 60 hz, according to reviews I read back then.
So for all these TVs the least they will cost you from $800+ at the absolute best then, ya?
I will definitely look at that then, I'm interested in this upscaling tech but I've yet to see actual examples of it/it explained in a way I could determine valueThere was a quickie Value Electronics deep dive a few days ago. Better processing mostly went toward upscaling and cleaning up poor quality content. Brought it closer to Sony level, but not quite.
BFI for 60Hz sources is the main reason, as it should reduce image persistence to ~4ms.
That should result in motion handling which is similar to, or better than, most Plasma TVs. Previous models only halved the persistence, to ~8ms.
It should also have been able to work with 120Hz sources, but according to RTINGS' review it's not a good implementation and should not be used.
Keep in mind that there will be visible flicker when you enable this option, and it will cut the brightness by more than 50%.
They don't seem to have posted brightness measurements, but I expect it to be 100 nits at most.
It's also going to be incompatible with VRR.
The main reason someone would want it is for retro gaming if they don't have a CRT, or modern games that are limited to a maximum of 60 FPS.