• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.

BreakAtmo

Member
Nov 12, 2017
12,824
Australia
Woah, so I could just buy a c9 now and have all the features for next gen 4k 120fps vrr?

Pretty much. The CX is better, but mostly in other ways:

IMG-20200426-004643.jpg
 
Nov 8, 2017
13,086
It must be pretty embarrassing for Sony's TV division that they will only have one TV on the market with a single HDMI 2.1 port by the time PS5 ships. You'd think they would coordinate behind the scenes to be like "yeah maybe we should fast track this since Samsung and LG are ahead and our own console is going to use it".

It's not even their top end models lol. The x90 this year will have it but not anything higher or lower. Super bizarre.
 

entremet

You wouldn't toast a NES cartridge
Member
Oct 26, 2017
59,960
It must be pretty embarrassing for Sony's TV division that they will only have one TV on the market with a single HDMI 2.1 port by the time PS5 ships. You'd think they would coordinate behind the scenes to be like "yeah maybe we should fast track this since Samsung and LG are ahead and our own console is going to use it".

It's not even their top end models lol. The x90 this year will have it but not anything higher or lower. Super bizarre.
Does Sony even care about TVs anymore?
 

BreakAtmo

Member
Nov 12, 2017
12,824
Australia
This seem relevant to the thread:



It's a very useful video, but you should also watch this one:



It demonstrates something that the DF guys didn't go into - basically, when you activate Game Mode (taking away most of the video processing features to improve latency), the PQ gap between OLED and QLED widens significantly. They go from, as the DF video notes, two different kinds of TVs with their own strengths and weaknesses, to the OLED just being better in almost every category. The QLED still has a couple of small advantages with colour, but overall if you're looking for a gaming TV and want to reduce input lag as much as possible, an OLED is the best option.
 

Mathieran

Member
Oct 25, 2017
12,854
Well ya jerks. I just changed last second from buying a B9 to a C9 65". I hope the extra 250 is worth it.
 

criteriondog

I like the chili style
Member
Oct 26, 2017
11,069
I'm hoping for the C9 to go further down in price! Due to burnin, I had to get my C7 repaired, but it took four repair attempts and LG just instead bought the TV right off of me!

so I'm using my 42" LG LCD 1080p panel from 2012 and man, I can definitely notice an extreme difference.

I miss the OLED and HDR so much. The colors on my lcd just look so washed out and the refresh rate seems wack.
 

TheMadTitan

Member
Oct 27, 2017
27,197
Have they said anything about the B series yet? My ass is cheap, so I'm not going to fork over CX money if the BX is going to follow in the B9's footsteps and be mostly the same.
 

Videophile

Tech Marketing at Elgato
Verified
Oct 27, 2017
63
San Francisco, CA
Do these LG TVs have a way of showing the FPS on the screen? And if HDR/DV is on?

HDR/DV - Yes. Press the ok button on the remote, then press on the icon that pops up and it will read out the current signal as well as resolution.

For FPS you need to go into the HDMI diagnostics menu. This is fairly easily done.

1. Open the channel menu
2. Highlight the top option
3. Press 1 five times in a row quickly
The HDMI diagnostics overlay will come up.

Here's what it looks like on my CX 77.


 

ss_lemonade

Member
Oct 27, 2017
6,646
It's a very useful video, but you should also watch this one:



It demonstrates something that the DF guys didn't go into - basically, when you activate Game Mode (taking away most of the video processing features to improve latency), the PQ gap between OLED and QLED widens significantly. They go from, as the DF video notes, two different kinds of TVs with their own strengths and weaknesses, to the OLED just being better in almost every category. The QLED still has a couple of small advantages with colour, but overall if you're looking for a gaming TV and want to reduce input lag as much as possible, an OLED is the best option.

What the heck, why does game mode on that q90 look worse than game mode on my significantly older KS9000? That beginning HZD cave scene sure didn't look like that when I played the game. The blooming with that white box also almost looks just as bad as mine, despite mine being well known to be awful with local dimming due to the KS8000/9000 being edge lit.
 

Fachasaurus

Member
Oct 27, 2017
1,349
HDR/DV - Yes. Press the ok button on the remote, then press on the icon that pops up and it will read out the current signal as well as resolution.

For FPS you need to go into the HDMI diagnostics menu. This is fairly easily done.

1. Open the channel menu
2. Highlight the top option
3. Press 1 five times in a row quickly
The HDMI diagnostics overlay will come up.

Here's what it looks like on my CX 77.




Wow thank you! So does this work regardless of the application/source that it's receiving signal from?

We have a Shield to an AV receiver and would still like to use it but I was curious if you could actually find that information out since I can only do it on an app to app basis currently.
 

BreakAtmo

Member
Nov 12, 2017
12,824
Australia
What does BFI have to do with near-black uniformity?
yeah that makes no sense
BFI is for helping motion resolution, at the expense of lowered brightness

Oh, is that part wrong? Are there any other mistakes in that list I should know about?

What the heck, why does game mode on that q90 look worse than game mode on my significantly older KS9000? That beginning HZD cave scene sure didn't look like that when I played the game. The blooming with that white box also almost looks just as bad as mine, despite mine being well known to be awful with local dimming due to the KS8000/9000 being edge lit.

No idea - is it possible that the QLED panels aren't that great due to Samsung going all-in on video processing? It may also look worse than it really is on camera.
 

Pargon

Member
Oct 27, 2017
11,989
What the heck, why does game mode on that q90 look worse than game mode on my significantly older KS9000? That beginning HZD cave scene sure didn't look like that when I played the game. The blooming with that white box also almost looks just as bad as mine, despite mine being well known to be awful with local dimming due to the KS8000/9000 being edge lit.
Camera exposure settings are going to affect how dark scenes appear in a video, and can very easily exaggerate differences in display contrast. You really can't judge this in a video, even in side-by-side comparisons.
That being said, when you put local-dimming LCDs into game mode, the dimming effectiveness is reduced - so what you see ends up being far closer to the panel's native contrast ratio.
RTINGS puts the Q90R native contrast ratio at ~3250:1, and 11,200:1 with local dimming enabled. The KS8000 has a native contrast ratio of 6900:1.
Contrast won't drop all the way to native, as local dimming is not disabled entirely, but the KS8000 is likely to have better contrast in game mode than the Q90R.
 

Falus

Banned
Oct 27, 2017
7,656
Im really wondering. Lg 65c9 is 2000e here. 65cx is 3000e

i want to be 100% futur proof for next gen. Is c9 enough ?

i read this
[
  • The LG implementation ignores the media handles for PCM 5.1 and PCM 7.1 audio, which means it is not possible to pass uncompressed HD audio from devices like game titles on consoles like Xbox/PS4 that send HD audio uncompressed. There is no technical reason this shouldn't work (and does work on competitor televisions from Sony) this is just an omission on LGs part in supporting the formats. This issue was first reported in rtings.com review of LG C9.
  • Owners of 2017 Denon products have reported that their AVRs are not recognized by LG C9 as being eARC capable devices. It is reported that 2017 Denons also have this issue with other brand televisions so possibly this issue can only be fixed by Denon or that Denon and display makers will have to collaborate on a fix.
  • It has been confirmed that LG C9 operates properly with eARC delivery when HDMI CEC is turned off on the source (TV) and destination (AVR). This is accomplished by removing HDMI configuration for target AVR in the LG C9 Connections Manager (reset configuration) and disabling ARC and TV control in the Denon/Marantz unit.... then enabling ARC and eARC w/passthrough in the C9 HDMI audio settings. It is unknown if this is functional across all AVR brands but strongly indicates that LG has properly implemented the feature so that it can be turned on independent of use of HDMI control (HDMI CEC).
Maybe cx is better I need e-arc lpcm
 

FlyStarJay

Member
Jan 7, 2018
429
looking at the 48 inch and 55 inch, its £300 difference, i feel like they got the pricing wrong, its like they do not believe the 48 would sell well, i was hoping to get the 48inch, will wait till november and see
 

gabdeg

Member
Oct 26, 2017
5,956
🐝
Any 12-bit sources on the horizon? Can the panel even display it? Still, seems like an odd thing to cheap out on. Hopefully we'll see a proper Alpha 10 chip next year.
 

Darknight

"I'd buy that for a dollar!"
Member
Oct 25, 2017
22,788
Any 12-bit sources on the horizon? Can the panel even display it? Still, seems like an odd thing to cheap out on. Hopefully we'll see a proper Alpha 10 chip next year.

No, and no which makes it more a spec sheet bulletpoint rather than a meaningful real world feature.
 

1-D_FE

Member
Oct 27, 2017
8,252
They claim the resources are being put to better uses. Not sure if that's FUD or not. But is there literally any use you'd ever have for something beyond 10-bit 4K @120hz with 4:4:4? It really seems like anything more is processing power that's there for zero practical reason.
 

Planet

Member
Oct 25, 2017
1,358
Copying my response from Reddit:

Clickbait headline got me worried a bit at first. While technically not a lie, it's making it sound waay worse than it is.
So they sightly capped the maximum bandwidth, HDMI 2.1 specification goes up to 48 GBit/s, LG 2020 seems to have about 40 GBit/s and for comparison HDMI 2.0 tops at 18 GBit/s.

This means the LG 2020 models won't support 12 bit HDR at 4K... which is a total non-issue, because no panel on the market realistically supports more than 10 bit anyway. Using 12 bit wouldn't make a visible difference, and LG still supports 10 bit without any compromises at 4K 120 Hz, no chroma subsampling required. Which is the absolute maximum you can realistically expect the consoles to deliver anyway.

tl;dr: Forbes is technically correct, but you aren't losing any feature if you buy a 2020 model over e.g. a 2019 one.
 

BreakAtmo

Member
Nov 12, 2017
12,824
Australia
Copying my response from Reddit:

Clickbait headline got me worried a bit at first. While technically not a lie, it's making it sound waay worse than it is.
So they sightly capped the maximum bandwidth, HDMI 2.1 specification goes up to 48 GBit/s, LG 2020 seems to have about 40 GBit/s and for comparison HDMI 2.0 tops at 18 GBit/s.

This means the LG 2020 models won't support 12 bit HDR at 4K... which is a total non-issue, because no panel on the market realistically supports more than 10 bit anyway. Using 12 bit wouldn't make a visible difference, and LG still supports 10 bit without any compromises at 4K 120 Hz, no chroma subsampling required. Which is the absolute maximum you can realistically expect the consoles to deliver anyway.

tl;dr: Forbes is technically correct, but you aren't losing any feature if you buy a 2020 model over e.g. a 2019 one.

When are we likely to actually get 12-bit panels? And what exactly would that give us?
 

Wet Jimmy

Member
Nov 11, 2017
809
Im wondering about that 120hz support. Is there any hdmi switch that support it ? Not enough hdmi IN on the tv

I think it's too early for HDMI 2.1 capable switches. Give it another 12 months.

Today, I recommend looking at the switches sold by Sewell;

Sewell - 6x1 HDMI switch

The best you'll get is 4K@60Hz though. I've used Sewell a couple of times and they've been the most reliable and least hassle video switches I've come across.
 

manustany

Unshakable Resolve
Member
Oct 27, 2017
3,528
The Space
Copying my response from Reddit:

Clickbait headline got me worried a bit at first. While technically not a lie, it's making it sound waay worse than it is.
So they sightly capped the maximum bandwidth, HDMI 2.1 specification goes up to 48 GBit/s, LG 2020 seems to have about 40 GBit/s and for comparison HDMI 2.0 tops at 18 GBit/s.

This means the LG 2020 models won't support 12 bit HDR at 4K... which is a total non-issue, because no panel on the market realistically supports more than 10 bit anyway. Using 12 bit wouldn't make a visible difference, and LG still supports 10 bit without any compromises at 4K 120 Hz, no chroma subsampling required. Which is the absolute maximum you can realistically expect the consoles to deliver anyway.

tl;dr: Forbes is technically correct, but you aren't losing any feature if you buy a 2020 model over e.g. a 2019 one.
Thanks for the clarification.
 

Gulfwarvet

Member
Oct 30, 2017
173
This whole thing is peeing me off lol. I waited from 2019 to get a new 2.1 tv and still waiting😢. Onto 2021 it seems lol.
 

DrDeckard

Banned
Oct 25, 2017
8,109
UK
looks like my KS8000 has another two years in it atleast :( so the CX is disappointing for 4k 120 VRR??
 

Bosch

Banned
May 15, 2019
3,680
I never thought gsync make so much diff. What a game changer it is. 1440p @120hz forever
 

tokkun

Member
Oct 27, 2017
5,399
Well, almost. The 2020 sets will support FreeSync. It might be meaningful for an AMD or Intel GPU PC build (Intel GPU is likely to use FreeSync too).

I think FreeSync support is only going to be useful if you plan to use it immediately. It isn't necessary for future proofing.

AMD has publicly stated they will release a driver update that supports HDMI Forum VRR. I would imagine that Intel would be more likely to support HDMI Forum VRR than FreeSync-over-HDMI, but maybe they will do both.
 

ppn7

Member
May 4, 2019
740
I don't understand, C9 is 12bits 4:4:4 but CX is only 10bits 4:4:4 or both are 10bits ?

Does 12bits mean less banding than 10bits panel ?