• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.

Super Craig

Member
Oct 27, 2017
653
So essentially putting a console in PC mode with DC forced off and in standard mode provides a lower input lag and better overall HDR image the both:

A) PC input type with Game Mode on

and / or

B) Game input type HDMI type without DC turned on?

I thought having DC on for HDR content was essential with these TVs to get actual HDR working properly and enables dynamic tone mapping.

Can someone briefly explain the benefits of moving over to forcing our consoles into PC input mode with lots of restricted settings
PC mode gives you low input lag irrespective of the preset that you use. HDR Standard gives you a brighter image from the off that compensates for the lack of Active HDR/Low Dynamic Contrast. It's a fudge. Ideally, you'd only use it for game consoles and use the proper modes/presets for watching media.

I notice more banding in PC mode in SDR. That can't be right. Shouldn't banding between PC mode and non-PC mode be identical in SDR?
Where are you seeing it?
 

Kyle Cross

Member
Oct 25, 2017
8,413
PC mode gives you low input lag irrespective of the preset that you use. HDR Standard gives you a brighter image from the off that compensates for the lack of Active HDR/Low Dynamic Contrast. It's a fudge. Ideally, you'd only use it for game consoles and use the proper modes/presets for watching media.


Where are you seeing it?
The sky, mainly. It seems to not be so much more banding, but rather the banding that's there is made more pronounced.
 

GReeeeN

Senior Analyst at GSD
Verified
Mar 6, 2018
329
OP, what made you change your og posting of game mode/ high DC over to PC Game mode on standard HDR when it appears on AV Forums that even with the latest firmware, HDR standard mode produces a very different looking image and lots of detail is lost. Looks like traditionally Standard HDR mode raises overall brightness, but detail is lost.

Are you recommending most people an overall softer/ less detailed image in Standard HDR PC mode in favour of more brightness over more a detailed and slightly crushed blacks image in game mode with DC on high?

Has this difference in detail between the two been proven?
 
OP
OP
P40L0

P40L0

Member
Jun 12, 2018
7,602
Italy
So essentially putting a console in PC mode with DC forced off and in standard mode provides a lower input lag and better overall HDR image the both:

A) PC input type with Game Mode on

and / or

B) Game input type HDMI type without DC turned on?

I thought having DC on for HDR content was essential with these TVs to get actual HDR working properly and enables dynamic tone mapping.

Can someone briefly explain the benefits of moving over to forcing our consoles into PC input mode with lots of restricted settings

OP, what made you change your og posting of game mode/ high DC over to PC Game mode on standard HDR when it appears on AV Forums that even with the latest firmware, HDR standard mode produces a very different looking image and lots of detail is lost. Looks like traditionally Standard HDR mode raises overall brightness, but detail is lost.

Are you recommending most people an overall softer/ less detailed image in Standard HDR PC mode in favour of more brightness over more a detailed and slightly crushed blacks image in game mode with DC on high?

Has this difference in detail between the two been proven?

Now that we have a properly fixed PC Input Mode, it's better to permantly switch there as HDMI Input for 3 main reasons:
  1. SDR can be set to fully calibrated ISF Expert (Dark Room) profile, that is a carbon copy of Technicolor Expert profile. This will provide more accurate PQ than SDR Game as it's now finally possible to select the Auto Color Gamut (instead of Wide) and to include all the calibrated White Balance and CMS adjustments. All while preserving 21ms Input Lag;
  2. HDR Standard is the brightest mode possible for gaming (it's as bright as HDR Technicolor Expert with Active HDR ON, if not brighter) while can also be color accurate after some adjustments. This is not possible with HDR Game as the only way to make it more bright is to increase Dynamic Contrast, and the more it's increased the more fidelity to reference colors is lost (and the only way to reach a similar brightness as HDR Technicolor Expert with Active HDR was with DC to HIGH...but noticeably crushing black levels on many scenes for that. This is not happening on HDR Standard). The only drawback of HDR Standard is highlights clipping over 1.000 nits parts of the image, but this generally translates only to clipped bright skies on daytime...and in the end, it's a more acceptable compromise compared to totally altering colors on all scenes all the time (plus if you go outside in real life on a sunny day riding a car and glance the clear sky, even your own eyes will "clip" all that brightness after a certain intensity. So that's not so bad or strange as a compromise!)
  3. You don't need to switch back and forth Inputs anymore, and you can now settle to PC Input for everything HDMI as even Dolby Vision Cinema mode is identical to "non-PC" quality...and most probably will have 21ms Input Lag too if and when DV games will come someday!
 

GReeeeN

Senior Analyst at GSD
Verified
Mar 6, 2018
329
Now that we have a properly fixed PC Input Mode, it's better to permantly switch there as HDMI Input for 3 main reasons:
  1. SDR can be set to fully calibrated ISF Expert (Dark Room) profile, that is a carbon copy of Technicolor Expert profile. This will provide more accurate PQ than SDR Game as it's now finally possible to select the Auto Color Gamut (instead of Wide) and to include all the calibrated White Balance and CMS adjustments. All while preserving 21ms Input Lag;
  2. HDR Standard is the brightest mode possible for gaming (it's as bright as HDR Technicolor Expert with Active HDR ON, if not brighter) while can also be color accurate after some adjustments. This is not possible with HDR Game as the only way to make it more bright is to increase Dynamic Contrast, and the more it's increased the more fidelity to reference colors is lost (and the only way to reach a similar brightness as HDR Technicolor Expert with Active HDR was with DC to HIGH...but noticeably crushing black levels on many scenes for that. This is not happening on HDR Standard). The only drawback of HDR Standard is highlights clipping over 1.000 nits parts of the image, but this generally translates only to clipped bright skies on daytime...and in the end, it's a more acceptable compromise compared to totally altering colors on all scenes all the time (plus if you go outside in real life on a sunny day riding a car and glance the clear sky, even your own eyes will "clip" all that brightness after a certain intensity. So that's not so bad or strange as a compromise!)
  3. You don't need to switch back and forth Inputs anymore, and you can now settle to PC Input for everything HDMI as even Dolby Vision Cinema mode is identical to "non-PC" quality...and most probably will have 21ms Input Lag too if and when DV games will come someday!
Thanks for the explanation!. So essentially when it comes down to it, we are trying to get the most accurate colours possible and havnt dived into whats happening with the IQ between the two options.

If I'm the type of person that have applied these settings and find them not "popping enough" for gaming (personal preference), am I better to just move back to Game mode with High DC (with apparently a more detailed image?) rather than cranking up the colour on your settings to 55-58?

Have you noticed this lost detail and softness of the image people are talking about in Standard HDR mode when compared to Game mode with DC on High?
 

FairFight

Member
Oct 27, 2017
794
Chandler, AZ
I continue to follow this thread and OP I have to say you're killing me. My OCD is in full effect. At first I thought, hey this guy has some good "final" settings I'll give it a shot. And then I t's changed over and over and over again... my head is spinning. I'm back to messing around with my settings more than actually playing games. Personal problem I know but damn.
 
OP
OP
P40L0

P40L0

Member
Jun 12, 2018
7,602
Italy
Thanks for the explanation!. So essentially when it comes down to it, we are trying to get the most accurate colours possible and havnt dived into whats happening with the IQ between the two options.

If I'm the type of person that have applied these settings and find them not "popping enough" for gaming (personal preference), am I better to just move back to Game mode with High DC (with apparently a more detailed image?) rather than cranking up the colour on your settings to 55-58?

Have you noticed this lost detail and softness of the image people are talking about in Standard HDR mode when compared to Game mode with DC on High?
It's impossible to have more "pop" than this while retaining 21ms Input Mode, so the question would be: "what if I want a more accurate tone mapping, no matter if noticeable dimmer?"

Then your choice would be returning to HDR Game mode with DC set to MEDIUM/LOW or even OFF for maximum accuracy.

Between HDR Game with DC set to HIGH and HDR Standard, with similar high brightness, HDR Standard is a better compromise overall (clipping over 1.000 nits only instead of color gamma warping), especially considering all recent games offer HDR Luminance slider to overcome even this compromise.
 
OP
OP
P40L0

P40L0

Member
Jun 12, 2018
7,602
Italy
I continue to follow this thread and OP I have to say you're killing me. My OCD is in full effect. At first I thought, hey this guy has some good "final" settings I'll give it a shot. And then I t's changed over and over and over again... my head is spinning. I'm back to messing around with my settings more than actually playing games. Personal problem I know but damn.
Sorry, but no one (including myself) was expecting LG to actually fix "PC Input" for an out of production TV series as 2017 models...but they did!

So I think this time is really the very end of setting everything up for gaming and mixed usage.

I just added a final small reminder when using HDR Standard for gaming tho:
  • Recommended HDR Luminance in games' slider (if available) is now 2.000 nits
Tested with Forza Horizon 4, where the game logo disappeared much before compared to HDR Game and 3.000 nits, and the game looks SO MUCH better now.

Most probably this is widely applicable for most games if they have similar in-game sliders, but as a general reminder just make sure to re-hide in-game logos or re-do in-game HDR calibration after the switch to PC Input + HDR Standard. Tone mapping is different between the two, and you will get much better results after.

Enjoy!
 

B_Mild

Member
Dec 1, 2017
77
Hey guys,

I need some help applying the .txt Dolby vision file.

No prompt to update dv profile comes up when I insert my freshly FAT32 formatted USB with the text file.

Also tried NTFS formatting as well and no luck.

I don't think I've ever installed this before so just wondering how to get this working.

To test, I changed some values in the text file to see if it would recognize a modified file and nothing pops up again.
 

BizzyBum

Member
Oct 26, 2017
9,137
New York
V3.2 / PC Mode is now FIXED - X1X SDR/HDR/DV Profiles updated: starting with official firmware v5.80.04, "PC Input" now finally provides a fully calibrated SDR Profile + a much brighter HDR that will best accommodate an all-around contents reproduction (Games, Movies, SDR and 4K/HDR Blu-rays), while retaining 21ms low Input Lag


Lowering HDR Standard Brightness to 49 and keeping 4:2:2 Unchecked mitigated the added banding I was facing compared to "non-PC" input, and further tuning to its color made it a viable alternative to HDR Game with DC set to HIGH, especially combining it to a much better SDR profile thanks to fully calibrated ISF Expert (Dark Room)

Weird, my B7A firmware is only 04.71.00 and when I manually check for an update it says no update available.
 
Last edited:

ZeeAyKay

Member
Oct 25, 2017
161
It's already set to automatic updates but I clicked on "check for updates" and still had no update where in the past if an update was available it would download then.

They don't release the firmware "over the air" at the same time it is availalbe through their website. They also don't have a set timeline, so if you want to update it now the only way is to download it to a thumb drive from the LG website and then plug it into the USB drive on your TV.
 

NoWayOut

Member
Oct 27, 2017
2,073
Sorry, but no one (including myself) was expecting LG to actually fix "PC Input" for an out of production TV series as 2017 models...but they did!

So I think this time is really the very end of setting everything up for gaming and mixed usage.

I just added a final small reminder when using HDR Standard for gaming tho:
  • Recommended HDR Luminance in games' slider (if available) is now 2.000 nits
Tested with Forza Horizon 4, where the game logo disappeared much before compared to HDR Game and 3.000 nits, and the game looks SO MUCH better now.

Most probably this is widely applicable for most games if they have similar in-game sliders, but as a general reminder just make sure to re-hide in-game logos or re-do in-game HDR calibration after the switch to PC Input + HDR Standard. Tone mapping is different between the two, and you will get much better results after.

Enjoy!

Sorry to bother you again, but could you clarify if the issue you described before in the thread where PC Input always used 4:4:4 no matter what is resolved, so it can use 4:2:0 for HDR now? Second question, can the PC label be changed to something else? I ask because I remember reading somewhere (it could be wrong) that if the port is not named PC the TV will not use the PC Input setting.
 

Samaritan

Member
Oct 25, 2017
6,696
Tacoma, Washington
Sorry, but no one (including myself) was expecting LG to actually fix "PC Input" for an out of production TV series as 2017 models...but they did!

So I think this time is really the very end of setting everything up for gaming and mixed usage.

I just added a final small reminder when using HDR Standard for gaming tho:
  • Recommended HDR Luminance in games' slider (if available) is now 2.000 nits
Tested with Forza Horizon 4, where the game logo disappeared much before compared to HDR Game and 3.000 nits, and the game looks SO MUCH better now.

Most probably this is widely applicable for most games if they have similar in-game sliders, but as a general reminder just make sure to re-hide in-game logos or re-do in-game HDR calibration after the switch to PC Input + HDR Standard. Tone mapping is different between the two, and you will get much better results after.

Enjoy!
Is there any particular reason that 2,000 nits is the target luminance value with HDR Standard + PC? Setting in-game luminance has always baffled me, since you'd naturally assume to set it to the max peak brightness of your TV, which would be around 800 for the 7-series.
 

Super Craig

Member
Oct 27, 2017
653
Is there any particular reason that 2,000 nits is the target luminance value with HDR Standard + PC? Setting in-game luminance has always baffled me, since you'd naturally assume to set it to the max peak brightness of your TV, which would be around 800 for the 7-series.
The TV tries to show more detail than the panel would normally allow. With PC mode + HDR Standard, the LG is throwing away all extra detail above 2,000 nits. Normally, it would try to preserve up to 4,000 nits worth of detail. It's personal preference as to what you prefer.
 

Samaritan

Member
Oct 25, 2017
6,696
Tacoma, Washington
The TV tries to show more detail than the panel would normally allow. With PC mode + HDR Standard, the LG is throwing away all extra detail above 2,000 nits. Normally, it would try to preserve up to 4,000 nits worth of detail. It's personal preference as to what you prefer.
So something about the PC input caps the TV's ability to tone map anything above 2,000 nits? Sorry if these are basic questions; tone mapping and setting luminance values in games has always been the part of owning an HDR TV I've struggled the most to understand.

Basically, from what I've gathered, my TV will attempt to tone map anything from 0 - 4,000 nits to its own capabilities of 0 - around 800 nits, but if I manually set the peak luminance in a game, that determines the highest point my TV will attempt to tone map down, correct?
 

Super Craig

Member
Oct 27, 2017
653
So something about the PC input caps the TV's ability to tone map anything above 2,000 nits? Sorry if these are basic questions; tone mapping and setting luminance values in games has always been the part of owning an HDR TV I've struggled the most to understand.

Basically, from what I've gathered, my TV will attempt to tone map anything from 0 - 4,000 nits to its own capabilities of 0 - around 800 nits, but if I manually set the peak luminance in a game, that determines the highest point my TV will attempt to tone map down, correct?
By changing the settings in a game, you're basically telling it what the TV can do.
 

thuway

Member
Oct 27, 2017
5,168
Any advice on how to get the TV to recognize the USB for custom Dolby profile? I've been trying for hours to no avail
 
OP
OP
P40L0

P40L0

Member
Jun 12, 2018
7,602
Italy
Hey guys,

I need some help applying the .txt Dolby vision file.

No prompt to update dv profile comes up when I insert my freshly FAT32 formatted USB with the text file.

Also tried NTFS formatting as well and no luck.

I don't think I've ever installed this before so just wondering how to get this working.

To test, I changed some values in the text file to see if it would recognize a modified file and nothing pops up again.

Any advice on how to get the TV to recognize the USB for custom Dolby profile? I've been trying for hours to no avail

For the DV patch to work the USB drive must be formatted in FAT32 and must contain only the txt file, no other files or folder must be in the drive.

Then you have to load a Dolby Vision movie on the TV using DV Cinema (User) profile (to obtain the "User" tag just change any value from its defaults) and plug the USB for the file to be recognized.

If this does not work, try replugging or turning off/on the TV with the USB plugged, or you can also try to do all that using another DV source (example Dolby Access webOS app loading some DV Demo).
Try also a different USB drive.

If you never applied the DV patch before, it will work in the end.
 

thuway

Member
Oct 27, 2017
5,168
For the DV patch to work the USB drive must be formatted in FAT32 and must contain only the txt file, no other files or folder must be in the drive.

Then you have to load a Dolby Vision movie on the TV using DV Cinema (User) profile (to obtain the "User" tag just change any value from its defaults) and plug the USB for the file to be recognized.

If this does not work, try replugging or turning off/on the TV with the USB plugged, or you can also try to do all that using another DV source (example Dolby Access webOS app loading some DV Demo).
Try also a different USB drive.

If you never applied the DV patch before, it will work in the end.
How do you know if patch is applied ?
 
OP
OP
P40L0

P40L0

Member
Jun 12, 2018
7,602
Italy
Does anybody know if the recent firmware update is compatible for the OLED 55B7T?
Yours is currently on 04.70.36 even on the download page under support.

Unless regions dont matter I can see the OLED55B7P firmware working, but I wouldnt risk it till some on avs or P40L0 checka it out

LG OLEDs firmwares are usually split in 2 macro versions that covers half of the regions each:

1) Asia, USA, Canada, Latin America
2) Europe, South Africa, Australia and Middle East

So for the 55B7T you should try downloading 5.80.04 from the US LG website and should be identical of what will be available also in your country.
The worst that could happen anyway is that the TV won't recognize/accept it from the USB drive and won't upgrade, so it's still worth a try.
 
OP
OP
P40L0

P40L0

Member
Jun 12, 2018
7,602
Italy
Sorry to bother you again, but could you clarify if the issue you described before in the thread where PC Input always used 4:4:4 no matter what is resolved, so it can use 4:2:0 for HDR now? Second question, can the PC label be changed to something else? I ask because I remember reading somewhere (it could be wrong) that if the port is not named PC the TV will not use the PC Input setting.
Yes, using now PC Input, once setup as suggested, means 4:4:4 RGB Limited @ 8-bit in SDR and 4:2:0 YUV HDR @ 10-bit.
Also if you A/B switch from PC Input to regular HDMI Input with same contents, same HDR profile and settings the PQ will be identical.

You don't need to rename the input to make it work, just be sure to change its icon from any type to PC to correctly enable PC Mode.
 

TheZynster

Member
Oct 26, 2017
13,285
If you will use PC Input, yes.

Also be sure to use Standard/Limited color space to match the Black Level: LOW of the TV and for PS4 Pro select 10-bit color depth (then apply all other suggested settings).

Why is your profile for the Xbox One X say use 8-bit though? Is that a typo or am I missing something?
 
OP
OP
P40L0

P40L0

Member
Jun 12, 2018
7,602
Italy
Why is your profile for the Xbox One X say use 8-bit though? Is that a typo or am I missing something?
On X1X selecting 8-bit (and Unchecking 4:2:2) is the best option when using the fixed PC Input.

This way the console will use 4:4:4 RGB Limited @ 8-bit for all SDR, and auto switch to YUV420 @ 10-bit for all HDR/DV
 

inspectah

Member
Oct 28, 2017
1,183
Germany
I wanted to try these settings, unfortunately you can´t use TruMotion in PC mode (not even Real Cinema). :(
I use it for movies and don´t want to pass on it, because it makes low framerate content so much more pleasant to watch on this TV.
All inputs run through my AVR, so I can´t use a different input for gaming on the TV.

I´´m pretty satisfied with gaming mode though.
The only gripe I really have with it with SDR content is the fixed color gamut.
I would understand if it was fixed to Auto... but Wide???
Why LG??? Why?? I don´t understand it.
 
Last edited:

Super Craig

Member
Oct 27, 2017
653
I wanted to try these settings, unfortunately you can´t use TruMotion in PC mode (not even Real Cinema). :(
I use it for movies and don´t want to pass on it, because it makes low framerate content so much more pleasant to watch on this TV.
All inputs run through my AVR, so I can´t use a different input for gaming on the TV.

I´´m pretty satisfied with gaming mode though.
The only gripe I really have with it with SDR content is the fixed color gamut.
I would understand if it was fixed to Auto... but Wide???
Why LG??? Why?? I don´t understand it.
Does your AVR have two HDMI outputs?
 
OP
OP
P40L0

P40L0

Member
Jun 12, 2018
7,602
Italy
I wanted to try these settings, unfortunately you can´t use TruMotion in PC mode (not even Real Cinema). :(
I use it for movies and don´t want to pass on it, because it makes low framerate content so much more pleasant to watch on this TV.
All inputs run through my AVR, so I can´t use a different input for gaming on the TV.

I´´m pretty satisfied with gaming mode though.
The only gripe I really have with it with SDR content is the fixed color gamut.
I would understand if it was fixed to Auto... but Wide???
Why LG??? Why?? I don´t understand it.
On Dolby Vision you still can enable both Real Cinema and TruMotion on PC Mode.
For games those settings make Input Lag sky rocket, that's why they're grayed out to always keep it steady at 21ms

Anyway both PC Input (all modes) and SDR/HDR Game Modes have an equivalent of Real Cinema motion handling and de-judder, so Real Cinema is not needed to be enabled: they both have the same benefit by default.
While on my opinion, TruMotion and its soap opera effect is really distorting reference material and create too much artifact for my tastes, that's why all professional calibrators turned it off for any input/modes
 
OP
OP
P40L0

P40L0

Member
Jun 12, 2018
7,602
Italy
What are the settings we should use for HDR standard mode on PS4 pro?
Select:

4K Resolution
60Hz
YUV420 Chroma Subsampling
10-bit Color Depth
Standard/Limited Color Space

Than apply all the rest of suggested settings for the TV, and put games to 2.000 nits if you are able to or re-do the hiding of the logos in-game.
 

thuway

Member
Oct 27, 2017
5,168
Select:

4K Resolution
60Hz
YUV420 Chroma Subsampling
10-bit Color Depth
Standard/Limited Color Space

Than apply all the rest of suggested settings for the TV, and put games to 2.000 nits if you are able to or re-do the hiding of the logos in-game.
Out of the op which page should I use? The Webos hdr mode?
 

pablogers

Member
Oct 27, 2017
130
Do you think HDR pc mode with this settings looks better than game mode with dinamic contrast medium?? im still trying to see which one i prefer and im having doubts :D, i think the imput lag is much better with game mode.....
 

GReeeeN

Senior Analyst at GSD
Verified
Mar 6, 2018
329
Do you think HDR pc mode with this settings looks better than game mode with dinamic contrast medium?? im still trying to see which one i prefer and im having doubts :D, i think the imput lag is much better with game mode.....

If you have your input set as pc, from my understanding your input lag should be locked at 21ms which is the same as game mode.

The advantage of running in pc input mode is that you will have a brighter HDR image when using the HDR standard preset as opposed to game device input type with game mode HDR.
 

inspectah

Member
Oct 28, 2017
1,183
Germany
On Dolby Vision you still can enable both Real Cinema and TruMotion on PC Mode.
For games those settings make Input Lag sky rocket, that's why they're grayed out to always keep it steady at 21ms

Anyway both PC Input (all modes) and SDR/HDR Game Modes have an equivalent of Real Cinema motion handling and de-judder, so Real Cinema is not needed to be enabled: they both have the same benefit by default.
While on my opinion, TruMotion and its soap opera effect is really distorting reference material and create too much artifact for my tastes, that's why all professional calibrators turned it off for any input/modes
DV is no option, because not all HDR content uses it.
And then there is still SDR.

Of course I use TruMotion only for movies and series because of the lag.

TruMotion is distorting nothing, remember this is no binary setting!
I have it set to 3 and there is no visible artifacting in motion (someone made excessive tests on avsforum or avforums, i don´t remember) and the effect is very subtle, so there is also no soap opera effect.

For me the setting is mandatory on the 7 series, because otherwise videocontent stutters very visbly due to the almost instant pixel shifts of the OLED technology.
Cinema projectors and other technologies like LCD have much slower transition times between frames, which adds motion blur and makes the image look smoother.
And thats what reference material (at least for movies) is mastered for.
In the 8 series they added black frame insertion to better combat this stuttering.

Thanks for your settings anyway, I will try them a bit on a different input. :)

P.S.: Do you have a changelog for this FW?
 

pablogers

Member
Oct 27, 2017
130
Select:

4K Resolution
60Hz
YUV420 Chroma Subsampling
10-bit Color Depth
Standard/Limited Color Space

Than apply all the rest of suggested settings for the TV, and put games to 2.000 nits if you are able to or re-do the hiding of the logos in-game.


And for SDR games in ps4 pro? what do you suggest, im having problems understanding if i have to change back to RGB and full and high blacks while in SDR and expert dark room... please help :D , and thanks amazing thread.
 
OP
OP
P40L0

P40L0

Member
Jun 12, 2018
7,602
Italy