• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.

FuturaBold

Member
Oct 27, 2017
2,518
Any reason for gamma at 2.0? I feel like thats way too bright. I run everything at 2.4, but I do have oled, so I have great black detail.
According to Rtings, the TCL-P series tracks at 2.2 gamma (HD standard) closer when set to 2.0. The TCL-P does have one of the best contrast ratios for an LCD TV. As a former long time Sony TV owner, I also would set the Sony to 2.4. However 2.4 on the TCL-P crushes blacks.
 

dsk1210

Member
Oct 25, 2017
2,390
Edinburgh UK
You know what annoys me!

The fact that Windows does not have an HDR shortcut key.

If I want to play Final Fantasy in HDR, I always have to go into the settings and change it manually, then go back in and change it back when I am finished.

Simple but annoying and easy to solve I am sure.
 
OP
OP
EvilBoris

EvilBoris

Prophet of Truth - HDTVtest
Verified
Oct 29, 2017
16,680

shinbojan

Member
Oct 27, 2017
1,101
If I want to play Final Fantasy in HDR, I always have to go into the settings and change it manually, then go back in and change it back when I am finished.

Yes, that is annoying.
I've been looking for a way to do this with powershell, so that I can make a script that would enable/disable by double clicking a file on desktop (not ideal, but better than going to settings).
Will let you know if I find a way.
 

shinbojan

Member
Oct 27, 2017
1,101
Can you try out Injustice 2? Demo is free on PSN.
I was pleasantly surprised by how it looks, I was expecting way worse. Ofc, I would not play online in HDR, it's just too distracting.
 

J_ToSaveTheDay

Avenger
Oct 25, 2017
18,789
USA
Nice work and thanks for the run-down, EvilBoris !

As someone who just got an LG OLED B7A, I'm still coming to grips with black levels. Compared to my SDR experiences and how I used to configure things from a purely subjective standpoint, the level of darkness that this TV is capable of in HDR Mode is kind of at a point that makes me get the impression of crushed blacks. I follow the generalized HDR Game Mode settings from Rtings with Wide Color Gamut selected, out of purely subjective preference for more vibrant color.

In many games on my B7A, I notice that at a certain point on some of these sliders (like AC:O or Injustice 2), there's a certain point where cranking a slider up literally has no bearing on the reference image whatsoever. Like in AC:O, going from 900-1000 nits on the slider is the final time I see any effect on my image whatsoever -- no matter how much higher I go from there, black levels don't change and the bright highlights don't appear to get any brighter. Based on the information in your OP, I take it that this is where metadata is being fed to the TV and the TV is tonemapping any changes?

If that's the case, is the ideal setting always to just crank these sliders to max and let the TV sort things out, or do I want to be setting it somewhere in the range where I can still see differences in the image to each notch?

As an additional example, I feel like at around 3500 nits on Injustice 2's calibration slider, I can no longer see the inner box of the calibration screen but I also can't see the blacks of the image changing whatsoever (I don't see the image getting brighter around Batman standing at his console in the background, for instance).
 
OP
OP
EvilBoris

EvilBoris

Prophet of Truth - HDTVtest
Verified
Oct 29, 2017
16,680
If the game has a slider you are simply telling the engine where to stop producing brighter light than the display can actually see, the game will also often configure other aspects of the image to ensure that the image remains looking right within this range you have set.

The game will always send the raw image data which will tell the TV what to display, the metadata that is sent never changes. The Metadata contains information about what type of display was used to master the content on. Your display is then meant to be able to use this data to make some changes to the overall image to try ensure that the image remains look broadly similar, despite your TV perhaps not having the hardware capability of the Mastering display.

In the case of the B7, it's max output is around 800nits, so technically going above that won't actually gain anything. With a UHD movie, if you send that TV a disk that is sending metadata which says "the brightest pixel is 1000 nits in this movie" then the TV takes that information and will adjust the image displayed so that anything that the disc is saying should be 1000 nits, is actually moved down to the brightest the TV can do (800 nits). So every bit of data that exists between 800-1000 nits on your display needs to be mapped downwards, without impacting too much on the rest of the image.

This is the part that all HDR10 Tvs actually handle differently.


So in a game such as injustice 2, that HDR adjustment you are making should only be affecting the very brightest things, but it perhaps sounds like it has a very good lightning system and display mapper and as you are increasingly the slider up towards those achievable 3500 nit values, the algorithm used to generate the grading is seeing there is a bit more overhead, so not as much 'artificial' contrast is required, so you are seeing the blacks lighting up a little.

So in a nutshell , if your TV is configured in a way that means it isn't making dynamic adjustments or moving too far away from the HDR10 PQ standards, setting your in game setting to match your TV's peak brightness (which is somewhere between 700-800nits) should provide you with a technically optimal image for a game.
 

Waffle

Member
Oct 28, 2017
2,821
Pretty confused about the Tomb Raider luminance slider. So setting it to max let's the tv tone map it without clipping a lot of the bright areas? Also if I'm trying to eye ball it with the on screen instructions, I think the game wants me to set it only 2 clicks from Minimum which I think you said it's like frostbite games so does that mean one or two clicks away from min is SDR range? I might be misunderstanding and very confused lol
 
Last edited:

ss_lemonade

Member
Oct 27, 2017
6,648
You know what annoys me!

The fact that Windows does not have an HDR shortcut key.

If I want to play Final Fantasy in HDR, I always have to go into the settings and change it manually, then go back in and change it back when I am finished.

Simple but annoying and easy to solve I am sure.
I'm surprised that they didn't just add an HDR setting ingame instead of relying on Windows. Is this because there's also a Windows store version and that they just decided to reuse the same codebase between the 2?
 

Samaritan

Member
Oct 25, 2017
6,696
Tacoma, Washington
So in a nutshell , if your TV is configured in a way that means it isn't making dynamic adjustments or moving too far away from the HDR10 PQ standards, setting your in game setting to match your TV's peak brightness (which is somewhere between 700-800nits) should provide you with a technically optimal image for a game.
This isn't always the case, though. I also own a B7, like J_ToSaveTheDay, and in the case of Assassin's Creed: Origins for example, setting peak brightness to 7 or 800 produces an incorrect image, as you can see major artifacts in the test image around where the sun is. The brightness on that test image only starts looking correct as you approach 1000, around 950 or so. So what's going on there?
 

Barneystuta

Member
Nov 4, 2017
1,637
Over time, HDR is a game changer. It is in games already today.

However, the tech is still lacking a standard. I don't mean Dolby Vision or HDR10, more the fact that calibration for different games will give a diddiffer result.

Two extremes examples are Gears of War 4 and Re-Core.

Gears is fantastic and I've got a near perfect (for my TV/set up) calibration. Boot up Recore and it is a mess.

The more games add in their own calibration, the better, because different games will always have different light levels and colour pallets.
 

texhnolyze

Member
Oct 25, 2017
23,154
Indonesia
Over time, HDR is a game changer. It is in games already today.

However, the tech is still lacking a standard. I don't mean Dolby Vision or HDR10, more the fact that calibration for different games will give a diddiffer result.

Two extremes examples are Gears of War 4 and Re-Core.

Gears is fantastic and I've got a near perfect (for my TV/set up) calibration. Boot up Recore and it is a mess.

The more games add in their own calibration, the better, because different games will always have different light levels and colour pallets.
The mass market won't buy HDR display in droves unless it's a one-click setting at startup and never to touch again. People don't want to deal with all the tinkering everytime they want to switch games.
 
OP
OP
EvilBoris

EvilBoris

Prophet of Truth - HDTVtest
Verified
Oct 29, 2017
16,680
This isn't always the case, though. I also own a B7, like J_ToSaveTheDay, and in the case of Assassin's Creed: Origins for example, setting peak brightness to 7 or 800 produces an incorrect image, as you can see major artifacts in the test image around where the sun is. The brightness on that test image only starts looking correct as you approach 1000, around 950 or so. So what's going on there?

I would say that it is a bug that causes that, after all it's a static image and not a real time rendered lit scene.
It may even be that the peak brightness slider only affects a couple of the parts of that image and it's possible to take those parts below the brightness of the surrounding image which produces those artifacts.
Deus Ex does this in game.
 

RashBandicoot

Member
Nov 3, 2017
124
So far the only two HDR games that don't work for me are Dissidia Final Fantasy and Sea of Thieves. Both look muddy and drab when HDR is switched on.

The rest, including the more recent Ni No Kuni II, look stunning in HDR.
 
OP
OP
EvilBoris

EvilBoris

Prophet of Truth - HDTVtest
Verified
Oct 29, 2017
16,680
Over time, HDR is a game changer. It is in games already today.

However, the tech is still lacking a standard. I don't mean Dolby Vision or HDR10, more the fact that calibration for different games will give a diddiffer result.

Two extremes examples are Gears of War 4 and Re-Core.

Gears is fantastic and I've got a near perfect (for my TV/set up) calibration. Boot up Recore and it is a mess.

The more games add in their own calibration, the better, because different games will always have different light levels and colour pallets.

That's mainly because Recore's calibration is actually broken.

We are still in a period where developers haven't found best practice yet.

There are good examples of games that work really great out of the box with no real calibration: Final Fantasy, destiny 2 , Agents of Mayhem, Sea of Thieves, Horizon.

Then there are games that offer Calibration and Deliver fantastic results : Tomb Raider, COD WW2, Assassins Creed, Battlefront 2.

Then you've got others in between, where the HDR implementation is perhaps trying to deal with people's misunderstanding of What HDR does , poor user calibration or even trying to bypass the reality that lots of users are using HDR on screens without the hardware capability to make use of it.
God of War is an example of this, it's actually pretty poor as a showcase or "reference" example.

Sea of Thieves however has had a lot of people complain about it (although it follows the HDR PQ to a T) , however it is a still a title that really demonstrates just how impressive HDR can be.
 

inspectah

Member
Oct 28, 2017
1,183
Germany
I got a C7 and the God of War Pro bundle last Week and was expecting a great deal of fiddling with HDR, but it was pretty straight forward.

I used the Rtings settings and left everything on default in GoW and it looks beautiful and natural.

Only thing I'm not sure about is the color temperature slider.
I have it on W30 for my UHD collection, because movies are mastered with the warm picture, but I'm not sure about games.
I currently have it on 0 for HDR and SDR, any other opinions?
 

Kyle Cross

Member
Oct 25, 2017
8,413
I got a C7 and the God of War Pro bundle last Week and was expecting a great deal of fiddling with HDR, but it was pretty straight forward.

I used the Rtings settings and left everything on default in GoW and it looks beautiful and natural.

Only thing I'm not sure about is the color temperature slider.
I have it on W30 for my UHD collection, because movies are mastered with the warm picture, but I'm not sure about games.
I currently have it on 0 for HDR and SDR, any other opinions?
W45 is actually closest to the mastering standard, at least according to Rtings.
 

Maturin

Member
Oct 27, 2017
3,101
Europe
W45 is actually closest to the mastering standard, at least according to Rtings.

While W45 is close to Warm2 - the supposed correct setting - I've seen problems with some LG models in HDR at this settings. not only my issue below, but I've had discussion with others who've had a similar problem and solution for 2016/17 LG models.

For my own 2016 LCD Warm2/W45 is fine for SDR and Dolby Vision. But for HDR - no matter the source - Warm2/W45 gets really overblown, too warm, too saturated and it seems to damage HDR highlights. This is odd given that Warm2/W45 works absolutely fine for Dolby Vision.

Anyway, whatever the cause of this, I've found for HDR10 sources Warm1/W15 looks much closer to how Warm2/W45 looks in SDR/DV. And so W1/W15 is my preferred setting when watching/playing anything in HDR, not because I don't like Warm2, but because Warm2 seems to be broken in HDR.
 
OP
OP
EvilBoris

EvilBoris

Prophet of Truth - HDTVtest
Verified
Oct 29, 2017
16,680
So I saw that Monster Hunter world has had an update recently and thought I'd revisit it after it look like it was perhaps a little better than it was.

Looks like this is the case, on the default 2 setting for Brightness
hyGuXBk.jpg


Still has abnormally raised blacks in this area, but the overall brightness is now significantly reduced

2 brightness whilist even starting at the sun now caps out at closer to 1000nit

D5imRMn.jpg


Quite a departure to the eye searing APL and insane brightness that was seen back in February.

Monster Hunter World
Much like Deus Ex, Monster Hunter World appears to operate within 4000nits, however much like DEUS EX, when HDR is enabled, the game appears to have severe black level problems. At the default brightness setting, this is what we are getting.
ux3EHkT.jpg

aRiOgO6.jpg


All mids and highlights, where are the shadows?

Heading in the right direction for improvement anyway.
 

taggen86

Member
Mar 3, 2018
464
IFinal Fantasy XV
From one Squarenix meh, to a Squarenix wow.
1000 nit fixed max output and a simple brightness slider to drop black levels.
Really fantastic grading throughout and in various lightning conditions.

jpKBjZw.jpg


xzvttv8.jpg


Even the Titlescreen has 2D elements optimised for HDR.
7zGfh6M.jpg






Any tips for the pc version of final fantasy 15 EvilBoris? The game has a HDR luminance slider between 0 and 1000 (in nits i suppose). Should I put it at 1000 or at my tvs peak brightness: 700 (my b6 oled has a peak brightness between 600 and 800)? 1000 provides a brighter overall image but clips a lot of white details (it is difficult to differentiate the sun from the clouds for example, just like in the ps4 version) while 700 is somewhat dimmer but preserves a lot of detail. In the absence of a test image, is it reasonable to use the sun as a reference when adjusting in game HDR settings?

Also, if we put (non-hdr) brightness as low as possible as you recommend in games, isnt there a risk that we clip shadow details? (games usually have test image that you should follow when it comes to brightness, so you do not clip near black details). Shouldnt we follow the in game test images instead?
 
Last edited:
OP
OP
EvilBoris

EvilBoris

Prophet of Truth - HDTVtest
Verified
Oct 29, 2017
16,680
Any tips for the pc version of final fantasy 15 EvilBoris? The game has a HDR luminance slider between 0 and 1000 (in nits i suppose). Should I put it at 1000 or at my tvs peak brightness: 700 (my b6 oled has a peak brightness between 600 and 800)? 1000 provides a brighter overall image but clips a lot of white details (it is difficult to differentiate the sun from the clouds for example, just like in the ps4 version) while 700 is somewhat dimmer but preserves a lot of detail. In the absence of a test image, is it reasonable to use the sun as a reference when adjusting in game HDR settings?

Also, if we put (non-hdr) brightness as low as possible as you recommend in games, isnt there a risk that we clip shadow details? (games usually have test image that you should follow when it comes to brightness, so you do not clip near black details). Shouldnt we follow the in game test images instead?

I typically use the in game sun as a reference for the brightest content.
The other brightness slider depends on the game, lots of games work just fine on the default, some require it to be dialled back, others will simply crush the shadow detail.

There are a number of games where the in game brughtnsss measure has only been calibrated for SDR and it doesn't function as intended in HDR mode.
 

taggen86

Member
Mar 3, 2018
464
I typically use the in game sun as a reference for the brightest content.

When using the sun as a reference, I guess you want the sun as bright as possible while at the same time being able to differentiate it from the nearby clouds? (maximize peak brightness while not clipping too much details) If that is the aim, then it is clear that adjusting the HDR peak brightness for your tv (as in FF XV pc) is superior to a 1000 nits lock (FF XV on ps4 pro), as long as your tv is not capable of 1000 nits (as in the case of my b6 oled). In the latter case, it is not possible to differentiate the sun from the clouds and FF XV on ps4 produces a washed out image, in comparison to FF XV on PC.
 

Laserdisk

Banned
May 11, 2018
8,942
UK
I wish 4k and 10k were the standard levels, it's a shame like UHD disc they are dumbing it down with lower nit mastering.
 

Angel DvA

Banned
Oct 27, 2017
1,232
You should analyze Detroit, the game is just stunning and the HDR implementation the best I've ever seen.
 
OP
OP
EvilBoris

EvilBoris

Prophet of Truth - HDTVtest
Verified
Oct 29, 2017
16,680
I wish 4k and 10k were the standard levels, it's a shame like UHD disc they are dumbing it down with lower nit mastering.

How are they going to master it? There isn't a 10k nit display available.
Dolby's Reference monitor which can hit 4K nits has to be liquid cooled as the LEDs generate so much heat.
Even Sony's reference OLED display is 1000
 

Dinobot

Member
Oct 25, 2017
5,126
Toronto, Ontario, Canada
Oh I'm subbing to this thread. Got my X900E a week ago.

HDR has impressed me much more than 4k has.

Currently playing God of War.

Tested Ratchet, Uncharted 4 and LL, Infamous SS and FL, Horizon and TLOU.

All are beautiful in their own way. Games that feature a wide color palette really shine (Ratchet, Uncharted, Horizon being the most colorful due to the locales) .
 

Laserdisk

Banned
May 11, 2018
8,942
UK
How are they going to master it? There isn't a 10k nit display available.
Dolby's Reference monitor which can hit 4K nits has to be liquid cooled as the LEDs generate so much heat.
Even Sony's reference OLED display is 1000

4k was what I lead with, and not future proofing but mastering for the OLED market is a foolhardy way of doing it.
My point is they need to master higher it's not their fault that a few displays are hitting under 600 full window.
If they cannot hit it or time map it down they need to work on their display tech.
The zd9 can do both, and most Sony displays can tone map down also.
 

Videophile

Tech Marketing at Elgato
Verified
Oct 27, 2017
63
San Francisco, CA
Using an HDFury Vertex I checked the HDR output of an Xbox One X and noticed it did not define any sort of max brightness(MaxCLL) or average brightness(MaxFALL). Does this mean TVs will assume that they are receiving a 10,000 nit signal?
 
OP
OP
EvilBoris

EvilBoris

Prophet of Truth - HDTVtest
Verified
Oct 29, 2017
16,680
Using an HDFury Vertex I checked the HDR output of an Xbox One X and noticed it did not define any sort of max brightness(MaxCLL) or average brightness(MaxFALL). Does this mean TVs will assume that they are receiving a 10,000 nit signal?

The metadata on all consoles is actually outputting their EOTF at 255.
This value should be 0-4 depending on whether smtpe2084 or HLG (or some others)

TVs respond to false data in different ways.
Ks8000 for example just follows the absolute values and clips at a fixed value.

I theorise that some sets assume for example 4000nit data and performs some kind of tone mapping.
This fits in with users who describe their TV letting them them select 3000-4000nit max outputs on games, despite the TV not physically being able to display this.
I'm able to replicate the same thing by telling my TV to tone map.

I also suspect there is a lack of awareness for this in gaming. I suspect that God of War had been mastered on a set that was tonemapping it's 4000nit data (probably an LG OLED, given their ubiquity with AV enthusiasts ). When this image is presented in a display that exceeds the peak of that consumer set or a display that doesn't try to tone map this image it's jusu looks too bright and has a lot of clipping.
This also explains why there was a multi page thread dedicated to how to calbrate the games very simple settings , which never came to a consensus.

This then leaves a 3rd option, TVs that don't recognise this false data at all. These sets may not even recognise it as hdr at all. Is this why some users describe a washed out image?

A couple of UHD discs use this trick. Planet earth 2 being the most notable. The content doesn't exceed 600ish nits, well within the range of most TVs without any tone mapping.
 
OP
OP
EvilBoris

EvilBoris

Prophet of Truth - HDTVtest
Verified
Oct 29, 2017
16,680
4k was what I lead with, and not future proofing but mastering for the OLED market is a foolhardy way of doing it.
My point is they need to master higher it's not their fault that a few displays are hitting under 600 full window.
If they cannot hit it or time map it down they need to work on their display tech.
The zd9 can do both, and most Sony displays can tone map down also.

For video content it's fine, as you provide the metadata to help those 600nit displays display that image correctly.

You say you lead with 4K nits? What do you do?

For games you don't have that option for metadata, so the best option is to have a display mapper that does it all dynamically. I don't think we will see Dolby vision or dynamic metadata because the nature of the way games are naturally very variable and the data is generated in real time so can be controlled at the point of output.
I'd like to see a global system wide control that sets peak brightness.
 

Laserdisk

Banned
May 11, 2018
8,942
UK
For video content it's fine, as you provide the metadata to help those 600nit displays display that image correctly.

You say you lead with 4K nits? What do you do?

For games you don't have that option for metadata, so the best option is to have a display mapper that does it all dynamically. I don't think we will see Dolby vision or dynamic metadata because the nature of the way games are naturally very variable and the data is generated in real time so can be controlled at the point of output.
I'd like to see a global system wide control that sets peak brightness.
I am not a fan of Dolby Vision, I don't like playing to the cheap seats.
And for video they are all mastering lower of late and that is painful to me as they will be obsolete pretty soon.
 

Videophile

Tech Marketing at Elgato
Verified
Oct 27, 2017
63
San Francisco, CA
The metadata on all consoles is actually outputting their EOTF at 255.
This value should be 0-4 depending on whether smtpe2084 or HLG (or some others)

TVs respond to false data in different ways.
Ks8000 for example just follows the absolute values and clips at a fixed value.

I theorise that some sets assume for example 4000nit data and performs some kind of tone mapping.
This fits in with users who describe their TV letting them them select 3000-4000nit max outputs on games, despite the TV not physically being able to display this.
I'm able to replicate the same thing by telling my TV to tone map.

I also suspect there is a lack of awareness for this in gaming. I suspect that God of War had been mastered on a set that was tonemapping it's 4000nit data (probably an LG OLED, given their ubiquity with AV enthusiasts ). When this image is presented in a display that exceeds the peak of that consumer set or a display that doesn't try to tone map this image it's jusu looks too bright and has a lot of clipping.
This also explains why there was a multi page thread dedicated to how to calbrate the games very simple settings , which never came to a consensus.

This then leaves a 3rd option, TVs that don't recognise this false data at all. These sets may not even recognise it as hdr at all. Is this why some users describe a washed out image?

A couple of UHD discs use this trick. Planet earth 2 being the most notable. The content doesn't exceed 600ish nits, well within the range of most TVs without any tone mapping.

This is the info I see from an Xbox One X (playing Far Cry 5) going into a Atomos Ninja Inferno. Seems odd that no HDR info besides the EOTF is being defined.
cXHP7k3.png
 
OP
OP
EvilBoris

EvilBoris

Prophet of Truth - HDTVtest
Verified
Oct 29, 2017
16,680
This is the info I see from an Xbox One X (playing Far Cry 5) going into a Atomos Ninja Inferno. Seems odd that no HDR info besides the EOTF is being defined.
cXHP7k3.png
Sorry, I've made a mistake somewhere, just went and checked my notes. I had it in my head that what I was seeing was the 02:00 value which was being set to something other than SMPTE 2084.