• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.

Ferrs

Avenger
Oct 26, 2017
18,829
Hello,

I play on ps4 standart on LG C8, when I set my console to RGB on automatic it goes to full RGB which forces me to switch to high black level on my C8 which apparently is not good for the SDR. So I have to force the ps4 into RGB limited and set my C8 to black on low.

The problem is that you say that Limited should be used only for SDR therefore that limited is bad for HDR games, which forces me to change the RGB setting from Limited to full every time I want to switch from is it an SDR to HDR game?

Do I have to set my ps4 standart on Limited / low for SDR and HDR or Full / high?

thank you so much

Limited or Full doesn't matter for HDR, PS4 HDR doesn't output RGB but YUV. Leave your console to Limited because, as you said, the C8 and the PS4 suck at hand-shacking.
 

2Blackcats

Member
Oct 26, 2017
16,057
ok.

So in a perfect world with a non existant perfect display, you feed the TV a value (the value as are the bottom) and you receive outwards the corresponding output of light (the vertical axis is 0-10000nits)
SzUoxlm.png


However we don't have this type of display and the display can only become so bright and then it will flatline with the amount of light in can output for larger values.
So a display operating very strictly within it's limits would do this to a display all values above 750 nits flatline and produce this hard clip. This will present itself as "blown out" whites and other anomolies in data that sits above that brightness.
I6c3sdh.png



BT2390 is a recommended standardisation in order to use the metadata about the movie, which says how bright the content is and how the TV should handle that.

So here is one for a 750nit display, being fed a piece of 4000nit content
FQp4p7Q.png

This is to help reduce those artefacts and create a smooth change from detail the display can produce to the data that the display cannot produce, so there are no weird artefacts or apparent overexposure.

example: Right original image, left 4000nit tone mapped image.
1ZS4OkS.png

However the downside in doing this is that we have lost luminance in certain parts of the image, notice the clouds are now darker than they should be. In order to facilitate the colour detail, we have lost luminance.

this is what LG OLEDS traditionally did even for games where there was no data above 1000nits, so luminance would actually be lost for no good reason.


DTM on the other hand will do something more like this more often than not.
zqCtAYw.png

to create a more poppy image.
mT08hd9.png


But what if the game has already done something similar and then the TV then does the same ontop of it, or even maybe tries and undoes it?

Thanks a million.
 
Last edited:
Oct 25, 2017
13,246
Insightful posts EvilBoris!

--

Has anyone calibrated the HDR on their TV? I was able to calibrate the SDR mode on my TV to Rec 709 with BT 1886 gamma (the end result is quite pleasing) but guides and workflows for HDR calibration seem to be... sparse.
 
OP
OP
EvilBoris

EvilBoris

Prophet of Truth - HDTVtest
Verified
Oct 29, 2017
16,680
Insightful posts EvilBoris!

--

Has anyone calibrated the HDR on their TV? I was able to calibrate the SDR mode on my TV to Rec 709 with BT 1886 gamma (the end result is quite pleasing) but guides and workflows for HDR calibration seem to be... sparse.

What tools are you using?
 

s-7-n

Member
Jun 10, 2019
34
The reason that the HGIG suggested the setting for is because there is a huge variability in how TVs handle tone mapping content from above their capability.
It makes more sense for the game to perform this task because it is already doing this in order to tonemap the huge 12-16bit or 32bit data ranges that they already work with internally.

It doesn't make sense for a game to
  1. Tonemap a 32bit buffer to 10bit
  2. then for a TV to try and then interpret that data and then guess (which is what it is doing) as to how it should look. This will affect both how bright the game should be, how dark it should be and how saturated things should be
  3. This in turn produces a perceptually <10bit image, which may mean that certain parts of the image suffer from additional unwanted artefacts.

It makes far more sense for a game to

  • Tonemap a 32bit buffer to 10bit
  • then for the TV to display it exactly is presented to.

^This way developers can create their display mappers to ensure a more consistent experience across display with differing capabilities.



I'm not sure why the DTM confusion still exists, as long as TVs have existed everyone has understood you turn off dynamic black levels, colour enhancement , contrast enhancer, super blackmode etc.
But for some reason there is a lack of understanding that DTM is doing all of these types of things.

And that's fine, the reason that manufacturers oversaturate, crush blacks and brighten things up is because user tests will always show a preference for this. I
It's not a problem liking it (there are loads of situation where subjectively that content may look better or be easier to watch) but it is not accurate.
It's not more "right" than having your sharpness on 20 and your color turned up to 100.

The reason that in certain games you will see more clipping happening happening is if the game is tone mapping ONLY to an arbitrary value of say 4000nits and allows no actually peak tone mapping of it's own - See God of War, Horizon Zero down, then HGIG will clip all detail above.
This clipping is also usually paired up with games where the game's exposure has not been configured for anything beyond SDR, which often results in additional overexposure and clipping anyway.

So when I say 99% of games, the vast majority of games do have adjustable peak brightness control, HGIG should be used in these situations.
Even 1000nit locked games like FF7R and Borderlands 3 are just fine in that mode, as the difference in data between say 700-nits and 1000nits is marginal and you are unlikely to come across many situations where you see it.

That leaves the older titles and titles which didn't really get things 100% right, i.e. God of War , Horizon which go all the way to 4000nits. For these games, DTM being turned off will give a bt2390 standard 4000nit roll off (these games were probably tested with this) , maintain the bulk of the image 1:1 and you'll roll off that upper part of the image.

DTM should only really be reserved for those titles where the game is unplayable dark, where they have no regular brightness adjustment. As it typically brightens up the mid tones, deals with some highlights, but typically will introduce more near black crushing to create more pop.

If you don't always play your games in a dark environment, then I'd actually suggest you enable the AI brightness function, this uses an ambient room sensor and makes a predictable PQ brightening as the room lightens up, but will drop right back down to reference in a dark room.


For games that make use of an adjustable in game tone mapper, then any loss of detail that is there is in highlights of change in saturation, a deliberate product of the system the game has chosen to use. In most instances.

I've done a little bit of work for Digital Foundry both written and video with John and Tom in the past in regards to RDR2 when that came out, I unfortunately caused the kafuffle with Rockstar that then led to us getting a proper HDR mode in the game. So it was kinda worth it in the end.
There know where I am if they ever need me :)
I have lots of opportunity with various companies to lend my expertise, I've done a few bits for a few games developers now, some hardware manufacturers and some display manufactures, trying to help make things better across the board.

thank you, sir, very insight & helpful 👍
 

Deleted member 50735

User requested account closure
Banned
Dec 10, 2018
519
Mean't to post a while ago, but well done on the Doom HDR vid. I liked it! 👍
Not got the game yet, but will do at some point.
 
OP
OP
EvilBoris

EvilBoris

Prophet of Truth - HDTVtest
Verified
Oct 29, 2017
16,680
I have the i1Display Pro that's rated for up to 2000nits I believe. I've been using HCFR so far for my TV calibration, and DisplayCal for monitor calibration.

You want to do a 2 point white balance calibration.

then a 20 point grayscale to deal with white balance and sort out the Pq curve. Do this with 1000nit metadata

Then try and do the best you can with some saturation sweeps, just bare in mind that you won't be able to get some of the more extreme saturations
 
Oct 25, 2017
13,246
You want to do a 2 point white balance calibration.

then a 20 point grayscale to deal with white balance and sort out the Pq curve. Do this with 1000nit metadata

Then try and do the best you can with some saturation sweeps, just bare in mind that you won't be able to get some of the more extreme saturations

Gotcha, thank you!

I have the Mehanik HDR10 patterns from the Avs forums which seem to be 1000 nits. Do you have any recommendation on which saturation percentage to target for the patterns (50-75%) since TVs can't hit the full Rec 2020 gamut?
 

Theolaitdoux

Member
Apr 25, 2020
19
[QUOTE = "Ferrs, message: 32416650, membre: 8641"]
Limited ou Full n'a pas d'importance pour HDR, PS4 HDR ne produit pas de RVB mais YUV. Laissez votre console à Limited car, comme vous l'avez dit, la C8 et la PS4 sont nulles.
[/CITATION]
Ce n'est pas tout à fait vrai, en 4k oui ce que vous dites est correct, mais en 1080p la ps4 prend en charge RGB en HDR et dans les informations de signal sur la ps4 cela me dit bien: "signal 1080p RGB HDR" donc je dois faire quoi? Je répète que je suis sur PS4 pas pro. Je vous remercie
 

Theolaitdoux

Member
Apr 25, 2020
19
The reason that the HGIG suggested the setting for is because there is a huge variability in how TVs handle tone mapping content from above their capability.
It makes more sense for the game to perform this task because it is already doing this in order to tonemap the huge 12-16bit or 32bit data ranges that they already work with internally.

It doesn't make sense for a game to
  1. Tonemap a 32bit buffer to 10bit
  2. then for a TV to try and then interpret that data and then guess (which is what it is doing) as to how it should look. This will affect both how bright the game should be, how dark it should be and how saturated things should be
  3. This in turn produces a perceptually <10bit image, which may mean that certain parts of the image suffer from additional unwanted artefacts.

It makes far more sense for a game to

  • Tonemap a 32bit buffer to 10bit
  • then for the TV to display it exactly is presented to.

^This way developers can create their display mappers to ensure a more consistent experience across display with differing capabilities.



I'm not sure why the DTM confusion still exists, as long as TVs have existed everyone has understood you turn off dynamic black levels, colour enhancement , contrast enhancer, super blackmode etc.
But for some reason there is a lack of understanding that DTM is doing all of these types of things.

And that's fine, the reason that manufacturers oversaturate, crush blacks and brighten things up is because user tests will always show a preference for this. I
It's not a problem liking it (there are loads of situation where subjectively that content may look better or be easier to watch) but it is not accurate.
It's not more "right" than having your sharpness on 20 and your color turned up to 100.

The reason that in certain games you will see more clipping happening happening is if the game is tone mapping ONLY to an arbitrary value of say 4000nits and allows no actually peak tone mapping of it's own - See God of War, Horizon Zero down, then HGIG will clip all detail above.
This clipping is also usually paired up with games where the game's exposure has not been configured for anything beyond SDR, which often results in additional overexposure and clipping anyway.

So when I say 99% of games, the vast majority of games do have adjustable peak brightness control, HGIG should be used in these situations.
Even 1000nit locked games like FF7R and Borderlands 3 are just fine in that mode, as the difference in data between say 700-nits and 1000nits is marginal and you are unlikely to come across many situations where you see it.

That leaves the older titles and titles which didn't really get things 100% right, i.e. God of War , Horizon which go all the way to 4000nits. For these games, DTM being turned off will give a bt2390 standard 4000nit roll off (these games were probably tested with this) , maintain the bulk of the image 1:1 and you'll roll off that upper part of the image.

DTM should only really be reserved for those titles where the game is unplayable dark, where they have no regular brightness adjustment. As it typically brightens up the mid tones, deals with some highlights, but typically will introduce more near black crushing to create more pop.

If you don't always play your games in a dark environment, then I'd actually suggest you enable the AI brightness function, this uses an ambient room sensor and makes a predictable PQ brightening as the room lightens up, but will drop right back down to reference in a dark room.


For games that make use of an adjustable in game tone mapper, then any loss of detail that is there is in highlights of change in saturation, a deliberate product of the system the game has chosen to use. In most instances.

I've done a little bit of work for Digital Foundry both written and video with John and Tom in the past in regards to RDR2 when that came out, I unfortunately caused the kafuffle with Rockstar that then led to us getting a proper HDR mode in the game. So it was kinda worth it in the end.
There know where I am if they ever need me :)
I have lots of opportunity with various companies to lend my expertise, I've done a few bits for a few games developers now, some hardware manufacturers and some display manufactures, trying to help make things better across the board.
[/QUOT
The reason that the HGIG suggested the setting for is because there is a huge variability in how TVs handle tone mapping content from above their capability.
It makes more sense for the game to perform this task because it is already doing this in order to tonemap the huge 12-16bit or 32bit data ranges that they already work with internally.

It doesn't make sense for a game to
  1. Tonemap a 32bit buffer to 10bit
  2. then for a TV to try and then interpret that data and then guess (which is what it is doing) as to how it should look. This will affect both how bright the game should be, how dark it should be and how saturated things should be
  3. This in turn produces a perceptually <10bit image, which may mean that certain parts of the image suffer from additional unwanted artefacts.

It makes far more sense for a game to

  • Tonemap a 32bit buffer to 10bit
  • then for the TV to display it exactly is presented to.

^This way developers can create their display mappers to ensure a more consistent experience across display with differing capabilities.



I'm not sure why the DTM confusion still exists, as long as TVs have existed everyone has understood you turn off dynamic black levels, colour enhancement , contrast enhancer, super blackmode etc.
But for some reason there is a lack of understanding that DTM is doing all of these types of things.

And that's fine, the reason that manufacturers oversaturate, crush blacks and brighten things up is because user tests will always show a preference for this. I
It's not a problem liking it (there are loads of situation where subjectively that content may look better or be easier to watch) but it is not accurate.
It's not more "right" than having your sharpness on 20 and your color turned up to 100.

The reason that in certain games you will see more clipping happening happening is if the game is tone mapping ONLY to an arbitrary value of say 4000nits and allows no actually peak tone mapping of it's own - See God of War, Horizon Zero down, then HGIG will clip all detail above.
This clipping is also usually paired up with games where the game's exposure has not been configured for anything beyond SDR, which often results in additional overexposure and clipping anyway.

So when I say 99% of games, the vast majority of games do have adjustable peak brightness control, HGIG should be used in these situations.
Even 1000nit locked games like FF7R and Borderlands 3 are just fine in that mode, as the difference in data between say 700-nits and 1000nits is marginal and you are unlikely to come across many situations where you see it.

That leaves the older titles and titles which didn't really get things 100% right, i.e. God of War , Horizon which go all the way to 4000nits. For these games, DTM being turned off will give a bt2390 standard 4000nit roll off (these games were probably tested with this) , maintain the bulk of the image 1:1 and you'll roll off that upper part of the image.

DTM should only really be reserved for those titles where the game is unplayable dark, where they have no regular brightness adjustment. As it typically brightens up the mid tones, deals with some highlights, but typically will introduce more near black crushing to create more pop.

If you don't always play your games in a dark environment, then I'd actually suggest you enable the AI brightness function, this uses an ambient room sensor and makes a predictable PQ brightening as the room lightens up, but will drop right back down to reference in a dark room.


For games that make use of an adjustable in game tone mapper, then any loss of detail that is there is in highlights of change in saturation, a deliberate product of the system the game has chosen to use. In most instances.

I've done a little bit of work for Digital Foundry both written and video with John and Tom in the past in regards to RDR2 when that came out, I unfortunately caused the kafuffle with Rockstar that then led to us getting a proper HDR mode in the game. So it was kinda worth it in the end.
There know where I am if they ever need me :)
I have lots of opportunity with various companies to lend my expertise, I've done a few bits for a few games developers now, some hardware manufacturers and some display manufactures, trying to help make things better across the board.
Hello and thank you for your incredible details. I have a few questions to ask yourself to be on the right settings on my LG C8 for HDR games.

Can you confirm that I must deactivate the DTM option for all content on my C8 that does not have HGiG available?

DTM brings nothing apart from losing details where to distort the content?

And if I ever want to use DTM should I adjust the hdr parameters on the ps4 before activating it or after?

Last question, 4000/200 for RDR2 HDR menu is it good?

thanks again
 

GlowingBovine

Prophet of Truth
Member
Nov 27, 2017
790
I'm playing Jedi: Fallen order on an Xbox One X on an LG C8, and I've noticed I seem to have raised blacks. The loading screen as well as the black background on the brightness setting screen are not pure black. Is that normal with this game?
 

Theolaitdoux

Member
Apr 25, 2020
19
[QUOTE = "LordDraven, message: 32452902, membre: 52336"]
[USER = 24334] EvilBoris [/ USER] préférez-vous réellement le nouveau paramètre HDR pour Red Dead 2 ou pensez-vous toujours que le SDR est meillewith
[/CITATION]
It's better with HDR, not perfect but it's better than SDR.
 

da1eb7150

Member
May 13, 2019
550
I'm playing Jedi: Fallen order on an Xbox One X on an LG C8, and I've noticed I seem to have raised blacks. The loading screen as well as the black background on the brightness setting screen are not pure black. Is that normal with this game?
It seems to be normal it's the same for me and a few others have mentioned it too
 

Tahnit

Member
Oct 25, 2017
9,965
I am having an issue with AC odyssey HDR.


When im on the title screen under hdr calibration the image looks fine after calibrating, when i go in game though everything is blown out super bright its hard to even see details in the ground ect.

When i go back to the calibration the image is blown out severely as well. This makes me think there is a bug with the hdr config.

I dont have any msi afterburner applications running as i know that causes issues.

the tv i have is a TCL 43S525 . not the greatest TV but the best I could get for the size that i wanted.

anyone have a fix?
 

flyinj

Member
Oct 25, 2017
10,941
When I play Hitman 2 on my PC at 1080p with HDR enabled, the red icons turn bright pink and they pixelate like crazy. This also happens on all the menus. This is on a Series 6 TCL using a 1070ti connected through HDMI.

Raising the resolution seems to bring these elements closer to red, and the pixelation becomes less apparent.

What is causing this, and is there a way to fix it?
 

da1eb7150

Member
May 13, 2019
550
For people who play assassin's creed oddysey on Lg c9 set hdr in-game to 700 and paper white to 120 it looks epic with HGIG set on the TV. I was playing at 800 but setting it to 700 looks so much better. I assume this should be better on the division 2 as well but I've not tried that yet
 
Last edited:

jb1234

Very low key
Member
Oct 25, 2017
7,225
Anyone check out the beta patch for Ori and the Will of the Wisps? Curious how the HDR is.
 

goaman

Member
Oct 5, 2019
269
For people who play assassin's creed oddysey on Lg c9 set hdr in-game to 700 and paper white to 120 it looks epic with HGIG set on the TV. I was playing at 800 but setting it to 700 looks so much better. I assume this should be better on the division 2 as well but I've not tried that yet

I confirm
 
OP
OP
EvilBoris

EvilBoris

Prophet of Truth - HDTVtest
Verified
Oct 29, 2017
16,680
I saw EvilBoris was tweeting about helping with the HDR, although I don't know if any of those changes are in the beta patch.




I'm not sure which build the public beta is, the HDR is still being tuned and amended almost daily in the build I have, in fact some fairly large changes have happened since the public build hit.
A lot of extra love is being given to it, hence my involvement, But for anybody who has the public beta installed it, we would be interested in what you think so far.
 

Rotanixel

Member
May 26, 2019
8
I've asked something like this before but should you aim for the HDR in game adjustment sliders like in SWBF2/Frostbite games to match your TV's max nits?
For example the x900f I've got when set at the default contrast setting of 90 the two halves match up around 1400/1500 whereas if I change the contrast to 95 the two halves match up at 1000 which is about the max for the set iirc.
 
OP
OP
EvilBoris

EvilBoris

Prophet of Truth - HDTVtest
Verified
Oct 29, 2017
16,680
I've asked something like this before but should you aim for the HDR in game adjustment sliders like in SWBF2/Frostbite games to match your TV's max nits?
For example the x900f I've got when set at the default contrast setting of 90 the two halves match up around 1400/1500 whereas if I change the contrast to 95 the two halves match up at 1000 which is about the max for the set iirc.

Basically if your display isn't calibrated, then you will get slight numerical mismatches.
The best you can do really is just max out the value based upon what you see and apply that logic to other titles too
 

Nola

Banned
Oct 29, 2017
8,025
I'm not sure which build the public beta is, the HDR is still being tuned and amended almost daily in the build I have, in fact some fairly large changes have happened since the public build hit.
A lot of extra love is being given to it, hence my involvement, But for anybody who has the public beta installed it, we would be interested in what you think so far.
Awesome! Keep up the good work

Was just itching to play this but was wondering if they ever got around to adding HDR into the PC version, guess I'll hold off a bit longer since it looks like a lot of care is being put into this.
 

Rotanixel

Member
May 26, 2019
8
Basically if your display isn't calibrated, then you will get slight numerical mismatches.
The best you can do really is just max out the value based upon what you see and apply that logic to other titles too
So I take it that leaving the TV's contrast at default then and using the 1400/1500 mismatch rather than pushing it to 95 so it reads as 1000 the max nits it's rated for is the better option?
 
OP
OP
EvilBoris

EvilBoris

Prophet of Truth - HDTVtest
Verified
Oct 29, 2017
16,680
So I take it that leaving the TV's contrast at default then and using the 1400/1500 mismatch rather than pushing it to 95 so it reads as 1000 the max nits it's rated for is the better option?

yeah, as I don't know what the actual Sony tonemapper is doing. There is also a relationship between what you see in terms of colour detail and actual luminance output, but sometimes they are untethered.

Go and take a look at what Rtings have recorded for the HDR setting for contrast is specifically, they will have calibrated it against the display PQ curve.
 

Rotanixel

Member
May 26, 2019
8
Cheers, looks like they keep it at 90 so will leave it there. keep up the great work by the way loving the videos.
 

jb1234

Very low key
Member
Oct 25, 2017
7,225
I'm not sure which build the public beta is, the HDR is still being tuned and amended almost daily in the build I have, in fact some fairly large changes have happened since the public build hit.
A lot of extra love is being given to it, hence my involvement, But for anybody who has the public beta installed it, we would be interested in what you think so far.

Thank you for your hard work!
 

Kiyoshi

Member
Apr 4, 2018
109
I'm not sure which build the public beta is, the HDR is still being tuned and amended almost daily in the build I have, in fact some fairly large changes have happened since the public build hit.
A lot of extra love is being given to it, hence my involvement, But for anybody who has the public beta installed it, we would be interested in what you think so far.

Good to know it's being worked on. The beta version with default HDR settings looks great on my TV (Samsung KS7000 UK version). I've moved onto other games for the time being, but I'm going to revisit this once HDR is finished properly.

As an aside, it's cool to see you're Northampton based. I'm a Northamptonian; currently living in a village near Kettering.
 

Broken Hope

Banned
Oct 27, 2017
1,316
So what are the chances that PS5/XSX will have proper HGIG support so we just set it once in the OS and don't have to set HDR per game any more?
 

sosainas

Member
Sep 13, 2018
896
The PS4 already supports it, just only No Man's Sky supports it, I'm hoping with the next generation it will be a standard feature of games.

I know about HGIG, didn't know that it could be possible to set up the HDR calibration in the console settings and then a game can just take that info and calibrate itself without any further calibration inside the game.
 

Wag

Member
Nov 3, 2017
11,638
I wish more PC games were in HDR.

I'm playing A Plague Tale right now and I looked at the config file. There is a HDR option but it doesn't work
 

Broken Hope

Banned
Oct 27, 2017
1,316
I know about HGIG, didn't know that it could be possible to set up the HDR calibration in the console settings and then a game can just take that info and calibrate itself without any further calibration inside the game.
The PS4 got OS based HDR settings a few updates ago, just only 1 game uses it so far. If all games use the feature next gen you'd set HDR settings the once, in the OS and that would be it.
 

thePopaShots

Member
Nov 27, 2017
1,687
So if I have a display that only has 600 nits peak brightness, am I current in assuming that I should not even try HDR options of PS4 or PC? The few times I have messed around with HDR the color reproduction seemed really off like the game was just trying to oversaturate to create the allusion of HDR.
 

MatOfTheDead

Member
May 30, 2018
559
Walsall West Midlands
i wonder if anyone could help i have a sony bravia KD43XF7596BU and im wondering what the maximum luminance is on this model as i cant seem to find it anywhere so im struggling to get assassins creed origins looking good
 

NutterB

Member
Oct 27, 2017
388
HDR experts I have a question for you.

I recently went back to RDR2 And HDR just looked very wrong, washed out colors and lighting that just lost its mojo.

I know the game got a patch last year to "fix" it's issues however is it still better to play this game in SDR?
 

2Blackcats

Member
Oct 26, 2017
16,057
HDR experts I have a question for you.

I recently went back to RDR2 And HDR just looked very wrong, washed out colors and lighting that just lost its mojo.

I know the game got a patch last year to "fix" it's issues however is it still better to play this game in SDR?

There's 2 versions of HDR. Cinematic & game. Game is the one to go for. Though I believe it's still not great.
 

goaman

Member
Oct 5, 2019
269
hello what hdr settings for jedi fallen order?
on my oled c9 game mode HGIG and hdr brightness 0.7 in the game is this correct?
 
Oct 25, 2017
1,137
Michigan
Anyone have good HDR setting for Ori and the Will of the Wisps? On PC (Game Pass) looks like it was updated with HDR. I assume HGiG on, but there's settings for "Base Brightness", Contrast", "UI Brightness", "Richness", and "Shadow Detail". Playing on a LG CX, but I assume whatever the best C9 settings are would apply to the CX as well.
 

Mr Delabee

Member
Oct 25, 2017
1,163
UK
Anyone have good HDR setting for Ori and the Will of the Wisps? On PC (Game Pass) looks like it was updated with HDR. I assume HGiG on, but there's settings for "Base Brightness", Contrast", "UI Brightness", "Richness", and "Shadow Detail". Playing on a LG CX, but I assume whatever the best C9 settings are would apply to the CX as well.