• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.

ZmillA

Member
Oct 27, 2017
2,160
Copy pasting something I wrote some time ago.

You've probably been playing games or watching content in HDR via your PC, while missing a critical component – 10bit video.

Windows 10, unlike game consoles does not auto-switch the bit depth of the outgoing video when launching HDR games.

By default, both NVIDIA and AMD GPUs are configured to output RGB 8bit.

You might be wondering "But my TV turns on its HDR modes and games look better" this is indeed true – HDR is a collection of different pieces that when working together create the HDR effect. Your PC is sending the WCG(Wide Color Gamut)/BT.2020 metadata as well as other information to the TV which triggers its HDR mode, but the PC is still only sending an 8bit signal.


How to output 10-bit video on an NVIDIA GPU

NVIDIA GPUs have some quirks when it comes to what bit depths can be output with formats. The list is as follows:

  • RGB/YUV444:
    • 8-Bit
    • 12-Bit
  • YUV422
    • 8-Bit,
    • 10-Bit
    • 12-Bit
What does this mean for you? Not much – 12-bit has the same bandwidth requirements as 10-bit. If you do require RGB/YUV444, and send a 12-bit signal to the TV, that signal still only is contains a 10-bit signal being sent in a 12-bit container. The TV will convert the signal back down to 10-bit.

However, if you want to output true 10-bit, then you'll need to step down to YUV422 signal. Again, not the end of the world. At normal TV viewing distances (and even in 4K monitors) it is very difficult to tell the difference between 4:4:4 and 4:2:2.

The recommended setting in this case is YUV422 video, at 10-bit, for both SDR and HDR. This will make the switch seamless and does not require you to do any extra work.

How to configure NVIDA GPUs

  1. Right click on the Windows desktop
  2. Open the NVIDA control panel
  3. On the left side, click on Resolutions
  4. click on the Output Color Formatdropdown menu and select YUV422
  5. Click on Apply
  6. Now click on the Output Color Depth dropdown menu and select 10bpc(bits per color)
  7. Click on Apply
Thats it. Your GPU is now outputting YUV422 10-bit video to your TV or monitor.

Now launch an HDR game and you'll see the full 10-bit color depth!

I dont have YUV as an option only RGB and YCbCr
 

Videophile

Tech Marketing at Elgato
Verified
Oct 27, 2017
63
San Francisco, CA
When I turn on the HDR toggle in Windows my entire screen becomes washed out and it does not look good at all. Any idea what is happening?
Yep...This is why on my TV I don't keep this setting enabled.

Windows is doing its best trying to convert all SDR things, like the Desktop, programs, basically all of Windows to HDR, and its not good at it. HDR material like games, movies and videos on YouTube will look fine as those are properly mapped but anything in SDR being upconverted will not look great.

You can try messing with the brightness slider which adjusts the mapping but I've never found a value that works well for me.

Here's MSFTs own help page on it.
 

Kyle Cross

Member
Oct 25, 2017
8,407
That's a really good question, I guess games can access the same list of resolutions too. I absolutely bet that is why people get weirdness on PC with HDR.
I suppose one thing to do is edit the resolutions your display's EDID offers in something like CRU.
Is there any tutorials to help me disable the PC resolutions?
 

noomi

Member
Oct 25, 2017
3,686
New Jersey
1 . LG CX
2. 2080 Ti
3. YUV422 via nvCPL
4. Sometimes I switch between 4K/120Hz & 1440/120Hz but that's about it.

I just wish HDMI 2.1 was already out because I want to have 4K/120Hz with HDR, but currently that is impossible :**(
 

EvilBoris

Prophet of Truth - HDTVtest
Verified
Oct 29, 2017
16,678
1 . LG CX
2. 2080 Ti
3. YUV422 via nvCPL
4. Sometimes I switch between 4K/120Hz & 1440/120Hz but that's about it.

I just wish HDMI 2.1 was already out because I want to have 4K/120Hz with HDR, but currently that is impossible :**(

I'm just trying a new adapter on the market and I currently have 4K 120hz HDR coming from a 1060
 

EvilBoris

Prophet of Truth - HDTVtest
Verified
Oct 29, 2017
16,678
Ooo! Do tell!

I imagine this must an adapter not available to general market yet? Not sure if you are able to talk about it, but I'm curious how this would be possible with the current bandwidth limits.

It's a club 3D Cac-1085, they are just starting to ship now.

You need a GPU that supports DSC (Turing, Newer AMD and intel cards), I only got it today, But I seem to be ok with 10bit 4K 120hz via HDMI 2.1.
That's only about 11gbps with 422, so well under the 48gbps.
I'm still doing a lot of experimenting with custom resolutions and stuff to see if I can push it further.
As it's a bit of a workaround there must be a few software and firmware limitations at either end of the chain that are stopping me going higher
 

John Rabbit

Member
Oct 25, 2017
10,091
1. LG C9
2. 2080
3. I don't turn on HDR in games at all as I've found the results between titles to be insanely inconsistent
 

Vasto

Member
May 26, 2019
342
Yep...This is why on my TV I don't keep this setting enabled.

Windows is doing its best trying to convert all SDR things, like the Desktop, programs, basically all of Windows to HDR, and its not good at it. HDR material like games, movies and videos on YouTube will look fine as those are properly mapped but anything in SDR being upconverted will not look great.

You can try messing with the brightness slider which adjusts the mapping but I've never found a value that works well for me.

Here's MSFTs own help page on it.



You are correct. HDR stuff looks good but everything else looks bad. I will mess around with it and see if I can get it looking better because I want to leave the HDR toggle on. Thanks for the link.
 

noomi

Member
Oct 25, 2017
3,686
New Jersey
It's a club 3D Cac-1085, they are just starting to ship now.

You need a GPU that supports DSC (Turing, Newer AMD and intel cards), I only got it today, But I seem to be ok with 10bit 4K 120hz via HDMI 2.1.
That's only about 11gbps with 422, so well under the 48gbps.
I'm still doing a lot of experimenting with custom resolutions and stuff to see if I can push it further.
As it's a bit of a workaround there must be a few software and firmware limitations at either end of the chain that are stopping me going higher

Wow, that is fantastic news actually! Thank you for passing the info along, going to check this thing out and see what I can learn.

*edit*

Awww, seems you can't use VRR/G-Sync with the adapter though. WE WERE ALMOST THERE!
 
Last edited:

Issen

Member
Nov 12, 2017
6,813
- nVidia RTX 2070
- "Windows HD Color" enabled, maximum SDR brightness
- Samsung Q6FN TV, HDR enabled, settings recommended by RTINGS.

Everything just works. Forza 7, Forza Horizon 4, Gears 5, F1 2019, Final Fantasy XV, Assassin's Creed Odyssey, Sekiro... They all either automatically detect and enable HDR or have an HDR toggle, but it all shows up and works perfectly just by having Windows HD Color enabled. Non-HDR games work automatically with HD Color enabled and their brightness is adjusted accodring to the SDR brightness setting. Setting it to maximum looks best to me, but YMMV and it's an extremely simple slider anyway.

Do NOT mess around with HDR settings on the nVidia Control Panel. Just set the color settings to "managed by Windows" or whatever the default option is and forget about it. It only causes trouble.

I also have a PS4 Pro connected to this display and I can confirm HDR looks just as good in both (or, when playing the same games, they look the same).

Also, if SDR content looks bad on your display when HD Color is enabled no matter the SDR brightness setting you use, your gamma/brightness might be incorrectly configured. This happened to me before I followed the recommended calibration settings given by RTINGS.
 

Landford

Attempted to circumvent ban with alt account
Banned
Oct 25, 2017
4,678
What is the best cost-benefit hdr monitor I can get right now? Cheapest as possible.
 

alpha

Member
Oct 25, 2017
4,994
I have a Radeon 5700 XT and a Samsung C32HG70, 32-inch. So my understanding is:

- Don't use Windows 10's HDR toggle, games with HDR will automatically turn it on and it fucks up anything that isn't HDR making it washed out and bad looking.
- Turn on YcbCr 4:2:2 instead of YcbCr 4:4:4, and use 10-bit Color Depth.

That about right?
 

Videophile

Tech Marketing at Elgato
Verified
Oct 27, 2017
63
San Francisco, CA
I have a Radeon 5700 XT and a Samsung C32HG70, 32-inch. So my understanding is:

- Don't use Windows 10's HDR toggle, games with HDR will automatically turn it on and it fucks up anything that isn't HDR making it washed out and bad looking.
- Turn on YcbCr 4:2:2 instead of YcbCr 4:4:4, and use 10-bit Color Depth.

That about right?
That is correct.

Some games won't turn on HDR or show the HDR toggle unless HDR is also toggled 'on' system wide - Shadow Of The Tomb Raider requires the system wide toggle to be on for the game to detect that it can run in HDR. Same with Borderlands 3.
 

Samaritan

Member
Oct 25, 2017
6,696
Tacoma, Washington
1) Asus ROG PG35VQ
2) 2080Ti
3) Nvidia color settings
4) RGB 10-bit

When a new game that supports HDR comes out, I'll check to see if our resident EvilBoris has taken a look at it himself, and if so will follow his guidance + tweak to accommodate the 1200+ nit peak brightness of my display. If he hasn't taken a look at the game, then I'll usually poke my head into the HDR Games thread and see what people there have to say about it. I usually have to spend a few minutes in every game to get the HDR just right, and in a couple cases it's something I end up messing with throughout the duration of my playthrough, but it's actually a lot better than I was expecting compared to what I had heard before getting this monitor.

I am a bit confused by LtRoyalShrimp's excellent post though, since it states that with Nvidia GPUs you can only select 8 or 12-bit in RGB, but I'm able to and have always ran with 10-bit RGB and had no problems, that I'm aware of anyway, getting HDR games to work. Wouldn't you ideally want to go with RGB over YcbCr 4:2:2 because it bypasses chroma subsampling? Would I be better off going with YcbCr 4:4:4 over RGB? Have I been limiting the color depth of my games this entire time by using RGB???
 
Last edited:

GasPanic!

Member
Oct 28, 2017
307
1. LG E9
2. 2080 Ti
3. YUV422 10-bit in NVCP

Is there a way to use an ultrawide custom resolution on the LG OLED with HDR enabled? I tried 3840x1600 with GPU scaling and it always resets to RGB 8-bit. I read that CRU might work.
 

Videophile

Tech Marketing at Elgato
Verified
Oct 27, 2017
63
San Francisco, CA
1) Asus ROG PG35VQ
2) 2080Ti
3) Nvidia color settings
4) RGB 10-bit

When a new game that supports HDR comes out, I'll check to see if our resident EvilBoris has taken a look at it himself, and if so will follow his guidance + tweak to accommodate the 1200+ nit peak brightness of my display. If he hasn't taken a look at the game, then I'll usually poke my head into the HDR Games thread and see what people there have to say about it. I usually have to spend a few minutes in every game to get the HDR just right, and in a couple cases it's something I end up messing with throughout the duration of my playthrough, but it's actually a lot better than I was expecting compared to what I had heard before getting this monitor.

I am a bit confused by LtRoyalShrimp's excellent post though, since it states that with Nvidia GPUs you can only select 8 or 12-bit in RGB, but I'm able to and have always ran with 10-bit RGB and had no problems, that I'm aware of anyway, getting HDR games to work. Wouldn't you ideally want to go with RGB over YcbCr 4:2:2 because it bypasses chroma subsampling? Would I be better off going with YcbCr 4:4:4 over RGB? Have I been limiting the color depth of my games this entire time by using RGB???
The resolution/format list is for HDMI. Its different for DisplayPort. Ill update the post.
 

plagiarize

Eating crackers
Moderator
Oct 25, 2017
27,506
Cape Cod, MA
1) Asus ROG PG35VQ
2) 2080Ti
3) Nvidia color settings
4) RGB 10-bit

When a new game that supports HDR comes out, I'll check to see if our resident EvilBoris has taken a look at it himself, and if so will follow his guidance + tweak to accommodate the 1200+ nit peak brightness of my display. If he hasn't taken a look at the game, then I'll usually poke my head into the HDR Games thread and see what people there have to say about it. I usually have to spend a few minutes in every game to get the HDR just right, and in a couple cases it's something I end up messing with throughout the duration of my playthrough, but it's actually a lot better than I was expecting compared to what I had heard before getting this monitor.

I am a bit confused by LtRoyalShrimp's excellent post though, since it states that with Nvidia GPUs you can only select 8 or 12-bit in RGB, but I'm able to and have always ran with 10-bit RGB and had no problems, that I'm aware of anyway, getting HDR games to work. Wouldn't you ideally want to go with RGB over YcbCr 4:2:2 because it bypasses chroma subsampling? Would I be better off going with YcbCr 4:4:4 over RGB? Have I been limiting the color depth of my games this entire time by using RGB???
These are my settings. I actually leave HDR on in windows all the time as a result.

I only use 422 with my monitor if I want to push it higher than 98 Hz (it'll go up to 144 hz at 422 if I remember right).
 

Lump

One Winged Slayer
Member
Oct 25, 2017
15,954
1) LG CX 48
2) 2080 Ti (until the moment Ampere comes out, then I'm going to sell it off for the 2080 Ti equivalent that has HDMI 2.1 - presuming Ampere has HDMI 2.1)
3) Nvidia color settings
4) YCbCr422, 10 bpc
 

Chris_Rivera

Banned
Oct 30, 2017
292
Cool thread, I noticed the same thing trying 422 in the control panel on B7. I think the 10bit color looks better. I tend to use high black on the TV, as I'm told that is full color vs limited.
 

Pwnz

Member
Oct 28, 2017
14,279
Places
Copy pasting something I wrote some time ago.

You've probably been playing games or watching content in HDR via your PC, while missing a critical component – 10bit video.

Windows 10, unlike game consoles does not auto-switch the bit depth of the outgoing video when launching HDR games.

By default, both NVIDIA and AMD GPUs are configured to output RGB 8bit.

You might be wondering "But my TV turns on its HDR modes and games look better" this is indeed true – HDR is a collection of different pieces that when working together create the HDR effect. Your PC is sending the WCG(Wide Color Gamut)/BT.2020 metadata as well as other information to the TV which triggers its HDR mode, but the PC is still only sending an 8bit signal.


How to output 10-bit video on an NVIDIA GPU

NVIDIA GPUs have some quirks when it comes to what bit depths can be output with formats when connected to a display via HDMI.

HDMI supported formats:
  • RGB/YUV444:
    • 8-Bit
    • 12-Bit
  • YUV422
    • 8-Bit,
    • 10-Bit
    • 12-Bit
What does this mean for you? Not much – 12-bit has the same bandwidth requirements as 10-bit. If you do require RGB/YUV444, and send a 12-bit signal to the TV, that signal still only is contains a 10-bit signal being sent in a 12-bit container. The TV will convert the signal back down to 10-bit.

However, if you want to output true 10-bit, then you'll need to step down to YUV422 signal. Again, not the end of the world. At normal TV viewing distances (and even in 4K monitors) it is very difficult to tell the difference between 4:4:4 and 4:2:2. Consistent UI and easy alt tab with consistent resolutions is the 1 thing I miss about 1080p non-HDR.

The recommended setting in this case is YUV422 video, at 10-bit, for both SDR and HDR. This will make the switch seamless and does not require you to do any extra work.

How to configure NVIDA GPUs

  1. Right click on the Windows desktop
  2. Open the NVIDA control panel
  3. On the left side, click on Resolutions
  4. click on the Output Color Format dropdown menu and select YUV422
  5. Click on Apply
  6. Now click on the Output Color Depth dropdown menu and select 10bpc(bits per color)
  7. Click on Apply
Thats it. Your GPU is now outputting YUV422 10-bit video to your TV or monitor.

Now launch an HDR game and you'll see the full 10-bit color depth!

Yeah, I just figured all of this shit out in the last day. Not intuitive at all. This is definitely an example of consoles ease of use > PC. Reminds me of PC gaming before Steam. On top of that I had to enable Input Signal+ on my TV which automagically does some shit for the PC to even allow HDR selections.

In the future when I upgrade to a new GPU with HDMI 2.1 I should be able to just keep it at 4K 60 Hz or 120 Hz with 10-bit and not constantly fuck with the settings. Even then, I'm still using 1080p 120 Hz HDR most of the time because I use it to work too and old Windows apps do not scale at all in 4K.
 

Hadoken

Member
Oct 25, 2017
306
I'll have to check when I get home. I'm using an HDR monitor that connects with display port, so I'm not sure that handles things the same way. When I connect to my 4K tv for hdr, I go the 422 route, but I'm pretty sure I don't have my PC set the same way with my monitor because 422 at the distance I sit from my monitor *is* readily apparent.

Are there any good test screens for banding, etc?

You can try this app on MS Store called DisplayHDR Test. It has multiple test and there's one for banding.

www.microsoft.com

DisplayHDR Test - Official app in the Microsoft Store

The DisplayHDR™ Test Tool allows users to confirm the display parameters including brightness, color and contrast performance of high dynamic range (HDR) laptop and desktop monitors as per the set forth in VESA’s High-Performance Monitor and Display Compliance Test Specification (DisplayHDR)...
 

MazeHaze

Member
Nov 1, 2017
8,572
yeah, but not enough for me to care tbh
Are we sure that Windows actually outputs 10bit color for HDR? On the C9, you can go to settings>channels>highlight "channel tuning" and press 11111 to bring up a source menu, and then click on "HDMI mode" and it will show you a ton of detailed information about the signal the TV is receiving. If I set my Nvidia control panel to 10bit 422, or even 420, the TV reports that it is getting an 8 bit signal. This is with a 2070 Super btw. Even when I enable HDR, or run an HDR game, the C9 menu still only reports 8bit, no matter what settings you change in NVCP.

Edit: the TV also reports HDR on the PS4 pro as ycbcr422 8 bit BT2020
 
Last edited:

plagiarize

Eating crackers
Moderator
Oct 25, 2017
27,506
Cape Cod, MA
You can try this app on MS Store called DisplayHDR Test. It has multiple test and there's one for banding.

www.microsoft.com

DisplayHDR Test - Official app in the Microsoft Store

The DisplayHDR™ Test Tool allows users to confirm the display parameters including brightness, color and contrast performance of high dynamic range (HDR) laptop and desktop monitors as per the set forth in VESA’s High-Performance Monitor and Display Compliance Test Specification (DisplayHDR)...
Cool, thank you!
 

MazeHaze

Member
Nov 1, 2017
8,572
Can anyone confirm that their Windows PC actually outputs 10bit color in HDR? None of the games I've tried actually output in 10 bit color, regardless of what you set Nvidia Control Panel to. I'm pretty sure windows only sends a 10 bit color signal if you run 10 bit content, and HDR games are all actually 8 bit.

Edit: here's shadow warrior 2 in HDR with NVCP set to 422 10 bit. 60hz, 120hz, 420, 422, etc, nothing makes a difference. Windows does not output 10bit color in HDR games.

rqgLoSS.jpg
 
Last edited:

Lump

One Winged Slayer
Member
Oct 25, 2017
15,954
Can anyone confirm that their Windows PC actually outputs 10bit color in HDR? None of the games I've tried actually output in 10 bit color, regardless of what you set Nvidia Control Panel to. I'm pretty sure windows only sends a 10 bit color signal if you run 10 bit content, and HDR games are all actually 8 bit.

Edit: here's shadow warrior 2 in HDR with NVCP set to 422 10 bit. 60hz, 120hz, 420, 422, etc, nothing makes a difference. Windows does not output 10bit color in HDR games.

rqgLoSS.jpg

This might explain the banding I get in DisplayHDR Test no matter what I do in Nvidia settings on my LG CX.
 

MazeHaze

Member
Nov 1, 2017
8,572
This might explain the banding I get in DisplayHDR Test no matter what I do in Nvidia settings on my LG CX.
People have been saying you need to manually enable 10 bit to get "true" hdr for years and I could never honestly tell the difference, And Ive been playing HDR on PC since the first games came out to support it. I always just use default settings and it looks fine to me, same as consoles (which also use 8bit). I think 10 bit color in windows is for photo editing/color grading, and certain applications that support it (Alien isolation I believe has a 10 bit color mode but no HDR). Ive suspected for a while people had placebo or the chroma subsampling changes added variations in gradation or something, but actually being able to see the information about the signal kind of confirms it for me.
 

laxu

Member
Nov 26, 2017
2,782
1) What display are you using for HDR gaming via PC?
2) What GPU are you using?
3) Do you allow Windows and GPU control panels to automate output, or are you customising your settings?
4) If the latter, what settings are you using? (eg: RGB vs YCbCr420 vs YCbCr422 vs YCbCr444, 8-bit vs 10-bit vs 12-bit, etc)

1. LG CX 48" OLED and I also own a Samsung CRG9 (HDR1000 with 10 dimming zones, super ultrawide)
2. 2080 Ti.
3. NVCP and automation? Like that exists! That piece of crap can't even have things like display presets that would be easy to switch. No, you need to manually changes settings and hope the game can activate HDR by itself rather than requiring the Windows HDR setting to be enabled. I toggle this even on the OLED so I can avoid any burn-in. On the CRG9 HDR on the desktop just looks washed out, works acceptably in games but is a far cry from the OLED.
4. I have tried my damnest to see a difference between 10-bit and 8-bit color on either display in games or videos and just can't notice it. So at the moment I am using 4K 60 Hz 8-bit RGB on the LG OLED and 5120x1440 120 Hz 8-bit RGB on the CRG9. This is the most straightforward as it may only require the HDR toggle.
 

MazeHaze

Member
Nov 1, 2017
8,572
4. I have tried my damnest to see a difference between 10-bit and 8-bit color on either display in games or videos and just can't notice it. So at the moment I am using 4K 60 Hz 8-bit RGB on the LG OLED and 5120x1440 120 Hz 8-bit RGB on the CRG9. This is the most straightforward as it may only require the HDR toggle.
Yeah I don't think any games are actually outputting 10bit color at all and I've yet to see any evidence to the contrary.

EDIT : EvilBoris can you shed any light on this? Have you ever seen a display actually report that it was receiving a 10bit signal from a game?
 
Last edited:

EvilBoris

Prophet of Truth - HDTVtest
Verified
Oct 29, 2017
16,678
Yeah I don't think any games are actually outputting 10bit color at all and I've yet to see any evidence to the contrary.

EDIT : EvilBoris can you shed any light on this? Have you ever seen a display actually report that it was receiving a 10bit signal from a game?

All HDR10 games are 10bit, I've seen it first hand when I've analysed the frame buffer. Many of them change from11bit to 10bit ahe final stage of rendering.
Whilst they might not always use a 10bit

Alien Isolation has a 10bit SDR mode.
You can also force 10bit/12bit outputs using specialK mod

When you are in 8bit HDR , you are squeezing everything into roughly 60% of the data that the game would be using if you were just using SDR, so you would likely get even more banding than simply not using HDR at all.
 

plagiarize

Eating crackers
Moderator
Oct 25, 2017
27,506
Cape Cod, MA
20200713_215703c9kg9.jpg


Now, this is with displayport, but that's my monitor correctly identifying the signal from my PC (with HDR enabled in windows) is outputting 10 bit, RGB, HDR.
 

MazeHaze

Member
Nov 1, 2017
8,572
All HDR10 games are 10bit, I've seen it first hand when I've analysed the frame buffer. Many of them change from11bit to 10bit ahe final stage of rendering.
Whilst they might not always use a 10bit

Alien Isolation has a 10bit SDR mode.
You can also force 10bit/12bit outputs using specialK mod

When you are in 8bit HDR , you are squeezing everything into roughly 60% of the data that the game would be using if you were just using SDR, so you would likely get even more banding than simply not using HDR at all.
20200713_215703c9kg9.jpg


Now, this is with displayport, but that's my monitor correctly identifying the signal from my PC (with HDR enabled in windows) is outputting 10 bit, RGB, HDR.
interesting. Any idea why the C9 does not report 10 bit from the ps4 pro or windows, no matter what settings are used?
 

EvilBoris

Prophet of Truth - HDTVtest
Verified
Oct 29, 2017
16,678
interesting. Any idea why the C9 does not report 10 bit from the ps4 pro or windows, no matter what settings are used?

there is a bug on that screen (I presume you are talking about the hidden menu on the C9) if you have 422 enabled (which the PS4 does by default)
 

MazeHaze

Member
Nov 1, 2017
8,572
there is a bug on that screen (I presume you are talking about the hidden menu on the C9) if you have 422 enabled (which the PS4 does by default)
So anything running 422 or 420 (it happens with 420 as well) that reports 8 bit in the HDMI source menu is incorrect? Or is there a glitch preventing the TV from receiving a 10 bit signal?
 

EvilBoris

Prophet of Truth - HDTVtest
Verified
Oct 29, 2017
16,678
A bug that reports it wrong. If validated with an external tool, it reports correctly
 

Exentryk

Member
Oct 25, 2017
3,234
Thank you for sharing the details for this. I've had the settings at default, and just not been playing in HDR since capturing in HDR is also another issue.

Regardless, does anyone know if:
1: C9's default GAME mode should be used when playing in HDR or something like Cinema Home/Cinema?
2: And if the OLED light can be left at 100 or should it be reduced?

P.S. - Ori 2 is something else with HDR!
 

EvilBoris

Prophet of Truth - HDTVtest
Verified
Oct 29, 2017
16,678
Thank you for sharing the details for this. I've had the settings at default, and just not been playing in HDR since capturing in HDR is also another issue.

Regardless, does anyone know if:
1: C9's default GAME mode should be used when playing in HDR or something like Cinema Home/Cinema?
2: And if the OLED light can be left at 100 or should it be reduced?

P.S. - Ori 2 is something else with HDR!

1: The game mode supports the game mode specific HGIG native tone mapping. It also has some additional burn in prevention for UI/Logos.
Other than that it's pretty much the same as the other modes in terms of IQ/
2. OLED light on 100 for HDR.
 

Exentryk

Member
Oct 25, 2017
3,234
1: The game mode supports the game mode specific HGIG native tone mapping. It also has some additional burn in prevention for UI/Logos.
Other than that it's pretty much the same as the other modes in terms of IQ/
2. OLED light on 100 for HDR.
Wow, thank you for the quick response! Really appreciate it.
 

Khasim

Banned
Oct 27, 2017
2,260
I'm using an LG B8 OLED with a GTX 1080 Ti, YUV422 10-bit, I thought maybe 12-bit 422 would cause problems, but good to know it's not.

The most annoying part of HDR on PC is the bugs and the constant need to switch.
Recently I had to restart the game/PC and switch resolutions/window modes like 10 times in Sekiro before it was displaying correctly, for example.

Also, if you watch a mix of HDR and SDR content on your PC then you need to constantly switch HDR on and off, because SDR looks bad in HDR most of the time.
I wish there was an application or a macro that would just allow me to flick it like a switch instead of having to open task bar and type 'hd color' every time I want to switch the dynamic range.
 

EvilBoris

Prophet of Truth - HDTVtest
Verified
Oct 29, 2017
16,678
The Windows thing is interesting, as activating the HDR mode simply sets brightness to 80nit max and limits SRGB to SRGB.
But as 80 nits is not really suitable for a typical Pc environment , it looks inanely dim by comparison. Also most users are used to an oversaturated image, which the HDR mode corrects.
I just turn the brightness up until it is full in the windows HDR settings. Using this mode will actually put less strain on a TV and prolong its life, so it's not a bad thing if you can actually see what you need to see.
 

Roven

Member
Nov 2, 2017
889
I want these settings on 5700 for my C9, but I guess I need HDMI 2.1 for this?

YUV 422
1440p
120hz
10bit
 

craven68

Member
Jun 20, 2018
4,548
It's a club 3D Cac-1085, they are just starting to ship now.

You need a GPU that supports DSC (Turing, Newer AMD and intel cards), I only got it today, But I seem to be ok with 10bit 4K 120hz via HDMI 2.1.
That's only about 11gbps with 422, so well under the 48gbps.
I'm still doing a lot of experimenting with custom resolutions and stuff to see if I can push it further.
As it's a bit of a workaround there must be a few software and firmware limitations at either end of the chain that are stopping me going higher
Do you think i can do 1440P 120hz and hdr on a 1080ti ? i don't find a good one to do this :/