• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.
  • We have made minor adjustments to how the search bar works on ResetEra. You can read about the changes here.

Bjones

Member
Oct 30, 2017
5,622
It's pretty ridiculous. I can hook up my consoles to my hdr monitors, no problem. I can hook up my Mac mini with a Vega 64 egpu ... flawless.
switch on hdr in Windows 10 .. everything is blue. Some games are fine .. some are blue ... some are just straight washed out.
I have 2 hdr monitors and 3 hdr TVs. There are issues on all while using Windows 10 with hdr on.. very frustrating.
 

TeenageFBI

One Winged Slayer
Member
Oct 25, 2017
10,240
Some games allow you to enable in-game HDR even if Windows doesn't have it turned on, some don't. Sometimes it'll look correct that way, sometimes it won't.

Yeah, it sucks.

Sekiro is one of the few games I've tried that makes it very easy.
 

Massicot

RPG Site
Verified
Oct 25, 2017
2,232
United States
Some games allow you to enable in-game HDR even if Windows doesn't have it turned on, some don't. Sometimes it'll look correct that way, sometimes it won't.

Yeah, it sucks.

Sekiro is one of the few games I've tried that makes it very easy.

Yep, for me this is the biggest outstanding issue. Monster Hunter World and FFXV won't let you set HDR unless it's on in Windows (and even then, XV's implementation is a bit weird). But Ubisoft games (Division 2, in my experience) can overwrite the general windows setting and enable it and works pretty much with no issues for me.
 

The Shape

Member
Nov 7, 2017
5,027
Brazil
Yeah. I have no problem with games that turn HDR On despite if it's set On or Off in Windows. But to leave it On in Windows itself is awful. Never managed to make it look good. My eyes hurt trying to browse anything.
 

wbloop

Member
Oct 26, 2017
2,273
Germany
It sucks big time.

I upgraded my AVR on Christmas so I can play PC with 5.1 sound and HDR, but I can't activate HDR on Windows. The switch immediately sets back to off for no reason, although I bought a brand-new HDMI2.0/HDCP2.2-compliant high speed cable and the AVR has otherwise no issues getting HDR through on other hardware, be it HDR10 or Dolby Vision content.

So no Monster Hunter World in HDR for me.
 

TeenageFBI

One Winged Slayer
Member
Oct 25, 2017
10,240
Yeah. I have no problem with games that turn HDR On despite if it's set On or Off in Windows. But to leave it On in Windows itself is awful. Never managed to make it look good. My eyes hurt trying to browse anything.
Windows itself is clearly not designed to be seen in HDR. The colors are all wrong.

I'd imagine that MS will attempt to fix HDR in one of those big feature updates. Here's hoping it doesn't break everything.
 

EvilBoris

Prophet of Truth - HDTVtest
Verified
Oct 29, 2017
16,684
There are some key differences between how Direct X 12 and 11 handle HDR

Turn on HDR in windows prior to booting games (this allows games to know they are connected to an HDR display when using certain GPU APIs.

As a rule of thumb, run in full screen exclusive when available.

For troubleshooting, use a standard HD or UHD resolution and ensure your TV is not in Pc mode.
Disable any in game overlays (steam , nvidia etc).
These are all known sources of issues.
 

Deleted member 13560

User requested account closure
Banned
Oct 27, 2017
3,087
Sekiro's HDR experience is exquisite. But yeah... other games are a little janky when it comes to HDR. Resident Evil 2 was fine as well, but I think you needed to have windows HDR enabled for it to work. AI always turn desktop HDR off when not running a game requiring it.

Also a little off topic. I have a C9 and when I run in a non Game mode HDR setting, my latency disappears when setting it to 120hz. Does the TV automatically turn off some post processing effects at 120hz? I mean I'm not complaining. I get more options and better color outside of game mode anyways.
 
Nov 14, 2017
4,928
I really don't understand why there isn't an easy way to tone map an SDR image into a HDR contained to basically make the Windows desktop and other SDR apps look the same as they do in SDR. Like, why does it have to be too dark or washed out no matter how you set the brightness slider?
 

Okinau

Member
Oct 27, 2017
532
I have an Alienware m15 laptop. The screen is not HDR but my 4K TV is. How do I get Windows to output HDR?
 

Legend J 858

Member
Oct 25, 2018
577
A little off topic but still related to HDR.

4K HDR movie play back on a X1X sucks compared to a dedicated 4K UHD HDR player like a Oppo
 

nikos

Banned
Oct 27, 2017
2,998
New York, NY
You need to configure it properly and have a monitor that supports HDR10, such as the Asus PG27UQ, otherwise you're not going to have a great experience.
 

catboy

Banned
Oct 25, 2017
4,322
i feel like it got WORSE tbh, it's now a nightmare. colour space seems totally incorrect for me using HDR on KS8000 unless i switch the nvidia display colour settings to RGB full or limited after enabling HDR.
 

EvilBoris

Prophet of Truth - HDTVtest
Verified
Oct 29, 2017
16,684
i feel like it got WORSE tbh, it's now a nightmare. colour space seems totally incorrect for me using HDR on KS8000 unless i switch the nvidia display colour settings to RGB full or limited after enabling HDR.

KS8000 doesn't do HDR very well if the TV is in PC mode (which it will be by default) change it to game mode or Blu Ray player to sort that out.
 

laxu

Member
Nov 26, 2017
2,782
It really needs an option "turn on HDR only on demand" so games can detect that yes, this display has HDR but can toggle it on and off themselves. HDR in general is pretty flaky in Windows and just loves to turn off or just not enable at all.
 

low-G

Member
Oct 25, 2017
8,144
I don't know why HDR couldn't have been handled similarly to how old bit depth in hardware was handled decades ago.

That was elegant, this has been a complete mess.

I know there are standards which hope to replicate actual display output with the signals, and the gamma curve + human perception is a beast, but there needs to be some strong underlying standards.
 
OP
OP
Bjones

Bjones

Member
Oct 30, 2017
5,622
You need to configure it properly and have a monitor that supports HDR10, such as the Asus PG27UQ, otherwise you're not going to have a great experience.

configuration has nothing to do with it when it works flawlessly on 3 other platforms .. on 4 separate TVs/monitors . Lg TVs , oled and led, monitors Asus xg438q and lg 27uk850
 

inner-G

Banned
Oct 27, 2017
14,473
PNW
I really only use it when you can toggle it in-game like Far Cry, but really you only ever need to switch it on right before launching an HDR game if you want. (I never do though). It does make the desktop and stuff look like crap.
 

Afrikan

Member
Oct 28, 2017
16,988
It sucks big time.

I upgraded my AVR on Christmas so I can play PC with 5.1 sound and HDR, but I can't activate HDR on Windows. The switch immediately sets back to off for no reason, although I bought a brand-new HDMI2.0/HDCP2.2-compliant high speed cable and the AVR has otherwise no issues getting HDR through on other hardware, be it HDR10 or Dolby Vision content.

So no Monster Hunter World in HDR for me.

You didn't mention it, so just making sure.. does your PC have a HDMI 2.0 output?
 

pswii60

Member
Oct 27, 2017
26,673
The Milky Way
HDRSwitch is an awesome tool for those few games that don't automatically switch on HDR. You can easily set up a script so it automatically turns it on when the game starts and off when it finishes, great for when using Big Picture Mode.
 

Poison Jam

Member
Nov 6, 2017
2,984
I haven't had any issues in a while, it's working relatively well for me.

I've got HDR turned on in Windows, TV set to "games-console" in input settings (this is important, do not select PC) . And I believe it's that 444 mode turned on in Nvidia control panel.

My main issue is that when I adjust volume, have any form of overlay active, and sometimes when alt-tabbing; the TV goes in and out of HDR mode.
 

JudgmentJay

Member
Nov 14, 2017
5,220
Texas
I've only played 9 or 10 games on PC that support HDR and all worked perfectly without tweaks so I guess YMMV. The only annoyance is that some require you to toggle HDR in your display settings and some do not. Also RE2 looks better in SDR. All played on a C6 OLED.
 

Watership

Member
Oct 27, 2017
3,118
Unfortunately this is just life on the PC. Hardware for cards and monitors, software with OS and Games. Everyone is going to blame everyone else. It'll shake out, but this happens wirh every new featur/tech jump on PC.
 

XxLeonV

Member
Nov 8, 2017
1,140
I'd prefer every game just be able to override the Windows setting vs rely on it. And with that, it could just require us to enable in the menu for every title if need be. It's a bit annoying when I forget to enable the Windows setting before playing a game that relies on it for HDR.
 

jb1234

Very low key
Member
Oct 25, 2017
7,231
Half the time, PC ports don't even include the feature, even when their console versions did. It's maddening.
 

Ostron

Member
Mar 23, 2019
1,953
HDRSwitch is an awesome tool for those few games that don't automatically switch on HDR. You can easily set up a script so it automatically turns it on when the game starts and off when it finishes, great for when using Big Picture Mode.
That's a good solution! I'll need to remember this when I get back home.

Now the next hurdle is recording tools like Shadowplay being incompatible with HDR. Sigh.
 

Afrikan

Member
Oct 28, 2017
16,988
I suppose that a RTX 2060 has HDMI 2.0. Afaik it even has HDMI2.1?

Yes lol.

When you hook it up to the TV, do you display it through only the TV? Not mirrored or extended.

Also are all the HDMI ports on your receiver 2.0?

Also is there a setting on your receiver's hdmi settings menu to enable hdmi 2.0 features? I forget the terminology. Some TVs give you the option on/off for each hdmi port. So I'm curious if that holds true to your reciever.
 

wbloop

Member
Oct 26, 2017
2,273
Germany
Yes lol.

When you hook it up to the TV, do you display it through only the TV? Not mirrored or extended.

Also are all the HDMI ports on your receiver 2.0?

Also is there a setting on your receiver's hdmi settings menu to enable hdmi 2.0 features? I forget the terminology. Some TVs give you the option for each hdmi port.
The reciever has HDMI2.0/HDCP2.2 on every input. It's the reason why I bought it. HDR passthrough wasn't possible on my old HDMI1.4 model. But I ran the TV as the third monitor, so that's probably the solution. I probably need to disable extended mode, although the TV/AVR is set as my primary screen in the NVIDIA settings when it's being used.
 

ss_lemonade

Member
Oct 27, 2017
6,658
i feel like it got WORSE tbh, it's now a nightmare. colour space seems totally incorrect for me using HDR on KS8000 unless i switch the nvidia display colour settings to RGB full or limited after enabling HDR.
Have you tried just sticking to YCbCr? That's what I do with my KS9000 so I don't have to worry about black level on the TV.

Can't say I've noticed any color space issues though. Always stuck with auto on the TV and it looks good to me.

It sucks big time.

I upgraded my AVR on Christmas so I can play PC with 5.1 sound and HDR, but I can't activate HDR on Windows. The switch immediately sets back to off for no reason, although I bought a brand-new HDMI2.0/HDCP2.2-compliant high speed cable and the AVR has otherwise no issues getting HDR through on other hardware, be it HDR10 or Dolby Vision content.

So no Monster Hunter World in HDR for me.
Are you sure the input your are using in your receiver can do HDR? Don't know about other receivers, but with mine for instance, eventhough it is perfectly capable of 4k hdr with hdmi 2.0 rear ports, I found out that the front aux hdmi in cannot do 4k.
 

plagiarize

It's not a loop. It's a spiral.
Moderator
Oct 25, 2017
27,545
Cape Cod, MA
My monitor seems to play nicely with HDR on PC, but support is still lacking. Halo Reach has HDR on console and not PC. Why? I thought we were over that.
 

Yarbskoo

Member
Oct 27, 2017
2,980
The desktop looks okay with HDR on my monitor, but the slider tends to reset itself after the monitor goes to sleep which is kind of a pain in the ass.
 

Ste

Banned
Jun 8, 2018
514
England
It's non exclusive full screen mode games that are stupid and we're stupid before the need for hdr.

I'll always choose exclusive full screen mode if I can.

I will try the hdr switch though so thanks for the tip
 

ss_lemonade

Member
Oct 27, 2017
6,658
I really don't understand why there isn't an easy way to tone map an SDR image into a HDR contained to basically make the Windows desktop and other SDR apps look the same as they do in SDR. Like, why does it have to be too dark or washed out no matter how you set the brightness slider?
This is one thing I never understood about the slider. Like, what's the point if SDR applications are still going to look off with bad colors? You're better off just disabling HDR when you don't need it (making the whole juggling between on and off with certain games an irritating process).
 

Hawk269

Member
Oct 26, 2017
6,044
Agree it does suck. I don't understand why they can't have Windows not send a HDR signal unless you are playing HDR content. That is how consoles work...if in the menu, the console sends the SDR signal and you are fine...once you boot a game that has HDR it sends the HDR signal and the TV switches over. Why they decided to send a HDR signal while in desktop is just stupid. I am sure it is something that would not be hard to fix, but who knows why they do it this way.
 

InfiniDragon

Member
Oct 25, 2017
2,312
I was wondering if I was doing something wrong, I just got an LG 32ML600M monitor that supports HDR after using a TV for a monitor for a few years, and when I turned it on I was like "this...looks off". Monster Hunter World looks washed out as does Windows itself, I didn't know Sekiro looks good with it though so I'll boot it up after work and take a look.
 

GameAddict411

Member
Oct 26, 2017
8,518
I agree it's awful. What I do is that I don't toggle it on unless I am about to play a game that supports HDR.
 

Deleted member 7948

User requested account closure
Banned
Oct 25, 2017
1,285
I had problems before when trying to manually choose the output color format. After leaving everything on auto, it has been smooth sailing.

However, I have some problems with Dolby Vision. Clearly there isn't enough bandwidth for it in 1440p120 or 4K60, leading to artifacts on the screen. 1080p120 is fine.
 

dgrdsv

Member
Oct 25, 2017
11,882
Windows HDR have issues for two reasons:

1. There is more than one HDR implementation option on Windows and they kinda work differently.
2. Windows is limited in how it can handle display output changes without direct user intervention meaning that there will be issues up until HDMI 2.1 or at least DSC monitors on DP 1.4 will arrive.

Another thing is that PC monitor industry is a burning trash dump with its "DHDR400" devices which aren't even physically capable of producing anything close to HDR image. TV industry is more in line with what you need to view HDR video, there's less ways to cut corners.
 

Pargon

Member
Oct 27, 2017
12,014
I think a lot of this is on developers either not implementing things properly, or using outdated methods of implementing HDR; e.g. vendor-specific implementations which require different settings, rather than using the system-level HDR implementation that should switch seamlessly and doesn't require exclusive mode etc.
Windows 10 itself shouldn't have any major issues with it now.

That being said, with my projector, I found it was a whole lot easier to connect to the HDMI port that doesn't support HDR at all, and treat it as an SDR display.
But that's more because projectors aren't really HDR displays even if they accept an HDR signal, and I thought the results were worse than either using SDR sources, or having madVR tone map HDR video to SDR (which can look better than the SDR release of that same source).
The only downside is that it will only do 1080p120 in 8-bit with that port, rather than 1080p120 10-bit, but I'm not sure that it actually made any visible difference at all - certainly not in motion at 120Hz.

I really don't understand why there isn't an easy way to tone map an SDR image into a HDR contained to basically make the Windows desktop and other SDR apps look the same as they do in SDR. Like, why does it have to be too dark or washed out no matter how you set the brightness slider?
That is probably the ideal solution, but there are several issues with it right now:
  1. Most people watch SDR out-of-spec, and doing this would force it to be displayed to-spec (100 nits, BT.709 gamut, 2.4 gamma). Even people with calibrated displays often push SDR to higher brightness levels than intended.
  2. General PC use requires RGB/4:4:4 chroma sampling or else things like text will look awful. That means 8-bit color (and banding) with HDR until HDMI 2.1 gets here.
  3. HDR requires that the display's brightness is maxed-out to be displayed correctly.
    • With an LCD, that means raising the backlight - which means that contrast with SDR content is going to be a lot worse than with the backlight properly set to 100 nits. This is true for any display that doesn't achieve perfect blacks "natively" like OLED does.
    • With OLED you have the problem of pushing the signal into the lower bits, where they don't have the best gradation - which means more color banding.
And that all assumes your display processes an HDR signal accurately to begin with. Any inaccuracies with HDR tone mapping will affect your SDR signal inside the HDR container.
 

chronomac

Member
Oct 25, 2017
1,235
Mobile, AL
Turn on HDR in windows prior to booting games (this allows games to know they are connected to an HDR display when using certain GPU APIs.
You shouldn't have to turn on HDR in Windows before every game. The PS4, Xbox One, Apple TV, Shield, and pretty much every major media console plays content in SDR and switches to HDR automatically based on the content. Why can't Windows 10?
 

ss_lemonade

Member
Oct 27, 2017
6,658
You shouldn't have to turn on HDR in Windows before every game. The PS4, Xbox One, Apple TV, Shield, and pretty much every major media console plays content in SDR and switches to HDR automatically based on the content. Why can't Windows 10?
PS4 seems to be more flexible, where many of its games have an HDR toggle, so it's a per-game setting. Xbox One is pretty similar to Windows, where it's an OS setting. Still a better situation though like you mentioned since you could just leave it enabled and it would get used when dealing with HDR content.