I think a lot of this is on developers either not implementing things properly, or using outdated methods of implementing HDR; e.g. vendor-specific implementations which require different settings, rather than using the system-level HDR implementation that should switch seamlessly and doesn't require exclusive mode etc.
Windows 10 itself shouldn't have any major issues with it now.
That being said, with my projector, I found it was a whole lot easier to connect to the HDMI port that doesn't support HDR at all, and treat it as an SDR display.
But that's more because projectors aren't
really HDR displays even if they accept an HDR signal, and I thought the results were worse than either using SDR sources, or having madVR tone map HDR video to SDR (which can look better than the SDR release of that same source).
The only downside is that it will only do 1080p120 in 8-bit with that port, rather than 1080p120 10-bit, but I'm not sure that it actually made any visible difference at all - certainly not in motion at 120Hz.
I really don't understand why there isn't an easy way to tone map an SDR image into a HDR contained to basically make the Windows desktop and other SDR apps look the same as they do in SDR. Like, why does it have to be too dark or washed out no matter how you set the brightness slider?
That is probably the ideal solution, but there are several issues with it right now:
- Most people watch SDR out-of-spec, and doing this would force it to be displayed to-spec (100 nits, BT.709 gamut, 2.4 gamma). Even people with calibrated displays often push SDR to higher brightness levels than intended.
- General PC use requires RGB/4:4:4 chroma sampling or else things like text will look awful. That means 8-bit color (and banding) with HDR until HDMI 2.1 gets here.
- HDR requires that the display's brightness is maxed-out to be displayed correctly.
- With an LCD, that means raising the backlight - which means that contrast with SDR content is going to be a lot worse than with the backlight properly set to 100 nits. This is true for any display that doesn't achieve perfect blacks "natively" like OLED does.
- With OLED you have the problem of pushing the signal into the lower bits, where they don't have the best gradation - which means more color banding.
And that all assumes your display processes an HDR signal accurately to begin with. Any inaccuracies with HDR tone mapping will affect your SDR signal inside the HDR container.