• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.
  • We have made minor adjustments to how the search bar works on ResetEra. You can read about the changes here.

345

Member
Oct 30, 2017
7,389

doesn't make it untrue! i've never understood why so many PC gamers ignore this stuff; building an amazing rig and using a mediocre monitor is like overcooking a wagyu steak and slathering it in ketchup. something like a GTX 1080 is basically a waste of money if you don't have the output to make use of it.

that said, this thread is enough to make me glad i'm not attempting to use HDR on PC yet. 1440p g-sync IPS will be the sweet spot for a while.
 

Pargon

Member
Oct 27, 2017
12,030
PC gamers spend thousands of pounds on high end PCs and getting the very best graphics. Then they play on tiny, poor contrast LCD monitors without HDR.

I don't get it.
High refresh rate VRR monitors are considerably smoother, have less motion blur, and are much lower latency than an OLED television.
They might be fine for gaming, but there are a number of issues with the design of OLED televisions that makes them unsuitable for many people to use as a monitor, compared to an equivalent LCD display.
HDMI 2.1 should at leas solve some of these issues - it should enable 120Hz support at native resolution, and enable HDR without dropping the chroma resolution very low. It won't fix the underlying pixel structure or dimming problems that LG's OLEDs suffer from. I wouldn't be surprised if most PC monitors remain LCD until µLED.

A bigger issue than LCD's lower contrast - which typically looks fine outside of a dark room - is the fact that most monitors use a matte or semi-matte finish.
Phones, tablets, notebooks, and televisions all have higher-end options that have the display panel bonded to a glass panel with an anti-reflective coating.
95% of monitors have a matte finish that suffers terribly from glare. I don't know why manufacturers keep doing it.

mmjtfsucqqdf.jpg


As for why size is not an issue for most PC gamers - same setup from the front:
ultrawide5qagm.jpg


The monitor is half the area of the TV, but fills more of your field of view.
 
Last edited:
Oct 30, 2017
636
Canada
I'm staying in a hotel for the foreseeable (longish term) future and I picked up a BenQ HDR monitor that's pretty awesome for my Pro. Has better colours and brightness than my 8000 series Samsung. So PC market is definitely catching up and I only paid $500 for mine. Built in speakers, too.

Edit: that real 1ms response time and built in adaptive-sync really makes a difference on console games.
 

medyej

Member
Oct 26, 2017
6,443
doesn't make it untrue! i've never understood why so many PC gamers ignore this stuff; building an amazing rig and using a mediocre monitor is like overcooking a wagyu steak and slathering it in ketchup. something like a GTX 1080 is basically a waste of money if you don't have the output to make use of it.

that said, this thread is enough to make me glad i'm not attempting to use HDR on PC yet. 1440p g-sync IPS will be the sweet spot for a while.

I don't really understand the complaint. There are just different priorities for PC gamers. Monitor tech has had people using 120-144hz and beyond screens with 1ms response times and that trend started over a decade ago, whereas TVs are still mostly stuck on 60hz or falsly advertised interpolated '120hz'. Variable Refresh Rate is a new tech that was invented and adopted on PC gaming in the past few years and has made huge improvements to smoothness and solved screen tearing. Ultrawide is also gaining popularity as the wide fov is great in games and is excellent for productivity on the desktop. None of these things would have been possible if PC gamers werent upgrading and investing in new monitor tech.
 

345

Member
Oct 30, 2017
7,389
I don't really understand the complaint. There are just different priorities for PC gamers. Monitor tech has had people using 120-144hz and beyond screens with 1ms response times and that trend started over a decade ago, whereas TVs are still mostly stuck on 60hz or falsly advertised interpolated '120hz'. Variable Refresh Rate is a new tech that was invented and adopted on PC gaming in the past few years and has made huge improvements to smoothness and solved screen tearing. Ultrawide is also gaining popularity as the wide fov is great in games and is excellent for productivity on the desktop. None of these things would have been possible if PC gamers werent upgrading and investing in new monitor tech.

totally agree with that, it's just that there are an awful lot of dope PCs out there hooked up to shitty 1080p TN panels. monitors are the ultimate bottleneck and yet a lot of people seem to put more thought into incremental silicon gains.

rigs should be built around and defined by the capabilities of the monitor you're willing to buy, otherwise you're just wasting power in the here and now.
 

galv

Avenger
Oct 25, 2017
2,048
Waiting so ever patiently for those BFGDs with G-Sync, 4K, 120Hz and HDR10.

Until then, I see no reason to downgrade from my G-Sync 144Hz setup, judder free high refresh rate gaming with great IQ is my personal preference.
 

Echo

Banned
Oct 29, 2017
6,482
Mt. Whatever
Waiting so ever patiently for those BFGDs with G-Sync, 4K, 120Hz and HDR10.

Until then, I see no reason to downgrade from my G-Sync 144Hz setup, judder free high refresh rate gaming with great IQ is my personal preference.

That's the TV from Nvidia right? Well, TV-like thing...?

And it's gonna cost more than a Titan?

I am super curious how quality control is gonna be for such a thing. With regards to light leaking and dead-pixels. If only Nvidia supported free-sync on the side... Cuz I mean there are TV's out there now with 120hz, 4k, and HDR. We're just waiting on HDMI 2.1 for free-sync and bam. Suddenly the only reason Nvidia gets to charge so much is for the dang G-sync proprietary crap. Alas, what can you do? If you want the best power/flops, AMD can't compete with 1080+ power. (And you miss out on tons of Nvidia exclusive features, which I actually tend to enjoy! :p)
 

tuxfool

Member
Oct 25, 2017
5,858
That's the TV from Nvidia right? Well, TV-like thing...?

And it's gonna cost more than a Titan?

I am super curious how quality control is gonna be for such a thing. With regards to light leaking and dead-pixels. If only Nvidia supported free-sync on the side... Cuz I mean there are TV's out there now with 120hz, 4k, and HDR. We're just waiting on HDMI 2.1 for free-sync and bam. Suddenly the only reason Nvidia gets to charge so much is for the dang G-sync proprietary crap. Alas, what can you do? If you want the best power/flops, AMD can't compete with 1080+ power. (And you miss out on tons of Nvidia exclusive features, which I actually tend to enjoy! :p)
There is an argument to be made that the people that benefit the most from VRR, are those that don't spend exorbitant amounts of money on hardware, mid-range buyers. Unfortunately Gsync displays target only high end buyers.

There are so many monitors that support freesync, and not everybody needs or wants the range specified in gsync.
 
Nov 14, 2017
4,928
Is support for the SDR brightness slider app specific or something? Cos when I turn HDR on and put the SDR slider up a bit, explorer windows and my desktop look OK but other apps like Chrome still look as bad as they did before with HDR turned on.

Also, what games on PC properly use HDR? Not just HDR brightness, but also the the wider range of colours? I think AC:O has full wide colour support. What else?
 

CrichtonKicks

Member
Oct 25, 2017
11,216
I agree that the Inn Songs are fantastic.
The combat theme is not very good, sadly, POE1 had 2 pretty epic ones.

The actual April update? I think Microsoft is sending the update out in waves. I only got it a few days ago

Windows 10 version 1803 is much better with handling HDR content than previous updates.

They also added an option to brighten up SDR content.



The April 2018 update. You can download it from here https://www.microsoft.com/en-us/software-download/windows10 if you don't want to wait for your turn in automatic updates.

Thought I had it since I just updated and rebooted last night. Turns out I needed to reboot again and now it's there. Thanks!
 

PlayBee

One Winged Slayer
Member
Nov 8, 2017
5,543
The only HDR PC game I've played is Ni No Kuni II and that worked about as well as it would have on my PS4.

Which is to say that it's decent if I mess with the dynamic color and contrast settings to compensate for the lack of dynamic tone mapping in game mode on my 2016 OLED.
 

Pharaoh

Unshakable Resolve
Member
Oct 27, 2017
2,676
Dedicated Full Screen -> Turn HDR On -> Boom, done!

Ni No Kuni II works flawlessly. I don't know why some devs want to deal with Windows bullshit if there's a simple solution.
 

qa_engineer

Member
Dec 27, 2017
484
Because the displays connected to PCs are generally not hdr ready and therefore the software to support hdr is still half baked (windows 10 hdr on/off toggle sucks).

Now that more and more people, myself included, are using hdr tvs to PCs the manufacturers are realizing there's a market for it. Once the hdr displays in place the software will mature and catch up to consoles.

Edit:
Destiny 2 and BF1 have amazing hdr implementations. You don't need to enable hdr withinw windows. Just make sure the game is fullscreens and boom it switches to hdr automatically.

Mass Effect Andromeda on PC has the jankiest hdr implementation,that im aware of, todate.

-sent from my note 8, please excuse typos
 

SirMossyBloke

Member
Oct 26, 2017
5,855
Just got an HDR TV yesterday and my god, what a nightmare to set up. Still cant get it to work for Nex Machina, its way too dark and almost unreadable.

CoD WWII looks great, Origins looks great, just wish we got better support on PC. Nvidia looking at making more monitors isnt much of a help at all.
 

shark97

Banned
Nov 7, 2017
5,327
It's weird to me how PC is always on the cutting edge of everything-except HDR. Where consoles/HDTV seem way ahead. And PC is some lagging clusterfuck.
 

shark97

Banned
Nov 7, 2017
5,327
I'm staying in a hotel for the foreseeable (longish term) future and I picked up a BenQ HDR monitor that's pretty awesome for my Pro. Has better colours and brightness than my 8000 series Samsung. So PC market is definitely catching up and I only paid $500 for mine. Built in speakers, too.

Edit: that real 1ms response time and built in adaptive-sync really makes a difference on console games.


Wait, there's a PC monitor with over 1k nits brightness for $500 (Samsung KS8000 was over 1k right?)? What size is this, 24" or something lame? I need 34" widescreen.

I'm kind of looking to purchase a PC Monitor soon but it looks like HDR will not be a box I'll be able to check affordably.
 
Oct 30, 2017
636
Canada
BenQ 3270U. Absolutely amazing image quality for the price. Speakers are throwaway, but the colours and blacks are on point. It's 32 inches, so that should fit the bill.

I think I'm just amazed that you can get something like this and at this price since I was a first gen 4Ker, and the tech hasn't improved substantially since the 8000series.

I think LG has a highly lauded HDR monitor too. That one needs sound output though.
 

Lockjaw333

Member
Oct 28, 2017
764
I just got my first HDR display, an LG 27UK600. 27", 4K, and has HDR10 compatibility. I'm under no delusions that the HDR on this monitor is comparable to a mid-high end 4K HDR tv, but its still pretty great when I can get it to work. I've also seen a bunch of weirdness with HDR on PC in the less than 24 hours since I unboxed this monitor.

Specifically, I can't get HDR to stick with Forza 7. I realized I had to enable it in WIndows 10 to get it to work with xbox/windows games like Gears 4 and Forza 7. It works in Gears 4, but it turns on in the menus in Forza 7 but turns off once I start a race. Bizarre.

HDR works with AC Origins, BF1, Madden 19 without issue. It auto detects and turns it on when I start the game (no need to enable in Windows). However the consistency of the presentation varies from game to game. I found it to look a bit ridiculous in AC Origins. There are HDR settings that I have no idea what I'm supposed to do with, so that could be the issue. It just makes the colors look cartooney.

Madden 19 HDR looks great, sun and reflections off helmets pop. Looks good in BF1 too.

However, I'm all about accurate colors- i calibrate my displays with an X-rite colormunki display colorimeter. I have to say, I still prefer the calibrated SDR image to the HDR one across the board. I realize the HDR on this monitor might not be representative of true HDR, but I'm not that impressed. Certain things look great, but colors and such look too intense and washed out in other areas.

Mixed bag for sure.
 

low-G

Member
Oct 25, 2017
8,144
I think it's a shame MS can't do "48-bit" color in Windows and hardware support that as best it can (within reason).

Desktop view could still be a restricted range but maybe you could even display HDR info in photos correctly, etc etc.
 

Alej

Banned
Nov 1, 2017
399
Via HDMi, because of lack of bandwith you can't have HDR10 + RGB4:4:4 at 60hz. I'd say it might be a problem on PC where everything else than RGB444 looks like shit on a monitor. Maybe that's why the support is abysmal.

Wait for more bandwith.
 

EvilBoris

Prophet of Truth - HDTVtest
Verified
Oct 29, 2017
16,686
I think it's a shame MS can't do "48-bit" color in Windows and hardware support that as best it can (within reason).

Desktop view could still be a restricted range but maybe you could even display HDR info in photos correctly, etc etc.

Nvidia cause loads of problems with HDR because windowed 10bit colour is exclusive to the crazy expensive Quadro range.
 

SliChillax

Member
Oct 30, 2017
2,147
Tirana, Albania
Has anyone compared the same HDR game on and pc and consoles on the same TV to see if HDR works better on consoles or is it just a myth? I haven't done a comparison myself but HDR on pc work great to my eye. All these complaints make me confused if you guys are doing something wrong or I'm too blind to see the difference.
 

JahIthBer

Member
Jan 27, 2018
10,383
This thread is still true (though Fortnite doesn't have HDR on Console & BF1 does have HDR on PC) but new games like No Man's Sky Next missing HDR for some reason is just ridiculous, especially since data for HDR is still in the game, it's like they kept it out of PC on purpose, time to put on my tinfoil hat.
 

MazeHaze

Member
Nov 1, 2017
8,584
Has anyone compared the same HDR game on and pc and consoles on the same TV to see if HDR works better on consoles or is it just a myth? I haven't done a comparison myself but HDR on pc work great to my eye. All these complaints make me confused if you guys are doing something wrong or I'm too blind to see the difference.
The HDR looks the same on PC or Console, the problem people have is that getting HDR to work can be game dependant, and unless you know things like windows store games dont run in exclusive full screen so you need to turn on hdr manually, or you need to leave Nvidia control panels color settings at default, etc, it can be confusing.
 

ABK281

Member
Apr 5, 2018
3,004
I have so many issues dealing with HDR and just tv/monitor shit in general. Everything has been a mess since upgrading my 1080p tv to a 4k hdr tv in terms of PC usage. The way it constantly resizes everything when I switch over from my 1080p monitor is obnoxious as hell. It's constantly resizing my steam chat, desktop icons, just about everything when I switch back to the monitor and it's a pain to repeatedly resize everything. For some reason Windows can't adjust the text/windows properly when switching resolutions, so you have to sign out and back in to fix everything. The problem with this is half the time when I sign out and back in my nvidia control panel won't load (probably something on my end but fuck me if I can figure out what it is), which makes it impossible for me to switch from rgb to ycbcr 4:2:2 to enable HDR which I've been needing lately for FFXV and Origins.

I mean I could just keep it on ycbcr at all times but it looks bad, especially when hdr turns itself on for no discernible reason. And to top it all off, if I dare make the mistake of turning off my computer when it's outputting to my TV I'm forced to unplug the HDMI cable from my TV because Windows REFUSES to acknowledge my monitor after it's been started up on my TV. It's the weirdest thing and I can't figure out why it's happening. It's causing a bunch of unneeded wear on my tv's hdmi port.
 

Pargon

Member
Oct 27, 2017
12,030
Lame, but isn't this something Intel and AMD could make inroads on then? Nvidia would slow adoption but they'd eventually be pressured into relenting.
AMD have the same restrictions.

[…] Windows REFUSES to acknowledge my monitor after it's been started up on my TV.
It's disabled, or it doesn't show up at all? WIN+P should switch multi-monitor display modes.
If you have an NVIDIA GPU, the NVIDIA service is required to be running for things like display detection. It will be enabled by default but some people like to disable everything on their system for some reason.
 

ABK281

Member
Apr 5, 2018
3,004
It's disabled, or it doesn't show up at all? WIN+P should switch multi-monitor display modes.
If you have an NVIDIA GPU, the NVIDIA service is required to be running for things like display detection. It will be enabled by default but some people like to disable everything on their system for some reason.

Oh I'm aware of WIN+P, that's how I go from my monitor to my TV, but it just won't work if I start my computer up after it was previously outputting to my TV. Technically this issue is especially annoying to me because I'm a weirdo that turns my computer off almost every night. See here how it is shows both my monitor and my TV while its currently only outputting to my monitor:
KUsIho3.png



If I were to turn on my computer with my TV as the main display it will not show my Dell monitor as an option even if I were to unplug it and plug it back in. It's incredibly annoying, but far from my biggest problem in this whole mess. I've never disabled anything related to Nvidia so I doubt that's it. For some reason the control panel just won't start some times.
KUsIho3
 

Pargon

Member
Oct 27, 2017
12,030
Oh I'm aware of WIN+P, that's how I go from my monitor to my TV, but it just won't work if I start my computer up after it was previously outputting to my TV. Technically this issue is especially annoying to me because I'm a weirdo that turns my computer off almost every night. See here how it is shows both my monitor and my TV while its currently only outputting to my monitor:
KUsIho3.png



If I were to turn on my computer with my TV as the main display it will not show my Dell monitor as an option even if I were to unplug it and plug it back in. It's incredibly annoying, but far from my biggest problem in this whole mess. I've never disabled anything related to Nvidia so I doubt that's it. For some reason the control panel just won't start some times.
KUsIho3
Your second image isn't loading for me, but I'd suggest using Display Driver Uninstaller to completely remove the drivers and do a clean install of the latest version if you're running into problems like the control panel not even opening.
 

ss_lemonade

Member
Oct 27, 2017
6,665
Specifically, I can't get HDR to stick with Forza 7. I realized I had to enable it in WIndows 10 to get it to work with xbox/windows games like Gears 4 and Forza 7. It works in Gears 4, but it turns on in the menus in Forza 7 but turns off once I start a race. Bizarre.
But Gears 4 doesn't have HDR support on PC.

I think this is a problem, since some people think that enabling HDR on Windows (which is the requirement with Windows Store games with actual HDR support) means automatic HDR support too on PC. I mean, other people thought they were getting HDR output too with Rise of the Tomb Raider by doing the same when in fact that game never got HDR support on PC in the first place.

The problem with this is half the time when I sign out and back in my nvidia control panel won't load (probably something on my end but fuck me if I can figure out what it is), which makes it impossible for me to switch from rgb to ycbcr 4:2:2 to enable HDR which I've been needing lately for FFXV and Origins.
I think it was the April Windows update that made HDR a bit easier to activate. You could even leave color settings to RGB and HDR would still work (it would run in RGB dithered or something like that). Of course, as with everything Windows, there is somewhat a new catch where exclusive fullscreen games only work in HDR and in fullscreen when running in native resolution lol. Drop the resolution and you get a smaller window. It's probably Microsoft's solution to getting HDR to still work with arbitrary resolutions.
 
Last edited:

Majorgamer10

Banned
Mar 6, 2018
28
PC games don't need support for HDR! Since Windows 10's Creative thingy update, put any game into border less Full-screen mode and flip the HDR switch in Windows 10 display settings, it will apply HDR to any game and look great! It's a bit buggy and causes some games to crash initially, but majority of the time it's amazing and works perfectly. I'm playing Yakuza 0 in HDR 4k right now. Only issue is input lag is around 10M
-____-.
https://drive.google.com/file/d/1Skbo8FOLPVPQ1-UYgio6X5PQcFnzjhHH/view?usp=drivesdk

1080TI Gigabyte Aourus + LG OLED TV. that HDR sign is up only when HDR content is being processed, and I use MADVR to make sure HDR content is passed through without any processing, so it's pure. I'm currently using RGB 8-bit though, switching content in ycbr 10 bit is painful on the eyes, while RGB is always comfy. *RGB with dithering
https://drive.google.com/file/d/1-EmKntUbC_4ClyS3vxb9v812V58qWAf4/view?usp=drivesdk

10 bit YCBR 4:2:2:
https://drive.google.com/file/d/1aWGp0ujPLaINkvmUmmijxFEuF66AQzN0/view?usp=drivesdk

https://drive.google.com/file/d/1O4WfBcDEIIUBBeZUuH76z215hGhOmBus/view?usp=drivesdk

From what I understand it's being displayed within the HDR color space so the image is being displayed as pure HDR content for what is there, but the actual image itself isn't mapped for HDR content. It's like trying to raise the resolution of the original image, the internal resolution of the image is gonna look kinda scratchy when displayed at a higher resolution, but it is being displayed at that higher resolution. You can tweak that with processing though, like LG OLED has a color gamut processor that extends the image's color space. I jam the OLED light to Max, contrast to Max, color to Max, flip the color gamut to extended, and the black level to lowwith a standard gamma of 2.2.

https://drive.google.com/file/d/12uOBXfqYnWgyYg5xQiNNS83BycsOwOb5/view?usp=drivesdk

Since OLED has that OLED light, true blacks, infinite contrast and low lumens of 500, makes any picture a sweet sweet picture that's 500 lumens HDR mapped through the OLED light.

"
Yeah that's not at all how it works. If the game doesn't have HDR support there's no way to force it.

It's a subtle technology so I understand why people who haven't seen proper HDR don't get it (or think they have it when they don't). It doesn't work this way though."

The OLED light forces it as natural light.

Active HDR', which is similar to HDR10+ or HDR10 with dynamic metadata. This feature analyses the content frame by frame in real time to adjust the HDR tone mapping curve. This has the advantage of displaying each scene with an optimized HDR effect, as opposed to the HDR with a static metadata, where all the movie was using the same tone mapping curve, resulting in some scene sometimes being too dark or too bright or simply not exposed correctly to have the best possible HDR effect.
 
Last edited:

ABK281

Member
Apr 5, 2018
3,004
Your second image isn't loading for me, but I'd suggest using Display Driver Uninstaller to completely remove the drivers and do a clean install of the latest version if you're running into problems like the control panel not even opening.

Sorry about that, I didn't attach a second image don't know how that got there. This issue has persisted through multiple drivers installations and I always uninstall the old driver via DDU before installing a new driver. It's been like this since I got this TV which would be last November.

Oh, and I should add the control panel always opens on first boot. It's only when I sign out and back into windows to fix the resolution scaling on the TV that it often fails to boot back up. And of course since I got this other issue to deal with just restarting the computer to fix the scaling isn't a reliable option. It's a vicious cycle.
 
Last edited:
Oct 27, 2017
9,431
PC games don't need support for HDR! Since Windows 10's Creative thingy update, put any game into border less Full-screen mode and flip the HDR switch in Windows 10 display settings, it will apply HDR to any game and look great! It's a bit buggy and causes some games to crash initially, but majority of the time it's amazing and works perfectly. I'm playing Yakuza 0 in HDR 4k right now. Only issue is input lag is around 10M
-____-.
https://drive.google.com/file/d/1Skbo8FOLPVPQ1-UYgio6X5PQcFnzjhHH/view?usp=drivesdk

1080TI Gigabyte Aourus + LG OLED TV. that HDR sign is up only when HDR content is being processed, and I use MADVR to make sure HDR content is passed through without any processing, so it's pure. I'm currently using RGB 8-bit though, switching content in ycbr 10 bit is painful on the eyes, while RGB is always comfy.

Is this a joke? Holy shit if true.
 

Turnabout Sisters

The Fallen
Oct 25, 2017
2,350
For me HDR in Windows is still pretty iffy, at least a few months ago when I last messed with it. I can get Forza 7 to work *sometimes*, but most of the time it messes up and looks washed out. Even when it works it's a little underwhelming compared to what I've seen on PS4. Maybe I will try again with this new update.
 

elektrixx

Banned
Oct 26, 2017
1,923
This thread reminded me to check HDR in Windows and now I leave it on. HDR on and SDR slider all the way up. When I turn HDR on and off, the taskbar and everything looks the same, so that seems like it's now working properly to me.

I just recalibrated my TV recently, so it was a good time to do it. The only thing I changed was turning the contrast down by 5 anyway (from 100).
 

JMY86

Member
Oct 27, 2017
7,071
United States
The only PC game I have ever gotten HDR to work properly on is Mass Effect: Andromeda. I have had zero luck with any other game. I recently tried again with FFXV that I picked up during the GMG summer sale and everything looks washed out with the HDR on. I am fine playing with the HDR off as the game is fucking stunning and I was surprised how much it puts the X1 enhanced version to shame even on my aging GTX 970.
 

TyMiles2012

Member
Mar 26, 2018
101
I would say because there's no official standard. My 4K monitor has HDR10 support, but it basically dithers 10bit down to 8bit, and has a peak brightness of 450 nits or so, where televisions go up to 1000 or something like that. So people are calling it "Fake HDR". Still, "Fake HDR" looks a lot better than SDR. I use it on PS4 and PC whenever possible.
 

K' Dash

Banned
Nov 10, 2017
4,156
Because there doesn't seem to be one official standard for HDR, which is lame and why I haven't jumped into it for either TVs/consoles or PC

This, HDR when done properly is amazing, but most of the time is shit, I have to calibrate my TV for each game separately, it is a real mess.
 
Oct 27, 2017
9,431
PC games don't need support for HDR! Since Windows 10's Creative thingy update, put any game into border less Full-screen mode and flip the HDR switch in Windows 10 display settings, it will apply HDR to any game and look great! It's a bit buggy and causes some games to crash initially, but majority of the time it's amazing and works perfectly. I'm playing Yakuza 0 in HDR 4k right now. Only issue is input lag is around 10M
-____-.
https://drive.google.com/file/d/1Skbo8FOLPVPQ1-UYgio6X5PQcFnzjhHH/view?usp=drivesdk

1080TI Gigabyte Aourus + LG OLED TV. that HDR sign is up only when HDR content is being processed, and I use MADVR to make sure HDR content is passed through without any processing, so it's pure. I'm currently using RGB 8-bit though, switching content in ycbr 10 bit is painful on the eyes, while RGB is always comfy. *RGB with dithering
https://drive.google.com/file/d/1-EmKntUbC_4ClyS3vxb9v812V58qWAf4/view?usp=drivesdk

10 bit YCBR 4:2:2:
https://drive.google.com/file/d/1aWGp0ujPLaINkvmUmmijxFEuF66AQzN0/view?usp=drivesdk

https://drive.google.com/file/d/1O4WfBcDEIIUBBeZUuH76z215hGhOmBus/view?usp=drivesdk

From what I understand it's being displayed within the HDR color space so the image is being displayed as pure HDR content for what is there, but the actual image itself isn't mapped for HDR content. It's like trying to raise the resolution of the original image, the internal resolution of the image is gonna look kinda scratchy when displayed at a higher resolution, but it is being displayed at that higher resolution. You can tweak that with processing though, like LG OLED has a color gamut processor that extends the image's color space. I jam the OLED light to Max, contrast to Max, color to Max, flip the color gamut to extended, and the black level to lowwith a standard gamma of 2.2.

https://drive.google.com/file/d/12uOBXfqYnWgyYg5xQiNNS83BycsOwOb5/view?usp=drivesdk

Since OLED has that OLED light, true blacks, infinite contrast and low lumens of 500, makes any picture a sweet sweet picture that's 500 lumens HDR mapped through the OLED light.

"
Yeah that's not at all how it works. If the game doesn't have HDR support there's no way to force it.

It's a subtle technology so I understand why people who haven't seen proper HDR don't get it (or think they have it when they don't). It doesn't work this way though."

The OLED light forces it as natural light.

Active HDR', which is similar to HDR10+ or HDR10 with dynamic metadata. This feature analyses the content frame by frame in real time to adjust the HDR tone mapping curve. This has the advantage of displaying each scene with an optimized HDR effect, as opposed to the HDR with a static metadata, where all the movie was using the same tone mapping curve, resulting in some scene sometimes being too dark or too bright or simply not exposed correctly to have the best possible HDR effect.


holy_shit_keegan_michael_key.gif


This actually works on my b7!

Had to switch rocket league to boarderless from full screen that was going to sdr mode. But this is absolutely the real deal and looks amazing!

jg31wh.jpg
 

Kyle Cross

Member
Oct 25, 2017
8,437
PC games don't need support for HDR! Since Windows 10's Creative thingy update, put any game into border less Full-screen mode and flip the HDR switch in Windows 10 display settings, it will apply HDR to any game and look great! It's a bit buggy and causes some games to crash initially, but majority of the time it's amazing and works perfectly. I'm playing Yakuza 0 in HDR 4k right now. Only issue is input lag is around 10M
-____-.
https://drive.google.com/file/d/1Skbo8FOLPVPQ1-UYgio6X5PQcFnzjhHH/view?usp=drivesdk

1080TI Gigabyte Aourus + LG OLED TV. that HDR sign is up only when HDR content is being processed, and I use MADVR to make sure HDR content is passed through without any processing, so it's pure. I'm currently using RGB 8-bit though, switching content in ycbr 10 bit is painful on the eyes, while RGB is always comfy. *RGB with dithering
https://drive.google.com/file/d/1-EmKntUbC_4ClyS3vxb9v812V58qWAf4/view?usp=drivesdk

10 bit YCBR 4:2:2:
https://drive.google.com/file/d/1aWGp0ujPLaINkvmUmmijxFEuF66AQzN0/view?usp=drivesdk

https://drive.google.com/file/d/1O4WfBcDEIIUBBeZUuH76z215hGhOmBus/view?usp=drivesdk

From what I understand it's being displayed within the HDR color space so the image is being displayed as pure HDR content for what is there, but the actual image itself isn't mapped for HDR content. It's like trying to raise the resolution of the original image, the internal resolution of the image is gonna look kinda scratchy when displayed at a higher resolution, but it is being displayed at that higher resolution. You can tweak that with processing though, like LG OLED has a color gamut processor that extends the image's color space. I jam the OLED light to Max, contrast to Max, color to Max, flip the color gamut to extended, and the black level to lowwith a standard gamma of 2.2.

https://drive.google.com/file/d/12uOBXfqYnWgyYg5xQiNNS83BycsOwOb5/view?usp=drivesdk

Since OLED has that OLED light, true blacks, infinite contrast and low lumens of 500, makes any picture a sweet sweet picture that's 500 lumens HDR mapped through the OLED light.

"
Yeah that's not at all how it works. If the game doesn't have HDR support there's no way to force it.

It's a subtle technology so I understand why people who haven't seen proper HDR don't get it (or think they have it when they don't). It doesn't work this way though."

The OLED light forces it as natural light.

Active HDR', which is similar to HDR10+ or HDR10 with dynamic metadata. This feature analyses the content frame by frame in real time to adjust the HDR tone mapping curve. This has the advantage of displaying each scene with an optimized HDR effect, as opposed to the HDR with a static metadata, where all the movie was using the same tone mapping curve, resulting in some scene sometimes being too dark or too bright or simply not exposed correctly to have the best possible HDR effect.
This isn't how this works. If the game doesn't support HDR, it doesn't support HDR. You're just forcing SDR into HDR.