• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.

J_ToSaveTheDay

Avenger
Oct 25, 2017
18,789
USA
I've had it up to my ass with this HDR stuff lol.

That's reasonable. There's no universal standard yet, standards vary by content type, etc -- so the bar is constantly moving the last few years.

I'm one of the folk that always says they prefer console gaming because of the ease of use, but every time there's a shift like HDR, it becomes a new learning process and can involve some settings tweaks and I admit I find it pretty annoying myself sometimes.

I personally have loved the impact HDR can make, though with the knowledge I have now, I do wonder if some of my PAST good impressions were off the mark when it came to color and contrast accuracy. Alas, the more I am learning as of late, the more and more confident I'm starting to become when it comes to HDR.

The funny thing is that both of the new-gen consoles have their system-level HDR tuner so that HDR can theoretically become a set-it-and-forget-it sort of thing for gamers, but devs still haven't adopted its use for the most part. Right now, it feels like Sony's first-party stuff is all utilizing it on PS5 and it's been pretty great experience for me on my CX since that's all I have so far besides Devil May Cry 5 Special Edition. I really hope Sony and Microsoft communicate with devs about just looking at the system-side calibration since that calibration accounts for the user's exact display capabilities and their own subjective preferences and can theoretically apply to any software that runs on the system.
 

Ninjician-

Member
Oct 29, 2017
443
HGIG turns off tone mapping. I use it regardless if the game "supports" it or not.

If the game supports system level HGIG, perfect. If the game has sliders, perfect.

If you get a game like God of War, which has neither, you basically just clip the upper highlight detail, but retain a brighter picture.

I would never use DTM, just as I wouldn't use Vivid mode. I even did a tone curve through Calman to rolloff ALL HDR at 100%, essentially using HGIG for both games and movies/shows.
 

Crayon

Member
Oct 26, 2017
15,580
That's reasonable. There's no universal standard yet, standards vary by content type, etc -- so the bar is constantly moving the last few years.

I'm one of the folk that always says they prefer console gaming because of the ease of use, but every time there's a shift like HDR, it becomes a new learning process and can involve some settings tweaks and I admit I find it pretty annoying myself sometimes.

I personally have loved the impact HDR can make, though with the knowledge I have now, I do wonder if some of my PAST good impressions were off the mark when it came to color and contrast accuracy. Alas, the more I am learning as of late, the more and more confident I'm starting to become when it comes to HDR.

The funny thing is that both of the new-gen consoles have their system-level HDR tuner so that HDR can theoretically become a set-it-and-forget-it sort of thing for gamers, but devs still haven't adopted its use for the most part. Right now, it feels like Sony's first-party stuff is all utilizing it on PS5 and it's been pretty great experience for me on my CX since that's all I have so far besides Devil May Cry 5 Special Edition. I really hope Sony and Microsoft communicate with devs about just looking at the system-side calibration since that calibration accounts for the user's exact display capabilities and their own subjective preferences and can theoretically apply to any software that runs on the system.

Demons souls looks just amazing. But then in the forest in horizon it looks like f****** Doom 3, then I'm squinting trying to see down the track in WRC, but then Gran Turismo looks amazing again, then black looks gray in x game, like a tar pit in y game.

I don't even know if my TV does real HDR anymore!!!

I only heard of hgig the other day I started trying to figure it out. I said screw it. I turned it all off.

Same. I have my OLED 'calibrated' with my eyes and some settings from RTINGS and other YouTube videos and am perfectly happy where it is right now with every game I play. I'm done touching the settings lol.

∆∆∆∆ this is me. It's always been a bit of a project to get the setting you like best out of your tv. Once I go through all the trouble to get it set once, I want to forget it. I even get it set up to compromise between lights off and lights on because I don't want to mess with it. I can handle that with regular settings. I'm no expert, but I can do it. The HDR stuff is killing me with the inconsistency, though.

Maybe it's me, maybe it's my shitty TV. I'm sure one day it will be all sorted out, but for now I'm returning to my rugged sdr lifestyle!
 

HaL64

Member
Nov 3, 2017
1,821
I have no problems when enabling HGIG in games. Looks great.

But forget watching Netflix or Prime. Some scenes are so dark you can't see shit (see like 80% of Mr Robot scenes). I was like, why is this so banded and black? I turned off HGIG (turned on dynamic tonemap on) on my CX, and all the sudden I can see again.
Same with UltraHD Blurays (with HDR). LOTR looks too dark with HGIG. Dynamic Tonemap and it looks beautiful again.

I mean if you want to be "correct", have fun, I guess.
 

Deleted member 75819

User requested account closure
Banned
Jul 22, 2020
1,520
I have no problems when enabling HGIG in games. Looks great.

But forget watching Netflix or Prime. Some scenes are so dark you can't see shit (see like 80% of Mr Robot scenes). I was like, why is this so banded and black? I turned off HGIG (turned on dynamic tonemap on) on my CX, and all the sudden I can see again.
Same with UltraHD Blurays (with HDR). LOTR looks too dark with HGIG. Dynamic Tonemap and it looks beautiful again.

I mean if you want to be "correct", have fun, I guess.
I'm with you. I'm all for accuracy, but not if it means scenes are so dark that a bunch of detail is too hard to see. DTM may not be as accurate but it still looks 100x better than just SDR, so you're still getting the value of HDR.
 

J_ToSaveTheDay

Avenger
Oct 25, 2017
18,789
USA
So if a game is not supporting HGiG, there will be no difference between "DTM off" and HGiG?

There would still be a difference, as demonstrated in the most recent video here in the thread.

HGIG will turn off ALL tone mapping from the TV itself and allow the content source (game console or game itself) to do the tone mapping. This tone mapping is going to be the most accurate because the system allows the user to calibrate the tone mapping to the EXACT capabilities of the TV, or if preferred, to the users exact preference. The problem with this approach right now is that very few developers actually allow their games to utilize the system-level tone mapping and apply it to their game. This might be better in the future, depending on how strongly that Microsoft or Sony communicate this to developers and urge them to utilize it. Like, HGIG with system-level tone mapping could theoretically be a "set it and forget it" universal setting for users but right now developers seem to opt out and allow in-game settings on a game-by-game basis, which is sort of where the frustration starts for users right now, especially since the language and exact effect of the settings is still being learned by many users.

HGIG is still useful anyway because it will tell the TV not to do any tone mapping in its own processing pipeline. It will allow all of the tone mapping to be done by the game console or by the game settings themselves, but this does require the users to either know about the HDR Calibration setting within the PS5 and the Xbox console and do it, or it requires the users to know their TV's capabilities with a fairly exact measure so that they can configure on a per-game basis... And unless you have a super popular TV like the LG OLEDs, this information can be relatively elusive. Like, it's generally accepted that LG OLEDs are best setting their peak HDR brightness to 800 nits (though the display puts out actual physical nits at around 700nits) wherever that setting is available, but I couldn't tell you where you might be able to find what the peak HDR brightness capabilities of most other models at. It might or might not be mentioned in review outlets like Rtings. If you do know this info, it does empower you to set this in any individual game and feel generally confident that you're configuring games to your display's exact capabilities and producing accurate HDR image output.

With DTM off, you're telling the TV not to dynamically adjust based on its internal algorithm which is constantly examining the current frame and adjusting the contrast based on its best guess of what its seeing. The DTM off is instead saying instead of doing it dynamically, just assume that the peak brightness of the content is 4000 nits and then squish that assumed HDR meta data mastering into the actual physical capabilities of your display. In the example of the LG CX, it will assume the content is mastering its HDR meta data to a max of 4000 nits and a minimum of 0 nits (pure black) and squishing it into the actual physical capabilities of 0-700 nits of the display, which will do an okay job but not a 1:1 perfect job.

Dynamic tone mapping will use an internal processing algorithm to analyze each frame and adjust the peak and low based on what the image is displaying based entirely on that algorithm as a "best guess" for what the accurate contrast range will be, hence why it is called dynamic tone mapping. It is ignoring ANY meta data (or re-processing over existing meta data) and using its own internal tone mapping algorithm to determine it on a per-frame basis, hence how it is deemed inaccurate. This setting right now is observed to be generally favoring a brighter-than-accurate image overall, which is raising blacks and mid-tones beyond their accurate values and generally blowing out bright details in the image. However, it is also producing overall the brightest possible image in HDR games which many users are finding pleasant, since many users operate under the impression that lifted blacks is allowing for more visual clarity even in dark areas of the image and that is resulting in an overall more "clear" image, even at the cost accuracy. Just as a theoretical example, it's like raising blacks so you can see the pipes hanging in the dark corner of the sewer, but you're also blowing out the image so you can't see the top bricks where the light is pouring into the tunnel down the way. People prefer to see those pipes in the pitch dark tunnel rather than the intended pitch black, realistic presentation of that image -- it can provide a tangible visual benefit to be able to see that stuff even though it's not accurate. However, if you were to take that screenshot of the "accurate" presentation and throw it into photoshop and expose it up, and if you had it accurately tone mapped in HGIG to your exact display's capabilities, you'd find that the image's data is still actually preserving all that deep dark detail like the pipes without crushing the image, even if that's not what your eye physically perceives. The accurate image is the more "realistic" one to how light would behave in that tunnel, not necessarily the image that provides you the most clarity and gameplay advantage, but that detail is actually physically being preserved and properly detailed even if your eye can't physically perceive it.

Accuracy and experiencing content at creators content is still a matter of subjective preference. Your display will always be able to adjust settings to let you see pipes in a dark corner of an image, and if that's your preference, go for it. The creators likely intended it to be realistically presented to where you can't quite make out that corner even though the display is technically preserving that detail in a very dark, though nearly-impercetible, manner. It's up to you which approach you prefer. As someone who generally favors content immersion, I prefer the more accurate image, even if it comes at the cost of what I can generally perceive in the image. If you've ever been in a dark cave or sewer or something like that, not everything within your field of view is easily visible without a light source being directed at it and some parts of your field of view will appear "blacked out" -- and that's the kind of experience that HDR wants to present in many cases without physically occluding that detail (not in ALL, but that's sort of the intent).

Gears 5 and Demon's Souls bot have very good HDR implementations on the new-gen hardware. In both, I admit a lot of scenes look very dark and it can be hard to make out intimate and minor detail in dark portions of the image, but I have set things to HGIG with this knowledge in mind and I generally feel that I am experiencing the games in as immersive and "realistic" as the creators intended. Other players might prefer to see the finer details at the cost of blown out bright details and raised blacks.
 
Last edited:

TitanicFall

Member
Nov 12, 2017
8,263


vincent confirms he has only found two games on series x which fully use HGIG... gears 5 isn't one of them

his argument for HGIG is sound though


I tried his advice for Spider-Man and didn't like the results. A sunny day in New York felt like a gray day in London. I think this is a case of YMMV. It's like letting your TV do the upscaling or letting your source device do it. One might do a better job than the other in some cases. I rather set it and forget it with DTM on.
 

ClamBuster

Member
Oct 27, 2017
4,092
Ipswich, England
I tried his advice for Spider-Man and didn't like the results. A sunny day in New York felt like a gray day in London. I think this is a case of YMMV. It's like letting your TV do the upscaling or letting your source device do it. One might do a better job than the other in some cases. I rather set it and forget it with DTM on.

i actually agree

i got rid of it on doom eternal because it just killed the image
 

maGs

Member
Oct 27, 2017
239
Perfect image accuracy sounds good on paper but I am here to play games at the end of the day. Having a dark and hard to see image isn't worth some details being slightly too bright. Also HGIG seems to be intended to play in an extremely dark room. Who wants to sit around and worry about playing at day vs at night? Set your screen to DTM on and be done with it. I tried both and tried to convince myself HGIG was somehow better, but nope. DTM on all the way.
 
Oct 28, 2017
83
Perfect image accuracy sounds good on paper but I am here to play games at the end of the day. Having a dark and hard to see image isn't worth some details being slightly too bright. Also HGIG seems to be intended to play in an extremely dark room. Who wants to sit around and worry about playing at day vs at night? Set your screen to DTM on and be done with it. I tried both and tried to convince myself HGIG was somehow better, but nope. DTM on all the way.

HDR in general is designed to be viewed with around 5 nits of ambient light, listening to Vince from HDTVTest in his recent dynamic tone mapping video, I think he really makes a valid point about things looking more impressive having a really bright object in a dark scene or vice versa, you want the dynamic range to be as wide as possible, darkest to brightest, I think with dynamic tone mapping enabled it lessons that dynamic range by raising the brightness across the whole range including mid tones which I agree looks off, it looks flatter, the problem is because tone mapping brightens the whole screen it gives the image that 'pop' factor and a lot of people like that, they associate that with 'looking better', with HGIG enabled it looks much more natural imo.
 

2Blackcats

Member
Oct 26, 2017
16,054
HGIG turns off tone mapping. I use it regardless if the game "supports" it or not.

If the game supports system level HGIG, perfect. If the game has sliders, perfect.

If you get a game like God of War, which has neither, you basically just clip the upper highlight detail, but retain a brighter picture.

I would never use DTM, just as I wouldn't use Vivid mode. I even did a tone curve through Calman to rolloff ALL HDR at 100%, essentially using HGIG for both games and movies/shows.

This, exactly.

DTM looks fine, is brighter. But the tones look way less natural to me. It looks much gamier but maybe that's why people like it.
 

ShadowRunner

Member
Oct 29, 2017
166
It seems like using HGIG is a bit like calibrating SDR to 100nits (or is it 200), but is jus not going to look good for the majority of people. Most people accept that this is not a hard and fast requirement for home viewing. Is anyone using filmmaker mode for watching cable tv?

It doesnt help that a lot of what is concidered correct calibration is taken directly standards for cinema. Applying those standards to games doesnt always make sense, otherwise we would be playing them at 24fps also lol.
 

Kinggroin

Self-requested ban
Banned
Oct 26, 2017
6,392
Uranus, get it?!? YOUR. ANUS.
What TV do you have?

Just based on popularity, I'm guessing it's an OLED. In which case you should use HGIG, then run each console's in-built HDR calibration app. For any game that gives you the option, set peak brightness at 800 nits.

At first blush, dynamic tone mapping can look like a more pleasing image, as it is much brighter. But it is ultimately less accurate.

Nailed it
 
Oct 27, 2017
1,382
It seems like using HGIG is a bit like calibrating SDR to 100nits (or is it 200), but is jus not going to look good for the majority of people. Most people accept that this is not a hard and fast requirement for home viewing. Is anyone using filmmaker mode for watching cable tv?

It doesnt help that a lot of what is concidered correct calibration is taken directly standards for cinema. Applying those standards to games doesnt always make sense, otherwise we would be playing them at 24fps also lol.
Nah it's different because each pixel has a fixed nit value in the HDR signal, and HGIG displays exactly what it's given (up to ~800 nits, the max the TV is capable of). So it's not really a standard, that's just how HDR works. Dynamic tone mapping tries to make assumptions about the image and manipulates it more than just increasing the brightness. It can lead to reduced details and darks or highlights and really changes the intended look.

Although if HLG takes off (HDR that uses a gamma curve) we wouldn't have to worry about any of this lol.
 

NoWayOut

Member
Oct 27, 2017
2,073
HGIG is the "correct" way to handle HDR in the sense that it will give you the most accurate picture. DTM makes everything brighter and makes the image pop more at the cost of accuracy. It's kind of a vivid mode for HDR. It's like the Warm 2 white balance debate, it is more accurate but many do not like it and prefer a cooler color tint.

Like many settings, it's very subjective. I personally prefer more accurate and neutral image whenever possible, others like brighter, more saturated presentation. It's up to you. Try both and set it the way you like it.
 

Blayde

Member
Oct 27, 2017
1,690
Kentucky
I don't have an OLED. My TV has a tone mapping setting, but not a dynamic tone mapping setting. The tone mapping setting is a value 0-100, so what am I supposed to put that on? It's one of the new vizio PQX TVs.
 

HeyNay

Banned
Oct 27, 2017
2,495
Somewhere
Leave it on unless you're playing in a pitch black room, or playing a game that specifically supports HGiG (most don't). I've been going back and forth over this, watching HDTVTest videos, and I finally found a thread that makes sense of it all:

Dynamic tone mapping on or off?

I recently bought an LG CX and was wondering what are people's thoughts about dynamic tone mapping when watching movies and TV? Do you keep it on or off? I can't make up my mind which I prefer. Sometimes the image looks too dim with it switched off but I understand having it on can result in a...

TLDR: Nothing is ever going to be perfectly "accurate", with an at-home display. Dynamic Tone Mapping as a software solution is basically a means to an end, and will ultimately get you a better picture in most situations, especially if you use your set with any ambient light in the room. Professional calibrators, and AV junkies always talk about "accuracy" as it relates to numbers or tests, instead of just going with what feels right in their own living room settings, all of which are ultimately different. There's a reason why they say not to use other people's settings. They're a good starting point, but ultimately you want to understand what a feature is doing, and whether or not it looks good to your eyes.

If you have an HGiG compatible game (there are still only a few of these), then sure. That'll be the best setting as it'll allow the game to map the brightness as the creator intended it, but if you don't, enabling HGiG will likely result in a duller, dimmer, picture. You can get used to it, sure. Even convince yourself that it's "better", but that's arbitrary.

I just went through RE Village with HGiG enabled and some rooms were so dark that I couldn't possibly conceive that it was what the creators intended. When I turned on Dynamic Mapping, the room suddenly illuminated with color and light and felt completely balanced.

The most logical advice I read on this issue is to adjust the HDR settings in your console with dynamic mapping turned off (otherwise it'll change in brightness as you adjust the setting), and then enable dynamic mapping afterwards. Doing this gave me stunning results in RE Village and I was able to forget about all this HGiG stuff.