Have you tried the Sony games? Aside from God of War, they're all so good in HDR.In general I find HDR a little overblown personally. It makes the sky in games like Gears 4 and FH4 look great, it really stands out but even in SDR it still looks really damn good too. Shadow of the Tomb Raider can look great in the jungles but in caves, crypts and tombs I prefer the look of SDR. It's a mixed bag in my opinion and not as consistent as I would hope.
Hope it looks great in RDR2. Rockstar's first HDR implementation.
Like an updated version? Cause at least on consoles the HDR mode has been there since day 1.
--------------
As for the thread, at least from my experience the only really bad one was Destiny 2 and not in that it made the game look bad but it's the only HDR I've seen that actively made the gameplay feel worse. Some reason it felt like it added a lot of input lag. Other HDR games didn't do this. Maybe it's my TV but I feel like other games would have the same problem as well.
Destiny 2. The dark areas in the game just go to a pitch black - which sucks because often you have to navigate and platform through these areas in the new Forsaken expansion.
Have you tried the Sony games? Aside from God of War, they're all so good in HDR.
I agree that overall it's a mixed bag. I thought AC Origins looked pretty great with HDR but now people are saying Odyssey isn't? So weird.
Isn't this the point of hdr though? Destiny 2 hdr is great imo.Destiny 2. The dark areas in the game just go to a pitch black - which sucks because often you have to navigate and platform through these areas in the new Forsaken expansion.
Most of the people in this thread don't have their sets calibrated HDR properly. Once it's done it's a game changer
Would definitely be interested.
For me, Nessus and Io look particularly like a blotchy mess, and I feel like I have to choose between being able to see in pitch-black caves or being able to see my HUD or enemies against the white sky.
Thanks.The game doesn't control the exposure correctly and illuminates many of the daylight scenes above and beyond a normal SDR grade. When you move to other areas of the game it doesn't have this same issue , but it's an issue non the less.
Already have it on PC but I bought the X version just for the HDR. Time to sell it I guess :\
Have you tried the Sony games? Aside from God of War, they're all so good in HDR.
I agree that overall it's a mixed bag. I thought AC Origins looked pretty great with HDR but now people are saying Odyssey isn't? So weird.
Thanks.
Some posts here are really embarrassing.
"It's your TV"
"It's your settings"
"You are wrong and its not a opinion its a fact."
This is not the first TV I own and this is not the only board I'm registered to (avforums, avsforum, hifi-forum etc) I know how settings work, thank you very much.
God of War has nice graphics and all, but from the few games with HDR I played (RE7, UC4, Shadow of the Tomb Raider) and tons of HDR movies I watched, it is the most unnatural looking and straining on the eyes, with the extreme contrasts in the daylight scenes.
Perhaps it doesn't fit the thread titles description, but I didn't find it good either.
I'd have to disagree.Destiny 2. The dark areas in the game just go to a pitch black - which sucks because often you have to navigate and platform through these areas in the new Forsaken expansion.
It does. It looks amazing on my a1 oled especially the at sea and Madagascar levels.I don't have a PS4 but when I borrowed my brother's Pro and played Uncharted 4 on it, on my OLED which I then had it looked great. Although I'm not even sure U4 has HDR?
Thanks, but my C7 set is properly calibrated, I played around with the ingame HDR settings though.My God of War looks amazing with HDR.. so you do check your settings.. tv.. etc.
It's a different case if you compare it to Uncharted 4 for example.. in that game I find the colors more real but a little washed out in some scenes.
Isn't this the point of hdr though? Destiny 2 hdr is great imo.
I'd have to disagree.
Destiny 2 would be one of those games where I think HDR has a poor implementation. So bad that I have to turn the in-game brightness down to 1 to get it half-way to where I think it should be. Gran Turismo 5 has no HDR yet pitch black is actually pitch black.
So yeah, Destiny 2 get's my vote.
Thanks, but my C7 set is properly calibrated, I played around with the ingame HDR settings though.
It is my opinion that GoWs HDR implementation has problems in a lot of situations and HDR guru EvilBoris explained why that is the case.
There is really nothing more to say here.
Yeah still find the HDR in gaming on 7 series sucks most of the time. I tend to use the cinema home mode without real cinema enabled to brighten things more thanks to active HDRI suspect the B/C6 users probably get a better deal here as their displays are not as bright as the data the game is asking for and they demand 4000nit data at all times, which the game offers by default.
The patch that the C7 has changed this 4000nkt requirement didn't it?
The point about calibration (certainly for luminance) becomes somewhat void for HDR will many of the LG sets as they have undefeatable (and with good reason) global and local contrast enhancement and tone mapping.
Was this on console or PC? I've been playing it on PC at 4K HDR and it looks and controls gloriously.As for the thread, at least from my experience the only really bad one was Destiny 2 and not in that it made the game look bad but it's the only HDR I've seen that actively made the gameplay feel worse. Some reason it felt like it added a lot of input lag.
I don't really understand your post.I suspect the B/C6 users probably get a better deal here as their displays are not as bright as the data the game is asking for and they demand 4000nit data at all times, which the game offers by default.
The patch that the C7 has changed this 4000nkt requirement didn't it?
The point about calibration (certainly for luminance) becomes somewhat void for HDR will many of the LG sets as they have undefeatable (and with good reason) global and local contrast enhancement and tone mapping.
I don't really understand your post.
AFAIK the 7series has peak brightness of 1000nits and every signal above it gets downconverted by the scaler (like many HDR movies go up to 10000nits).
What is different with the 6 series?
While hdr on assassins creed is great all devs need to stop putting white loading screens in hdr games. It's an eye killer when you've got a decent hdr tv and they flash a white loading screen at you, come on guys.
While hdr on assassins creed is great all devs need to stop putting white loading screens in hdr games. It's an eye killer when you've got a decent hdr tv and they flash a white loading screen at you, come on guys.
While hdr on assassins creed is great all devs need to stop putting white loading screens in hdr games. It's an eye killer when you've got a decent hdr tv and they flash a white loading screen at you, come on guys.
Thanks, very interesting like always.I mean they get a better deal for god of war, which is fixed at 4000nits and the game is categorically too bright, but the 16sets are no way near bright enough to do what it is asking.
C7 has a peak brightness on a 10% window of 700ish nits.
By the time 50% of the screen is requesting HDR code values, the display is only able to supply 290nits to it.
So the display has to constantly make decisions about what the relative brightness on screen is to the pixels distributed. So if the skybox is asking for 1000nits and the display isn't showing this at its maximum capability of 290nits, then an explosion happens on the lower half of the screen requesting 4000nits, the display would have to drop back the skybox brightness so 150nits for example, increase contrast as then display said explosion at 300nits.
However it can't do this on a frame by frame basis, so what the displays typically do is to dial back the overall brightness across the board, so when incoming high code value brightness data is supplied, there is enough juice left to deliver an increase.
The problem with video games is that they do not supply metadata which tells the display what content they are going to expect, the 16 series sets are not as bright to begin with and they will hold back more because they treat all videogames with no metadata as content that might achieve 4000nits.
The newer models had patches for game mode that lowered this threshold.
not sure if joking..
Read the thread. -_-not sure if joking..
the woods of the Witch is stunning with HDR
that game has one of the best HDR implementation....(together with Horizon ZD)
Destiny 2 is a mixed bag. Some areas look amazing while others are awful.
For me, locations on Io are all fucked up. Whatever post-lighting effect Bungie is using creates this crushed purple Haze in dark areas. I can't see shit.
I can't go back to SDR because the entire game just looks so washed out.
Try messing with the black slider in settings. Ignore what the game tells you about making the icon barely visible blah blah. Just play with some different slider positions in the right half of the bar until it looks right.
I did this last night on Io in Xur's cave and was shocked at the difference.
BTW my white level is all the way down and bringing it up at all looks really bad for some reason. YMMV
They normally appear after a cutscene I think, they flash white then stay white for a good few seconds, it can be blinding at times even more so if your playing with lights off.There are white loading screens in AC: Odyssey? I'm only seeing black ones on PS4.
i haven't played too many games in hdr, but god of war is the worst i've seen
i haven't played too many games in hdr, but god of war is the worst i've seen