• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.
  • We have made minor adjustments to how the search bar works on ResetEra. You can read about the changes here.

Ebb

Member
Oct 27, 2017
102
Ebb Try this:

87:01:1a:28:02:00:c2:33:c4:86:4c:1d:b8:0b:d0:84:8

0:3e:13:3d:42:40:e8:03:32:00:10:27:90:01

Make it one string obviously, I've taken the string from the Windows GUI, instead of the app.

Holy crap, yup that did it -- wow this looks great! HDR Game with DC off now looks about the same as Cinema Home with DC on low; switching back and forth between the two I can't really tell much of a difference. Like you said, I can notice a tiny bit of clipping, for example when I look up at the sun/clouds in Horizon, but its well worth the trade off for the extra brightness/shadow detail IMO, and nowhere near as bad as Standard/Vivid.

Also - I'm now able to toggle the metadata injection on and off via the app without restarting the device. I think the theories about HDR Game mode assuming 4000 nits peak brightness are correct -- toggling the metadata injection seems to do virtually nothing in any of the other modes aside from Game mode.

I don't want to oversell it - it's still a pretty expensive solution to an issue only some people have a problem with, but it essentially gives your B/C7 the Cinema Home tone mapping like-settings at the 21ms latency of game mode. I'm pretty happy with it, and while the device cost me ~$160 (on sale), its a whole lot cheaper than upgrading to a B/C8.

Thanks again DOTDASHDOT for the recommendation. I'm going to play around with it more tonight when it gets darker, and can post some more impressions then
 

DOTDASHDOT

Helios Abandoned. Atropos Conquered.
Member
Oct 26, 2017
3,076
Holy crap, yup that did it -- wow this looks great! HDR Game with DC off now looks about the same as Cinema Home with DC on low; switching back and forth between the two I can't really tell much of a difference. Like you said, I can notice a tiny bit of clipping, for example when I look up at the sun/clouds in Horizon, but its well worth the trade off for the extra brightness/shadow detail IMO, and nowhere near as bad as Standard/Vivid.

Also - I'm now able to toggle the metadata injection on and off via the app without restarting the device. I think the theories about HDR Game mode assuming 4000 nits peak brightness are correct -- toggling the metadata injection seems to do virtually nothing in any of the other modes aside from Game mode.

I don't want to oversell it - it's still a pretty expensive solution to an issue only some people have a problem with, but it essentially gives your B/C7 the Cinema Home tone mapping like-settings at the 21ms latency of game mode. I'm pretty happy with it, and while the device cost me ~$160 (on sale), its a whole lot cheaper than upgrading to a B/C8.

Thanks again DOTDASHDOT for the recommendation. I'm going to play around with it more tonight when it gets darker, and can post some more impressions then

Awesome! Glad it's working! Not sure why there are two different strings for HDR, but whatever if it works it works :) Don't forget that with this way of doing things, you keep the blacks, and don't get dark grey kicking in, like with DTM. Win win :0

BTW injection/HDR games do act the same in the other modes, just that if you have dynamic contrast on low, you wont see any difference, it's how I compared injection to DTM using Cinema user, and switching between DC off and low, whilst keeping injection running.
 
Last edited:

Deleted member 16452

User requested account closure
Banned
Oct 27, 2017
7,276
Z9F retails for about same price as Samsungs Q9FN if this is true. I wonder when these TVs will hit stores? Probably not in time for black friday deals. :(

If this is true then there will be no reason for anyone to get a Q9FN.

I can't even imagine what it would be like to watch a 4000 nit TV lol, must BLINDING.
 

Ukraine

Banned
Jun 1, 2018
2,182
Wouldn't the motion smoothing settings help for movies (while being out of question for games due to input lag)?

Also, is it something you could get used to, even if you're initially sensible to it?
Is there someone here who was very bothered by it, but then got better?
It helps but I don't like it on. It's very rare that it bothers me. It's really just wide panning shots in movies with high contrast.
 

Deleted member 4346

User requested account closure
Banned
Oct 25, 2017
8,976
If this is true then there will be no reason for anyone to get a Q9FN.

I can't even imagine what it would be like to watch a 4000 nit TV lol, must BLINDING.

If Z9F doesn't have variable refresh rate and the super-low input lag of the Q9FN, I imagine more people who want an LCD would still get the Samsung. To me the gaming features are really the only selling point, it's the highest contrast VRR display that you can buy, with some of the lowest input lag, zero burn-in, and you can use the interpolation with minimal processing delay. 4000 nits for peak highlights would be cool but for SDR/HDR image quality overall OLEDs are still going to IMO be better; highlights at 800-900 nits "pop" more when a display can put them right beside 0 nit pixels compared to a super-bright LCD where the highlights blow out blacks in the rest of the zones around it.
 

DOTDASHDOT

Helios Abandoned. Atropos Conquered.
Member
Oct 26, 2017
3,076
If Z9F doesn't have variable refresh rate and the super-low input lag of the Q9FN, I imagine more people who want an LCD would still get the Samsung. To me the gaming features are really the only selling point, it's the highest contrast VRR display that you can buy, with some of the lowest input lag, zero burn-in, and you can use the interpolation with minimal processing delay. 4000 nits for peak highlights would be cool but for SDR/HDR image quality overall OLEDs are still going to IMO be better; highlights at 800-900 nits "pop" more when a display can put them right beside 0 nit pixels compared to a super-bright LCD where the highlights blow out blacks in the rest of the zones around it.

One of the downsides of current Sony offerings, is the X1 Extreme chip, so excitingly you can pretty well guarantee the next sets will have the X1 Ultimate chip, with whatever upgrades that brings.
 

Deleted member 16452

User requested account closure
Banned
Oct 27, 2017
7,276
If Z9F doesn't have variable refresh rate and the super-low input lag of the Q9FN, I imagine more people who want an LCD would still get the Samsung. To me the gaming features are really the only selling point, it's the highest contrast VRR display that you can buy, with some of the lowest input lag, zero burn-in, and you can use the interpolation with minimal processing delay. 4000 nits for peak highlights would be cool but for SDR/HDR image quality overall OLEDs are still going to IMO be better; highlights at 800-900 nits "pop" more when a display can put them right beside 0 nit pixels compared to a super-bright LCD where the highlights blow out blacks in the rest of the zones around it.

According to the video the leak comes from, the Z9F comes with a new Sony processor called the X1 Ultimate. So the input lag will probably be very competitive, considering there's not a big difference right now with the current X1 Extreme chip. The video also says Android TV is about 60% faster thanks to this new chip.

And of course OLED will always be the superior image quality, I wasn't debating that lol.
 

FuturaBold

Member
Oct 27, 2017
2,521
If this is true then there will be no reason for anyone to get a Q9FN.

I can't even imagine what it would be like to watch a 4000 nit TV lol, must BLINDING.
The 4000 nits refer to small specular highlights (10% window). Im talking car headlights, sun, tracers from gunfire, VFX, etc. Playing something like the new Spiderman would be stunning to say the least. If this rumor is true Sony will no doubt show off Spiderman on their new displays.
 

DOTDASHDOT

Helios Abandoned. Atropos Conquered.
Member
Oct 26, 2017
3,076
The 4000 nits refer to small specular highlights (10% window). Im talking car headlights, sun, tracers from gunfire, VFX, etc. Playing something like the new Spiderman would be stunning to say the least. If this rumor is true Sony will no doubt show off Spiderman on their new displays.

It will need a ton of zones, to pull off full 4000nit highlights against dark areas.
 

Chamber

Member
Oct 25, 2017
5,279
Fuck the brightness talk, the real game changer is the new CPU actually being able to run Android TV at an acceptable level.
 

Deleted member 4346

User requested account closure
Banned
Oct 25, 2017
8,976
1500 zones would be so costly to manufacture that the Z9F won't be priced competitively. This is where FALD finds itself at a disadvantage at this point, where LG has refined their manufacturing process to bring 65" OLED under $2k.
 

Chamber

Member
Oct 25, 2017
5,279
If Z9F doesn't have variable refresh rate and the super-low input lag of the Q9FN, I imagine more people who want an LCD would still get the Samsung. To me the gaming features are really the only selling point, it's the highest contrast VRR display that you can buy, with some of the lowest input lag, zero burn-in, and you can use the interpolation with minimal processing delay. 4000 nits for peak highlights would be cool but for SDR/HDR image quality overall OLEDs are still going to IMO be better; highlights at 800-900 nits "pop" more when a display can put them right beside 0 nit pixels compared to a super-bright LCD where the highlights blow out blacks in the rest of the zones around it.

Yeah, I don't know, I think the Q9 is toast if that $3500 price is true. If gaming was that big a deal in the premium market, QLED wouldn't be getting it's teeth kicked in by OLED in the first place. Even from a gaming standpoint, I wouldn't trade Sony's vastly superior (and accurate) image quality and processing for ~10ms of input lag.
 

DOTDASHDOT

Helios Abandoned. Atropos Conquered.
Member
Oct 26, 2017
3,076
Well that was a good choice re-buying Rise of the Tomb Raider for £13.50 on the X! Held off as I'd played 3/4 of it on the PC, 2 years ago, what an awesome HDR implementation.
 

tokkun

Member
Oct 27, 2017
5,405
Yeah, I don't know, I think the Q9 is toast if that $3500 price is true. If gaming was that big a deal in the premium market, QLED wouldn't be getting it's teeth kicked in by OLED in the first place. Even from a gaming standpoint, I wouldn't trade Sony's vastly superior (and accurate) image quality and processing for ~10ms of input lag.

Samsung also has VRR, auto game mode, and a motion interpolation mode that runs with only 30-ms of lag, making it usable for many games. It would be really nice to see other manufacturers adopt these features. Improvements to motion resolution are welcome, since that is one only areas where current TVs are still way behind what plasma could do.
 

Cronus

Member
Oct 31, 2017
521
im looking forward to DOTDASHDOT Input lag test!!

since the little HDR Home experment i notice many things with my B7 almost 1 year later... things like PC mode not working correctly with Limited RGB color space(this is for SDR) or PC Mode crushed some shadow detail in the low end, even with proper settings.

Using technicolor or ISF dark with PS4 OUTSIDE PC MODE habe been mind blowing for me, l am now getting this perfect picture with excelent colors y TONS OF SHADOW DETAIL OUT OF NOWHERE.

this is why i never was able to match the in apps image(Netflix/Youtube) with my PS4, the PS4 always was more dark and artificial contrasty in the end... and its was PC Mode all along... almost 1 and half year later... :(
Can you explain this in more detail, please? I use PC mode + limited, and this is the first time I've heard anything about this.
 

Barrett2

Member
Oct 27, 2017
137
Seattle
Finally bought a 55" LG B8 today. Barely begun to go through the various settings.

For playing games on an XBOX ONE X, will I see substantial difference by tweaking settings manually vs just putting on a pre-set mode like "dark room" or "gaming?" Thanks.
 

MoFo2

Member
Oct 27, 2017
151
I'm looking at a 75", mainly either the

Sony X90F
+ low input lag (21ms) on 4k
+ good motion handling for football (soccer)
- 40ms input lag for non-4k (Switch, Snes mini etc)
- Android is slow (if it's the same as my 2015 55X90C)
or

Samsung Q9FN
+ low input lag
+ Tizen apparently is good stuff
- much more expensive than the Sony
- Not so good on motion as the Sony? Someone on Youtube mentioned motion handling on Samsungs has a distinct look/feel, whatever that means.

Nevertheless, I'm holding out for black friday. Until then I have probably gone through some other sets in my head as well.
 
Oct 27, 2017
1,248
Damn this thread, I fell into this state of mind where I'm thinking about what to get, and if I should do it at all. It's just full circle now - "Well, my 55'' plasma is still great, maybe I should just keep it and not bother... Yeah, but 65'' is looking SO good, immersion will be much better.. Well, what if IQ will be disappointment?.. Well, OLED seems like best option.. Oh but those burn in talks are really scary... Maybe I just should buy best available LED? Philips 8503 is like 2400€ it must be good... But if not, do I really need it?.. Well, my 55'' plasma is still great, maybe I should just keep it and not bother... Yeah, but 65''..."

What do you say, guys? If I go for top LED TV, maybe not especially Philips, maybe a good Sony one, will it be a good upgrade? I'm slowly going away from OLED idea because I really don't want to think about possible burn in on 3000 euro TV. Just possibility of this horror scares me away from it.
 

Sunbro83

Member
Oct 27, 2017
1,262
Holy shit. 65" C8 already down to £2199 (after LG Cashback redemption). Took until BF 2017 for the B7/C7 to hit that price. I'd been waiting for it to hit that price but thought I had a few months and I'm not in a position to buy it right now. Wonder if it can come down to £2000 by BF?

Anyway, that deal is available now at Crampton & Moore. Don't forget to tell them that Vincent sent you!

Edit: deal is over
 
Last edited:

EvilBoris

Prophet of Truth - HDTVtest
Verified
Oct 29, 2017
16,683
Awesome! Glad it's working! Not sure why there are two different strings for HDR, but whatever if it works it works :) Don't forget that with this way of doing things, you keep the blacks, and don't get dark grey kicking in, like with DTM. Win win :0

BTW injection/HDR games do act the same in the other modes, just that if you have dynamic contrast on low, you wont see any difference, it's how I compared injection to DTM using Cinema user, and switching between DC off and low, whilst keeping injection running.

Is there any software to easily adjust the values and get a nice string?

I've been doing it all manually using a spreadsheet and an INT16 converter, but it's slow and painful
 

DOTDASHDOT

Helios Abandoned. Atropos Conquered.
Member
Oct 26, 2017
3,076
Is there any software to easily adjust the values and get a nice string?

I've been doing it all manually using a spreadsheet and an INT16 converter, but it's slow and painful

Man I don't know, only way I've seen is your way, but I don't mind helping you out, if you give me what combinations you want, I'll generate them on the Vertex, and send them :)
 

ArchAngel

Avenger
Oct 25, 2017
1,472
Damn, can't decide between the twi 65": Sony X90F and LG C8.
C8 is 900€ more expensive than the Sony, and I don't like OLED for their faults (banding, black crush, smear etc). Need a new TV today .... ARGH.
 

AMDfanTO

Member
Nov 8, 2017
135
When left blank, the tv will have to make an assumption of the info. My old panasonic dx802 ignored all metadata and treated all video as 1000nit peak, with a roll off up until clipping to 1300nits. Ultimately this was the best choice to make, but did reduce the detail in highlights for a few games. Horizon and Final fantasy 15 showed lots of clipping in clouds and in specular detail.

LG chooses detail over adhering to the PQ eotf and assumes zeroed out metadata as 4000nits peak, which is not the way to go. Any data beyond 1000nits is not vital information, and the display should never reduce the apl to compensate. The issue is the aggressive tone mapping of bright yellow and reds. This makes fire and sunsets look muddy on the b7, unless cinema home with active hdr is used or if one is using metadata injection. Regular cinema mode in HDR with active hdr will even tone map the image further, if greater than 4000 nits content is shown. That's why I would stick with cinema home instead.

The aggressive tonemapping of untagged hdr metadata is also why God of War is giving everyone issues and variable results. Look at the ground and its less than 1000nits, then look up at the sky and its greater than 4000nits which really trips up the active hdr algorithm. Basically active hdr smoothly switches between 540, 1000, and 4000nit static tone mapping depending what's on the screen at the time.

Btw getting the hdfury linker today. I will post impressions
 
Last edited:

Ebb

Member
Oct 27, 2017
102
Cool to see some other people playing around with metadata injection -- interested to see your impressions.

FYI - the HDFury Vertex config GUI has an interface for generating HDR Metadata strings, which the Linker GUI does not have. However, it looks like you can launch the Vertex GUI and use the metadata generator even if you don't have a Vertex connected to your PC (the Linker GUI on the other hand won't launch for me unless the device is connected). So, perhaps you could generate the strings using the Vertex GUI, then copy and paste them into the Linker config.

I played more Horizon last night (including some time in the snowy Frozen Wilds DLC area) using DOTDASHDOT's string, and it continues to look excellent -- the perfect blacks are retained, making the snowy nights look really impressive, especially coupled with the super bright HDR effects from the robots/Alloy's armor.
 

DOTDASHDOT

Helios Abandoned. Atropos Conquered.
Member
Oct 26, 2017
3,076
EvilBoris I'm asking for a mate, can you turn on and off the HDR injection via the Linker itself? He's got a bit of a wait till his Goblue rolls up.
 

Arih

Member
Jan 19, 2018
471
Damn, can't decide between the twi 65": Sony X90F and LG C8.
C8 is 900€ more expensive than the Sony, and I don't like OLED for their faults (banding, black crush, smear etc). Need a new TV today .... ARGH.

I'm kinda on the same boat. Cant decide between LG C8 and Samsung Q9FN.
Cant decide mainly because the C8 is an OLED, i'm not made of money so i'd like to have wichever TV i buy for at least 8 years so burn-in is a big concern. And the Q9FN has really bad shadow detail. On a night sky scene, the C8 pops the starts way more than the Q9FN. Plus the Q9FN 55", some reviewers are saying it's bad compared to the 65 and 75? God...

Hate not being able to simply know "hey, this is the best TV you can buy". ARGH
 

EvilBoris

Prophet of Truth - HDTVtest
Verified
Oct 29, 2017
16,683
I'm kinda on the same boat. Cant decide between LG C8 and Samsung Q9FN.
Cant decide mainly because the C8 is an OLED, i'm not made of money so i'd like to have wichever TV i buy for at least 8 years so burn-in is a big concern. And the Q9FN has really bad shadow detail. On a night sky scene, the C8 pops the starts way more than the Q9FN. Plus the Q9FN 55", some reviewers are saying it's bad compared to the 65 and 75? God...

Hate not being able to simply know "hey, this is the best TV you can buy". ARGH

I still think that OLED is the best choice if you
Play in a darker room or mainly use SDR content.

If you play lots of HDR content or view in a lit room, then a premium LCD is the way to go.


Both will give you amazing pictures when set up properly.
 
Oct 25, 2017
3,818
I'm kinda on the same boat. Cant decide between LG C8 and Samsung Q9FN.
Cant decide mainly because the C8 is an OLED, i'm not made of money so i'd like to have wichever TV i buy for at least 8 years so burn-in is a big concern. And the Q9FN has really bad shadow detail. On a night sky scene, the C8 pops the starts way more than the Q9FN. Plus the Q9FN 55", some reviewers are saying it's bad compared to the 65 and 75? God...

Hate not being able to simply know "hey, this is the best TV you can buy". ARGH

Same boat as you all. Just wish the best combo of OLED and QLED would just exist.

Currently waiting for more Vizio P Series Quantum reviews but with this much competetion, it may be best to just let price be the deciding factor this Fall.

I'm just getting into the world of high end TVs but at a certain point, I'm just chasing diminishing returns, no?

Concerning QLED versus OLED, my tv will be a 65er and itll be in my bedroom meaning the curtains will always be drawn when viewing it. I also sit about 10 feet back from it.

With this information, which of the two would be a better buy for me?
 

Doorakz

Banned
Oct 25, 2017
617
There is an open box B7 65" tv at my local Best Buy store for $1,700. Still comes with 1 year manufacturers warranty and I can buy the extended warranty at a discount. Is this a good deal or should I wait until Black Friday sales?

I've been holding out for good deals on LG OLED 65" TVs.

And how do B7s compare against other LG OLEDs?
 

Yukstin

Member
Oct 31, 2017
223
Nashville, TN
There is an open box B7 65" tv at my local Best Buy store for $1,700. Still comes with 1 year manufacturers warranty and I can buy the extended warranty at a discount. Is this a good deal or should I wait until Black Friday sales?

I've been holding out for good deals on LG OLED 65" TVs.

And how do B7s compare against other LG OLEDs?

At this point I would wait for deals around black Friday.
 

severianb

Banned
Nov 9, 2017
957
One of the downsides of current Sony offerings, is the X1 Extreme chip, so excitingly you can pretty well guarantee the next sets will have the X1 Ultimate chip, with whatever upgrades that brings.

I'd put money on the 2019 Sony, Samsung and LG sets (and more probably) being certified HDMI 2.1. Which will bring super-low latency to everything, including Sony TVs. Some people keep saying that can't happen because chipsets and cables are not available, but those people are wrong. The HDMI organization has already confirmed the Xbox One X will be the first certified HDMI 2.1 device and that HDMI 2.1 certification can happen without the high-bandwidth features.

All confirmed in this video for anyone still in the dark:

https://youtu.be/e0XMQMKk_II

Personally, I don't give a shit about the high-bandwidth stuff. 4K/60fps or 1440p/120fps is PLENTY for me and the current HDMI cables, X1X (which I have) and chipsets can do those easily.

Bring on the HDMI 2.1 sets. Black Friday 2019 or 2020 I finally make the move to 4K HDR with VRR, 120mhz and all the other little goodies. Can't wait.
 

Deleted member 4346

User requested account closure
Banned
Oct 25, 2017
8,976
What do you say, guys? If I go for top LED TV, maybe not especially Philips, maybe a good Sony one, will it be a good upgrade?

No.

I'm slowly going away from OLED idea because I really don't want to think about possible burn in on 3000 euro TV. Just possibility of this horror scares me away from it.

Bro you are coming from plasma. OLED is far more resistant to burn-in than plasma.
 

Deleted member 16452

User requested account closure
Banned
Oct 27, 2017
7,276
Damn this thread, I fell into this state of mind where I'm thinking about what to get, and if I should do it at all. It's just full circle now - "Well, my 55'' plasma is still great, maybe I should just keep it and not bother... Yeah, but 65'' is looking SO good, immersion will be much better.. Well, what if IQ will be disappointment?.. Well, OLED seems like best option.. Oh but those burn in talks are really scary... Maybe I just should buy best available LED? Philips 8503 is like 2400€ it must be good... But if not, do I really need it?.. Well, my 55'' plasma is still great, maybe I should just keep it and not bother... Yeah, but 65''..."

What do you say, guys? If I go for top LED TV, maybe not especially Philips, maybe a good Sony one, will it be a good upgrade? I'm slowly going away from OLED idea because I really don't want to think about possible burn in on 3000 euro TV. Just possibility of this horror scares me away from it.

You already had a TV (your plasma) with risk of burn in and I assume you did fine with it.

But if you're dead set on getting an LCD, avoid that Philips. LCDs biggest strength is how bright they can get, and in that department the Philips is severely lacking for the price.



and

https://www.rtings.com/tv/reviews/best/by-type/led-lcd

Should be a good starting point in determining the right TV for you. I would research the OLEDs more if I were you, it would be a shame to drop so much money on a top end LCD, to then end up disappointed.
 
Last edited:

Hawk269

Member
Oct 26, 2017
6,043
I'm looking at a 75", mainly either the

Sony X90F
+ low input lag (21ms) on 4k
+ good motion handling for football (soccer)
- 40ms input lag for non-4k (Switch, Snes mini etc)
- Android is slow (if it's the same as my 2015 55X90C)
or

Samsung Q9FN
+ low input lag
+ Tizen apparently is good stuff
- much more expensive than the Sony
- Not so good on motion as the Sony? Someone on Youtube mentioned motion handling on Samsungs has a distinct look/feel, whatever that means.

Nevertheless, I'm holding out for black friday. Until then I have probably gone through some other sets in my head as well.

I have used both of these in my home at the 75" size and the Q9FN by far was superior. Both sets are good, but the Q9FN really was a lot more impressive.
 

DJ Lushious

Enhanced Xperience
Member
Oct 27, 2017
3,330
Currently waiting for more Vizio P Series Quantum reviews but with this much competition, it may be best to just let price be the deciding factor this Fall.
Yeah, I'm anxiously awaiting RTings' review. It won't be for at least a month, though.

Reviewed.com posted a review of the PQ last week, but there's some numbers I still want to see, like input lag. RTings' is so thorough that they've set the gold standard, really.
 

maks

Member
Oct 27, 2017
418
I'd put money on the 2019 Sony, Samsung and LG sets (and more probably) being certified HDMI 2.1. Which will bring super-low latency to everything, including Sony TVs. Some people keep saying that can't happen because chipsets and cables are not available, but those people are wrong. The HDMI organization has already confirmed the Xbox One X will be the first certified HDMI 2.1 device and that HDMI 2.1 certification can happen without the high-bandwidth features.

All confirmed in this video for anyone still in the dark:

https://youtu.be/e0XMQMKk_II

Personally, I don't give a shit about the high-bandwidth stuff. 4K/60fps or 1440p/120fps is PLENTY for me and the current HDMI cables, X1X (which I have) and chipsets can do those easily.

Bring on the HDMI 2.1 sets. Black Friday 2019 or 2020 I finally make the move to 4K HDR with VRR, 120mhz and all the other little goodies. Can't wait.


I'm looking forward to HDR and higher bandwidth audio coming through. I'll be waiting to upgrade my main gaming tv and receiver until then...likely ps5 and next xbox will be here for that
 

Deleted member 35478

User-requested account closure
Banned
Dec 6, 2017
1,788
I need to figure out how to output Dolby digital from my gaming PC. Does Nvidia have an audio driver that outputs Dolby vs pcm?
 

Chamber

Member
Oct 25, 2017
5,279
FYI, "some people" saying no HDMI 2.1 in 2019 are people and retailers in the know with contacts at these manufacturers. I think there's enough of them out there saying it that you can consider it reliable.
 
Last edited:

Rogue74

Member
Nov 13, 2017
1,756
Miami, FL
You guys are tempting me on getting an Apple TV 4K for my LG C8.

While I do think UHD blu ray has some benefits and I may opt for physical media for some of my favorites, I have to admit that when your internet connection is fast enough and the encoding is solid, 4K streaming looks phenomenal. Some Netflix shows look very noisy in dark scenes, which is disappointing. But yesterday I rented The Dark Tower from Amazon Prime in 4K HDR and it really looked amazing. Movie sucked but the image quality was really impressive.

iTunes sales and rental pricing in general seems to be something I want to take advantage of.

The other reason is that the built in WebOS apps seem to have issues that I can't make sense of. When watching Netflix or Amazon sometimes I get a mess of digital noise that looks like macroblocking for a split second. It happens on just a portion of the image, like around a person's face. The location of the artifact varies. Sometimes it is in the center of the screen, other times in the corners. It lasts for a second or two and then the image goes back to normal. If I rewind and immediately play the scene again it plays flawlessly. I know this sounds like a problem with my internet connection, but my connection is fast and pretty stable. Also, when the internet speed dips, what usually happens in Netflix is that they dynamically lower the resolution. If it gets bad enough it might go from 2160p to 720p or lower very quickly, which will make the whole picture suddenly look low res. It has never resulted in an artifact on one portion of the image while the rest of it stays super sharp. I haven't seen this problem while watching 4K blu rays on the LG UBK90 or when playing games on the PS4 Pro. Only when streaming from the internal apps. It is very frustrating, hard to describe without having someone see it, and very hard to recreate. It happens randomly.

So yeah, all of the above makes me want to try an Apple TV out.


So I caved in a bought an Apple TV 4k. Pretty slick device.

I watched 2 episodes of Bloodline on Netflix with the wife. Then when she went to bed I rented my first movie on iTunes, The Equalizer, which was in Dolby Vision. Then I watched 2 episodes of Narcos on Netflix. Not once did I notice the noise/artifact I described above. I really think there is something odd going on with the built in apps on my LG C8. It might just be mine, or it happens so quickly and infrequently that others may not have noticed it - or chalked it up to network problems. I will keep testing, but if I would have watched a similar amount of content with the WebOS apps I would have seen the issue occur at least twice based on how it has been behaving the last week or so.

The image quality on the ATV4K is pretty damn good. The Equalizer in Dolby Vision looked great most of the time. Detailed with some nice highlights. There was one scene near the end where Denzel's face is shrouded in darkness where portions of it looked like it just had this black blob over it. Now that did look like either there wasn't enough bandwidth for that particular dark scene or my OLED needs some fine tuning with settings. Other dark scenes looked good though. I don't know if this is the above black handling issues with OLEDS or what. I've had mine for less than a month so I still have a lot to learn about what settings work best. I was in the Cinema Home mode in Dolby Vision.

Like I said I will keep testing. I browsed iTunes for while and saw some pretty good deals. For example, Unforgiven in 4K UHD Dolby Vision for like $6. That's pretty damn good. Might make that my first purchase and give it a watch. I see this becoming an issue with me like Steam sales.

So, first impressions are positive. I think I'm going to enjoy using this instead of the built in apps.