• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.
  • We have made minor adjustments to how the search bar works on ResetEra. You can read about the changes here.
Status
Not open for further replies.

ss_lemonade

Member
Oct 27, 2017
6,659
what's the best way to see if blacks are crushing?

I'm curious too. I put in the Rtings settings and then watched all kinds of footage. With black level set to darker I just increased brightness to see if there's detail in the blacks and indeed there was, quite a lot actually. So I set the black level in the brighter mode and then set my brightness lower until blacks were deep but not crushing any detail...

I like to use the lagom test images, specificaly the black level, white saturation and contrast pages

http://www.lagom.nl/lcd-test/contrast.php

http://www.lagom.nl/lcd-test/black.php

http://www.lagom.nl/lcd-test/white.php
 

Deleted member 4346

User requested account closure
Banned
Oct 25, 2017
8,976
Creeping closer and closer to CES. Any rumors of what we might see there in terms of new TV tech or models?

Not yet. The big story is going to be HDMI 2.1 support and, for us, variable refresh rate. It's also a major refresh year for LG's OLED panels. So the show is going to be as anticipated IMO as any in recent memory.
 

Deleted member 4346

User requested account closure
Banned
Oct 25, 2017
8,976
With the serious pushback of cert for 2.1, do you really think we will see much, if anything, of support in 2018? I am highly doubtful

Well I hope we do. The Xbox One X supports adaptive sync. So that IMO increases the chances that we'll see the first TV support it standard as well. This is a situation where the market would benefit from the open standard winning out, IMO.

It's to bad the X1X doesn't have DisplayPort. VRR has been a part of the DP standard for a while now :)
 

RedlineRonin

Member
Oct 30, 2017
2,620
Minneapolis
Well I hope we do. The Xbox One X supports adaptive sync. So that IMO increases the chances that we'll see the first TV support it standard as well. This is a situation where the market would benefit from the open standard winning out, IMO.

It's to bad the X1X doesn't have DisplayPort. VRR has been a part of the DP standard for a while now :)

oh, I'm with you, it would certainly be great. It just isn't sounding very realistic...

The HDMI 2.1 Compliance Test Specification is still in development and will be published in stages over the first three quarters of 2018. Given that today's release does mark a delay from the originally planned Q2 2017 window, it may be until mid-2019 before consumers are able to take advantage of what HDMI 2.1 has to offer – if manufacturers provide them.

From Anandtech
 

TitanicFall

Member
Nov 12, 2017
8,274
Just got a Sony 55x930e at a great price for my office. I have a 65KS8000 in my living room. Setup everything last night. Booted up Horizon: Zero Dawn to test the HDR. This game looked pretty good on my Samsung, but the HDR performance on this TV is on another level. Despite it being 10" smaller than the Samsung, I think I'm going to be doing most of my gaming on this TV now. I haven't even tried 1080p 120fps on my PC yet. Looking forward to that tonight. Now I feel like I have to upgrade from a 2.1 to a 4.1 sound system.
 

Tratorn

Member
Oct 25, 2017
709
Should work with no problem, all I can say is try different cables.

That happened to me with a cheap HDMI switch that I got. I connected directly to the TV and had no issues. I recently connected to my new receiver and also had no issues. So it's either your cable or something in between.

Thanks, it was indeed the cable. Now it's working great.
Well, at least its functionality, but I'm not really happy with the picture yet. I used the rtings settings, but they're crushing my black and look unnatural in some games/movies/series.

Are there any good starting points or at least some options that should definetly be turned on/off? I'm pretty overwhelmed with all the options and don't really know much about that stuff.
 

Adobe

Member
Oct 27, 2017
378
I want a TV with like 8 HDMI ports.

4 is not enough. lol
Why not buy a receiver? Much easier imo
———

i'm having problems with limited/full hdmi settings with my X930E, when I set it to full it becomes whashed and limited seems to crush the blacks. It's automatic on ps4.
 
Last edited:

1-D_FE

Member
Oct 27, 2017
8,260
oh, I'm with you, it would certainly be great. It just isn't sounding very realistic...



From Anandtech

I guess the dream isn't totally dead because that article is a bit contradictory. The last part, the awful part, completely contradicts the earlier part of the article:

The HDMI Forum expects UHS cables to be available in the first half of 2018, noting that these cables may only ship after complying with the as-yet unreleased HDMI 2.1 Compliance Test Specification. These cables will be necessary to enable the fullest HDMI 2.1 functionality – particularly any feature that requires more bandwidth – such as 4K120 and 8K60 display modes.

So going by that, the official rhetoric from the HDMI forum is they're still expecting products to ship in the first half (even products that can't ship until they've passed certification tests). Hopefully the later part is just misguided speculation on Anantech's part.
 

LiK

Member
Oct 25, 2017
32,099
So Edge Enhancement should be ON for HDR because it's off on that setting. So weird.
 

RedlineRonin

Member
Oct 30, 2017
2,620
Minneapolis
I guess the dream isn't totally dead because that article is a bit contradictory. The last part, the awful part, completely contradicts the earlier part of the article:

The HDMI Forum expects UHS cables to be available in the first half of 2018, noting that these cables may only ship after complying with the as-yet unreleased HDMI 2.1 Compliance Test Specification. These cables will be necessary to enable the fullest HDMI 2.1 functionality – particularly any feature that requires more bandwidth – such as 4K120 and 8K60 display modes.

So going by that, the official rhetoric from the HDMI forum is they're still expecting products to ship in the first half (even products that can't ship until they've passed certification tests). Hopefully the later part is just misguided speculation on Anantech's part.

I guess, but the lead time on a cable is way different than on an entire TV line. Those decisions would have been made months ago, with CES coming in Jan. And the approach would be very odd from a manufacturer standpoint. Do you change your marketing message all year as "oh now it has VRR because the Q2 test spec is available" and then eARC is available the quarter after that? Sets will hit retail in March/April, so it'd be tough to market any of the 2.1 features if none are really available at that time.

Beyond that, VRR is kind of the lynch pin, because eARC isn't going to be earth shattering for 99% of users, and there certainly won't be 8K panels or 4K120 source devices next year (and probably not the year after? I guess we'll see what consumer Volta has, but not likely).

Cable companies can certify whenever, update packaging etc., but it gets tricky on TVs where you'll theoretically have the same models for a full 12 months. Of course firmware updates can do wonders, and we've already seen a bit of that this year with DV support added and eARC on X7 LGs. It just seems like the marketing and product line stuff could get confusing throughout the year.

Also, idk how much stock to put in the cable stuff either. There are already cables out there claiming to be 2.1 certified (or if not certified, at least capable of 48 gbps) based on a quick look on Amazon.

Again, at the end of the day, I hope that you're both right and I hope they can literally firmware upgrade all the features in all year as they hit spec tests for compliance (assuming all the hardware is in from the get go for stuff like VRR). Just seems like a long shot and maybe not worth getting hopes up for?
 

1-D_FE

Member
Oct 27, 2017
8,260
I guess, but the lead time on a cable is way different than on an entire TV line. Those decisions would have been made months ago, with CES coming in Jan. And the approach would be very odd from a manufacturer standpoint. Do you change your marketing message all year as "oh now it has VRR because the Q2 test spec is available" and then eARC is available the quarter after that? Sets will hit retail in March/April, so it'd be tough to market any of the 2.1 features if none are really available at that time.

Beyond that, VRR is kind of the lynch pin, because eARC isn't going to be earth shattering for 99% of users, and there certainly won't be 8K panels or 4K120 source devices next year (and probably not the year after? I guess we'll see what consumer Volta has, but not likely).

Cable companies can certify whenever, update packaging etc., but it gets tricky on TVs where you'll theoretically have the same models for a full 12 months. Of course firmware updates can do wonders, and we've already seen a bit of that this year with DV support added and eARC on X7 LGs. It just seems like the marketing and product line stuff could get confusing throughout the year.

Also, idk how much stock to put in the cable stuff either. There are already cables out there claiming to be 2.1 certified (or if not certified, at least capable of 48 gbps) based on a quick look on Amazon.

Again, at the end of the day, I hope that you're both right and I hope they can literally firmware upgrade all the features in all year as they hit spec tests for compliance (assuming all the hardware is in from the get go for stuff like VRR). Just seems like a long shot and maybe not worth getting hopes up for?

I have no idea. I'll just say this, the marketing on cables is, agreed, meaningless. What I find interesting is the actual consortium (which does care about those things) is stating that there will be "official" cables in the first half (which requires the the testing certification and contradicts the Q3 stuff Anantech talks about at the end.

As for displays, LG has been showing high refresh OLED prototypes for a while now. So they were clearly on board even before HDMI 2.1 started focusing on it. And it's not like their current sets aren't already 4K120 sets. They just don't have an input chipset with high enough bandwidth to accept that signal... so they apply interpolation on 60hz instead.

I guess the real question would come down to this: Is the high bandwidth HDMI chipset done? Does this need certification, or is it simply a silicon design that's already finished and tested out. I would assume it doesn't need additional testing, because this type of testing would need to be done before you actually committed to the hardware design of the chipset.

As for output devices, assuming things haven't been too delayed, I would absolutely expect new enthusiast GPUs in 2018 to have HDMI 2.1 output. Maybe I'm mis-remembering, but I always thought GPU cards were among the first devices to support the new HDMI standards.

So basically what I'm saying is I'm guardedly optimistic (and likely to be disappointed by CES).
 

RedlineRonin

Member
Oct 30, 2017
2,620
Minneapolis
I have no idea. I'll just say this, the marketing on cables is, agreed, meaningless. What I find interesting is the actual consortium (which does care about those things) is stating that there will be "official" cables in the first half (which requires the the testing certification and contradicts the Q3 stuff Anantech talks about at the end.

As for displays, LG has been showing high refresh OLED prototypes for a while now. So they were clearly on board even before HDMI 2.1 started focusing on it. And it's not like their current sets aren't already 4K120 sets. They just don't have an input chipset with high enough bandwidth to accept that signal... so they apply interpolation on 60hz instead.

I guess the real question would come down to this: Is the high bandwidth HDMI chipset done? Does this need certification, or is it simply a silicon design that's already finished and tested out. I would assume it doesn't need additional testing, because this type of testing would need to be done before you actually committed to the hardware design of the chipset.

As for output devices, assuming things haven't been too delayed, I would absolutely expect new enthusiast GPUs in 2018 to have HDMI 2.1 output. Maybe I'm mis-remembering, but I always thought GPU cards were among the first devices to support the new HDMI standards.

So basically what I'm saying is I'm guardedly optimistic (and likely to be disappointed by CES).

Totally agree on the input limitation. And I think there are actually better odds there. The connector is no different than what is on 2.0b, so it really is just the cable that needs to have higher bandwidth, and then obviously whatever hardware is required on the TV side for VRR (and eventually 8K etc.).

And concerning source devices, i think you're right, the GPUs will likely have the hardware, I was more just referring to the actual horsepower or TFLOPs we're seeing on cards. Everyone shy of Titan XP owners are lucky to get 60fps at 4K in most new games, so 4K120 native seems a ways off, and I'd be surprised if Volta was a full doubling of Pascal's power.

You could argue further optimism too, at least in LGs case, based on the updates we received this year for both the 6 and 7 series sets. If the hardware is in when it launches, it's not out of the question.

January will certainly be interesting!
 

LiK

Member
Oct 25, 2017
32,099
LMAO, found some guy's video about calibrating for "best picture" and the first thing he recommended was Vivid mode. Bailed immediately.
 

Kudo

Member
Oct 25, 2017
3,884
LMAO, found some guy's video about calibrating for "best picture" and the first thing he recommended was Vivid mode. Bailed immediately.

no99bna.gif
 

Wagram

Banned
Nov 8, 2017
2,443
Just want to report back in. I've had my OLED for awhile now. I've been abusing it while gaming. 4-8+ hour sessions. No image retention at all. OLED lights ranging from 35 to 100 depending on game and HDR.

Only real complaint I have is ABL. I wish I could turn it off. It's not atrociously bad or view breaking, but it's still annoying.
 
Last edited:
Oct 27, 2017
951
So went to Best Buy to look at televisions today. My mother was in the market for a new television and was fighting between the LG E7, Sony A1E, Sony 930E, and Samsung Q7F. The OLEDs looked amazing and my mom was very heavily considering the Sony A1E (it looked AMAZING). Unfortunately, my mother's the kind of person who leaves the TV on something like CNN and walks out of the room. Sooner or later, I felt we'd get permanent burn-in. When I mentioned the 930E, the guy working there mentioned they had an open box of that.

After looking at the open boxes, they had a 930E 55' for $1400, a Q7F 65' for $2000, a 75' 900E for $2000, and a 65' Q9F for $2700. The $2700 price tag caught my eye because the sticker price for the Q9F was $3500. This was an $800 discount. What gives?! Well after looking closely at the price tag and speaking with the attendant, apparently the discounted open box price was on a $3200 sticker price for the TV and not $3500. Looking at the open box sticker on the Q9F, it said the same thing. If we didn't buy the TV right there and then, they were going to have to raise the price of the television to $3200 open box.

On one hand, I could have been getting railroaded for a sale. On the other hand, the price on all only stores for a Q9F is definitely $3500 at the moment. Soon after, another couple showed up and they begin their pitch to them to pick up the television. I'm certain a part of it was to put the pressure on us for a sale. Well considering this was $800 of an open box return, and reviewing the Q9F in person, we decided to go for it and picked up the television as it was literally only $200 more expensive than a Q7F 65'.

We've set the delivery of the television to be two weeks from now because I wanted to report back here and ask how good of a deal is $2700 for a 65' Q9F? Is there enough of a difference to justify skipping on the Sony A1E, LG E7 because of burn-in problems? Or is this TV a little bit of overkill and we should downgrade? We have until January 14th to cancel the sale.
 

RedlineRonin

Member
Oct 30, 2017
2,620
Minneapolis
Just want to report back in. I've had my OLED for awhile now. I've been abusing it while gaming. 4-8+ hour sessions. No image retention at all. OLED lights ranging from 35 to 100 depending on game and HDR.

Only real complaint I have is ABL. I wish I could turn it off. It's not atrociously bad or view breaking, but it's still annoying.

Yep. I've been hammering my C7 since June. And even with stuff that had issues before; I burned Destiny 1 into my plasma and have been playing Destiny 2 about as much and no IR at all.
 

McCHitman

Member
Nov 22, 2017
57
Pulled the trigger on the LG 65" C7. It should be here Monday. Thank you for the help.

I don't have the fancy calibration tools that are mentioned here, where should I start? What to do first?

I do have a Speers and Munsel Blu Ray for calibration. Not sure if it covers 4K HDR though.

Reading through this thread makes me nervous and excited at the same time. I just want to enjoy some fancy gaming lol.
 

Arcadia

Banned
Oct 30, 2017
63
England
I have no idea. I'll just say this, the marketing on cables is, agreed, meaningless. What I find interesting is the actual consortium (which does care about those things) is stating that there will be "official" cables in the first half (which requires the the testing certification and contradicts the Q3 stuff Anantech talks about at the end.

As for displays, LG has been showing high refresh OLED prototypes for a while now. So they were clearly on board even before HDMI 2.1 started focusing on it. And it's not like their current sets aren't already 4K120 sets. They just don't have an input chipset with high enough bandwidth to accept that signal... so they apply interpolation on 60hz instead.

I guess the real question would come down to this: Is the high bandwidth HDMI chipset done? Does this need certification, or is it simply a silicon design that's already finished and tested out. I would assume it doesn't need additional testing, because this type of testing would need to be done before you actually committed to the hardware design of the chipset.

As for output devices, assuming things haven't been too delayed, I would absolutely expect new enthusiast GPUs in 2018 to have HDMI 2.1 output. Maybe I'm mis-remembering, but I always thought GPU cards were among the first devices to support the new HDMI standards.

So basically what I'm saying is I'm guardedly optimistic (and likely to be disappointed by CES).

I think people are reading way too much into HDMI 2.1 + VRR.

I highly doubt VRR will have good support on consoles for many years, let alone next year.

So what does HDMI 2.1 really bring for console gamers (or general AV use) in the next few years? Basically nothing.

Also the HDMI 2.1 spec has just been finalised, so likely there will be zero 2.1 TVs next year, but as I said, none of this is worth worrying about for years.
 

Madness

Member
Oct 25, 2017
791
You shouldn't be using Rtings settings to be honest. They are fine-tuned for their particular set, in their unique viewing conditions. Messing with 20-point settings will likely cause more harm than good, if you aren't calibrating to the room you are viewing in, and is likely why you are having issues with crushed blacks.

True, but Rtings also advocates for certain things such as change this if you don't like too yellow a picture, increase brightness if the room isn't pitch black. In the old days when AVSForum would often have people directly sharing full ISF Calibration data and settings, some people who had the equipment would find that the settings posted from a calibrated display even in other viewing conditions were a better option than standard, dynamic and even cinema.

I find that LG, Sony and Samsung are pretty good with their Cinema, Custom, Movie modes out of the box now that you can click that,and then tinker to your liking.
 

MrBob

Member
Oct 25, 2017
6,670
2018 is a transition year so it's probably going to be all over the place what HDMI 2.1 features are implemented dependant upon vendor.

I'm also not a huge fan of rtings settings. Seems silly to adjust isf modes.
 
Last edited:

1-D_FE

Member
Oct 27, 2017
8,260
I think people are reading way too much into HDMI 2.1 + VRR.

I highly doubt VRR will have good support on consoles for many years, let alone next year.

So what does HDMI 2.1 really bring for console gamers (or general AV use) in the next few years? Basically nothing.

Also the HDMI 2.1 spec has just been finalised, so likely there will be zero 2.1 TVs next year, but as I said, none of this is worth worrying about for years.

I've gotten used to playing my games in VR and on a high refresh monitor (neither of which have sample-and-hold blur). Obviously Nintendo is their own category, but the vast majority of my gaming is PC gaming. The thing I want is OLED + 120HZ + 4K + HDR. It's entirely reasonable this may be possible next year on PC. That's really the prism I view everything through.

Yeah, I get that AAA titles aren't doing 4K + 120hz anytime soon, but I almost never play AAA titles anymore. So it's not like me not being able to drive them at that framerate is an issue for me anyways.

2018 is a transition year so it's probably going to be all over the place what HDMI 2.1 features are implemented dependant upon vendor.

This is also true. And I'll be completely honest, if the LG OLED's only had a single HDMI 2.1 port, I'd be perfectly fine with that. PCs are the only component I'm going to own that's going to utilize any of that for the foreseeable future anyways. And I remember past HDMI bumps where sets only had a single HDMI with the lastest version. Would be pretty ecstatic with this scenario again.
 
Last edited:

LiK

Member
Oct 25, 2017
32,099
Man, playing D2 in HDR is nice. More vibrant and just makes everything look cleaner and better for some reason. Even the ships look better.

OW on the other hand looks like ass. It's more colorful on OLED but everything looks more muddy because of the lower res textures. At least I'm way better with the game because of the lower input lag.

2018 is a transition year so it's probably going to be all over the place what HDMI 2.1 features are implemented dependant upon vendor.

I'm also not a huge fan of rtings settings. Seems silly to adjust isf modes.

Everyone recommends adjusting the ISF modes. Basically just tweaking what's already pretty accurate. Even Pro calibrators recommend it so why not.
 

The Artisan

"Angels are singing in monasteries..."
Moderator
Oct 27, 2017
8,132
My LG C7 just arrived. Besides basic calibration stuff, is there anything else I should do when I first power it on? Do I need to "break it in" to ensure the best uniformity or to avoid image retention issues/burn-in down the line?
Do most scenes seem dark to you? I don't know how to change that and I have the E7
Look in video output information in the PS4 settings, it should say HDR - 2k/4k supported and HDCP 2.2.

When something starts in HDR there's a symbol pops up on the TV
Yeah I think that's what it says, although I guess I haven't watched anything in true 4K yet
 

J-Skee

The Wise Ones
Member
Oct 25, 2017
11,109
My Vizio E Series came in the mail yesterday &.... it was a mistake. Is the P series much better or should I dump Vizio altogether & go for something else?
 

Deadceptor

Member
Oct 26, 2017
537
How much did it cost, who did it and how did you find this calibrator?

Sorry for the late reply! There's this big electronics store in Finland that offers a calibration service for tv's they sell. It would have cost about 120€, but I bargained it as a freebie. And I would definitely recommend a calibration for the Sony X900E, because the greens and blues tend to be way off.

what's the best way to see if blacks are crushing?

I'm curious too. I put in the Rtings settings and then watched all kinds of footage. With black level set to darker I just increased brightness to see if there's detail in the blacks and indeed there was, quite a lot actually. So I set the black level in the brighter mode and then set my brightness lower until blacks were deep but not crushing any detail...

The Lagom test site is fast and easy, but you get much better results with this AVS HD 709 set. Download the files on your PC and stream them to your device which picture settings you're about to tune (I use Plex, because the app is free and available to almost every device). The test "slides" are actually looping videos with flashing elements that are much more easier to use and read than static test images.

But I still couldn't get the blacks right on my B7 no matter what slides I used. I went through countless settings and possible combinations, but there was always a clear black crush or washed up blacks. According to rtings.com the "out of the box accuracy" isn't that great in 2017 OLED's so maybe a professional calibration would fix that.
 

LiK

Member
Oct 25, 2017
32,099
The only game that looks awful on my B7 so far is Overwatch. The blacks look awful and colors are oversaturated. Other games look fine so I think the console version was just calibrated for 1080p sets or something because it did look great on my Vizio.

I don't want to mess with the settings too much for this one game when other games look fine in a Game mode. Tried the in-game Brightness and Contrast and it looked worse the more I messed with it. I dunno. If you guys play OW on the new OLEDs and don't see an issue, please let me know your settings.
 

Kudo

Member
Oct 25, 2017
3,884
The only game that looks awful on my B7 so far is Overwatch. The blacks look awful and colors are oversaturated. Other games look fine so I think the console version was just calibrated for 1080p sets or something because it did look great on my Vizio.

I don't want to mess with the settings too much for this one game when other games look fine in a Game mode. Tried the in-game Brightness and Contrast and it looked worse the more I messed with it. I dunno. If you guys play OW on the new OLEDs and don't see an issue, please let me know your settings.
I think the problem is that Overwatch doesn't really use that much pure blacks from what I remember, most likely resulting in bit washed out look with cartoony colours.
 

LiK

Member
Oct 25, 2017
32,099
I think the problem is that Overwatch doesn't really use that much pure blacks from what I remember, most likely resulting in bit washed out look with cartoony colours.

The blacks are too dark to me. It's particularly ugly on some maps like King's Row. I would love HDR for this game.
 

zoukka

Game Developer
Verified
Oct 28, 2017
2,361
Monster Hunter beta looked way too bright in HDR for me on game mode, there was no blacks to be seen anywhere. With the same settings Uncharted looked mind blowing.
 

LiK

Member
Oct 25, 2017
32,099
That's the problem with OLEDs, the screens are so nice that the flaws get magnified.
 
Status
Not open for further replies.