• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.

Yappa

Member
Oct 25, 2017
6,481
Hamburg/Germany
True. But honestly, "what looks good to you" should be the primary metric for a TV you spend your money on.
I can definitely understand when it comes to HDR brightness. Like Dune with proper DV Cinema picture mode on a G1 is still too damn dark. I had to use DV Cinema Home.
Wonder how that movie fares on the Sony S95B btw, especially the sandworm attack, though obviously HDR10 is different.
 

EvilBoris

Prophet of Truth - HDTVtest
Verified
Oct 29, 2017
16,681
I can definitely understand when it comes to HDR brightness. Like Dune with proper DV Cinema picture mode on a G1 is still too damn dark. I had to use DV Cinema Home.
Wonder how that movie fares on the Sony S95B btw, especially the sandworm attack, though obviously HDR10 is different.
When I saw it in the theatre with a great projector I was amazed by how bright the daytime scenes looked - the difference from many other scenes was super pronounced - I suspect that intentional choice carried over into a Similar theatrical home grade.
 

BPHusker

Member
Oct 26, 2017
2,125
Nebraska
based on the Nebraska name/profile, assuming you're in the US? Best Buy should cover burn in with their 5 year extended warranty, unless they changed rather recently. don't think there are many other options, though I think the newer EVO panels have a 4-5 year coverage for burnin direct from LG now?

I don't know how possible it truly is, but I've had people tell me in the past on forums and irl, that you can basically invent an issue with the TV in ~4 years to get a newer one. essentially the warranty pays for itself, and then some. whether or not they scrutinize these returns more in 4 years from now, I could not tell you.
Is this at Best Buy or anywhere they have warranties?
 

Hawk269

Member
Oct 26, 2017
6,043
What's going on in there?
I post there and it comes down to owners of "other" brands battling with people that have the Samsung. As a person that has the Samsung, it really is what owners are saying. It is a big leap in tech and the color volume, brightness etc. The "War" as some put it is people just in fighting because of brand loyalty. Not too much different from this forum and the Xbox vs. PlayStation arguements.

I will copy and paste my 6 hour impressions that I posted over at AVS Forums a little bit later today.
 

Hawk269

Member
Oct 26, 2017
6,043
Haven't checked there but if it's what I think it's nerds upset their display looks better when the picture is inaccurate. Most movies won't have a big difference when calibrating to show the picture properly. Like in the batman there is an explosion that is 300 nits. You can make it look more vibrant on the qd but it should d look exactly the same because the colors are the same v woled up to 300nits
Like some in that Avs thread, you clearly don't know what you are talking about. The Samsung actually does track pretty game close to the EOTF. There are some easy changes to make it even better. It is not the most accurate TV, but it is pretty close. The one fear that many had is that Samsung was being Samsung and going with a very innaccurate out of the box settings to make things look brighter, punchier etc. This IS NOT THE CASE with the Samsung QDOLED. They did a great job with out of the box settings and not too much adjustments needed.

In game mode it is reported that it is not as accurate as it is in Movie or Filmaker mode, but for games do we really care that it tracks exactly to the EOTF? It is close, but many sets are not accurate.
 

big_z

Member
Nov 2, 2017
7,794
Kind of glad Vincent knocked some people on avs down a peg. There's a group of individuals that we're getting that toxic elitist attitude and blinded by their own churned hype.
 

Jokerman

Member
May 16, 2020
6,936
True. But honestly, "what looks good to you" should be the primary metric for a TV you spend your money on.

Why would you pay that much money on a TV and not want an accurate picture though? I've never understood people paying thousands on a set and then using it in its default mode, or even worse, vivid. True, you pay your money and are free to use it as you see fit, but you wonder why people would be on the audio video and science forum, and defend to death a TV that cannot be dialled in for accuracy.

You can understand why it kicked off, as people have been told in 'reviews' by people who think just plonking a meter in front of a TV is enough, that these TVs are blowing everything away, when it comes to accurate colours, so when an actual expert tells them the truth, of course those who have committed don't want to admit they were wrong.

It is why people should always ignore these initial hype reviews on YouTube.
 

aevanhoe

Slayer of the Eternal Voidslurper
Member
Aug 28, 2018
7,326
Why would you pay that much money on a TV and not want an accurate picture though?

Because expensive TVs, in addition to "accurate", also do "bright" or "contrasty" better.

For example, I would not use "vivid", but I also don't really care about accuracy that much. I do care about true blacks, bright highlights, punchy colors - so you can understand why I chose an OLED. I didn't choose it because it's a good way to get the closest to creator's intent, I chose it because my lasers and magic spells in games look like neon lights and everything is crispy. Now, in my case, I'm somewhere in the middle - I would not use "standard" or "vivid", but I will - on occasion - use Cinema Home instead of Cinema or use Dynamic Contrast or Dynamic Tone Mapping because, you know, explosions explode more and lightsabers lightsaber more - regardless of creator's intent. For some reason, that almost seems to offend certain people.

I mean, I could ask a similarly wrong question: why pay that much money on a TV to watch a movie at a dim, dull setting because it's accurate?

The answer, of course, is: because you prefer accuracy to shiny, bright colors. And, vice versa, some people prefer shiny, bright colors to accuracy. Both goals can be better achieved with an expensive TV than with a cheap one. So there you go, that's why some people would pay much money on a TV even if they don't want an accurate picture that much.
 

Igorth

Member
Nov 13, 2017
1,309
Things like fire and red lights (common in movies) seam to be greatly improved vs. WOLED, that alone cements it for me, my LG B6 will be replaced with an QD OLED (Samsung or Sony, or maybe 2023 models)
 

Jokerman

Member
May 16, 2020
6,936
Because expensive TVs, in addition to "accurate", also do "bright" or "contrasty" better.

For example, I would not use "vivid", but I also don't really care about accuracy that much. I do care about true blacks, bright highlights, punchy colors - so you can understand why I chose an OLED. I didn't choose it because it's a good way to get the closest to creator's intent, I chose it because my lasers and magic spells in games look like neon lights and everything is crispy. Now, in my case, I'm somewhere in the middle - I would not use "standard" or "vivid", but I will - on occasion - use Cinema Home instead of Cinema or use Dynamic Contrast or Dynamic Tone Mapping because, you know, explosions explode more and lightsabers lightsaber more - regardless of creator's intent. For some reason, that almost seems to offend certain people.

I mean, I could ask a similarly wrong question: why pay that much money on a TV to watch a movie at a dim, dull setting because it's accurate?

The answer, of course, is: because you prefer accuracy to shiny, bright colors. And, vice versa, some people prefer shiny, bright colors to accuracy. Both goals can be better achieved with an expensive TV than with a cheap one. So there you go, that's why some people would pay much money on a TV even if they don't want an accurate picture that much.

I understand why people buy OLEDs. What I don't understand is why someone would pay a premium for this TV, which is capable of unparalleled colour accuracy and not use that feature. It would be like you buying an OLED and raising the brightness to lose the perfect blacks.
 

EvilBoris

Prophet of Truth - HDTVtest
Verified
Oct 29, 2017
16,681
I understand why people buy OLEDs. What I don't understand is why someone would pay a premium for this TV, which is capable of unparalleled colour accuracy and not use that feature. It would be like you buying an OLED and raising the brightness to lose the perfect blacks.
Black is usually clamped and if anything many of the more contrasty presets that TVs do via presets or via dynamic tone mapping actually favour crushing blacks in order to produce pop. You can make an image feel lighter without moving the black point
 

Jokerman

Member
May 16, 2020
6,936
Black is usually clamped and if anything many of the more contrasty presets that TVs do via presets or via dynamic tone mapping actually favour crushing blacks in order to produce pop. You can make an image feel lighter without moving the black point

That is true, but my point still stands. Unless of course these TVs are more accurate out of the box, than a calibrated current-gen OLED.

That makes me wonder about this Samsung. If they are pumping out different measurements for 10% windows (which is a scummy move there is no getting away from that) how calibrators are actually calibrating them at this point. Experts like VT can look at an image, and through decades of experience, know when it is accurate, but people relying on meters, are going to be producing inaccurate calibrations.
 

Ziyi

Banned
Mar 24, 2022
75
That is true, but my point still stands. Unless of course these TVs are more accurate out of the box, than a calibrated current-gen OLED.

That makes me wonder about this Samsung. If they are pumping out different measurements for 10% windows (which is a scummy move there is no getting away from that) how calibrators are actually calibrating them at this point. Experts like VT can look at an image, and through decades of experience, know when it is accurate, but people relying on meters, are going to be producing inaccurate calibrations.


This is somewhat true but there are also times when a reference monitor is required to genuinely ascertain whether or not something looks entirely accurate. Without something to refer to (based on industry standards), we have only our brains to rely on which are constantly sullied by our own subjective experiences leading to perception distortion through the visual cortex.

VT has stated on on multiple occasions that he uses a Sony reference monitor to assist with his analysis and assessment.
 

Jokerman

Member
May 16, 2020
6,936
This is somewhat true but there are also times when a reference monitor is required to genuinely ascertain whether or not something looks entirely accurate. Without something to refer to (based on industry standards), we have only our brains to rely on which are constantly sullied by our own subjective experiences leading to perception distortion through the visual cortex.

VT has stated on on multiple occasions that he uses a Sony reference monitor to assist with his analysis and assessment.
Yeah I should have said that he will know when a picture is off rather than accurate. I believe this is how he discovered what Samsung were doing a few days ago.
 

ShapeGSX

Member
Nov 13, 2017
5,213
That is true, but my point still stands. Unless of course these TVs are more accurate out of the box, than a calibrated current-gen OLED.

That makes me wonder about this Samsung. If they are pumping out different measurements for 10% windows (which is a scummy move there is no getting away from that) how calibrators are actually calibrating them at this point. Experts like VT can look at an image, and through decades of experience, know when it is accurate, but people relying on meters, are going to be producing inaccurate calibrations.

Don't all OLED TVs adjust brightness depending on how large the area being lit brightly is, and then also ramp down the brightness over time as power and heat come into play? Regardless of what TV is being calibrated, that is going to provide inaccurate calibrations for certain situations. I'm not sure why this is "scummy". It's a limitation of the tech, and various TVs are going to handle it in different ways.
 

JiyuuTenshi

Member
Oct 28, 2017
836
Don't all OLED TVs adjust brightness depending on how large the area being lit brightly is, and then also ramp down the brightness over time as power and heat come into play? Regardless of what TV is being calibrated, that is going to provide inaccurate calibrations for certain situations. I'm not sure why this is "scummy". It's a limitation of the tech, and various TVs are going to handle it in different ways.
That's not what's happening here according to Vincent in this post: https://www.avsforum.com/threads/sa...-no-price-talk.3240720/page-103#post-61604628

It's ramping up its peak brightness to about 1500 nits in a 1% window for just a few seconds before returning to baseline to make it look like it's brighter than it really is in a real-world scenario.

And it shows the most accurate picture at a 10% window, knowing that it's the industry standard measurement size, while oversaturating colors and boosting brightness all the time outside of it to make the picture look punchier, even as close as on a 9% window.

Seems pretty obvious to me that Samsung is trying to trick people here...
 

ShapeGSX

Member
Nov 13, 2017
5,213
That's not what's happening here according to Vincent in this post: https://www.avsforum.com/threads/sa...-no-price-talk.3240720/page-103#post-61604628

It's ramping up its peak brightness to about 1500 nits in a 1% window for just a few seconds before returning to baseline to make it look like it's brighter than it really is in a real-world scenario.

And it shows the most accurate picture at a 10% window, knowing that it's the industry standard measurement size, while oversaturating colors and boosting brightness all the time outside of it to make the picture look punchier, even as close as on a 9% window.

Seems pretty obvious to me that Samsung is trying to trick people here...

So you'd get maximum brightness for a firework or an explosion on screen, then. It just sounds like it is operating using different parameters than LG is.
 

Ziyi

Banned
Mar 24, 2022
75
To all potential buyers concerned about QD-OLED; the issue highlighted by VT on AVSForums isn't related to deficiencies of QD-OLED technology but directed at the way Samsung Electronics have approached their own product's (S95B) manipulation of the EOTF curve and color handling.

Sony have their own product, the A95K, which utilizes the same QD-OLED panel as the S95B from Samsung Display. Knowing their emphasis on adhering to industry standards on accurate picture modes as of late, it remains hopeful that they will not follow the same "dirty" tactic employed by Samsung Electronics to generally over-brighten everything and over-saturate colors which appears to be a constant habit of their own TV products.
 
Last edited:

Lowrys

Member
Oct 25, 2017
12,344
London
So you'd get maximum brightness for a firework or an explosion on screen, then. It just sounds like it is operating using different parameters than LG is.
Samsung know exactly what patterns and tests people use to measure and calibrate TVs. They aren't a secret. It's perfectly possible to get a TV to recognise when it's being fed a certain type of test pattern and change its output for that specific circumstance.

I like Samsungs and have owned them for years. Still do. But it's a bit suspicious that this OLED is pumping out higher brightness when showing a specific window test.

But technical stuff aside, your average viewer isn't going to ever notice if the colour curve is very. slightly off. I wouldn't, that's for sure.
 

ShapeGSX

Member
Nov 13, 2017
5,213
Samsung know exactly what patterns and tests people use to measure and calibrate TVs. They aren't a secret. It's perfectly possible to get a TV to recognise when it's being fed a certain type of test pattern and change its output for that specific circumstance.

The question is, are you certain that is what they're doing (recognizing the test pattern)? Or do they just know the maximum power they can deliver to the entire panel over time and delivering more power to the "on" pixels for as long as they can within the operating parameters of the panel? As an engineer, I can see that being one way to operate the panel.

I'd like to see real scenes tested before concluding it is anything nefarious. Dark space scenes with a star field and an explosion, for example. If you get the same brightness there for something that is supposed to output >1500 nits, it would indicate that they aren't trying to recognize the test pattern and it's just a different way to operate the panel.
 

Igorth

Member
Nov 13, 2017
1,309
That's not what's happening here according to Vincent in this post: https://www.avsforum.com/threads/sa...-no-price-talk.3240720/page-103#post-61604628

It's ramping up its peak brightness to about 1500 nits in a 1% window for just a few seconds before returning to baseline to make it look like it's brighter than it really is in a real-world scenario.

And it shows the most accurate picture at a 10% window, knowing that it's the industry standard measurement size, while oversaturating colors and boosting brightness all the time outside of it to make the picture look punchier, even as close as on a 9% window.

Seems pretty obvious to me that Samsung is trying to trick people here...
According to some videos from YT seams like the Samsung is not ramping down like that on real life content, it does that ramping down when paused videos or games or with static and/or repeating patterns (like those testing boxes), probably to avoid burnin.

With movies in motion it does achieve that peak brightness if conditions are dynamic enough to not think you paused the image.
 

Jokerman

Member
May 16, 2020
6,936
Don't all OLED TVs adjust brightness depending on how large the area being lit brightly is, and then also ramp down the brightness over time as power and heat come into play? Regardless of what TV is being calibrated, that is going to provide inaccurate calibrations for certain situations. I'm not sure why this is "scummy". It's a limitation of the tech, and various TVs are going to handle it in different ways.

Erm, no. Believe it or not, Samsung are deliberately misleading people for positive reviews.
 

ShapeGSX

Member
Nov 13, 2017
5,213
Erm, no. Believe it or not, Samsung are deliberately misleading people for positive reviews.

I've seen no proof of this. Igorth 's post seems to refute this, and the "paused content" situation is plausible. You'd have to show that this situation doesn't occur in motion with a test case like I mentioned to prove your accusation that they are deliberately misleading people for positive reviews.
 

Jokerman

Member
May 16, 2020
6,936
I've seen no proof of this. Igorth 's post seems to refute this, and the "paused content" situation is plausible. You'd have to show that this situation doesn't occur in motion with a test case like I mentioned to prove your accusation that they are deliberately misleading people for positive reviews.

He is referring to YouTube videos, which have been terrible in all aspects when reviewing this TV, including being deceived by Samsung. I will take VT's years of experience calibrating TVs over randoms on YT any day.
 

JiyuuTenshi

Member
Oct 28, 2017
836
According to some videos from YT seams like the Samsung is not ramping down like that on real life content, it does that ramping down when paused videos or games or with static and/or repeating patterns (like those testing boxes), probably to avoid burnin.

With movies in motion it does achieve that peak brightness if conditions are dynamic enough to not think you paused the image.
Do you have links to the videos showing this?

As I'm going to wait till Black Friday anyway before upgrading my TV, there's thankfully enough time to get in-depth reviews of all the contenders (S95B, A95K, C2, G2) and hopefully some firmware fixes for initial issues.
 

RoastBeeph

Member
Oct 29, 2017
1,027
Ok, i'm convinced. The new QD-OLEDs seem to be a huge step above LG's OLED. I hope the 75+ inch versions come out in the next year.
 

JiyuuTenshi

Member
Oct 28, 2017
836
I've seen no proof of this. Igorth 's post seems to refute this, and the "paused content" situation is plausible. You'd have to show that this situation doesn't occur in motion with a test case like I mentioned to prove your accusation that they are deliberately misleading people for positive reviews.
Even if the 1% window brightness spike has a reasonable explanation, it doesn't explain why the picture is deliberately the most accurate in a 10% window though. At least for me that's the more important concern with this TV.
 

TKWarner

Member
Feb 28, 2021
224
He is referring to YouTube videos, which have been terrible in all aspects when reviewing this TV, including being deceived by Samsung. I will take VT's years of experience calibrating TVs over randoms on YT any day.

I think that was the point of of him posting in that discussion in the first place: To stop the spread of misinformation by a YouTube calibrator that he deemed to be inexperienced and just plain wrong in his assumption that the new Samsung QD-OLEDs present better color in circumstances when it can't or shouldn't. E.g.: QD-OLEDs should only improve color volume at high luminance levels; not at all luminance levels. Existing OLEDs can already present most colors in HDR movies 1:1 without any desaturation; especially those colors that fall within paper white (<= 100 nits) that make up more than 90% of the HDR movie. We are mostly talking about improvements to the HDR specular highlights and the overall accuracy of the image when it comes to things such as artifacts and uniformity, not overall color brightness.

This boost in color brightness should been obvious to anyone who has seen a reference monitor next to a consumer TV (like Vincent has). He pointed out the calibrators inexperience by highlighting a kink in one of his EOTF graphs that implied he had taken too many consecutive readings of the OLED panel which had led to some "panel fatigue." Those graphs shouldn't have been used as evidence by YouTuber in his video when demonstrating the performance of the TV.
 

Igorth

Member
Nov 13, 2017
1,309
Do you have links to the videos showing this?

As I'm going to wait till Black Friday anyway before upgrading my TV, there's thankfully enough time to get in-depth reviews of all the contenders (S95B, A95K, C2, G2) and hopefully some firmware fixes for initial issues.
Somewhere in here, 10 minutes in they start talking about it
 
Last edited:

Duck-Zilla

Member
Feb 21, 2018
533
As a total noob regarding the latest tv's trends, (I still own a 2011 Samsung Plasma PN51D550) hooked up on my PC and XSX. I still like this TV tbh, I feel the colors are accurate, blacks are deep, no motion blur... But yeah, it's only 1080p, no HDR and 51-inch.

How much better is OLED or QL-OlED? Is there any drawbacks vs my Plasma? Considering it's 3 000$ Canadian here, just wondering how much better they are. 3 000$ is quite a bit of money...

What would be the best logical step to upgrade? OLED? LCD? Keep plasma and wait for the next best thing?
 

Hawk269

Member
Oct 26, 2017
6,043
I just read your impressions over there. I look forward to how things evolve for you over time like your impressions of the A90J did.
The A90J is a fantastic set. The issues I had was around launch and those were: Banding at 4k/120, lack of VRR, only 2 HDMI 2.1 ports. VRR and Banding have been fixed. The only issue for gaming the A90J has now is that it dims quite a bit. Some may not notice it, but others do.
 

rou021

Member
Oct 27, 2017
526
The question is, are you certain that is what they're doing (recognizing the test pattern)? Or do they just know the maximum power they can deliver to the entire panel over time and delivering more power to the "on" pixels for as long as they can within the operating parameters of the panel? As an engineer, I can see that being one way to operate the panel.

I'd like to see real scenes tested before concluding it is anything nefarious. Dark space scenes with a star field and an explosion, for example. If you get the same brightness there for something that is supposed to output >1500 nits, it would indicate that they aren't trying to recognize the test pattern and it's just a different way to operate the panel.
It's not just the brightness that gets ramped up, but also the color saturation. Not to mention this large shift starts when you go from a 10% to a 9% window. This would suggest there's something else—something sketchy going on other than the usual ABL and ABSL type behavior. Unlike WOLED, QD-OLED shouldn't lose any saturation at high luminance levels, so there's no reason the saturation should vary that much with the brightness of the image.

With a not insignificant portion of HDR content being under 1000nits, there probably wouldn't be that dramatic of a different between WOLED and QD-OLED as it is (provided both displays are accurate). Some of the brightest elements on screen—things like a star field, the moon, the sun in broad daylight, etc.—normally aren't that saturated in content to begin with. Even at full field, QD-OLED is only a few dozen nits brighter than the current brightest WOLED. The advantages of QD-OLED's increased brightness and ability to maintain saturation will definitely give it a very real, very noticeable edge in a number of instances, but not in all of them. QD-OLED can certainly look better, but does it look better enough to the average person that it justifies the premium they'd pay over WOLED? When you look at it like that, it makes sense Samsung would want to rig things so they can say they look better in nearly every instance.

Let's not forget this is coming from the same company that had a large marketing campaign trashing WOLED for burn in and offered bounties to WOLED customers with evidence of burn in on their TVs. Like many manufacturers, Samsung is not above dirty tricks to get a competitive edge.
 

BPHusker

Member
Oct 26, 2017
2,125
Nebraska
based on the Nebraska name/profile, assuming you're in the US? Best Buy should cover burn in with their 5 year extended warranty, unless they changed rather recently. don't think there are many other options, though I think the newer EVO panels have a 4-5 year coverage for burnin direct from LG now?

I don't know how possible it truly is, but I've had people tell me in the past on forums and irl, that you can basically invent an issue with the TV in ~4 years to get a newer one. essentially the warranty pays for itself, and then some. whether or not they scrutinize these returns more in 4 years from now, I could not tell you.
Damn on a 77 C2 it's 749 for a 5 year warranty. If I got Total Tech does it give you the same coverage as the normal yearly warranties?
 

Hoodie Season

Member
Jun 17, 2020
1,148
Maybe this isn't known yet, but compared to either a CX or the newer C2, how do the new QD OED TVs compare when it comes to input lag while in game mode? Better/worse?
 

Ziyi

Banned
Mar 24, 2022
75
It's not just the brightness that gets ramped up, but also the color saturation. Not to mention this large shift starts when you go from a 10% to a 9% window. This would suggest there's something else—something sketchy going on other than the usual ABL and ABSL type behavior. Unlike WOLED, QD-OLED shouldn't lose any saturation at high luminance levels, so there's no reason the saturation should vary that much with the brightness of the image.

With a not insignificant portion of HDR content being under 1000nits, there probably wouldn't be that dramatic of a different between WOLED and QD-OLED as it is (provided both displays are accurate). Some of the brightest elements on screen—things like a star field, the moon, the sun in broad daylight, etc.—normally aren't that saturated in content to begin with. Even at full field, QD-OLED is only a few dozen nits brighter than the current brightest WOLED. The advantages of QD-OLED's increased brightness and ability to maintain saturation will definitely give it a very real, very noticeable edge in a number of instances, but not in all of them. QD-OLED can certainly look better, but does it look better enough to the average person that it justifies the premium they'd pay over WOLED? When you look at it like that, it makes sense Samsung would want to rig things so they can say they look better in nearly every instance.

Let's not forget this is coming from the same company that had a large marketing campaign trashing WOLED for burn in and offered bounties to WOLED customers with evidence of burn in on their TVs. Like many manufacturers, Samsung is not above dirty tricks to get a competitive edge.

It's not "looking better" at every instance otherwise at a 9% window luminance would have been maintained at the same level as it was at 10%. Instead, it takes a nosedive outside of this very specific testing area. Until more evidence is provided, we cannot say for sure what Samsung's motives truly are although we can assume that based on their past practices alone (they have manipulated the EOTF curve many times before), it's likely this method of retaining accuracy and pumping nits at testing windows whilst saturating colors outside of this window is pointing to a "best of both worlds" approach; to hook in positive reviews from the critics, by pleasing their testing tools, and at the same time to purposefully dazzle the general user with an over-brightened and slightly over-saturated picture in real-world viewing scenarios.
 
Last edited: