Agreed, didn't expect to see so many "I don't care about reference and only like what looks good to me" posts on avs.
True. But honestly, "what looks good to you" should be the primary metric for a TV you spend your money on.
Agreed, didn't expect to see so many "I don't care about reference and only like what looks good to me" posts on avs.
I can definitely understand when it comes to HDR brightness. Like Dune with proper DV Cinema picture mode on a G1 is still too damn dark. I had to use DV Cinema Home.True. But honestly, "what looks good to you" should be the primary metric for a TV you spend your money on.
When I saw it in the theatre with a great projector I was amazed by how bright the daytime scenes looked - the difference from many other scenes was super pronounced - I suspect that intentional choice carried over into a Similar theatrical home grade.I can definitely understand when it comes to HDR brightness. Like Dune with proper DV Cinema picture mode on a G1 is still too damn dark. I had to use DV Cinema Home.
Wonder how that movie fares on the Sony S95B btw, especially the sandworm attack, though obviously HDR10 is different.
Is this at Best Buy or anywhere they have warranties?based on the Nebraska name/profile, assuming you're in the US? Best Buy should cover burn in with their 5 year extended warranty, unless they changed rather recently. don't think there are many other options, though I think the newer EVO panels have a 4-5 year coverage for burnin direct from LG now?
I don't know how possible it truly is, but I've had people tell me in the past on forums and irl, that you can basically invent an issue with the TV in ~4 years to get a newer one. essentially the warranty pays for itself, and then some. whether or not they scrutinize these returns more in 4 years from now, I could not tell you.
I'm wondering if I wait until October to save a bit especially with the lag on 4:3 aspect ratio. Does anyone recall how much the C1 dropped during BF?August? I'd wait for C2 Black Friday deals at that point. They start late October anyways
What's going on in there?Any of y'all check out the AVS forums? It's a total war zone in the Samsung QD-OLED thread lol.
best buy is the only retailer with an extended warranty that covers burn in specifically in the US afaik
I post there and it comes down to owners of "other" brands battling with people that have the Samsung. As a person that has the Samsung, it really is what owners are saying. It is a big leap in tech and the color volume, brightness etc. The "War" as some put it is people just in fighting because of brand loyalty. Not too much different from this forum and the Xbox vs. PlayStation arguements.
Like some in that Avs thread, you clearly don't know what you are talking about. The Samsung actually does track pretty game close to the EOTF. There are some easy changes to make it even better. It is not the most accurate TV, but it is pretty close. The one fear that many had is that Samsung was being Samsung and going with a very innaccurate out of the box settings to make things look brighter, punchier etc. This IS NOT THE CASE with the Samsung QDOLED. They did a great job with out of the box settings and not too much adjustments needed.Haven't checked there but if it's what I think it's nerds upset their display looks better when the picture is inaccurate. Most movies won't have a big difference when calibrating to show the picture properly. Like in the batman there is an explosion that is 300 nits. You can make it look more vibrant on the qd but it should d look exactly the same because the colors are the same v woled up to 300nits
I will copy and paste my 6 hour impressions that I posted over at AVS Forums a little bit later today.
True. But honestly, "what looks good to you" should be the primary metric for a TV you spend your money on.
Why would you pay that much money on a TV and not want an accurate picture though?
Because expensive TVs, in addition to "accurate", also do "bright" or "contrasty" better.
For example, I would not use "vivid", but I also don't really care about accuracy that much. I do care about true blacks, bright highlights, punchy colors - so you can understand why I chose an OLED. I didn't choose it because it's a good way to get the closest to creator's intent, I chose it because my lasers and magic spells in games look like neon lights and everything is crispy. Now, in my case, I'm somewhere in the middle - I would not use "standard" or "vivid", but I will - on occasion - use Cinema Home instead of Cinema or use Dynamic Contrast or Dynamic Tone Mapping because, you know, explosions explode more and lightsabers lightsaber more - regardless of creator's intent. For some reason, that almost seems to offend certain people.
I mean, I could ask a similarly wrong question: why pay that much money on a TV to watch a movie at a dim, dull setting because it's accurate?
The answer, of course, is: because you prefer accuracy to shiny, bright colors. And, vice versa, some people prefer shiny, bright colors to accuracy. Both goals can be better achieved with an expensive TV than with a cheap one. So there you go, that's why some people would pay much money on a TV even if they don't want an accurate picture that much.
Black is usually clamped and if anything many of the more contrasty presets that TVs do via presets or via dynamic tone mapping actually favour crushing blacks in order to produce pop. You can make an image feel lighter without moving the black pointI understand why people buy OLEDs. What I don't understand is why someone would pay a premium for this TV, which is capable of unparalleled colour accuracy and not use that feature. It would be like you buying an OLED and raising the brightness to lose the perfect blacks.
Black is usually clamped and if anything many of the more contrasty presets that TVs do via presets or via dynamic tone mapping actually favour crushing blacks in order to produce pop. You can make an image feel lighter without moving the black point
That is true, but my point still stands. Unless of course these TVs are more accurate out of the box, than a calibrated current-gen OLED.
That makes me wonder about this Samsung. If they are pumping out different measurements for 10% windows (which is a scummy move there is no getting away from that) how calibrators are actually calibrating them at this point. Experts like VT can look at an image, and through decades of experience, know when it is accurate, but people relying on meters, are going to be producing inaccurate calibrations.
Yeah I should have said that he will know when a picture is off rather than accurate. I believe this is how he discovered what Samsung were doing a few days ago.This is somewhat true but there are also times when a reference monitor is required to genuinely ascertain whether or not something looks entirely accurate. Without something to refer to (based on industry standards), we have only our brains to rely on which are constantly sullied by our own subjective experiences leading to perception distortion through the visual cortex.
VT has stated on on multiple occasions that he uses a Sony reference monitor to assist with his analysis and assessment.
That is true, but my point still stands. Unless of course these TVs are more accurate out of the box, than a calibrated current-gen OLED.
That makes me wonder about this Samsung. If they are pumping out different measurements for 10% windows (which is a scummy move there is no getting away from that) how calibrators are actually calibrating them at this point. Experts like VT can look at an image, and through decades of experience, know when it is accurate, but people relying on meters, are going to be producing inaccurate calibrations.
That's not what's happening here according to Vincent in this post: https://www.avsforum.com/threads/sa...-no-price-talk.3240720/page-103#post-61604628Don't all OLED TVs adjust brightness depending on how large the area being lit brightly is, and then also ramp down the brightness over time as power and heat come into play? Regardless of what TV is being calibrated, that is going to provide inaccurate calibrations for certain situations. I'm not sure why this is "scummy". It's a limitation of the tech, and various TVs are going to handle it in different ways.
That's not what's happening here according to Vincent in this post: https://www.avsforum.com/threads/sa...-no-price-talk.3240720/page-103#post-61604628
It's ramping up its peak brightness to about 1500 nits in a 1% window for just a few seconds before returning to baseline to make it look like it's brighter than it really is in a real-world scenario.
And it shows the most accurate picture at a 10% window, knowing that it's the industry standard measurement size, while oversaturating colors and boosting brightness all the time outside of it to make the picture look punchier, even as close as on a 9% window.
Seems pretty obvious to me that Samsung is trying to trick people here...
Samsung know exactly what patterns and tests people use to measure and calibrate TVs. They aren't a secret. It's perfectly possible to get a TV to recognise when it's being fed a certain type of test pattern and change its output for that specific circumstance.So you'd get maximum brightness for a firework or an explosion on screen, then. It just sounds like it is operating using different parameters than LG is.
Samsung know exactly what patterns and tests people use to measure and calibrate TVs. They aren't a secret. It's perfectly possible to get a TV to recognise when it's being fed a certain type of test pattern and change its output for that specific circumstance.
According to some videos from YT seams like the Samsung is not ramping down like that on real life content, it does that ramping down when paused videos or games or with static and/or repeating patterns (like those testing boxes), probably to avoid burnin.That's not what's happening here according to Vincent in this post: https://www.avsforum.com/threads/sa...-no-price-talk.3240720/page-103#post-61604628
It's ramping up its peak brightness to about 1500 nits in a 1% window for just a few seconds before returning to baseline to make it look like it's brighter than it really is in a real-world scenario.
And it shows the most accurate picture at a 10% window, knowing that it's the industry standard measurement size, while oversaturating colors and boosting brightness all the time outside of it to make the picture look punchier, even as close as on a 9% window.
Seems pretty obvious to me that Samsung is trying to trick people here...
Don't all OLED TVs adjust brightness depending on how large the area being lit brightly is, and then also ramp down the brightness over time as power and heat come into play? Regardless of what TV is being calibrated, that is going to provide inaccurate calibrations for certain situations. I'm not sure why this is "scummy". It's a limitation of the tech, and various TVs are going to handle it in different ways.
Erm, no. Believe it or not, Samsung are deliberately misleading people for positive reviews.
I've seen no proof of this. Igorth 's post seems to refute this, and the "paused content" situation is plausible. You'd have to show that this situation doesn't occur in motion with a test case like I mentioned to prove your accusation that they are deliberately misleading people for positive reviews.
Do you have links to the videos showing this?According to some videos from YT seams like the Samsung is not ramping down like that on real life content, it does that ramping down when paused videos or games or with static and/or repeating patterns (like those testing boxes), probably to avoid burnin.
With movies in motion it does achieve that peak brightness if conditions are dynamic enough to not think you paused the image.
Even if the 1% window brightness spike has a reasonable explanation, it doesn't explain why the picture is deliberately the most accurate in a 10% window though. At least for me that's the more important concern with this TV.I've seen no proof of this. Igorth 's post seems to refute this, and the "paused content" situation is plausible. You'd have to show that this situation doesn't occur in motion with a test case like I mentioned to prove your accusation that they are deliberately misleading people for positive reviews.
He is referring to YouTube videos, which have been terrible in all aspects when reviewing this TV, including being deceived by Samsung. I will take VT's years of experience calibrating TVs over randoms on YT any day.
Somewhere in here, 10 minutes in they start talking about itDo you have links to the videos showing this?
As I'm going to wait till Black Friday anyway before upgrading my TV, there's thankfully enough time to get in-depth reviews of all the contenders (S95B, A95K, C2, G2) and hopefully some firmware fixes for initial issues.
Entirely different things, I agree.Even if the 1% window brightness spike has a reasonable explanation, it doesn't explain why the picture is deliberately the most accurate in a 10% window though. At least for me that's the more important concern with this TV.
Gamers are going to see a huge upgrade in Dolby Vision later this year
It’s all thanks to MediaTek’s upcoming Pentonic series chipswww.techradar.com
Is this actually happening in not flagship SKU? Please trickle down ASAP. Says later this year, but likely next year? Was waiting on TCL specs for 6 series.
The A90J is a fantastic set. The issues I had was around launch and those were: Banding at 4k/120, lack of VRR, only 2 HDMI 2.1 ports. VRR and Banding have been fixed. The only issue for gaming the A90J has now is that it dims quite a bit. Some may not notice it, but others do.I just read your impressions over there. I look forward to how things evolve for you over time like your impressions of the A90J did.
It's not just the brightness that gets ramped up, but also the color saturation. Not to mention this large shift starts when you go from a 10% to a 9% window. This would suggest there's something else—something sketchy going on other than the usual ABL and ABSL type behavior. Unlike WOLED, QD-OLED shouldn't lose any saturation at high luminance levels, so there's no reason the saturation should vary that much with the brightness of the image.The question is, are you certain that is what they're doing (recognizing the test pattern)? Or do they just know the maximum power they can deliver to the entire panel over time and delivering more power to the "on" pixels for as long as they can within the operating parameters of the panel? As an engineer, I can see that being one way to operate the panel.
I'd like to see real scenes tested before concluding it is anything nefarious. Dark space scenes with a star field and an explosion, for example. If you get the same brightness there for something that is supposed to output >1500 nits, it would indicate that they aren't trying to recognize the test pattern and it's just a different way to operate the panel.
Damn on a 77 C2 it's 749 for a 5 year warranty. If I got Total Tech does it give you the same coverage as the normal yearly warranties?based on the Nebraska name/profile, assuming you're in the US? Best Buy should cover burn in with their 5 year extended warranty, unless they changed rather recently. don't think there are many other options, though I think the newer EVO panels have a 4-5 year coverage for burnin direct from LG now?
I don't know how possible it truly is, but I've had people tell me in the past on forums and irl, that you can basically invent an issue with the TV in ~4 years to get a newer one. essentially the warranty pays for itself, and then some. whether or not they scrutinize these returns more in 4 years from now, I could not tell you.
Note taken, I am waiting for more in depth reviews anyway, the TV is not even on sale in my country yet.That is literally the 'calibrator' that Vincent called out. You should treat his videos with caution, as you should any YouTube creator without established credentials.
As far as I have seen on the same level, a tad better in some scenarios.Maybe this isn't known yet, but compared to either a CX or the newer C2, how do the new QD OED TVs compare when it comes to input lag while in game mode? Better/worse?
It's not just the brightness that gets ramped up, but also the color saturation. Not to mention this large shift starts when you go from a 10% to a 9% window. This would suggest there's something else—something sketchy going on other than the usual ABL and ABSL type behavior. Unlike WOLED, QD-OLED shouldn't lose any saturation at high luminance levels, so there's no reason the saturation should vary that much with the brightness of the image.
With a not insignificant portion of HDR content being under 1000nits, there probably wouldn't be that dramatic of a different between WOLED and QD-OLED as it is (provided both displays are accurate). Some of the brightest elements on screen—things like a star field, the moon, the sun in broad daylight, etc.—normally aren't that saturated in content to begin with. Even at full field, QD-OLED is only a few dozen nits brighter than the current brightest WOLED. The advantages of QD-OLED's increased brightness and ability to maintain saturation will definitely give it a very real, very noticeable edge in a number of instances, but not in all of them. QD-OLED can certainly look better, but does it look better enough to the average person that it justifies the premium they'd pay over WOLED? When you look at it like that, it makes sense Samsung would want to rig things so they can say they look better in nearly every instance.
Let's not forget this is coming from the same company that had a large marketing campaign trashing WOLED for burn in and offered bounties to WOLED customers with evidence of burn in on their TVs. Like many manufacturers, Samsung is not above dirty tricks to get a competitive edge.