• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.
  • We have made minor adjustments to how the search bar works on ResetEra. You can read about the changes here.

Blackthorn

Member
Oct 26, 2017
2,316
London
My initial plan was to sell my Apple TV 4K once the app dropped on the C9 but I'm probably gonna keep it for how well its search features pulls from all the streaming services I use, which I have to assume won't be the case with the app.
 
Last edited:

Mike Works

Member
Oct 28, 2017
1,775
How did you guys get burn-in? Was it from gaming (which game)? Or broadcast TV logo? Tell us more.
It was from videogames. The big offender was Rocket League, though Stardew Valley also caused issues.

With Rocket League, it was from the following elements:

799999-rocket-league-playstation-4-screenshot-first-goal-in-overtime.jpg


The words "BALL CAM" at the lower left, the speedometer in the lower right, and the scoreboard (specifically the orange square where the "1" is in the picture above) at the top middle. The biggest offender was the speedometer -- any time I viewed any content that was yellow, orange, red, purple (in that color spectrum), I could see it. I think Rocket League gave the most problems of any games I played because

A) So much of the damn UI is bright orange, which seems to be the offending burn in color (along with red and yellow) and
B) I played it in HDR at 100 OLED light on average of 1 hour a day.

The other offender was Stardew Valley, though to a much smaller degree:

StardewValley_1-768x432.jpg


I could make out the boxes along the lower middle of the screen, along with the top right menu UI. These elements stay static for a good amount of gameplay, and I my partner would play around 2 hour sessions, though only a couple times a week. Oddly enough the "energy" meter at the lower right didn't burn in at all.

I came from a plasma TV (Pioneer Kuro) background and was always very careful with burn-in, so I was pretty surprised I suffered burn in (or, more specifically, burn out) in 1 year on my OLED.

Now that I have a C9, I'm not going to play Rocket League (or anything) in HDR if it has static content of a certain color, which really sucks.

Oddly enough, the game I play the most on my system is NHL 20:

8o5gcvou4lze.jpg


Which has that scoreboard on the bottom stuck on the screen for 90% of the time playing, and it's caused exactly 0 issues. Which again leads me to believe it's the color of the static HUD (along with brightness) that really affects potential issues.
 

Kyle Cross

Member
Oct 25, 2017
8,427
This might be the wrong place as it isn't about a TV per se, but not sure where else to ask: Are there any HDMI splitters out there than handle 4k60 HDR at 4:2:2 10bit? Everything I'm finding on Amazon seems to top out at 4:2:0.
 
Oct 26, 2017
805
Virginia, US
With VRR being implemented in TV's now do you think we will see a comeback of native 240 hz TV's with HDMI 2.1 and VRR? Back in 2009 and 2010 I remember Samsung making TV's that were native 240 hz.
 

Branson

Member
Oct 27, 2017
2,772
It was from videogames. The big offender was Rocket League, though Stardew Valley also caused issues.

With Rocket League, it was from the following elements:

799999-rocket-league-playstation-4-screenshot-first-goal-in-overtime.jpg


The words "BALL CAM" at the lower left, the speedometer in the lower right, and the scoreboard (specifically the orange square where the "1" is in the picture above) at the top middle. The biggest offender was the speedometer -- any time I viewed any content that was yellow, orange, red, purple (in that color spectrum), I could see it. I think Rocket League gave the most problems of any games I played because

A) So much of the damn UI is bright orange, which seems to be the offending burn in color (along with red and yellow) and
B) I played it in HDR at 100 OLED light on average of 1 hour a day.

The other offender was Stardew Valley, though to a much smaller degree:

StardewValley_1-768x432.jpg


I could make out the boxes along the lower middle of the screen, along with the top right menu UI. These elements stay static for a good amount of gameplay, and I my partner would play around 2 hour sessions, though only a couple times a week. Oddly enough the "energy" meter at the lower right didn't burn in at all.

I came from a plasma TV (Pioneer Kuro) background and was always very careful with burn-in, so I was pretty surprised I suffered burn in (or, more specifically, burn out) in 1 year on my OLED.

Now that I have a C9, I'm not going to play Rocket League (or anything) in HDR if it has static content of a certain color, which really sucks.

Oddly enough, the game I play the most on my system is NHL 20:

8o5gcvou4lze.jpg


Which has that scoreboard on the bottom stuck on the screen for 90% of the time playing, and it's caused exactly 0 issues. Which again leads me to believe it's the color of the static HUD (along with brightness) that really affects potential issues.
Thanks for the insight. Luckily you had that warranty to fall back on. If I ever get an OLED in the future I'd pick Best Buy only for that.

Doesn't seem like you really stressed it that much though. According to some people in this thread burn in is a myth. Good so see some actual discussion from some unfortunate issues. It looks amazing but damn that's one frustrating part about the tech.
 

Sanctuary

Member
Oct 27, 2017
14,225
I have a Panasonic G20 (or G25, whichever is 10 years old this year!) and it's still going strong. I got lots of burn-in when it was new, from playing the first PS3 Batman on it. It took decent amount of time to remove the UI from the screen. Maybe a couple of weeks. That was nothing compared to Dark Souls though, that took MONTHS to remove the burn-in. It went eventually though. I did put hundreds of hours into Dark Souls though.

I have a ten year old G10 that had crazy image retention issues, and eventually ended up with permanent burn in that ironically is rather large, but can't even actually be seen when content is playing, even in dark scenes. I ended up playing Demon's Souls (And Arkham Asylum lol) on mine for about eighty hours after an initial hundred hour "break in" period, and after extended playing sessions, it would leave traces of the D-pad HUD on the left side, but that would eventually go away after thirty minutes of its version of a screen wipe and a few hours of other content.

However, it only took me three hours of training mode with MvC3 to have six health bars stuck at the top of the screen for almost seven months that was easily noticeable in a dark scene, and kind of ruined the picture. I thought that it was going to be permanent, but somehow that eventually went away. What didn't go away though were the SFIV super/ultra meters that fill up most of the bottom of the screen. They ended up getting stuck forever, yet the only time I could actually see them is when the TV is first turned on, and a console is reaching the dashboard. Specifically the PS3.

Now whenever I play any games that have heavy HUD elements on that screen, it usually gets image retention that lasts a long while, but since anymore I just use that TV for games that I will be putting a lot of hours in with anyway, it doesn't really matter.

that's just image retention, its normal. Burn in is permanent.

When it ends up sticking for months, compared to other obvious cases of image retention, it looks like burn in. In my case, I'm 100% sure that I don't have some image retention that lasted ten years. I was surprised though when actual retention that I had for over half a year finally went away. Although maybe it's still there, and I just can't see it due to that particular model of TV having black level elevation issues anyway. The black level is much worse than the first few years I owned it, and I even bought the device that was supposed to reset it to the factory defaults.

Goodbye affordable cables!

Make sure you buy Monster brand. That way, you will know for sure they were worth the extra money!
/s
 
Last edited:

CreepingFear

Banned
Oct 27, 2017
16,766
I went to Costco tonight and checked out the 2019 Vizio P Series Quantum X, since I know design wise it won't be too different from 2020 version. I can't wait! I also looked at the C9. It's so fucking sexy. God, I wish image retention wasn't "a thing", other wise I would get the OLED. Before anyone says anything, it will have lots of CNN and MSNBC on there(all day at work before I come home), so don't try to convince me to get the OLED.
 

Deleted member 35478

User-requested account closure
Banned
Dec 6, 2017
1,788
Thanks for the insight. Luckily you had that warranty to fall back on. If I ever get an OLED in the future I'd pick Best Buy only for that.

Doesn't seem like you really stressed it that much though. According to some people in this thread burn in is a myth. Good so see some actual discussion from some unfortunate issues. It looks amazing but damn that's one frustrating part about the tech.

It's not a myth, and no one says it's impossible. For most owners it's a non issue, but running oled light at 100 with static images like red, yellow, or orange for hundreds / thousands of hours you run the risk of burn in. That's primarily on 6 series LG oled panels, 7 series seeing a large increase in the size of the red sub pixel, and 8/9 further increasing the red sub pixel size (just not as large a jump as 6 to 7) which further reduced chances of burn in. It's at the point that for most owners by the time there's enough pixel degredation to cause burn in, it's time to replace the set anyways.

I also can't even count anymore how many times it's be reiterated that if you view content with a ton of static images on screen like a CNN logo, on top of that run the TV's in torch mode, and view that content for thousands of hours, just gen a LCD. OLED is not the tech for you. The RT'ings burn in test has been brought up over and over, breaking down how many hours it takes to degrade pixels, panel variance, oled light settings etc.
 

Brucey

Member
Jan 2, 2018
828
filmmaker mode, which is suppose to disable all image processing and edge enhancement, has sharpening on the mode lol. @3:59 10 sharpness, when it should be at 0.
 

dallow_bg

Member
Oct 28, 2017
10,629
texas
filmmaker mode, which is suppose to disable all image processing and edge enhancement, has sharpening on the mode lol. @3:59 10 sharpness, when it should be at 0.

Could just mean it is disabled but still shows a number. Or 10 could mean "no sharpening" and going below they is actually softening it artificially.
 

Brucey

Member
Jan 2, 2018
828
Could just mean it is disabled but still shows a number. Or 10 could mean "no sharpening" and going below they is actually softening it artificially.
I guess it could be disabled, but why show the wrong number? And as far as I'm aware on LG Oleds 0 is neutral.

As shown by this gif by an avsforum member who did a pattern test to show the difference
attachment.php


We also had an Era member who had his set professionally calibrated by Vincent Teoh who said his tv was set to 0 as well.

Dunno why they have it set to 10.
 

Branson

Member
Oct 27, 2017
2,772
It's not a myth, and no one says it's impossible. For most owners it's a non issue, but running oled light at 100 with static images like red, yellow, or orange for hundreds / thousands of hours you run the risk of burn in. That's primarily on 6 series LG oled panels, 7 series seeing a large increase in the size of the red sub pixel, and 8/9 further increasing the red sub pixel size (just not as large a jump as 6 to 7) which further reduced chances of burn in. It's at the point that for most owners by the time there's enough pixel degredation to cause burn in, it's time to replace the set anyways.

I also can't even count anymore how many times it's be reiterated that if you view content with a ton of static images on screen like a CNN logo, on top of that run the TV's in torch mode, and view that content for thousands of hours, just gen a LCD. OLED is not the tech for you. The RT'ings burn in test has been brought up over and over, breaking down how many hours it takes to degrade pixels, panel variance, oled light settings etc.

Lol. I know, I've been following everything. I was mainly joking about the myth part. It's an all OLED party all the time up in here so I like to mess with it. I picked LCD for my tv last year and I've been happy with it. I'm always following the tech though and we don't get a lot of burn in posts in here from people who have it so I'm always curious. It really didn't seem like he stressed it much and it happened. If you can't have your tv at a high brightness and enjoy it without it having issues like that it sucks.

I still want one though. I finally saw a scene at Best Buy on a Sony set that really made me take notice even though I'm fine with what I have.
 

dallow_bg

Member
Oct 28, 2017
10,629
texas
I guess it could be disabled, but why show the wrong number? And as far as I'm aware on LG Oleds 0 is neutral.

As shown by this gif by an avsforum member who did a pattern test to show the difference
attachment.php


We also had an Era member who had his set professionally calibrated by Vincent Teoh who said his tv was set to 0 as well.

Dunno why they have it set to 10.
Ah I see what you mean.
 

Darknight

"I'd buy that for a dollar!"
Member
Oct 25, 2017
22,828
They usualy come with a cable anyway, so hopefully that's the case and they're 2.1.

To do good cable management, one of the first things you should do is toss out the cables that come with the device and replace them with exact length cables. I'm planning on tossing out every single HDMI cable in my current setup when I upgrade and lay out all new HDMI 2.1 cables when I do my complete rewire overhaul so that the infrastructure is there for the future.
 

dallow_bg

Member
Oct 28, 2017
10,629
texas
All my cables are the same too. Bought a set of certified high speed cables and all my devices use those so there's no question what they are for. Plus looks neater.

Doing the same with USB cables. Recycled a few pounds worth of USB 1.0 thru 3.0 cables that had been collected from all the devices that they came bundled with over the years.

Almost all my USB 3.0 and C cables are now essentially the same so I know the speed and power ratings.
 

GeoNeo

Member
Oct 26, 2017
1,447
Wish someone at CES could take some Macro shots of the new OLED screens from LG to compare to last years models. :)
 

MrBob

Member
Oct 25, 2017
6,670


The TV AI will take over our homes eventually.

Confirms AMD Freesync support too which might be old news. Under 6 ms input lag at 120 Hz, noice.
 

Branson

Member
Oct 27, 2017
2,772
I am curious to see how the upscaling features compare to a Sony or something else thats good at that.
 
Oct 28, 2017
12
Given that most were predicting that HDMI 2.1 would be the biggest adoption this year at CES is there any reason why we're not necessarily seeing that does anyone know?
 

Haint

Banned
Oct 14, 2018
1,361
Honestly wouldn't want to spend over $500 if possible. That's why that used HW-Q90R looked great for $400 if I could find the rear speakers later on lol.

$500's not going to go very far including a receiver. You're probably better off buying the soundbar (the Q90 is a good one as far as soundbars go) and forgetting about the rears (or contacting Samsung's warranty department and seeing if they'll sell you replacements).
 

Mitchman1411

Member
Jul 28, 2018
635
Oslo, Norway
So are there any good/affordable eARC compatible receivers yet?
The Sony STR-DN1080 recently got an eARC firmware upgrade and is priced at USD $600, while it's little brother STR-DH790 is priced 350 on amazon. I have the 1080 and, coming from a range of Onkyo receivers, this is quite a bit easier to use with similar amount of inputs and outputs.
 

gabdeg

Member
Oct 26, 2017
5,961
🐝
Seems like 120hz BFI is back in for the 2020 LG OLEDs
They'll also be able to take 4K 120hz 4:2:0 over HDMI 2.0. 2019 will only be able to over HDMI 2.1.
Whether or not this pans out for the actual TVs that are released to customers in the end is another question. But I think LG would be insane to do the same mistake again and show features they can't ship.
 

Hasney

One Winged Slayer
The Fallen
Oct 25, 2017
18,621
Seems like 120hz BFI is back in for the 2020 LG OLEDs
They'll also be able to take 4K 120hz 4:2:0 over HDMI 2.0. 2019 will only be able to over HDMI 2.1.
Whether or not this pans out for the actual TVs that are released to customers in the end is another question. But I think LG would be insane to do the same mistake again and show features they can't ship.

Not sure what the issue is with the 4:2:0 120Hz over HDMI 2.0 in the current boxes, but sounds like the 2.1 functionality is already in. I prefer 1440p 120Hz to chroma sub sampling anyway and it looks great on the C9, so as long as we can use that full bandwidth once these next NVidia GPUs are out, I'm happy.
 

Blackthorn

Member
Oct 26, 2017
2,316
London
I guess it could be disabled, but why show the wrong number? And as far as I'm aware on LG Oleds 0 is neutral.

As shown by this gif by an avsforum member who did a pattern test to show the difference
attachment.php


We also had an Era member who had his set professionally calibrated by Vincent Teoh who said his tv was set to 0 as well.

Dunno why they have it set to 10.
Damn it I read that 10 was neutral. Time to go back and adjust all my presets.
 

EvilBoris

Prophet of Truth - HDTVtest
Verified
Oct 29, 2017
16,683
Not sure what the issue is with the 4:2:0 120Hz over HDMI 2.0 in the current boxes, but sounds like the 2.1 functionality is already in. I prefer 1440p 120Hz to chroma sub sampling anyway and it looks great on the C9, so as long as we can use that full bandwidth once these next NVidia GPUs are out, I'm happy.

I think 4:2:0 isn't in any or many of the CTA or VESA standards, so I suppose there are chips that only work with those pre-defined resolutions for the various processors that exist within a TV.
 

EvilBoris

Prophet of Truth - HDTVtest
Verified
Oct 29, 2017
16,683
Could just mean it is disabled but still shows a number. Or 10 could mean "no sharpening" and going below they is actually softening it artificially.

It is a mystery; when you do a DDC reset in calman it always sets sharpening to 10 too.
Any test patterns I've tried show it is still sharpening at this point
 

Deleted member 27551

User requested account closure
Banned
Oct 30, 2017
660
I'm thinking about getting a bigger tv. I currently have a lg b8 55inch and it's perfect but just to small.
I've got 4 options which is an lg b8 65inch or a sony af8 65inch. I'm very tempted to get a 82inch q60r but does anyone know if the hit in picture quality would be worth it for that extra inches. I'm worried a bigger tv would suffer from DSE aswell and I cant stand that. Also seen a 75inch lg 8600 but heard blacks are utter shit on IPS screens.
 

Sanctuary

Member
Oct 27, 2017
14,225
Also seen a 75inch lg 8600 but heard blacks are utter shit on IPS screens.

They are, but at least for PC monitors, I find it to be an acceptable trade compared to how ugly TN looks in general, and the shimmering/glow that VA panels can sometimes get. When you are viewing games in full screen, I've found that blacks aren't really all that bad compared to the rest of the screen, unless the game is primarily dark in general.

Where the bad IPS blacks really are a factor IMO are when viewing letterboxed films. What makes IPS comparatively worse than they previously were too is how good FALD (usually VA) screens have gotten over the years, and of course OLED. Spend a week on an OLED and even a moderately good plasma looks like crap in the black level department, and plasma already looked better than IPS. Since you already own a B8, I'm not sure you would be satisfied with anything less. Go for either the LG or Sony 65'' OLEDs. If screen size is really an important consideration, is it possible for you to sit a few feet closer?
 
Last edited:

Hasney

One Winged Slayer
The Fallen
Oct 25, 2017
18,621
I'm thinking about getting a bigger tv. I currently have a lg b8 55inch and it's perfect but just to small.
I've got 4 options which is an lg b8 65inch or a sony af8 65inch. I'm very tempted to get a 82inch q60r but does anyone know if the hit in picture quality would be worth it for that extra inches. I'm worried a bigger tv would suffer from DSE aswell and I cant stand that. Also seen a 75inch lg 8600 but heard blacks are utter shit on IPS screens.

Get the B8 out of the two OLEDs if you spend more time gaming as it has better input lag, but the A8F has slightly better processing in movies. It's such a toss up that I'd go with whatever the cheaper of the two is.

The q60r is fine, but as you mentioned the black levels, the lack of local dimming and the fact it's not got great HDR brightness according to RTings means it would be a significant downgrade to what you have. It's a VA panel, but weirdly enough, people started seeing IPS versions at the end of last year too, so if you do go this route, check model numbers carefully.
 

Deleted member 27551

User requested account closure
Banned
Oct 30, 2017
660
Get the B8 out of the two OLEDs if you spend more time gaming as it has better input lag, but the A8F has slightly better processing in movies. It's such a toss up that I'd go with whatever the cheaper of the two is.

The q60r is fine, but as you mentioned the black levels, the lack of local dimming and the fact it's not got great HDR brightness according to RTings means it would be a significant downgrade to what you have. It's a VA panel, but weirdly enough, people started seeing IPS versions at the end of last year too, so if you do go this route, check model numbers carefully.
Thanks for the replys. I sit as close as I can which isnt that bad I just think if I want to 65 it would be alot more immersive for films.
I didnt know some q60r are IPS. I've been looking at reviews on all the tvs and I think I can rule out the tvs that are not oled. Sony oled had higher response time for gaming, though it is brighter for HDR and movies as mentioned. Rely seems a toss of a coin between the oleds I do game alot but also watch alot of movies. Really considering the sony just for the better out the box accuracy and for films and sound. 30ms input lag compared to 13 is a big difference though. Ugh more watching and reading reviews I think lol. Is there any reason apart from response time the af8 would be alot worse for gaming than a b9?

Edit: I used multi quote so I replied to both of you properly as never use it usually.
 

Hasney

One Winged Slayer
The Fallen
Oct 25, 2017
18,621
Thanks for the replys. I sit as close as I can which isnt that bad I just think if I want to 65 it would be alot more immersive for films.
I didnt know some q60r are IPS. I've been looking at reviews on all the tvs and I think I can rule out the tvs that are not oled. Sony oled had higher response time for gaming, though it is brighter for HDR and movies as mentioned. Rely seems a toss of a coin between the oleds I do game alot but also watch alot of movies. Really considering the sony just for the better out the box accuracy and for films and sound. 30ms input lag compared to 13 is a big difference though. Ugh more watching and reading reviews I think lol. Is there any reason apart from response time the af8 would be alot worse for gaming than a b9?

Nope, just the input lag. And while the Sony is better at processing, it's probably not something you'd notice much unless they were side by side. If the price difference between the Sony and the LG was enough to get a calibrator out, then I'd just use the difference on that.

You said B8 in the first post and B9 in this one. If it's a B9, just get that one for that sweet HDMI 2.1 bandwidth and features ready for next gen. That's the biggest difference if it is a 9.
 

Deleted member 27551

User requested account closure
Banned
Oct 30, 2017
660
Nope, just the input lag. And while the Sony is better at processing, it's probably not something you'd notice much unless they were side by side. If the price difference between the Sony and the LG was enough to get a calibrator out, then I'd just use the difference on that.

You said B8 in the first post and B9 in this one. If it's a B9, just get that one for that sweet HDMI 2.1 bandwidth and features ready for next gen. That's the biggest difference if it is a 9.
Sorry must of typed it wrong I currently have a b8 and I meant considering the b9. Yea I think it's going to be the b9 as my main use is gaming. Just need the extra size mainly, anyway thanks for the replys I'm going to get the b9 if it's just like a b8 but better features then it will be great.
 

gabdeg

Member
Oct 26, 2017
5,961
🐝
LG figures that most people will have it sitting on a desk while they game. Removing the components from the back would have also driven the cost up even more.

You could also consider the 55" GX model.
I just mean the stand with the cable management box. Something slimmer would have been a little desk friendlier imo. So you could move it further back.
 

MazeHaze

Member
Nov 1, 2017
8,579
Lol. I know, I've been following everything. I was mainly joking about the myth part. It's an all OLED party all the time up in here so I like to mess with it. I picked LCD for my tv last year and I've been happy with it. I'm always following the tech though and we don't get a lot of burn in posts in here from people who have it so I'm always curious. It really didn't seem like he stressed it much and it happened. If you can't have your tv at a high brightness and enjoy it without it having issues like that it sucks.

I still want one though. I finally saw a scene at Best Buy on a Sony set that really made me take notice even though I'm fine with what I have.
Personally I still think there is panel lottery at play here. We've seen people get burn in from things that are barely ever on the screen. And like, this poster got burn in from Rocket League and Star Dew valley, but I've played hundreds of hours of Destiny 2 (max brightness in HDR) and Overwatch which both have bright yellow static HUD elements , and seen nothing, and I use it as my PC monitor with about 6000 hours on it. Who knows what causes these variations, if I had to guess maybe some hiccup in the pixel scrubber function, or voltage related issues. Burn in is definitely real and a problem, but the huge variance between users watching/playing the same content shows there is definitely more to the story.
 

SixelAlexiS

Member
Oct 27, 2017
7,729
Italy
It was from videogames. The big offender was Rocket League, though Stardew Valley also caused issues.

With Rocket League, it was from the following elements:

799999-rocket-league-playstation-4-screenshot-first-goal-in-overtime.jpg


The words "BALL CAM" at the lower left, the speedometer in the lower right, and the scoreboard (specifically the orange square where the "1" is in the picture above) at the top middle. The biggest offender was the speedometer -- any time I viewed any content that was yellow, orange, red, purple (in that color spectrum), I could see it. I think Rocket League gave the most problems of any games I played because

A) So much of the damn UI is bright orange, which seems to be the offending burn in color (along with red and yellow) and
B) I played it in HDR at 100 OLED light on average of 1 hour a day.

The other offender was Stardew Valley, though to a much smaller degree:

StardewValley_1-768x432.jpg


I could make out the boxes along the lower middle of the screen, along with the top right menu UI. These elements stay static for a good amount of gameplay, and I my partner would play around 2 hour sessions, though only a couple times a week. Oddly enough the "energy" meter at the lower right didn't burn in at all.

I came from a plasma TV (Pioneer Kuro) background and was always very careful with burn-in, so I was pretty surprised I suffered burn in (or, more specifically, burn out) in 1 year on my OLED.

Now that I have a C9, I'm not going to play Rocket League (or anything) in HDR if it has static content of a certain color, which really sucks.

Oddly enough, the game I play the most on my system is NHL 20:

8o5gcvou4lze.jpg


Which has that scoreboard on the bottom stuck on the screen for 90% of the time playing, and it's caused exactly 0 issues. Which again leads me to believe it's the color of the static HUD (along with brightness) that really affects potential issues.
So basically 300-400 hours of Rocket League caused burn in on your OLED? Eh, and ppl say thousands of hours, sure... uff, the sad thing is that only the LG OLED have all the fancy things for the next gen... as a VT60 Plasma user I'm fu**ing tired of babysitting, I just want to play, but even the new Samsung doesn't have HDMI 2.1 and so, jeez...
 

Gravemind IV

Member
Nov 26, 2017
1,949
I noticed an annoying bright green pixel on my display (Samsung UE49MU8000) the other day. Is this a stuck pixel? I thought it might have been a scratch due to it shifting colors depending on the background, seeing as I thought a stuck pixel always stays the same color (hence, stuck). I cleaned the area but it didn't seem damaged at all, thankfully.

I tried running jscreen amongst other things, but they haven't helped so far. Anyone know a good tip (aside from the ones where you need to 'massage' the pixel, anything involving me touching the TV is a last resort as I don't want to make it worse).




Also, I noticed (and due to noticing I can't unsee it) these transparent columns on the sides of my tv. Now I know that my model is edge led, so it could be that I believe and I simply hadn't noticed before but are there any settings tweaks I can do to make it less apparent?
 

Deleted member 35478

User-requested account closure
Banned
Dec 6, 2017
1,788
Lol. I know, I've been following everything. I was mainly joking about the myth part. It's an all OLED party all the time up in here so I like to mess with it. I picked LCD for my tv last year and I've been happy with it. I'm always following the tech though and we don't get a lot of burn in posts in here from people who have it so I'm always curious. It really didn't seem like he stressed it much and it happened. If you can't have your tv at a high brightness and enjoy it without it having issues like that it sucks.

I still want one though. I finally saw a scene at Best Buy on a Sony set that really made me take notice even though I'm fine with what I have.

At OLED Light 100 the output is like 350 nits, the SDR standard is around 100 nits or so. If you have to view content at 3x the brightness there's no point in purchasing an OLED, again, just get a LCD.

So basically 300-400 hours of Rocket League caused burn in on your OLED? Eh, and ppl say thousands of hours, sure... uff, the sad thing is that only the LG OLED have all the fancy things for the next gen... as a VT60 Plasma user I'm fu**ing tired of babysitting, I just want to play, but even the new Samsung doesn't have HDMI 2.1 and so, jeez...

He ran the tv in torch mode, 100 OLED light, max brightness. It is what it is.
 

laxu

Member
Nov 26, 2017
2,782
LG figures that most people will have it sitting on a desk while they game. Removing the components from the back would have also driven the cost up even more.

You could also consider the 55" GX model.

The LG stand is generally crappy IMO. It's mostly an oversized, badly designed cable organizer in the back. If you need more space a wall mount or monitor arm is in order. I really hope they redesign the stand and move the inputs to a separate box ala Samsung.

Personally I still think there is panel lottery at play here. We've seen people get burn in from things that are barely ever on the screen. And like, this poster got burn in from Rocket League and Star Dew valley, but I've played hundreds of hours of Destiny 2 (max brightness in HDR) and Overwatch which both have bright yellow static HUD elements , and seen nothing, and I use it as my PC monitor with about 6000 hours on it. Who knows what causes these variations, if I had to guess maybe some hiccup in the pixel scrubber function, or voltage related issues. Burn in is definitely real and a problem, but the huge variance between users watching/playing the same content shows there is definitely more to the story.

The common theme in the user with Rocket League / Stardew Valley burn in was that both games have plenty of red and yellow in the sections that burned in. I guess that's why after some year they made the red subpixel bigger on LG OLEDs.
 

Deleted member 35478

User-requested account closure
Banned
Dec 6, 2017
1,788
They are, but at least for PC monitors, I find it to be an acceptable trade compared to how ugly TN looks in general, and the shimmering/glow that VA panels can sometimes get. When you are viewing games in full screen, I've found that blacks aren't really all that bad compared to the rest of the screen, unless the game is primarily dark in general.

Where the bad IPS blacks really are a factor IMO are when viewing letterboxed films. What makes IPS comparatively worse than they previously were too is how good FALD (usually VA) screens have gotten over the years, and of course OLED. Spend a week on an OLED and even a moderately good plasma looks like crap in the black level department, and plasma already looked better than IPS. Since you already own a B8, I'm not sure you would be satisfied with anything less. Go for either the LG or Sony 65'' OLEDs. If screen size is really an important consideration, is it possible for you to sit a few feet closer?

IPS tv's are complete trash. I have a 75" Sony 850D in the living room, edge lit IPS, literally the worst combo. I cheaped out to have a larger screen in the living room, wish I spent more ($1k) for the 900f or whatever at the time FALD VA. Only saving grace is the wide viewing angles with IPS, which comes in handy in my living room. The tv is good for a bright room, and the kids streaming Disney+ all day, or watching sports. Other than being a beater tv, it sucks, but for my living it's good enough lol.