Seeing this makes me think that great CS is being used as cover for below average build quality.
It's not an issue with build quality - it's just how that type of display is.
But they are absolutely being generous with out-of-warranty replacements of OLED panels to not kill its reputation.
Joke aside, and honest question.. is microLED supposed to be an OLED replacement? A LCD/LED replacement?
What are the benefits of this technology vs the current stuff?
What are its disadvantages? š¤
Micro LED is the inorganic equivalent of OLED.
They're much brighter, more efficient, and have longer lifespans - so they should be significantly more resistant to burn-in than OLED.
Think of late-model direct-view CRTs. Outside of 24/7/365 use for things like airport displays, burn-in was essentially a solved problem by the mid-to-late '90s despite technically still being possible, because the longevity had been improved enough.
But it's likely 5+ years away before consumer displays are affordable, unless there is a breakthrough.
Yeah I can't say I advise it. Burn in is the tip of the iceberg too - OLEDs have two kinds of brightness limiters that cannot be disabled, called ABL and ASBL. The first kind dims the screen if the tv determines you've been on a static screen too long, which is literally all day every day on a PC. (I've heard this can be disabled with a service remote but cmon). I frequently have to alt tab while writing an email or designing something to reset the brightness. The second is a screen contrast thing where the brightest peak display dims depending on how much white content is on screen. So if I take a web browser and blow it up full screen, it'll look dimmer than if i display it windowed. You can't disable this one at all š
The Automatic Brightness Limiter (your second description) should be disabled on newer OLEDs if the brightness is set below 150 nits - and SDR is intended to be viewed at 100 nits.
It really shouldn't be an issue when used as a monitor, unless you have the display set extremely bright.
OLED burn in is an issue when the wear on your screen isn't uniform. i.e. if you're excessively playing a certain game with a bright UI, or if you're watching a channel with specific logos / banners much more than you watch random other content.
Please stop trying to frame burn-in as "excessive" use.
It's accumulated wear. Playing a game for one hour and then watching one hour of TV every week for a year should be equivalent to 52 hours of a game in a single week and then 52 hours of TV the next, as far as burn-in is concerned.
The difference should be that the wear is far more gradual in the first scenario, so you wouldn't notice it as easily.
In the second scenario you might end up with mild burn from the game which then fades as you watch the TV content.
This is a hypothetical example though - I don't know how many hours it would take for burn-in to be noticeable.
The newer models are "better" about burn-in mostly because they are newer and people haven't had as much time to burn them in.
At the end of the day, your OLED screen will develop burn-in over time if the same static content is displayed on it for hundreds of hours. There's no way around it. It's just how the tech works.
I love my OLED but it lives in the media room for movies and streaming TV series. Video games and cable TV get played in the living room on an LCD. Better for bright environments anyway.
They have different panels and additional features to reduce it.
It's not just that people haven't spent as much time with them.
I do, however, suspect that these improvements will mostly buy you another year or two before the panel ends up in the same state as an older model though.