The future is bypassing the AVR for video using eARC, not passing through higher bandwidth signals.
RTINGS' SDR real scene peak brightness test only measured 284 nits for a C6 OLED (2016). It's not unreasonable to have the display set to that.
That's because your old TV probably didn't have full-array local dimming.
My 2010 Sony TV does use FALD, and it's the same story there - contrast is reduced in game mode because it never turns zones off, and the zones lag a frame behind the content.
But it has a panel with 5000:1 native contrast, which is still higher than many of the current FALD displays on the market, so it didn't look as bad in game mode as some of the newer displays do.
Its dimming algorithms still seem to be a step above many of the newer models too.
I've been mulling over the idea of buying the cheapest LG 55B9 I can find, because that way I shouldn't be
too torn up about it if it does end up suffering from burn-in - and it's still going to provide better image quality than any other LCD on the market - at least for dark room viewing.
I wouldn't recommend buying a TV without VRR - especially if you plan on using it with consoles.
VRR makes things look much smoother, and really helps to reduce input lag.