It sounds like this will be 1440p120 or 4K60 on current GPUs. Since it's mid-to-late cycle for these displays and GPUs, I'll be waiting for next year's TVs and GPUs before upgrading.
Hopefully next year's OLEDs will include the improved motion handling which was demoed and then cancelled at the last minute. And if I'm allowed to dream, they will have figured out a way to combine BFI with VRR, like ASUS have with their new ELMB-Sync monitors. ≤1ms MPRT combined with VRR is the next big thing for displays.
Super noob question but I always thought G-Sync was mostly useful for preventing tearing/stuttering at variable FPS's (which is usually at the maximum of the GPU's capability).
It has been a few years since I've played on a console, but is screen tearing ever actually an issue on console? I didn't think it was.
The #1 benefit of VRR technologies is that it eliminates stuttering with variable frame rates, since the refresh rate is now synchronized to the frame rate.
A distant #2 and #3 are that it can reduce latency by 2–3 frames (introduced by double/triple buffering) and does so without screen tearing.
So..."Switch Pro"? (Though I have yet to hear about Tegra processors supporting G-Sync)
I said it from the start, but the Switch should have included a G-Sync display.
It would provide a significant improvement for playing in handheld mode where performance is often inconsistent, and would have been amazing for NVIDIA to get G-Sync in that many people's hands.
I wonder if Sony oleds will have this tech? I thought they are rebrands lg tvs?
Anyways do you guys think my A1E will still be good for ps5? Like the freesync stuff is awesome but they are just nice perks at end of day. It should still be a good panel for games/movies.
They buy panels from LG and use their own processing/design. They are not "rebrands".
Unfortunately, while Sony arguably have the best image processing in the business (or Panasonic if you care about slightly more accuracy over picture enhancements) they have (both) been falling behind when it comes to TV features. Their SmartTV OS is Android-based and really bad compared to LG's WebOS.
Panasonic have a number of features specific to their TVs which seem to be in practically all of their displays that other manufacturers overlook. I love using their "Hotel Mode" to always start the TV at a fixed volume and on a specified input, for example.
And Sony's image processing has yet to be matched in my experience (smoothest gradients in any source).
But LG is at the top of my list for new TVs right now due to how much better their OS and the support for their TVs is. Samsung seem like a close second now, but they don't sell OLEDs.
are you telling me consoles with AMD GPU are able to take advantage of Gsync display?
This is not G-Sync, it is "G-Sync Compatible".
That means it's using FreeSync (VESA Adaptive-Sync) or HDMI-VRR, but support has been certified by NVIDIA.
The Xbox One X already supports both FreeSync and HDMI-VRR. The PS4 doesn't seem to have the hardware required for support -or Sony does not care to support it- but next-gen everything should be using it.
Am I the only one who can't see the benefits of gsync/freesync in actual games?
Imo higher hz should be prioritized on these TVs.
They're already 120Hz.
As for not seeing the benefits of G-Sync/FreeSync, there are two possibilities:
- You don't seem to notice stuttering in games caused by variable frame rates.
- You're using G-Sync in windowed mode and not full-screen mode.
Windowed-Mode G-Sync produces bad results compared to Full-Screen G-Sync. It's not properly smooth and stutter-free, and should be disabled.
If you want "Windowed" G-Sync, you should use Kaldaien's SpecialK tool to force the game into DirectFlip mode, or hope that Windows 10's Full-Screen Optimizations are active in the game's Full-Screen mode.
I feel you, I have a 1080ti. I see no reason why they can't update those in the future too besides greed.
It's probably a hardware limitation.
AMD were able to implement FreeSync, and then FreeSync-over-HDMI, on much older products than NVIDIA were with G-Sync, and now FreeSync / FreeSync-over-HDMI / HDMI-VRR.
If I recall correctly, it was something like AMD being an early adopter of DisplayPort 1.2a vs NVIDIA staying at DisplayPort 1.2 for much longer. Until VRR displays, that difference didn't really mean anything.
It wouldn't surprise me if there's a similar limitation with NVIDIA's HDMI implementations.
Doesn't work at all right now, AFAIK.
This is why I am so frustrated at the way eARC is implemented in HDMI 2.1.
eARC seems to require eARC-capable devices on both ends.
What eARC
should have been, is a dedicated full-bandwidth output on the display that works with any existing HDMI audio device. Even HDMI 1.0 supported 8-channel LPCM - though realistically you'd probably not want to use anything older than HDMI 1.3 (TrueHD/DTS-HD support).
That way buying an HDMI 2.1 display would have upgraded the audio with
any device which is currently hooked up via ARC or S/PDIF.
The way HDMI should
always have worked, is that you plug your sources directly into the TV, which then outputs audio to the receiver.
Receivers should never have been video switchers or processing devices to begin with.
As it is now, I'm hoping there will be inexpensive adapters to connect regular HDMI audio devices up to displays with eARC support.
There's no reason that anyone with an HDMI 2.0 audio device should have to replace it with an HDMI 2.1 device just for improved video pass-through.
I guess I will wait for the driver to clarify this. Is it bringing generic VRR (like generic VRR over Displayport) to HDMI? Or is it literally just for LG T.V.s?
I would obviously prefer the former as it allows you to "try" and get your VRR display to work with the NV driver, which generally does.
They said it may work with other displays but is only certified for these OLEDs right now.
The question is whether this is only HDMI-VRR support, or also FreeSync-over-HDMI, as I believe the specs are slightly different and are not compatible with one another. Hopefully it's not just HDMI-VRR, and FreeSync-over-HDMI monitors will also be supported by this update.
No idea why it needs to be RTX or GTX 16 series only at this point, (like integer scaling) but damn I am happy about what
ThereAreFourNaan just highlighted ^_^
My understanding is that scaling is provided by fixed-function hardware in everything prior to Turing, which is why it is not supported on older GPUs.
I'm very glad that my old topic on the GeForce forums actually did result in a new feature being implemented, though I'm unable to actually try it out since I have a 1070.
I have since found that my preference is to use integer scaling to the nearest integer (or +1) and then use linear scaling to fit the display though. This retains most of the sharpness of "true" integer scaling but without the black bars if it's not a perfect fit.