• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.
  • We have made minor adjustments to how the search bar works on ResetEra. You can read about the changes here.

Civilstrife

Member
Oct 27, 2017
2,286
As a comfy couch PC gamer, I was thrilled by Nvidia's recent announcement of VRR support for all compatible monitors.

It seemingly couldn't have come at a better time, as more and more TV manufactures are announcing VRR in their sets.

But it was a splash of cold water to learn that VRR on Nvidia cards only works through DisplayPort, which is a complete non-starter in the TV space.

Are comfy couchers' only options for VRR to either get an AMD card or shell out for Nvidia's exorbitantly priced monstrous 65" "monitor"?

I know there's always that weird workaround with using a Ryzen IGPU in tandem with an Nvidia card or even dual GPUs, but that seems unreliable and spotty at best.

I'm a bit new to the space, so if there's any options I missed, please let me know!
 

Zojirushi

Member
Oct 26, 2017
3,297
Personally I just hold out for HDMI 2.1 TVs and either good AMD/Intel GPUs or Nvidia eventually caving.
 

x3sphere

Member
Oct 27, 2017
973
Probably didn't want to enable it yet since I don't think the G-Sync big format displays are even out yet

I didn't think they'd ever even enable VRR over DisplayPort, but now that they did, I think it's only a matter of time before they support it over HDMI too. Just seems very unlikely they will keep it locked to DP.
 

TitanicFall

Member
Nov 12, 2017
8,273
I think technically you can enable the option after the driver update even if your display isn't officially on the whitelist.
 

Zexen

Member
Oct 27, 2017
522
I don't understand why DisplayPort is not used more, given the advantages it has over HDMI. Financial gains I presume.

Weird that the AX800 I bought some years ago do have a DP, but the current high end ones with HDR, VRR and all the bells and whistles doesn't.
 

SlothmanAllen

Banned
Oct 28, 2017
1,834
Well, I feel like the recent addition of support for non-G-SYNC monitors is the beginning of the end for G-SYNC. So I imagine, eventually nVidia will support VRR on televisions.
 

tuxfool

Member
Oct 25, 2017
5,858
Driver update, as iirc the 10 and 20 series cards already have support for it.
This is incorrect.

Not even Turing has HDMI 2.1 support. So it is lacking support for the spec in hardware. The only alternative is for them to reimplement a proprietary VRR algorithm on top of HDMI 2.0, like AMD did with Freesync.

It also depends on how flexible their display hardware is, because Nvidia is known to use less flexible hardware, than AMD.

All of this is to say is that it isn't a given that they have support for it, or that it only depends on software drivers.
 
Last edited:

Pargon

Member
Oct 27, 2017
12,014
I don't understand why DisplayPort is not used more, given the advantages it has over HDMI. Financial gains I presume.

Weird that the AX800 I bought some years ago do have a DP, but the current high end ones with HDR, VRR and all the bells and whistles doesn't.
DisplayPort is awful if you need anything longer than ~2m. If I recall correctly, 3m is the maximum cable length within spec, but many certified cables had issues with 3440x1440@100Hz for me (same bandwidth as 4K60).
I spent a lot of money trying to get a 10m DisplayPort connection working and the only viable option seemed to be a fiber connection that was far out of budget. Meanwhile HDMI can do much longer runs with zero issues for far less money.
The AX800 is a rare exception which is the only television I can think of with a DisplayPort connection, and if I recall correctly it was added to support a 4K60 input before HDMI was capable of it.
 

Chaosblade

Resettlement Advisor
Member
Oct 25, 2017
6,596
I'd guess it's inevitable and will be supported over HDMI 2.1 once cards have that. But I don't expect them to port a solution back to cards without HDMI 2.1 even if it's technically possible, because that's not how they roll.
 

Zexen

Member
Oct 27, 2017
522
DisplayPort is awful if you need anything longer than ~2m. If I recall correctly, 3m is the maximum cable length within spec, but many certified cables had issues with 3440x1440@100Hz for me (same bandwidth as 4K60).
I spent a lot of money trying to get a 10m DisplayPort connection working and the only viable option seemed to be a fiber connection that was far out of budget. Meanwhile HDMI can do much longer runs with zero issues for far less money.
The AX800 is a rare exception which is the only television I can think of with a DisplayPort connection, and if I recall correctly it was added to support a 4K60 input before HDMI was capable of it.
Yea, I've encountered my share of problems with DP, but I still think that right now, it's a better tech than HDMI.

The TV supports both 4K60 over DP and HDMI, although, only one port does on the latter (4th one).
 

Alo81

Member
Oct 27, 2017
548
When HDMI 2.1 cards come out, and there are HDMI 2.1 TV's in the wild, I would expect the two to work together exactly as you'd hope.
 

Alexious

Executive Editor for Games at Wccftech
Verified
Oct 26, 2017
909
As a comfy couch PC gamer, I was thrilled by Nvidia's recent announcement of VRR support for all compatible monitors.

It seemingly couldn't have come at a better time, as more and more TV manufactures are announcing VRR in their sets.

But it was a splash of cold water to learn that VRR on Nvidia cards only works through DisplayPort, which is a complete non-starter in the TV space.

Are comfy couchers' only options for VRR to either get an AMD card or shell out for Nvidia's exorbitantly priced monstrous 65" "monitor"?

I know there's always that weird workaround with using a Ryzen IGPU in tandem with an Nvidia card or even dual GPUs, but that seems unreliable and spotty at best.

I'm a bit new to the space, so if there's any options I missed, please let me know!

DisplayPort isn't a complete non-starter anymore this year thanks to the Alienware OLED 55" showcased at CES.
Beyond that, NVIDIA said they might explore HDMI VRR. That's all we know.
 

dgrdsv

Member
Oct 25, 2017
11,879
NV will definitely support HDMI 2.1 VRR whether implemented on HDMI 2.1 or on HDMI 2.0 devices.

I think it's unlikely that NV will ever support Freesync-on-HDMI though which is what most TVs with VRR are using right now.
 

Serious Sam

Banned
Oct 27, 2017
4,354
This is straight from NV employee (he's only customer care rep though). Unless Nvidia does RTX line-up refresh with HDMI 2.1 soon, it will be a long time until we have any sort of AdaptiveSync over HDMI support from Nvidia. I think for now the dream is dead.

3abykpT.png
 

dgrdsv

Member
Oct 25, 2017
11,879
Unless Nvidia does RTX line-up refresh with HDMI 2.1 soon, it will be a long time until we have any sort of AdaptiveSync over HDMI support from Nvidia.
Not necessarily. A) There are no compliance tests to rate anything as HDMI 2.1 yet which means that Turing might actually be HDMI 2.1 capable, just not rated as such so for. B) HDMI's VRR can be backported to HDMI 2.0 and it's possible that NV will be able to include it into their HDMI 2.0 cards (Pascals and Turings likely, maybe Maxwells too) with a s/w update.