• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.

Pargon

Member
Oct 27, 2017
11,968
Hold on so it's actually better to not have V-Sync enabled on driver-level but rather inside the games?
In my experience the opposite is true. Far more games suffer from issues with their own V-Sync implementation than having it enabled in the NVIDIA Control Panel.
The most important thing is that you do have V-Sync on though; whether that's in the control panel or in the game. However:
  • Some games do not provide a V-Sync option.
  • The V-Sync implementation in some games is bad, and causes it to stutter (even with G-Sync).
  • Some games silently implement a frame rate limiter when V-Sync is enabled or disabled so you may need to disable the in-game option as a way of controlling that.
It is extremely rare for there to be problems caused by having V-Sync enabled in the driver in my experience, and far more common that I may wish to disable V-Sync in a game due to some issue that its implementation causes.
So it's a lot less effort to enable V-Sync globally, and make per-game exceptions where required, than set it on a per-game basis.

Similarly, NULL is something which causes more problems than it fixes in my experience, so I would disable it globally (set low latency to "On" not "Ultra") and only enable it on a per-game basis where required; e.g. in games that do not have a frame rate limiter of their own, and other external tools like RTSS do not work.
 

Skyfireblaze

Member
Oct 25, 2017
11,257
In my experience the opposite is true. Far more games suffer from issues with their own V-Sync implementation than having it enabled in the NVIDIA Control Panel.
The most important thing is that you do have V-Sync on though; whether that's in the control panel or in the game. However:
  • Some games do not provide a V-Sync option.
  • The V-Sync implementation in some games is bad, and causes it to stutter (even with G-Sync).
  • Some games silently implement a frame rate limiter when V-Sync is enabled or disabled so you may need to disable the in-game option as a way of controlling that.
It is extremely rare for there to be problems caused by having V-Sync enabled in the driver in my experience, and far more common that I may wish to disable V-Sync in a game due to some issue that its implementation causes.
So it's a lot less effort to enable V-Sync globally, and make per-game exceptions where required, than set it on a per-game basis.

Similarly, NULL is something which causes more problems than it fixes in my experience, so I would disable it globally (set low latency to "On" not "Ultra") and only enable it on a per-game basis where required; e.g. in games that do not have a frame rate limiter of their own, and other external tools like RTSS do not work.

Okay thanks for the clarification, yeah that's was where I was standing too mostly so I was surprised to see the video saying the opposite.
 

Pargon

Member
Oct 27, 2017
11,968
Okay thanks for the clarification, yeah that's was where I was standing too mostly so I was surprised to see the video saying the opposite.
I think it's more that Chris likes to tweak everything so he personally leaves it up to the game, and would probably enable it on a game profile if the game's own implementation has issues.
I'd rather use a solution that works for most games by default, with minimal intervention required, than one which requires every game to be configured individually.
 

Brazil

Actual Brazilian
Member
Oct 24, 2017
18,392
São Paulo, Brazil
G-Sync monitors are so absurdly expensive here :/

Are GTX 970 cards even compatible with FreeSync? I keep getting mixed messages from Google searches.
 

Skyfireblaze

Member
Oct 25, 2017
11,257
I think it's more that Chris likes to tweak everything so he personally leaves it up to the game, and would probably enable it on a game profile if the game's own implementation has issues.
I'd rather use a solution that works for most games by default, with minimal intervention required, than one which requires every game to be configured individually.

Okay that makes sense. After the previous discussions we had I follow the same approach and honestly it's been smooth (literally!) sailing for me ever since.
 

Foxashel

Banned
Jul 18, 2019
710
Is the quality control /light bleed on a lot of these monitors still garbage? I went through 4 Acer monitors on amazon until I found one worth keeping. What a nightmare. At $700+, unacceptable.
 

Solus

Member
Oct 26, 2017
304
I'm actually thinking of buying the LG screen in the OP. For 300-350 it seems like a decent screen and would probably make a nice upgrade to my current one, which is a 10 year old 27 inch Samsung 1080p TN.
Upgrading to 32 inch, 144hz, freesync and a VA screen should be nice. I think I would prefer VA over IPS because black levels matter a lot to me. In fact if I could forego all the other options and pick an OLED, I would, but unfortunately that doesn't seem to be an option. I'll probably get one of those C9 screens in a year or two, or whatever the latest model is called by then.
That said, I'd rather double check with you guys first. I'm not an expert and there are no stores here where I can properly see and compare screens for myself.
So what do you guys think? Should I go with the 32LGK650F or are there better screens to choose from? Or should I hold out and and see what 2020 has to offer?
Thanks in advance!
 

Nekrono

Member
May 17, 2018
563
In my experience the opposite is true. Far more games suffer from issues with their own V-Sync implementation than having it enabled in the NVIDIA Control Panel.
The most important thing is that you do have V-Sync on though; whether that's in the control panel or in the game. However:
  • Some games do not provide a V-Sync option.
  • The V-Sync implementation in some games is bad, and causes it to stutter (even with G-Sync).
  • Some games silently implement a frame rate limiter when V-Sync is enabled or disabled so you may need to disable the in-game option as a way of controlling that.
It is extremely rare for there to be problems caused by having V-Sync enabled in the driver in my experience, and far more common that I may wish to disable V-Sync in a game due to some issue that its implementation causes.
So it's a lot less effort to enable V-Sync globally, and make per-game exceptions where required, than set it on a per-game basis.

Similarly, NULL is something which causes more problems than it fixes in my experience, so I would disable it globally (set low latency to "On" not "Ultra") and only enable it on a per-game basis where required; e.g. in games that do not have a frame rate limiter of their own, and other external tools like RTSS do not work.
Okay thanks for the clarification, yeah that's was where I was standing too mostly so I was surprised to see the video saying the opposite.
Correct, the video stated that VSYNC in some games might trigger some other engine optimizations that make it work better but it can also have the opposite effect depending on the engine/developer (bad frame pacing, etc), so it seems that it might be better to enable VSYNC on the NVCP globally just to be on the safer side.
 

Skyfireblaze

Member
Oct 25, 2017
11,257
Correct, the video stated that VSYNC in some games might trigger some other engine optimizations that make it work better but it can also have the opposite effect depending on the engine/developer (bad frame pacing, etc), so it seems that it might be better to enable VSYNC on the NVCP globally just to be on the safer side.

Yeah I'll keep it that way unless it's specifically mentioned that a game works better the other way around.
 

Mr.Deadshot

Member
Oct 27, 2017
20,285
Any good deals for a PC monitor on amazon.de ? Currently running an old Synchmaster 24" 1080p with a GeForce 980TI. I am not interested in 4k, but a bigger, better monitor might be a consideration.

I swear I can't find shit this Black Friday, lol.
 

low-G

Member
Oct 25, 2017
8,144
Got my first freesync monitor a few days ago. Biggest surprise is the classic games like Unreal and Duke 3D which suddenly run much smoother. They have terrible hitching with vsync & 60Hz but they're smooth like they used to be many years ago with freesync and higher refresh rates! Did not expect that.

It also really does improve latency for a lot of games - low spec and high.

However, RetroArch, even properly tweaked, didn't really feel much different. My old monitor was very low latency for 60Hz, and of course I was tweaking the hell out of latency settings. But I don't perceive any difference from what was probably ~20ms latency to ~5ms. I'd like to compare it to a CRT on real hardware to see how much more latency a real SNES has. I bet I would feel the jump from ~5ms latency to ~40ms on a real SNES and CRT.
 

Gitaroo

Member
Nov 3, 2017
7,967
Does G sync makes 40fps feels the same as 60fps? What's the game changing thing about this compare to triple buffering.
 

dadjumper

Member
Oct 27, 2017
1,932
New Zealand
Got my first freesync monitor a few days ago. Biggest surprise is the classic games like Unreal and Duke 3D which suddenly run much smoother. They have terrible hitching with vsync & 60Hz but they're smooth like they used to be many years ago with freesync and higher refresh rates! Did not expect that.

It also really does improve latency for a lot of games - low spec and high.

However, RetroArch, even properly tweaked, didn't really feel much different. My old monitor was very low latency for 60Hz, and of course I was tweaking the hell out of latency settings. But I don't perceive any difference from what was probably ~20ms latency to ~5ms. I'd like to compare it to a CRT on real hardware to see how much more latency a real SNES has. I bet I would feel the jump from ~5ms latency to ~40ms on a real SNES and CRT.
Are you using the rollback option in Retroarch? I found that that helps latency a whole lot. In general I think going from my 60hz TV to my 144hz monitor for retroarch feels way better, but maybe the input lag in general on my other TV was higher.

Does G sync makes 40fps feels the same as 60fps? What's the game changing thing about this compare to triple buffering.
I doesn't do that, but it makes it feel much smoother when the framerate fluctuates. When I'm playing a game that can't consistently hit 60, it definitely doesn't feel like consistent 60, but it feels like consistent something, like the game isn't chugging or speeding up intermittently.
At least, that's my experience after a couple of days
 

Xiaomi

Member
Oct 25, 2017
7,237
Does G sync makes 40fps feels the same as 60fps? What's the game changing thing about this compare to triple buffering.

In my experience it's hard to tell the difference between 55-60 fps in Gsync, but anything below that is still noticeably choppy, just not as bad or tearing like it would without Gsync. It's good for games like RDR2 where I can get 1440p, mostly ultra settings up to the mid-50s with my 2080, but not quite 60. Anything over 70 fps or so feels great, and anything over 90 or so is flawless to me.
 

Pargon

Member
Oct 27, 2017
11,968
Does G sync makes 40fps feels the same as 60fps? What's the game changing thing about this compare to triple buffering.
No, 40 FPS feels like 40 FPS.
G-Sync only removes the extra stuttering you'd get on a display which is not running at some multiple of 40. It doesn't increase the frame rate.

The main benefit of G-Sync is when you want to go above 60 FPS in my opinion.
Rather than having to run at game at either 60 or 120 for it to be smooth on a 120Hz display, it would now be smooth at something in-between that range; e.g. 90 FPS.

However, RetroArch, even properly tweaked, didn't really feel much different. My old monitor was very low latency for 60Hz, and of course I was tweaking the hell out of latency settings. But I don't perceive any difference from what was probably ~20ms latency to ~5ms. I'd like to compare it to a CRT on real hardware to see how much more latency a real SNES has. I bet I would feel the jump from ~5ms latency to ~40ms on a real SNES and CRT.
I think runahead has done so much to improve latency in emulation that those benefits are less noticeable now - at least in games where there's a couple of frames you can shave off.
That said, you can still combine G-Sync with runahead to reduce latency further.
G-Sync removes latency from the display end of the equation, while runahead removes latency from the game.

But make sure that you have RetroArch configured correctly: it has a "sync to exact content frame rate" option you should enable.
This is required for it to work correctly with VRR displays, and can make a big difference with arcade games such as R-Type which run at 55Hz rather than the typical 60 (curiously my monitor reports a fluctuating 52-55 FPS when it's running, but the image is perfectly smooth).
 

low-G

Member
Oct 25, 2017
8,144
Are you using the rollback option in Retroarch? I found that that helps latency a whole lot. In general I think going from my 60hz TV to my 144hz monitor for retroarch feels way better, but maybe the input lag in general on my other TV was higher.
I think runahead has done so much to improve latency in emulation that those benefits are less noticeable now - at least in games where there's a couple of frames you can shave off.
That said, you can still combine G-Sync with runahead to reduce latency further.
G-Sync removes latency from the display end of the equation, while runahead removes latency from the game.

But make sure that you have RetroArch configured correctly: it has a "sync to exact content frame rate" option you should enable.
This is required for it to work correctly with VRR displays, and can make a big difference with arcade games such as R-Type which run at 55Hz rather than the typical 60 (curiously my monitor reports a fluctuating 52-55 FPS when it's running, but the image is perfectly smooth).

Yeah, I've used runahead both before and after the change.

But yeah there is a setting under framerate throttling (not where I'd expect it to be) that allows VRR support. Retroarch was doing really poorly for me before I enabled that.
 

OneBadMutha

Member
Nov 2, 2017
6,059
Do Freesync monitors do both freesync and Gsync? I just bought an Nvidia gaming laptop yet also use my X on it. Will be using next gen AMD consoles on it as well.
 

Theswweet

RPG Site
Verified
Oct 25, 2017
6,397
California
Do Freesync monitors do both freesync and Gsync? I just bought an Nvidia gaming laptop yet also use my X on it. Will be using next gen AMD consoles on it as well.

Your nVidia laptop should support FreeSync over HDMI with the latest drivers, assuming the monitor supports HDMI 2.0b or later. Get a FreeSync display with that and you should be good to go.
 

Pargon

Member
Oct 27, 2017
11,968
Your nVidia laptop should support FreeSync over HDMI with the latest drivers, assuming the monitor supports HDMI 2.0b or later. Get a FreeSync display with that and you should be good to go.
Does it? I thought NVIDIA still only supported G-Sync and VESA Adaptive-Sync via DisplayPort (which AMD brands as "FreeSync") and HDMI-VRR.
FreeSync-over-HDMI is a different standard from HDMI-VRR. The latter was introduced as part of the HDMI 2.1 spec and back-ported to 2.0b, while the former is a proprietary AMD extension.
 

Theswweet

RPG Site
Verified
Oct 25, 2017
6,397
California
Does it? I thought NVIDIA still only supported G-Sync and VESA Adaptive-Sync via DisplayPort (which AMD brands as "FreeSync") and HDMI-VRR.
FreeSync-over-HDMI is a different standard from HDMI-VRR. The latter was introduced as part of the HDMI 2.1 spec and back-ported to 2.0b, while the former is a proprietary AMD extension.

It's a new development but yes, if you have a Turing card then Freesync will work assuming it's at least over 2.0b.
 

Yogi

Banned
Nov 10, 2019
1,806
^ Beautiful monitor but I felt the IPS made fast-motion a bit mushy compared to high-refresh TN. I still want it but just bare that in mind. It's an absolutely beautiful display though. Only other drawback is the heavy gamer aesthetics.

I was about to buy it but the price hasn't dropped for the IPS version in the UK. £350 is for the TN panel on amazon uk.
 

Yappa

Member
Oct 25, 2017
6,470
Hamburg/Germany
So I currently have both the LG 34GK950F and the LG 27GL850 at home for testing and it's between those two that I'll decide on my first gaming monitor with Freesync/Gsync. Will continue to use my old Dell 24" 1920x1200/60 Hz screen as part of a multi monitor setup screen, so either 34" + 24" or 27" + 24".
This past week I mainly used the 34" Ultra wide screen and I like the extra desktop space and gaming was fun with still mostly 90+ fps at native 3440x1440 res playing Modern Warfare using a RTX 2080.
Then I switched over to the 27" and while I miss the extra desktop space, I can now almost reach 144 fps at native 2560x1440 res which is pretty nice too.
What I haven't thought about testing yet and I'll get to it next is simply gaming with 2560x1440 res on the 34GK950F with black bars on both sides. Wouldn't that be the best solution to get the "best of both worlds"? Apart from simply lowering the graphics settings, of course...