Yes
Yes
Has anyone been able to successfully eliminate FreeSync flickering on a 144Hz monitor running a Nvidia graphics card? I'm currently wondering If I should just turn it off for good :(
Oh. Turn V-Sync on? I thought you were supposed to have it off so it didn't conflict with G-Sync/FreeSync. I'll try that next.Sorry to hear that. Didn't even know that was an issue. I'm using a dell freesync monitor with an nvidia GPU successfully. All I did was turn vsync on globally in the nvidia control panel, and obviously enable gsync. Which monitor are you using? By the way, are you using HDMI or Displayport? My understanding was that you must use displayport for freesync to work on nvidia cards (someone correct me if I'm wrong). And get a certified cable. The best buy insignia displayport cable works great.
This is dumb, just take a bunch of pictures of connections and plugs, upload to the gallery, drop the link.
Pictures of how you plugged everything from multiple angles. Will only take a few minutes.
My understanding was that you must use displayport for freesync to work on nvidia cards (someone correct me if I'm wrong). And get a certified cable. The best buy insignia displayport cable works great.
I bought two identical 1440p/144hz monitor recently with one hooked up through HDMI and the other through display port. And I'm only getting the G-Sync option on the monitor that is connected via the display port cable. I didn't know why but if what you wrote is true then that explains it.
Another thing of note, despite the all the settings being identical my screen connected via HDMI seems a bit brighter then the one with DP. I'm liking the one with HDMI more. Anyone know if they look a little different because of the different cables I used or does this tend to happen even with identical monitors?
Oh. Turn V-Sync on? I thought you were supposed to have it off so it didn't conflict with G-Sync/FreeSync. I'll try that next.
My monitor is an LG 34UC79G. I'm using DisplayPort through the cable provided with the monitor itself, which I would imagine has to be certified, haha.
I've tried following this tutorial, but it didn't seem to change much. https://www.reddit.com/r/nvidia/comments/agcj4a/how_to_eliminate_flickering_on_gsyncfreesync/
Did you have to meddle with stuff like CRU to get FreeSync to work on your monitor?
Has anyone been able to successfully eliminate FreeSync flickering on a 144Hz monitor running a Nvidia graphics card? I'm currently wondering If I should just turn it off for good :(
That's normal on Zen 2. As for the fan speed, you need to increase the speed up/down intervals or whatever they're called in the bios, so the fan doesn't speed up because the cpu reported 70C for a split second.
So, should I just ignore Ray tracing for now and just upgrade again when it becomes more supported?
Edit: If this is the case, I may even scale back for this build to a RX590/580 and wait it out.
...if I'm being honest with myself, I'm having hard time choosing between a RX 5700 XT and a RTX 2060 Super. Personal pros for the Nvidia is RT and nvenc but the 5700 XT is more powerful.
OK I got a question guys. When I first enabled Gsync in the Nvidia Control panel, it eliminated screen tearing in called Ultra Street Fighter 4. But when I tried again today the screen tearing was back again. I disabled and enabled it but nothing changed. For the record I am using a monitor that supports Freesync and not Gsync though.
I saw people talking about the Vsync option so I checked it. There was no option for "adaptive", just on, off, fast, and "Use the 3D Application setting". I turned it on and it got rid of the screen tearing but also introduced noticeable input lag. And that input lag is why I never turn Vsync on within the game.
But when I turn on the Gsync indicator it does say Gsync is on when I open USF4 despite the fact that there is still very noticeable screen tearing. Any idea what I can do about this?
EDIT: OK so when I tried the "Fast" setting it eliminated screen tearing and didn't give me any noticeable input lag either. Did this option even require Gsync? It seems like something that works independently from Gsync so I was just wondering if it even mattered that I enabled it?
Adaptive sync has a range, as described in this PC World article. The low-end could the issue with yours? I'm surprised USF4 is not locked at 60 FPS. Apparently there's a smooth v fixed option in SF4? Set it to fixed and then see.
My very limited understanding of "fast" in the vsync options is that it's for when your FPS > monitor refresh rate (like 280 fps on a 144hz monitor). So it selects the best frames to display, something like that.
For Gsync you need to cap the framerate at 2-3 frames below your max refresh rate (141-142 on a 144Hz panel for example), if it goes over that cap the image will tear.OK I got a question guys. When I first enabled Gsync in the Nvidia Control panel, it eliminated screen tearing in called Ultra Street Fighter 4. But when I tried again today the screen tearing was back again. I disabled and enabled it but nothing changed. For the record I am using a monitor that supports Freesync and not Gsync though.
I saw people talking about the Vsync option so I checked it. There was no option for "adaptive", just on, off, fast, and "Use the 3D Application setting". I turned it on and it got rid of the screen tearing but also introduced noticeable input lag. And that input lag is why I never turn Vsync on within the game.
But when I turn on the Gsync indicator it does say Gsync is on when I open USF4 despite the fact that there is still very noticeable screen tearing. Any idea what I can do about this?
EDIT: OK so when I tried the "Fast" setting it eliminated screen tearing and didn't give me any noticeable input lag either. Did this option even require Gsync? It seems like something that works independently from Gsync so I was just wondering if it even mattered that I enabled it?
So, should I just ignore Ray tracing for now and just upgrade again when it becomes more supported?
Edit: If this is the case, I may even scale back for this build to a RX590/580 and wait it out.
...if I'm being honest with myself, I'm having hard time choosing between a RX 5700 XT and a RTX 2060 Super. Personal pros for the Nvidia is RT and nvenc but the 5700 XT is more powerful.
Yeah, definitely upgrade to 16gb if you are able to. 8gb of RAM is pretty low nowadays. Your starting to see 16gb become the norm now for recommended in pc specs and 8gb is usually quite low. Also what DDR are you running?Out of curiosity, would you guys say that more than 8GB of RAM are needed? I see 16GB is recommended, and I guess it's a good idea to get it now that the prices are reasonable. Maybe it's just that I usually don't have all that many programs open (and never an intensive game and another program).
I have 8GB DDR4 2666MHz, which I got when the prices were awful. Is there any way to check if I actually need it? Should I just check the usage while gaming, for example?Yeah, definitely upgrade to 16gb if you are able to. 8gb of RAM is pretty low nowadays. Your starting to see 16gb become the norm now for recommended in pc specs and 8gb is usually quite low. Also what DDR are you running?
I myself just upgraded to 64gb of RAM but I'm a game developer who works on UE4. I used to have 32gb of RAM but UE4 eats up RAM like it's candy and I was seeing it being maxed out easily.
Yeah, I would check the usage while gaming to see how much is being used. I use CAM, it's a free monitoring software made by NZXT. Also bear in mind chrome alone can use 3-5gb of RAM depending on what addons you have and windows 10 uses up some too so that's around 6gb already used up leaving 2 gb for games if you're running chrome in the background.I have 8GB DDR4 2666MHz, which I got when the prices were awful. Is there any way to check if I actually need it? Should I just check the usage while gaming, for example?
I'm planning to get a new GPU soon, so I'd probably postpone the RAM for now.
I'll try that program, thanks! Like I said, I pretty much never keep Chrome or other programs open while playing something intensive, which probably explains why I haven't felt the need for more RAM as of yet. I never experience slowdowns, stutters or weird hangups either, which as far as I know are the usual suspects when there's not enough RAM.Yeah, I would check the usage while gaming to see how much is being used. I use CAM, it's a free monitoring software made by NZXT. Also bear in mind chrome alone can use 3-5gb of RAM depending on what addons you have and windows 10 uses up some too so that's around 6gb already used up leaving 2 gb for games if you're running chrome in the background.
If that's the case, it increases the chance something is wrong about one-hundred fold. Tidy it up and take/post pics, nobody is going to be able to offer substantive help to "yes it's all connected correctly".My build is mad messy. You won't be able to see what the hell is going on
So I feel like upgrading the ram of my 4 year old PC from 16gb to 32gb. Problem is that the Mobo only supports 2133 mhz unless I OC, which I don't really want to do nor deal with. It's quite difficult to find 2133 ram, especially if I want to get the same sticks to match my old ones, so I'm thinking of just getting newer ones that have faster speeds to replace the old ones. The idea is to use these sticks for my future new PC eventually, which I should be building in a year or 2 probably.
Will there be any problems using ram that runs faster than supported by the Mobo?
all DDR4 RAM speeds over 2133 are "out of spec" and considered "overclocking". Don't worry about that. Are you saying enabling XMP forces some shitty optimizations on your CPU etc in the menus? Usually you can disable those.
Okay quick question.
I updated my Ram yesterday to 2x8 3600 Corsair Vengeance. In HWMonitor I've got one module running at 47 degrees and the other one at a constant 17 degrees. Surely this can't be normal. I'm playing PoE and current memory usage is 7.4GB.
I've checked in CPU-Z and the memory is in Dual Channel.
This drive (https://it.pcpartpicker.com/product...b-m2-2280-nvme-solid-state-drive-sa2000m8500g) should be compatible, right?
- 6 x SATA 6Gb/s ports1
- 1 x M.2 slot (Key M)
- Supports up to PCIe 3.0 x4 and SATA 6Gb/s, 2242/ 2260/ 2280/ 22110 storage devices
- Intel® Optane™ Memory Ready2
- The SATA2 will be unavailable when installing M.2 SATA device into M.2 slot.
- Before using Intel® Optane™ memory modules, please ensure that you have updated the drivers and BIOS to the latest version from MSI website.
Some of the info on that post is outdated since they've mitigated some stuff in the meantime via chipset/bios updates, but the gist of it is that those boost fluctuations are normal and by design.Wow, thanks for that link! That was very helpful. Looks like I need to try a couple of things to see if it settles down a little.
EDIT:
I ended up disabling the Core Performance Boost and another processor boost setting and all is great! Fan is consistent, CPU voltages/temps are much more consistent, but the performance hasn't been hit much at all. Seriously thank you for the link - that issue was the only thing bumming me out about my build
Turning V-Sync on like that did seem to improve things a bit... or maybe it's just placebo, haha. I'm still getting some occasional flickering on Destiny 2, though. I'll continue looking into it. I paid good money for this monitor, damn it, and I will use it properly! Hahaha.I just did my first freesync setup so all this is fresh in memory. Yup the key point is vsync ON in Nvidia Control Panel global settings, and then in each game's options menu turn the game's own vsync option OFF. I know, it sounds weird to me too but basically you want the nvidia global setting to be in charge.
And yeah trying a new cable (like the best buy one I mentioned) doesn't hurt. I know when it came to HDMI cables I had numerous issues with non certified cables so I presume Displayport has the same quirks.
Did you ever try the Nvidia G-Sync pendulum demo? That's where the flickering was the most intense for me.I've only had one game flicker on me, and that was Modern Warfare. I don't know if one of the various game patches they've done fixed the problem, but it hasn't happened again since I finished the single player campaign and moved on to multiplayer. Every other game I've played so far has exhibited no flickering whatsoever.
In the NVIDIA Control Panel, I had both G-SYNC and V-Sync enabled, and in Modern Warfare, I had the frame limiter turned on to 140 (or whatever number they have that is closest to 144). Was pretty annoying until it got fixed.
I was looking at nVME drives for my friend's build. If the mobo specs say this:
This drive (https://it.pcpartpicker.com/product...b-m2-2280-nvme-solid-state-drive-sa2000m8500g) should be compatible, right?
I assume 17c is below ambient, so you can assume it's misread. Unless there's some other (real) problem, I wouldn't worry too much about it.
I believe something like AIDA64 will allow you to compare memory bandwidth with similar systems.
Yeah, stop worrying about RAM temperature. Even if it is capable of measuring itself, which is a huge ? on what is it actually reading, it doesn't matter.
As long as you don't over-volt it to something like 1.45v, you shouldn't even care about RAM temps.
Thanks, all the various terms were leaving me a bit confused. I'm curious to see how different it will be compared to my regular SSD.
I'll double-check if my cable is "certified" later as well. Can't hurt.