• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.

Scott Pilgrim

Member
Oct 25, 2020
25
I got a Ryzen 9 3900XT Overclocked to 4.6/4.5ghz and an RTX 2080, I tried many configurations for the graphics settings but the stutter is the same, any ideas please?

additional info: 32gb of ram at 3600mhz, tight timings. Windows 10, latest drivers/bios/chipset.
 

Mars

Member
Oct 25, 2017
1,988
Complete opposite for myself, DX11-- most settings on Ultra except shadows (Very high) 1440p, Balanced DLSS, and locked @ 60fps; DX12 crashes immediately after it finishes loading my save(s) doesn't matter what settings I use/set beforehand, even after erasing my first save and starting a new game.

3800X
2080S
16gb
 

Kaldaien

Developer of Special K
Verified
Aug 8, 2020
298
Which doesn't make a lick of sense.
On that note... AC: Odyssey and AC: Origins have a WMI thread that does nothing but sample CPU utilization from Windows constantly so they can give you a graph with CPU load %.

That sounds like a reasonable design, but then you actually profile the software (which Ubisoft obviously has never heard of), and hilariously, that thread whose job is to measure CPU load % is the single biggest contributor to CPU load %. That is Ubisoft in a top form.

FarCry 5's biggest consumer of CPU is a thread that runs in the background collecting gameplay analytics. You can kill that thread without any negative consequence (and your CPU temps will go down).
 
Nov 25, 2018
13
On that note... AC: Odyssey and AC: Origins have a WMI thread that does nothing but sample CPU utilization from Windows constantly so they can give you a graph with CPU load %.

That sounds like a reasonable design, but then you actually profile the software (which Ubisoft obviously has never heard of), and hilariously, that thread whose job is to measure CPU load % is the single biggest contributor to CPU load %. That is Ubisoft in a top form.

FarCry 5's biggest consumer of CPU is a thread that runs in the background collecting gameplay analytics. You can kill that thread without any negative consequence (and your CPU temps will go down).

I love Ubisoft games but, I've just about accepted that all their games are going to run like butt.

What's destroying Legion this time? Does disabling that weird Battleye anti cheat do anything for it?
 

Vuze

Member
Oct 25, 2017
4,186
You have a limit active, it doesn't necessarily have to be V-Sync, you could have used RTSS and turned on Scanline Sync and caused this just as easily.

RTSS cannot measure render latency unfortunately, it would go a long way toward telling you what is causing the limit:
First time using Special K (current SKIF version linked on the official website) and I can't seem to figure out how to open the menu. :(

I disabled the Anti Cheat and I can see the SpecialK64.dll has been injected in the game process by using process explorer. Also disabled GFE overlay, Gamebar, uPlay overlay, Afterburner/Rivatuner.

Yet I can't open the menu with Shift+CTRL+Backspace (if that's still the default key combo, I couldn't find any in the readme and went by PCGamingWiki). Any ideas what might be wrong?

fwiw no profile folder for the game is created either.

just rebooted the system and same behavior. I feel kinda dumb 😬
 
Last edited:

Jedi2016

Member
Oct 27, 2017
15,614
On that note... AC: Odyssey and AC: Origins have a WMI thread that does nothing but sample CPU utilization from Windows constantly so they can give you a graph with CPU load %.

That sounds like a reasonable design, but then you actually profile the software (which Ubisoft obviously has never heard of), and hilariously, that thread whose job is to measure CPU load % is the single biggest contributor to CPU load %. That is Ubisoft in a top form.

FarCry 5's biggest consumer of CPU is a thread that runs in the background collecting gameplay analytics. You can kill that thread without any negative consequence (and your CPU temps will go down).
Yeah, that's kind of what I'm thinking is happening, something unique to Ubisoft that makes the game behave in ways no other game does. A game built on last-gen Jaguar CPUs should barely wake up the fans on my system.
 

Kaldaien

Developer of Special K
Verified
Aug 8, 2020
298
I disabled the Anti Cheat and I can see the SpecialK64.dll has been injected in the game process by using process explorer. Also disabled GFE overlay, Gamebar, uPlay overlay, Afterburner/Rivatuner.
Did you also change the render API to D3D11? :) I don't support drawing my overlay in D3D12, rudimentary functionality (i.e. framerate limiter) does work, but I'm not in a hurry to draw the overlay. D3D12 games are unstable enough as things are.

Also, for this game I recommend using local injection (dxgi.dll), Ubisoft does this weird thing with dxdiag that starts/stops several versions of Direct3D every time you start a game. It confuses the hell out of Special K's global injector, and it can't figure out which one of those D3D devices is the actual game.
 

Vuze

Member
Oct 25, 2017
4,186
Did you also change the render API to D3D11? :) I don't support drawing my overlay in D3D12, rudimentary functionality (i.e. framerate limiter) does work, but I'm not in a hurry to draw the overlay. D3D12 games are unstable enough as things are.

Also, for this game I recommend using local injection (dxgi.dll), Ubisoft does this weird thing with dxdiag that starts/stops several versions of Direct3D every time you start a game. It confuses the hell out of Special K's global injector, and it can't figure out which one of those D3D devices is the actual game.
Ohhh, I knew it would be something simple as that haha. I wasn't aware DX12 isn't supported for the time being. Thanks!

Actually the better frame limiter is what I wanted to use primarily, I assume putting a proper profile in place should do the trick. Should be able to figure this out now that I know about this.
 

Oddhouse

Member
Oct 31, 2017
1,035
Is DLSS not working for others ?

I've run the benchmark without it (avg 45fps), in quality mode (44fps) and in performance mode (44fps).

this doesn't seem right ?
 

Nooblet

Member
Oct 25, 2017
13,622
For me it does nothing and I think some others also have mentioned that it doesn't seem to do anything.
It definitely does something, going from off to quality reduces my GPU usage by 10-15%, going to balanced reduces it by further 5% and performance is another 5% reduction in GPU usage. Ultra performance reduces it further but it also looks super blurry with full of artifacts everywhere; performance, quality and balanced look the same in terms of clarity but depending on which one you choose you get motion artifacts, with performance having the most out of the 3 and quality mode basically having little to no artifacts.

None of that translates to framerate though, so I assume it's a CPU thing as my CPU usage stays constant throughout. The game is being limited by that somewhere for some reason even on newer and good CPUs and is weirdly demanding on CPU even when there's pretty much nothing on screen.
 
Oct 28, 2019
5,973
Just had my first hard crash, locked the computer up and was treated to a yellow screen of death - a DPC Watchdog (palpatineironic.gif) violation. Should I be worried? Never seen this before. With the latest driver from Nvidia that was released today.
 

Vic20

Member
Nov 10, 2019
3,268
Does anyone else still get stutters when driving even after lowering the setting?
 

Serious Sam

Banned
Oct 27, 2017
4,354
I think GPU power is going to be an issue long before the VRAM becomes an issue.
After yesterday's AMD reveal, this is no longer up for debate. There is a reason AMD didn't dare to go below 16GB VRAM, even on their mid range card. No amount of "10GB VRAM is enough" thread bumping and MSI afterburner screenshots will convince me overwise. Game developers and AMD engineers know better what's required for next gen, not random forum people. Nvidia knows this too, that's why 16/20GB cards were rumored for so long and I'm pretty sure the only reason they didn't come out is because of RTX30 series production hurdles.
 

Tora

The Enlightened Wise Ones
Member
Jun 17, 2018
8,637
Curious to see how Cyberpunk fares with the full suite of RT considering its recommended specs are very very modest
 

Galava

▲ Legend ▲
Member
Oct 27, 2017
5,080
DLSS really does something, but the thing is that this game is soooo CPUbound that any changes in GPU-bound settings make no difference almost.

If you CPU can do at most 45fps on this game, it doesnt matter that your gpu can fo 90fps, it will be bound by those 45 of the cpu.
DLSS might make the GPU go from 90 to 120fps for example, but if the CPU cant do more than 45, the game will stay at 45, regardless of your GPUbound settings.

I think i'm understanding this correctly.
 
Oct 28, 2017
1,715
After yesterday's AMD reveal, this is no longer up for debate. There is a reason AMD didn't dare to go below 16GB VRAM, even on their mid range card. No amount of "10GB VRAM is enough" thread bumping and MSI afterburner screenshots will convince me overwise. Game developers and AMD engineers know better what's required for next gen, not random forum people. Nvidia knows this too, that's why 16/20GB cards were rumored for so long and I'm pretty sure the only reason they didn't come out is because of RTX30 series production hurdles.

Yeah I'm sure it's not because Ubisoft are fucking incompetent. Couldn't be that.
 

Purslane

Member
Jun 25, 2020
367
I'm glad the requirements specify the need for dual channel memory. AC Origins and Odyssey had crazy stutter with a single stick of RAM.
 

Tora

The Enlightened Wise Ones
Member
Jun 17, 2018
8,637
DLSS really does something, but the thing is that this game is soooo CPUbound that any changes in GPU-bound settings make no difference almost.

If you CPU can do at most 45fps on this game, it doesnt matter that your gpu can fo 90fps, it will be bound by those 45 of the cpu.
DLSS might make the GPU go from 90 to 120fps for example, but if the CPU cant do more than 45, the game will stay at 45, regardless of your GPUbound settings.

I think i'm understanding this correctly.
Yep. 70% utilisation, when I turn DLSS off gpu utilisation shoots up to 85% but my framerate is still the same.
Yeah I'm sure it's not because Ubisoft are fucking incompetent. Couldn't be that.
10GB isn't going to be enough though, it's on the lower end and AMD knows that
 

Zoyos

Banned
Oct 30, 2017
322
DLSS really does something, but the thing is that this game is soooo CPUbound that any changes in GPU-bound settings make no difference almost.

If you CPU can do at most 45fps on this game, it doesnt matter that your gpu can fo 90fps, it will be bound by those 45 of the cpu.
DLSS might make the GPU go from 90 to 120fps for example, but if the CPU cant do more than 45, the game will stay at 45, regardless of your GPUbound settings.

I think i'm understanding this correctly.

Based off what I've seen so far this is accurate. It's highly cpu bound, more than one should expect even for the open world complexity of the game. Drivers aren't going to fix this. It's on ubisofts end.
 

Vincent4756

Member
Oct 27, 2017
543
GeForce Experience won't even let me optimize the game.
2080ti and 9700k here.

I've only done the benchmark at ultra/rtxon so far and for 1440p I'm barely hitting 60fps.

Edit: said 9900k by accident
 
Last edited:
Nov 2, 2017
2,275
After yesterday's AMD reveal, this is no longer up for debate. There is a reason AMD didn't dare to go below 16GB VRAM, even on their mid range card. No amount of "10GB VRAM is enough" thread bumping and MSI afterburner screenshots will convince me overwise. Game developers and AMD engineers know better what's required for next gen, not random forum people. Nvidia knows this too, that's why 16/20GB cards were rumored for so long and I'm pretty sure the only reason they didn't come out is because of RTX30 series production hurdles.
600 dollar mid range. 300-400 dollar cards are mid range cards.

Not enough for what? Maxing out games for the rest gen? Yeah, you're right but again VRAM won't be the issue here. These cards are fine for the crossgen but after that...if your purpose is maxing games at 60fps+ at console resolution then none of them will last past the crossgen period. If that's your goal then you'll just have to upgrade in 2 years and it won't matter how much VRAM you have during that period.
 

CollectedDust

Member
Oct 27, 2017
1,044
Indiana
Got the game installed. Runs fine on foot (60fps), but gets to the mid 40s while driving as stated by others. Luckily with gsync I don't notice as much as I normally would, but still feel it at times. That said, I think the game looks great and don't get what folks are saying in that regard.

I'm playing with ultra settings (Including ray tracing) at 5120 x 1440 HDR with DLSS set to Quality.

PC: 3800x OC to 4.1 / RTX 3080 / 32GB 3200 / Running off a WD Black NVME
 

MDR

Member
Jun 21, 2018
192
Amazing, it's the same story for every ubi PC release. I stopped playing them on PC around AC Origins.

My theory is that they are designed for 30 FPS, once they hit this performance they stop optimizing.
Would like to hear a game engine engineer weight in though.
 

Hasney

One Winged Slayer
The Fallen
Oct 25, 2017
18,592
Is DLSS not working for others ?

I've run the benchmark without it (avg 45fps), in quality mode (44fps) and in performance mode (44fps).

this doesn't seem right ?
For me it does nothing and I think some others also have mentioned that it doesn't seem to do anything.

Yeah, the most annoying part of this game right now is how inconsistent it is across systems. There's no central advice anyone can give. DLSS is noticeable for me in the magnitude of 30fps from native to performance mode and unlike some other posts above, my 3080 is at 100% utilisation. 5.1 surround over eARC worked immediately too. All out of the box, no tweaking done (unless motion blur or AA is causing all this, because I turned them off immediately).
 

leng jai

Member
Nov 2, 2017
15,117
Amazing, it's the same story for every ubi PC release. I stopped playing them on PC around AC Origins.

My theory is that they are designed for 30 FPS, once they hit this performance they stop optimizing.
Would like to hear a game engine engineer weight in though.

My PC can't even sustain 30fps on this anymore, it's a new low.
 

Orion514

Member
Oct 25, 2018
366
Illinois
On my LG CX and getting 45-70 fps fluctuating with 9900K and RTX 3080 with High settings and RTX + DLSS on. Thank god for VRR, which makes it pretty smooth and playable. Hopefully tomorrow's Ubisoft performance patch helps us.
 

Kaldaien

Developer of Special K
Verified
Aug 8, 2020
298
Actually the better frame limiter is what I wanted to use primarily, I assume putting a proper profile in place should do the trick. Should be able to figure this out now that I know about this.
Frankly, the situation is insane here.


My limiter can normally fix Swapchain timing issues just by turning it on/off. In this case, you've gotta turn VSYNC on/off several times and then synchronize to VBLANK a few times. Until you do, frame pacing will make the game unplayable.
 

JEH

Prophet of Truth
Member
Oct 25, 2017
10,207
If I didn't get this with my 3080 I would be refunding right away.
 

TitanicFall

Member
Nov 12, 2017
8,262
Game runs fine on my 3080 and 3700x, but the cutscenes are awful. Basically run at max everything and then they switch to some prerendered videos that look like they came from the PS3/360 era.
 

Vuze

Member
Oct 25, 2017
4,186
Game runs fine on my 3080 and 3700x, but the cutscenes are awful. Basically run at max everything and then they switch to some prerendered videos that look like they came from the PS3/360 era.
I almost lost it when I saw that super compressed intro video that was pillar and letter boxed on my ultrawide lol. But yeah, extremely jarring constantly switching between the aspect ratios and general image quality...
 

JackEtc

Member
Oct 28, 2017
447
NYC
With RTX Medium on and everything on Ultra except shadows, I get like, ~4 fps more with DLSS set to quality vs no DLSS at all - and no DLSS seems more stable. I thought DLSS was a huge boost in performance lol? Are we not setting it up right, should our resolution be half when using it or something? All the other DLSS games ive played have surfaced that extra resolution setting

DLSS quality I get like 30-70 frames, all over the place, averaging about 60. With it off I get a rock solid 55. So weird.

9900KS, 3080, 32GB, 1440p UW
 

leng jai

Member
Nov 2, 2017
15,117
Not interested in changing your CPU? 4c/4t isn't really up to snuff now, minimum recommendation would be a 4c/8t processor like a 6700k or a 3300x

Can't be bothered at the moment especially with the new consoles just coming out, (getting both). My 6600k still plays 95% of games at 60fps. Definitely will by the end of next year or early 2021 when cross gen finishes up but right now there is really no rush.

This dog excrement of a port certainly won't be the game that pushes me to upgrade.
 

Tora

The Enlightened Wise Ones
Member
Jun 17, 2018
8,637
Can't be bothered at the moment especially with the new consoles just coming out, (getting both). My 6600k still plays 95% of games at 60fps. Definitely will by the end of next year or early 2021 when cross gen finishes up but right now there is really no rush.

This dog excrement of a port certainly won't be the game that pushes me to upgrade.
Fair enough, makes sense.

This definitely isn't the type of game to base a hardware upgrade off, else you'll find yourself with 3090's in SLI trying to scrape a locked 1080/60
 

Flandy

Community Resettler
Member
Oct 25, 2017
3,445
Keep getting fps drops and stutter with my RTX 3080 and i7 9700k in DX11 mode. DX12 doesn't run well either. None of my CPU Cores or my GPU reach 100% usage. Port seems busted right now
 

Guffers

Member
Nov 1, 2017
384
Yikes at this entire thread. I've got Uplay + so I'm installing this now, more out of curiosity than anything. 3700X & 5700XT driving 3440x1440. Wish me luck. I guess I should aim for 30fps at High settings?