• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.

mordecaii83

Avenger
Oct 28, 2017
6,853
Man, trying to drive around and getting drops into the upper 40's on a RTX 3080 at 1080p Ultrawide is rough...
 

CelticKennedy

▲ Legend ▲
Member
Sep 18, 2019
1,877
Strange, today the game isn't even launching for me from the Ubi Client. I click play, the splash screen comes up then it goes away and the game never boots up. No error screen or anything.
 

Isee

Avenger
Oct 25, 2017
6,235
Have you tried pushing your infinity fabric and mem clock to 1900mhz 1:1:1 mode ? That should do more for games than your CPU overclock especially on a 3900x with only 3 cores per CCX as it reduces latency when jumping across a CCX (which you'll be doing a lot on a heavily threaded game on a 3900x)?

Are your memory subtimings tuned via DRAM calculator? Is your memory running in dual rank mode?

You've still potentially got a decent chunk of additional gaming performance left in your 3900x, given your so close to a locked 60fps, it may be refugee to get you above that threshold.

Edit: If that memory really is at 3600 CL14 it will easily do 3800mhz and almost every Zen 2 chip can get to 1866mhz IF clock at the very least, so you should definitely have some low hanging fruit to get your performance up. It will also mean it's likely B die as well, so if you haven't already, you'll have loads of headroom to really push your secondary and tertiary timings, which can improve 1% lows by as much as 10-20% alone in games that are memory/latency bound (which most "CPU" bound games usually are on Ryzen).

I'm running 4x8GB Samsung B-Die, so yes, it is in dual channel mode. XMP/DOCP is 3600c16-16-16-32. I'm using the following Ryzen DRAM calculator settings: https://abload.de/img/ddr4_3600_c14_145voltsqkku.jpg
The memory itself has no problems with 3800c14 settings (tested in 1:2 ratio) but the IF immediately crashes at 1900MHz. In fact, even 1833MHz IF isn't stable at 1.125V on the SOC. 🤷

I lost the silicon lottery in that regard.
 

SunBroDave

Member
Oct 25, 2017
13,135

brain_stew

Member
Oct 30, 2017
4,727
I'm running 4x8GB Samsung B-Die, so yes, it is in dual channel mode. XMP/DOCP is 3600c16-16-16-32. I'm using the following Ryzen DRAM calculator settings: https://abload.de/img/ddr4_3600_c14_145voltsqkku.jpg
The memory itself has no problems with 3800c14 settings (tested in 1:2 ratio) but the IF immediately crashes at 1900MHz. In fact, even 1833MHz IF isn't stable at 1.125V on the SOC. 🤷

I lost the silicon lottery in that regard.

I was referring to dual rank rather than dual channel but as you've got 4 sticks you'll already be running in dual rank so no performance to be gained there. Seems you've got every last drop of performance out of your chip that you can get so yeah, there's not really much of anything else you can do to get better performance at this point.

Looks like you've been really unlucky with your IF clock, that will definitely be costing you a small amount although with dual rank and tuned secondary and tertiary timings you'll be getting better performance than most, which really doesn't bode well for the vast majority that will be running it on a bog standard Zen 2 machine with XMP memory settings :/ at ~3200mhz or similar.
 

Uhtred

Alt Account
Banned
May 4, 2020
1,340
Could the issue be one of streaming these high rez textures causing the frame rate drops? (especially since it mostly happens when moving in a car) Maybe something we'll have to wait for direct storage/RTX I/O to fix? Or maybe something that need sto optimized a bit more using current tech?
 

Isee

Avenger
Oct 25, 2017
6,235
Could the issue be one of streaming these high rez textures causing the frame rate drops? (especially since it mostly happens when moving in a car) Maybe something we'll have to wait for direct storage/RTX I/O to fix? Or maybe something that need sto optimized a bit more using current tech?

The game is running on PS4, Xbox One. Streaming in general can't be that advanced.

I tested the game at 720p, no DLSS, no RT, lowest settings possible. 80-90 fps and suddenly: 69 FPS because Threat 2 on core 1 is maxed out. But there is also not a small load on every other core.
A 3900x is certainly not the fastest CPU out there, but it is several magnitudes faster than the 1.6GHz Jaguar cores on PS4. You'd assume it should be able to scale up from thirty and way beyond sixty, especially on settings that are significantly lower than on current consoles.
Instead, it is brute forcing its way up there.

In my (meaningless) opinion: The engine wasn't designed with sixty fps in mind in the first place.

watchdogslegion_2020_baky7.png


And now imagine how Ubisoft games will run on PC, once they are designed to take advantage of PS5 storage and significantly faster zen 2 CPU cores at thirty FPS.
7GHz, 16core CPUs on PC can't come soon enough.

(I'm, half joking here, but PC optimization is certainly not high on Ubi's to do list)
 

zephiross

Member
Mar 27, 2018
137
RTX 3080 + i7 7700K at 4.9 Ghz here

Averaging 40-45 FPS at 4K DLSS at ultra settings. The game is so CPU bounf it's not even funny... going from 4K to 1440p without changing anything else nets me an amazing 2fps boost lol.
 

Tahnit

Member
Oct 25, 2017
9,965
anyone with a 2080 super here and an i7 7700k have anything to report? Seeing if its possible to get decent framerate.
 

Uhtred

Alt Account
Banned
May 4, 2020
1,340
The game is running on PS4, Xbox One. Streaming in general can't be that advanced.

I tested the game at 720p, no DLSS, no RT, lowest settings possible. 80-90 fps and suddenly: 69 FPS because Threat 2 on core 1 is maxed out. But there is also not a small load on every other core.
A 3900x is certainly not the fastest CPU out there, but it is several magnitudes faster than the 1.6GHz Jaguar cores on PS4. You'd assume it should be able to scale up from thirty and way beyond sixty, especially on settings that are significantly lower than on current consoles.
Instead, it is brute forcing its way up there.

In my (meaningless) opinion: The engine wasn't designed with sixty fps in mind in the first place.

watchdogslegion_2020_baky7.png


And now imagine how Ubisoft games will run on PC, once they are designed to take advantage of PS5 storage and significantly faster zen 2 CPU cores at thirty FPS.
7GHz, 16core CPUs on PC can't come soon enough.

(I'm, half joking here, but PC optimization is certainly not high on Ubi's to do list)

Yeah, good points.
 

Kaldaien

Developer of Special K
Verified
Aug 8, 2020
298
In my (meaningless) opinion: The engine wasn't designed with sixty fps in mind in the first place.
It also wasn't designed to have a ton of anti-debug code running in the background. But since Ubisoft's performance target is 30 FPS, I guess they figure it's acceptable.

Best we can hope for is that the next-gen consoles have a performance target of 60 FPS and they stop shackling their PC ports with as much superfluous code that has nothing to do with the actual game.
 

Galava

▲ Legend ▲
Member
Oct 27, 2017
5,080
In my (meaningless) opinion: The engine wasn't designed with sixty fps in mind in the first place
I completely agree. They really aren't expecting us to play this game at 60fps unless you got a monster PC and don't enable RT.

Also, seeing how Valhalla seems to or might run at 60 in XSX gives me hope.
 

SunBroDave

Member
Oct 25, 2017
13,135
anyone with a 2080 super here and an i7 7700k have anything to report? Seeing if its possible to get decent framerate.
The games targets 30 fps. You can get great visuals at a largely consistent 30 fps. You will have to make serious visual compromises to get 60 fps, but even then, frame time spikes are very common. If you do target 60 fps, a Gsync display is highly recommended to reduce the impact of those stutters.
 

Galava

▲ Legend ▲
Member
Oct 27, 2017
5,080
Capping the game to 30 is a good experience on a 4K TV (is not a game that needs 60 as well), settings mostly on high/ultra including RT. 60 is not achievable realistically at high settings with RT.
 

m_shortpants

Member
Oct 25, 2017
11,195
Playing this with a 3700x/3080 and was pretty disappointed at the performance at 1440p. Sounds like single thread CPU is key. Kind of agree with the poster who said this seems to have been designed for 30fps.
 

CelticKennedy

▲ Legend ▲
Member
Sep 18, 2019
1,877
anyone with a 2080 super here and an i7 7700k have anything to report? Seeing if its possible to get decent framerate.
I have a 2080 Super with a 6700k. I'm running with High settings with Raytracing OFF at 1440p. Also, I don't have the High-Res Textures installed. The benchmark runs slightly above average 60fps. Actually playing the game can be totally different though, especially while driving around the world. I have a G-Sync monitor so it's not too bad.

The game seems very CPU heavy.

I'm thinking about experimenting with cranking a lot of the settings and turning on Raytracing to Low or Medium and locking it to 30fps to see how that feels. Probably not great.
 

Uhtred

Alt Account
Banned
May 4, 2020
1,340
I wouldn't be surprised if the realized they weren't going to hit anything like 60 FPs with ray tracing on consoles, so optimized for 30 FPS. Then took a look at the PC code and said, fuck it! they'll power through, right?
 
Aug 30, 2020
2,171
I wouldn't be surprised if the realized they weren't going to hit anything like 60 FPs with ray tracing on consoles, so optimized for 30 FPS. The took a look at the PC code and said, fuck it! they'll power through, right?

I expect the game targeted 30hz on next gen consoles from the start, but with a conservative target.

It's hard to imagine Ubisoft even considering a target output of 60hz for the title.
 

Tahnit

Member
Oct 25, 2017
9,965
I have a 2080 Super with a 6700k. I'm running with High settings with Raytracing OFF at 1440p. Also, I don't have the High-Res Textures installed. The benchmark runs slightly above average 60fps. Actually playing the game can be totally different though, especially while driving around the world. I have a G-Sync monitor so it's not too bad.

The game seems very CPU heavy.

I'm thinking about experimenting with cranking a lot of the settings and turning on Raytracing to Low or Medium and locking it to 30fps to see how that feels. Probably not great.
what happens when you have raytracing on ultra?
 

Flappy Pannus

Member
Feb 14, 2019
2,337
Are you playing on TV? It's because Windows has true 4K 4096x2160 as an option as well as the standard 3840x2160 which a TV is connected. Most games will always display at the resolution you have selected, but some will have parts that automatically go to the highest option available.

There's an old guide on steam (here) which will show you how to fix it, but I think your results may vary. I tried it but the computer kept freezing as long as that display was connected, so I reset it.

I wouldn't care so much if the bars were actually black in HDR, but they're just really gray on my C9.

Yeah this is a thing I have to do on every clean driver install (driver upgrades don't revert it) in a few games it will result in the wrong aspect ratio regardless of the resolution chosen, such as Mankind Divided. Even Alex of DF exhibited this in his Metro Exodus PC review video where it was obviously stretched due to the game suffering from this issue (all the Metro games are affected by this actually). Most games properly set the aspect ratio based on your chosen res but a handful set the aspect ratio based on the top res reported from the TV's EDID.

It's display specific, just affects some TV's mostly (not presented with that res on either of my 4K monitors). Never had a problem with using CRU to delete the EDID data though myself.
 

Patitoloco

Member
Oct 27, 2017
23,598
Does anyone has this problem? Easily reproducible:

- You go to the Team tab
- Your performance drops to 10fps

¯\_(ツ)_/¯

It's killing all my enjoyment. The rest, I found my perfect spot, but for some reason the Team screen wants to kill my computer.
 
Nov 19, 2019
18
Smooth here on a 2070 super, 3600 without ray tracing everything on high/ultra at 4k with DLSS. 60+ fps. its ray tracing which is the resource hog.
 

Scott Pilgrim

Member
Oct 25, 2020
25
Hey guys I have 25 hours on the game and I have some issues with DLSS, if you go to your team to select a new NPC my game goes to 20fps or lower, I have to disable DLSS, change resolution, change it back again and enable DLSS, so I think is DLSS related. I have to do that or else I'll have 20fps in the open world too.

Have anyone of you guys experienced something like that? Remember it happens when you want to recruit someone or change npcs, on the npc tab.
 
Oct 25, 2017
1,387
Yeah this is a thing I have to do on every clean driver install (driver upgrades don't revert it) in a few games it will result in the wrong aspect ratio regardless of the resolution chosen, such as Mankind Divided. Even Alex of DF exhibited this in his Metro Exodus PC review video where it was obviously stretched due to the game suffering from this issue (all the Metro games are affected by this actually). Most games properly set the aspect ratio based on your chosen res but a handful set the aspect ratio based on the top res reported from the TV's EDID.

It's display specific, just affects some TV's mostly (not presented with that res on either of my 4K monitors). Never had a problem with using CRU to delete the EDID data though myself.

Huh I tried CRU again and it worked this time. Not sure what happened before. Really happy to have that fixed before Valhalla.
 

Resident Guru

Member
Oct 28, 2017
918
Hey guys I have 25 hours on the game and I have some issues with DLSS, if you go to your team to select a new NPC my game goes to 20fps or lower, I have to disable DLSS, change resolution, change it back again and enable DLSS, so I think is DLSS related. I have to do that or else I'll have 20fps in the open world too.

Have anyone of you guys experienced something like that? Remember it happens when you want to recruit someone or change npcs, on the npc tab.
Yes same thing happened to me. Didn't know DLSS caused it as I just exited reloaded the game and it was good. 3080 GPU.
 

Kaldaien

Developer of Special K
Verified
Aug 8, 2020
298
Huh I tried CRU again and it worked this time. Not sure what happened before.
It can be somewhat unreliable. Games don't read the EDID directly, they use DXGI to get a list of resolutions, and DXGI reads a cached copy of the EDID stored in the Windows registry.

I prefer to just modify DXGI ;)

Code:
[Render.DXGI]
MaxRes=3840x2160
MinRes=0x0

Things will get exceptionally messy if you ever move your display to a different HDMI / DisplayPort, because the EDID cache is tied to the physical port the monitor is plugged into. It's not really practical to fix the problem with a CRU, unless you want to keep fixing it over and over.
 
Last edited:

Kaldaien

Developer of Special K
Verified
Aug 8, 2020
298
So is capping this at 30 fps with a 2080 super with RT at 4k a doable thing?
Can't speak for a 2080 Super, but my 2080 Ti is perfectly happy. Most framerate issues are coming from the graphics API and not the GPU though, so you defeat those problems with a faster CPU.
 

jim2011

Member
Oct 27, 2017
233
Maybe it's just me but capping at 30FPS looks terrible in this game. I have no issues with many console games at 30FPS but it just doesn't look smooth here.
 

Kaldaien

Developer of Special K
Verified
Aug 8, 2020
298
Capping to 30 FPS using what?

If you're speaking of the in-game framerate limiter, no... that's not smooth. RTSS can do a better job, and I can give you a 30 FPS so smooth percentiles stop having meaning :)

92a25fee47811b881e84062895f9996279a4f899_2_1152x363.png


I was expressing frustration at this limit being applied during FMVs, but their limiter sucks equally during gameplay :(
 

Kaldaien

Developer of Special K
Verified
Aug 8, 2020
298
Is this a similar thing to RE2R where the film grain hides the colour banding?
More or less, though the sky is always one of the most visible places for banding in HDR.

10-bits is not enough precision to handle the range this game renders at. So they have to resort to tricks to hide problem areas.

---

I made a video on this particular subject, in fact...

www.youtube.com

HDR10 vs. scRGB (without vidcap confusion)

Same as previous upload, only driver isn't confused believing this to be a 10-bit format (thanks NvAPI, you're useless)

I've modified the engine to render into a 16-bit framebuffer and done image processing myself. I've eliminated a lot of the banding issues, and also created a visualization showing off why you want more precision than the engine uses.
 
Last edited:

Mizkreant

Member
Mar 22, 2019
7
GTX2060. I'm also playing on 1080p, so GPU is probably not the main issue. I have a new CPU coming this week, will report what kind of difference does ir make.

I went ahead and bought the game. At 1440p Medium, it's not awful. It's playable but I wouldn't say it's great. Walking around the city I get 52 fps avg., with a %1 low of 32.
 

jim2011

Member
Oct 27, 2017
233
Capping to 30 FPS using what?

If you're speaking of the in-game framerate limiter, no... that's not smooth. RTSS can do a better job, and I can give you a 30 FPS so smooth percentiles stop having meaning :)

92a25fee47811b881e84062895f9996279a4f899_2_1152x363.png


I was expressing frustration at this limit being applied during FMVs, but their limiter sucks equally during gameplay :(

Thanks, I'll try RTSS. I mostly used the in-game limiter but thought I had tried Nvidia Control Panel as well.
 

KTroopA

Member
Oct 27, 2017
2,964
London, UK
Use the Sparse option for Vsync in game - it is half refresh rate vsync.

I have been messing with capping fps and used the in game limiter set to 30fps. Tried sparse under vsynch but wasnt sure what combination I needed for monitor refresh rate. Some settings created stutter/ judder when panning the camera. Can I get some advice here plz. My TV is 1080p