• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.
  • We have made minor adjustments to how the search bar works on ResetEra. You can read about the changes here.

Kaldaien

Developer of Special K
Verified
Aug 8, 2020
298
What's weird about the HDR in this game, is they don't have an adjustment for the UI's luminance. I think this is what is throwing everyone off into believing that HDR is not enabled. So then they go the other way, and actually disable HDR and when that causes the image to blow-out and brighten everything -- UI included -- they conclude that they've now enabled HDR.

I think the game may be taking the SDR reference level defined in the Windows HDR Color control panel (that slider you have to adjust) and using it for the game's HUD luminance target. I would do that in my own software, except only UWP DLLs can read those system preferences.

In any event, make sure to play the game in Borderless mode if you want HDR to work correctly. In Fullscreen mode, it does not change the colorspace, it will be SDR in a 16-bit framebuffer.
 
Last edited:

TinTuba47

Member
Nov 14, 2017
3,801
3080, and the performance is garbage.

1440p, I've turned a bunch of stuff off, and shit still stutters constantly, in game and in cut scenes. I would refund if I could
 
Oct 27, 2017
3,962
For a TV? YCbCr 4:4:4 is always best. RGB has to be converted to YCbCr for image processing and then gets converted back to RGB. TVs have never been particularly concerned about doing that conversion in as few steps as possible, and this is where their latency issues start to come from. Pipelining a bunch of conversions and processing steps that could be done as a single step, but traditionally have not been to save cost.
appreciate the info. What about for an actual PC monitor? RGB?
 
Oct 27, 2017
3,962
What's weird about the HDR in this game, is they don't have an adjustment for the UI's luminance. I think this is what is throwing everyone off into believing that HDR is not enabled. So then they go the other way, and actually disable HDR and when that causes the image to blow-out and brighten everything -- UI included -- they conclude that they've now enabled HDR.

I think the game may be taking the SDR reference level defined in the Windows HDR Color control panel (that slider you have to adjust) and using it for the game's HUD luminance target. I would do that in my own software, except only UWP DLLs can read those system preferences.

In any event, make sure to play the game in Borderless mode if you want HDR to work correctly. In Fullscreen mode, it does not change the colorspace, it will be SDR in a 16-bit framebuffer.
oh man... we have ppl now stating to be in exclusive mode for it to work haha. which one is right!?!? :)
 

Mercador

Member
Nov 18, 2017
2,840
Quebec City
Do you think it will be better in a few patchs or it won't be better than that on the performance level? I got a 1070 but I think I'll wait for my next gpu. I wonder how it runs on old gen...
 
Oct 27, 2017
1,367
Having HDR enabled but opening the menu and tabbing to Graphics consistently disables HDR. Menu shows it's still on, but it's not. If I toggle off, apply, toggle on, apply, it works again. This is reproducible.

Played for 3 hours last night, from before you rescue your crew until after the first two story objectives at your base, then went to save manually before calling it a night. Game stuck at "saving" screen. Eventually force quit, save didn't take. No auto saves between those two points either, which is insane considering how many mission objectives and new areas I reached.

Performance is OK. Playing from an NVME SSD with Ryzen 3600, RTX 2080, and 32gb of 3600mhz 16 cl ram. Playing at 2160p resolution with HDR, vsync on, res scaling turned down to 70%. Graphics preset on High except for AA, which is on low. Getting 60fps in most cases with some occasional dips down to mid 50s. Haven't made it past your home base, so not sure if there are more graphically demanding areas.
 
Oct 25, 2017
1,402
For HDR I've used both Windows and game HDR on, and also Windows off with game on. Both these worked fine for me in fullscreen. I was blown away by the HDR from the start, so I was surprised people were having issues since it worked for me no problem.

Performance has been alright. On a i9 9900KS/3090 I'm in the mid 50s in towns and 60+ everywhere else at ultra settings. Not great, but I'll take it.
 

Relix

Member
Oct 25, 2017
6,223
So everyone else getting ugly dips in cutscenes? Like drops to 40fps? I am playing 4K and generally it's very close to 60 using the Dynamic res option but cutscenes kill my frames
 

Kaldaien

Developer of Special K
Verified
Aug 8, 2020
298
oh man... we have ppl now stating to be in exclusive mode for it to work haha. which one is right!?!? :)
I don't like to stir the pot, but... I am a graphics programmer with custom tools to trace the execution of graphics engines as they engage HDR.

My own logs emphatically show that only borderless window mode is doing BOTH: 16-bit SwapChain + Applying Rec 709 + Linear Gamma (scRGB). That is one of the two available HDR formats on Windows for rendering, Rec 709 + 2.2 Gamma is not and that's what you're getting in fullscreen mode.
 

Relix

Member
Oct 25, 2017
6,223
I don't like to stir the pot, but... I am a graphics programmer with custom tools to trace the execution of graphics engines as they engage HDR.

My own logs emphatically show that only borderless window mode is doing BOTH: 16-bit SwapChain + Applying Rec 709 + Linear Gamma (scRGB). That is one of the two available HDR formats on Windows for rendering, Rec 709 + 2.2 Gamma is not and that's what you're getting in fullscreen mode.
Thanks for this. I'll try borderless to get the full HDR. What about performance? Is it affected by being borderless?
 

Vic20

Member
Nov 10, 2019
3,289
I've noticed something weird in the cutscenes, every time the camera cut to a different angle the textures become muddy and then become sharp again, can anyone check if this happened to them?
 
Oct 27, 2017
3,962
Closest that currently exists is this:


I'm in the process of getting LDAT hardware from NVIDIA to collect data for an upcoming blog series on the design of my framerate limiter. I intend to release it as an open source library for inclusion in as many third-party projects as possible. It does, obviously, work as a standalone product too, but I'm trying to spread the open source love far and wide :)

Discussion on tuning the limiter for this game begins here:

discourse.differentk.fyi

Topic-Free Mega Thread - v 1.11.2020

[API.Hook] LastKnown=128 d3d9=false d3d9ex=false OpenGL=false d3d11=true d3d12=true [Render.FrameRate] TargetFPS=60.0 SleeplessRenderThread=true SleeplessWindowThread=true MaxBusyWaitPercent=0.0 PreRenderLimit=6 BackBufferCount=6 [Render.DXGI] UseFlipDiscard=true SwapChainWait=1...

1. where do the dxgi files go for valhalla?
2. when running SKIF it defaults to the Steam games list but where do I go to add games from Ubisoft like Valhalla?
 
Oct 27, 2017
3,962
I don't like to stir the pot, but... I am a graphics programmer with custom tools to trace the execution of graphics engines as they engage HDR.

My own logs emphatically show that only borderless window mode is doing BOTH: 16-bit SwapChain + Applying Rec 709 + Linear Gamma (scRGB). That is one of the two available HDR formats on Windows for rendering, Rec 709 + 2.2 Gamma is not and that's what you're getting in fullscreen mode.
I will of course trust in you above all :)
 

hlhbk

Member
Oct 25, 2017
3,117
3080, and the performance is garbage.

1440p, I've turned a bunch of stuff off, and shit still stutters constantly, in game and in cut scenes. I would refund if I could

I am getting the exact same performance. My specs:

Intel 8700K
32 GB RAM
Nvidia GTX 3080
Win 10
SSD

I am assuming hopefully this will be fixed in patches/drivers in the future.

On another note if I turn HDR on in game everything is far to dark to see anything.
 

Kaldaien

Developer of Special K
Verified
Aug 8, 2020
298
1. where do the dxgi files go for valhalla?
2. when running SKIF it defaults to the Steam games list but where do I go to add games from Ubisoft like Valhalla?
1. Default would be: **C:\Program Files (x86)\Ubisoft\Ubisoft Game Launcher\games\Assassin's Creed Valhalla**

2. The version that ships with SKIF is not compatible with this game, it was missing IDXGIDebugInterface1 (NVIDIA Aftermath uses that for crash reporting)

Here is a link to an updated pre-release version of 0.11.1 that's compatible with this game:

discourse.differentk.fyi

Topic-Free Mega Thread - v 1.11.2020

Mostly the reason for that is because I can’t tune this stuff in-game, so I just pushed that setting to its limit rather than re-run the benchmark dozens of times 😛 There’s no real harm, probably no real benefit either, but I’m blind without my overlay giving me information in real-time. BTW...
 

icecold1983

Banned
Nov 3, 2017
4,243
Looks just like Odyssey and performs dramatically worse. Sounds about right for ubi + dx12. Same situation in Watch Dogs.
 

TheMadTitan

Member
Oct 27, 2017
27,235
The constant stuttering -- not even microstutters -- and frame drops into the teens are really starting to get on my nerves. No matter what framerate I choose, vsync option, or graphical settings, I can't even get the drops to just settle around the 40s. If people with 3080s are having issues, I'm not sure how I'm gonna fare with my 1080 long term.
 

-Tetsuo-

Unlimited Capacity
Member
Oct 26, 2017
12,577
Oddly enough, I updated my drivers today and was getting worse performance and some pretty bad stuttering. Rolled back to 457.09 and it is fine again.
 

bodine1231

Banned
Nov 16, 2017
194
I'm not having any issues at all with my 3080,running 4k 60+,usually 60-65 in busy areas/towns and 70+ when adventuring. Playing on 77 CX with gsync/HDR. If you are having micro stutters using gsync and you use Rivatuner,make sure to uncheck anything with 'power' in the monitoring. It causes microstuttering with gsync.

These are the settings that helped.

Lower clouds to high
Shadows very high
Anti aliasing to low (10fps increase,dont need full AA at 4k)

Everything else maxed.
 

Nintendo

Prophet of Regret
Member
Oct 27, 2017
13,383
For a TV? YCbCr 4:4:4 is always best. RGB has to be converted to YCbCr for image processing and then gets converted back to RGB. TVs have never been particularly concerned about doing that conversion in as few steps as possible, and this is where their latency issues start to come from. Pipelining a bunch of conversions and processing steps that could be done as a single step, but traditionally have not been to save cost.

What about for HDR? YCbCr 4:2:2 12bit?
 

Kaldaien

Developer of Special K
Verified
Aug 8, 2020
298
The constant stuttering -- not even microstutters -- and frame drops into the teens are really starting to get on my nerves. No matter what framerate I choose, vsync option, or graphical settings, I can't even get the drops to just settle around the 40s. If people with 3080s are having issues, I'm not sure how I'm gonna fare with my 1080 long term.
Have you tried Special K? A few posts prior, I linked to it. If there's one thing SK's good at, it's removing worst-case performance spikes. That's literally the entire reason it was built.
 

Deepo

Member
Oct 25, 2017
252
Norway
This is a bit off-topic, but I just tried the game on Xbox Series X using a VRR display (LG C9), and I'm really struggling to see any real differences from my 3090 equipped PC at Ultra. A bit of pop in here and there, but that's about it. Very impressed!
 

Lockjaw333

Member
Oct 28, 2017
764
This is a bit off-topic, but I just tried the game on Xbox Series X using a VRR display (LG C9), and I'm really struggling to see any real differences from my 3090 equipped PC at Ultra. A bit of pop in here and there, but that's about it. Very impressed!
Agreed. Bought it on series x, playing on a sony x950g. Also subbed to a month of uplay+ to check it out on pc. The series x version looks like it's at the very least equal to the Very High preset, some things maybe on ultra. 60fps lock pretty much, no idea if resolution is dynamically scaling to hit that but it looks like native most times.

On pc with a 3080 and 10700k I'm having trouble keeping it at 60fps, and that's with AA in low which is probably some sort of sub native reconstruction like previous games. Graphics wise the game looks the same as series x, I'm using the Hardware Unboxed settings which only turn down a few things.

It also loads faster on series x when fast traveling and such, it's like instant. I'm playing on a SSD on pc and it's just not as fast. Add in the instant resume and it's just better on X right now IMO.

Very impressed with the series X version.

Also the cross platform cloud saves are not working quite right. Seems my progress on the console is uploading to cloud but I can't see that save on pc until like hours later. Seems to be the same in reverse to. Not sure if anyone else is experiencing that.
 

Addnan

Member
Oct 28, 2017
65
Anyone have issues with tabbing in and out of the game. Just now the game wouldn't go back to rendering at full speed, was at like 5fps, game had to be restarted. I love going in and out of game
 

Slammey

Member
Mar 8, 2018
323
3950x
Mem at 3600mhz
Strix 2080ti
Win 10 latest version

Everything maxed out at 1440p 100% scaling
HDR ON
Reshade
LgCx 48"

Cutscenes are indeed 30-40s

But gameplay wise.. Getting from 55-60 to 80-90 depending night or day.. Just got to my home settlement.
 

Koldanar

Member
Oct 1, 2020
135
1. Default would be: **C:\Program Files (x86)\Ubisoft\Ubisoft Game Launcher\games\Assassin's Creed Valhalla**

2. The version that ships with SKIF is not compatible with this game, it was missing IDXGIDebugInterface1 (NVIDIA Aftermath uses that for crash reporting)

Here is a link to an updated pre-release version of 0.11.1 that's compatible with this game:

discourse.differentk.fyi

Topic-Free Mega Thread - v 1.11.2020

Mostly the reason for that is because I can’t tune this stuff in-game, so I just pushed that setting to its limit rather than re-run the benchmark dozens of times 😛 There’s no real harm, probably no real benefit either, but I’m blind without my overlay giving me information in real-time. BTW...

Has anyone done this with the Epic games store version of the game? I've so far had no luck - I assume something must be running in the background that's interfering. I've turned off RTSS, XBox Game bar, and kill just about every process I can before running with no luck.
 

Relix

Member
Oct 25, 2017
6,223
Just played a few hours. Dropped resolution scale to 90 while keeping adaptive quality. Managed to hit 60 most of the time. Everything ultra high except shadows and volumetric clouds on a slightly OCed 3080.
 

JudgmentJay

Member
Nov 14, 2017
5,220
Texas
This is a bit off-topic, but I just tried the game on Xbox Series X using a VRR display (LG C9), and I'm really struggling to see any real differences from my 3090 equipped PC at Ultra. A bit of pop in here and there, but that's about it. Very impressed!

Honestly the solid performance on consoles together with the fact that it's not on Steam means I'll probably just get this on PS5 if I do ever decide to play it. I wish Ubisoft put more effort into their PC ports.
 

Deepo

Member
Oct 25, 2017
252
Norway
The HDR on the Xbox Series X looks exactly the same as on the PC version by the way, which is to say, not very good imo.
 

Shifty Capone

Member
Oct 27, 2017
620
Los Angeles
I'm not having any issues at all with my 3080,running 4k 60+,usually 60-65 in busy areas/towns and 70+ when adventuring. Playing on 77 CX with gsync/HDR. If you are having micro stutters using gsync and you use Rivatuner,make sure to uncheck anything with 'power' in the monitoring. It causes microstuttering with gsync.

These are the settings that helped.

Lower clouds to high
Shadows very high
Anti aliasing to low (10fps increase,dont need full AA at 4k)

Everything else maxed.

To anyone on a 4k display or similar, follow this. This boosted my FPS so much. I am on a 3090 and was getting somewhat frustrated with certain aspects. Putting AA to low etc barely changed what the game looks like but I now average 90 FPS, but the more important part is my drops went from 20fps to 60. 5120x1440. Thanks, didn't even think of doing it earlier for some reason.
 
Oct 27, 2017
3,962
All credits to HardwareUnboxed, but I find it handy to have the optimized settings in an image in a thread like thism for reference<

y3P5RUZ.png
Adapt
Contemporary TVs do not need 12-bit for the range of luminance they support. You are better off without chroma subsampling, if you have a choice between 12-bit 4:2:2 and 10-bit 4:4:4, 10-bit is better.
how come my samsung q90t and 2070 super only has the 8-bit 4:4:4 option and not 10bit ?
 
Oct 27, 2017
3,962
1. Default would be: **C:\Program Files (x86)\Ubisoft\Ubisoft Game Launcher\games\Assassin's Creed Valhalla**

2. The version that ships with SKIF is not compatible with this game, it was missing IDXGIDebugInterface1 (NVIDIA Aftermath uses that for crash reporting)

Here is a link to an updated pre-release version of 0.11.1 that's compatible with this game:

discourse.differentk.fyi

Topic-Free Mega Thread - v 1.11.2020

Mostly the reason for that is because I can’t tune this stuff in-game, so I just pushed that setting to its limit rather than re-run the benchmark dozens of times 😛 There’s no real harm, probably no real benefit either, but I’m blind without my overlay giving me information in real-time. BTW...
all I see at that link is an immediate post that has a download that only contains the dxgi .dlls not a new SKIF specialk download
 

hlhbk

Member
Oct 25, 2017
3,117
I'm not having any issues at all with my 3080,running 4k 60+,usually 60-65 in busy areas/towns and 70+ when adventuring. Playing on 77 CX with gsync/HDR. If you are having micro stutters using gsync and you use Rivatuner,make sure to uncheck anything with 'power' in the monitoring. It causes microstuttering with gsync.

These are the settings that helped.

Lower clouds to high
Shadows very high
Anti aliasing to low (10fps increase,dont need full AA at 4k)

Everything else maxed.

I'll try this and report back.
 

Raydonn

One Winged Slayer
Member
Oct 25, 2017
919
www.pcgameshardware.de

Assassin's Creed Valhalla im Techniktest: Prachtvolles Meucheln - erstmals unter DirectX 12

PCGH macht den Technik-Test bei Assassin's Creed Valhalla und prüft, mit welcher Hardware Sie für die Eroberung Britanniens gerüstet sein sollten.
What's interesting to me is how the 5700XT performs near a 2080Ti/3070 in 1080p, but as the resolution increases the performance drops down to 2070 Super.
I'm guessing the game is not very Nvidia friendly, especially with all the mentions of stutters from the 3080 crowd.
 
Oct 28, 2019
5,974
Does anyone else get 100% GPU in conversations only? It's frying my laptop, have the game locked to 30fps so load is far less during gameplay but somehow during conversations it stays at 30fps but the GPU load skyrockets.
 

CrichtonKicks

Member
Oct 25, 2017
11,209
Yeah I encourage everyone to give it a look with CPU-Z. I beat the first boss and his 'stage' without problems.

I really think there is something about this RAM speed thing. I don't know how it's possible that I'm running at a locked 60 fps at 4K (80% resolution) with Very High settings on a 2080 Ti/6700k when people with 3080s and 3090s do worse at 1440p. And I'm deep in the game at this point (haven't got to London yet though) so I don't expect it to change. Outside of the occasional cutscene stutter this game is performing flawlessly for me.
 
Oct 27, 2017
12,238
I really think there is something about this RAM speed thing. I don't know how it's possible that I'm running at a locked 60 fps at 4K (80% resolution) with Very High settings on a 2080 Ti/6700k when people with 3080s and 3090s do worse at 1440p. And I'm deep in the game at this point (haven't got to London yet though) so I don't expect it to change. Outside of the occasional cutscene stutter this game is performing flawlessly for me.
I also dropped the environment detail to medium because it would kill my framerate. I haven't tried bump it up again to high and check if it can hold it.
 

hankenta

Member
Oct 25, 2017
670
After landing in England my experience with the game is considerably worse from what I assume are VRAM issues on a GTX 970. Usually I have framerates ranging from 40/50 to the rare 60 which is fine enough for me on the aging card. I prefer fluctuating higher framerates to locking to 30.
But in England performance will suddenly tank to 20-30 and stay there until I restart the game. I've had no issues like that in Norway. When this happens even the menus are below 30 instead of 60.
 

Ravelle

Member
Oct 31, 2017
17,801
I'm averaging on a 54 to a 60 with drops to the 40's at spots on a 2080 Super/i7 9700K on 1440p And this is around the starting area where it's just snow and water. Cut scenes hard cut to a locked 30 which is very jarring.

This can't be right
 

hlhbk

Member
Oct 25, 2017
3,117
I'm not having any issues at all with my 3080,running 4k 60+,usually 60-65 in busy areas/towns and 70+ when adventuring. Playing on 77 CX with gsync/HDR. If you are having micro stutters using gsync and you use Rivatuner,make sure to uncheck anything with 'power' in the monitoring. It causes microstuttering with gsync.

These are the settings that helped.

Lower clouds to high
Shadows very high
Anti aliasing to low (10fps increase,dont need full AA at 4k)

Everything else maxed.

Tried it and the FPS went up and never dropped below 60 FPS but the microstutters were still present as soon as the benchmark went over the water. Hopefully this is patched by Ubi.
 

SunBroDave

Member
Oct 25, 2017
13,156
Anti aliasing to low (10fps increase,dont need full AA at 4k)
Tried it and the FPS went up and never dropped below 60 FPS but the microstutters were still present as soon as the benchmark went over the water. Hopefully this is patched by Ubi.
It's not AA and it's not 4k. The AA setting controls what internal resolution is used to reconstruct up to the chosen output resolution. AA set to High means you're running your set resolution (so 4k), AA set to Medium means you're running at 90% of your set resolution and then reconstructing up to your set resolution, and then AA set to Low is the same thing but 83%. Same as Odyssey was.