Ultra clouds in horizon dawn cuts down frame rate by 20 fps with nothing to show for it.
Ultra shouldn't be anything more than a scalabilty test and until we have a 3070 it's useless.
Ultra clouds in horizon dawn cuts down frame rate by 20 fps with nothing to show for it.
Hard to take anyone seriously when they spout the lazy dev rhetoric
Well, obviously, but NVIDIA tried to bullshit market it as a 8k card so it's still funny to see thisUltra clouds in horizon dawn cuts down frame rate by 20 fps with nothing to show for it.
Ultra shouldn't be anything more than a scalabilty test and until we have a 3070 it's useless.
It's ultra settings....ultra.
Why would you boot a modern game in ultra?
Some of these comments....
The video really showed off how great the game can look at max settings. The problem is, that the character animations look more jarring than ever before.
It's probably similar to odyssey where you put certain settings from high to medium without much visual effect but a huge fps gainIt's ultra settings....ultra.
Why would you boot a modern game in ultra?
Some of these comments....
Then why doesnt it look like ultra? Looks avg as hell and def not ULTRAAAAAAIt's ultra settings....ultra.
Why would you boot a modern game in ultra?
Some of these comments....
So you're insinuating that the devs are not talented and just threw this together with a low budget and without any passion?Except it doesn't look great at all because graphics are more about talent, budget and passion than brute power.
It's not lazy though, that's my point.Nah, they're not lazy but the game looks definitely average compared to the most beautiful games of this gen, so there's no real reason fps are so low on newer GPUs.
A lot of optimizations actually get turned off when you max games out.everyone will proceed to act shocked that a ubisoft open world game with tons of graphical options runs poorly fully maxed out
Like, it doesn't matter how a game runs fully maxed if it scales well (both in terms of visuals and perf)
RDR 2 maxed out is quite insane but the game gets insane perf boost at the expense of very little visual detail by stepping down some settings.
You seem completely ignorant to how demanding native 4K resolution is, still.Nah, they're not lazy but the game looks definitely average compared to the most beautiful games of this gen, so there's no real reason fps are so low on newer GPUs.
Yeah, hoping for 1440p60 with RTX on and DLSS quality on my 2080TI.God damn, thats with a 3090 too?
Hoping my 2080 can get at least 1440p 60 with some RTX and DLSS loving
Doesn't hold 4k60 at max settings and ray tracing off.
Can hold 60fps with ray tracing on in "Performance" and "Ultra Performance" DLSS mode.
I am sure better results are possible with some tweaking, but this game is heavy on the GPU.
Exactly my thought, the whole benchmark is more or less worthless without going into this.Do we know if this benchmark is accurately representative of in game performance? Wouldn't be the first time if the in game benchmark was either too light or too heavy.
The benchmark has Shadows and Reflections under 'Major Impact Options'.I'm it's something like the difference between Shadows Ultra and High being negligible visually, but like 50fps technically.
~60% load on that i9 at 4K is actually indication that it's quite CPU heavy (or utilizes multi core very well) as this is a very, very GPU bottlenecked scenario.It's better that it's not destroying the CPU this time around though, average fps is 52 when running everything maxed out at 4k with DLSS Quality
59% load on a 10900k when using ultra performance mode at 4k, although that's with 0% extra details.
3080 should be ok maxed out at 1440p with DLSS quality to hit 60.
To be honest, from the looks of it this should actually scale well with future CPU's/GPU's very well, as opposed to WD2 that is a little funky
Yeah I tend to prefer high settings, and sometimes use a few medium settings here and there.Ultra settings are almost always badly optimized and run like shit even on top tier hardware for minimal visual fidelity gains if compared to high.
So you're insinuating that the devs are not talented and just threw this together with a low budget and without any passion?
Weird take.
You're right, i'm wrong on this.~60% load on that i9 at 4K is actually indication that it's quite CPU heavy (or utilizes multi core very well) as this is a very, very GPU bottlenecked scenario.
Turn some settings down and it probably can.
This should be something that gamers on PC hold to the chest, it has happened over and over again - so the 'ultra settings' test is only so much of an actual showing of what performance will look like with optimized settings (this doesn't excuse the likelihood that this title could've been optimized further, it is a Ubisoft title). Though what I think should be highlighted here is the hammering DLSS is doing for performance. More titles having DLSS, means smoother gameplay for all PC gamers (on relatively contemporary hardware).As always in modern open world games, the Ultra preset will have some specific setting that is tanking the framerate by 20-30%, and lowering it will barely affect visuals while massively improving the framerate.
so trueImagine doing 10 different benchmark runs but being too stubborn to try "High" settings even once lol.
Yes, 8K*.
You seem completely ignorant to how demanding native 4K resolution is, still.
We have not reached the point where GPU rasterization has surpassed the demands of 3840 x 2160, especially not in open world games.