It's a cross-gen game.How do you it's not properly leveraging XSX capabilities? We haven't even seen actual gameplay or comparison.
I didn't have any issues in playing both Origins and Odyssey on Xbox One X.
Both were 4K or Dynamic 4K at solid 30fps, HDR while still looking stunning, therefore properly leveraging all X1X capabilities.
This does not seem the case on XSX if 4K/30 with no other modes will be what they will only offer for the final version, for many reasons already explained for many pages in different topics now.
People need to understand that PC benchmarks are not performance guides. They are for comparing GPUs relative to each other.For reference, for anyone expecting 4K/60 with RT, look at the performance of a 2080 Ti at 4K, which is more powerful than either of the new consoles coming out.
Consoles don't use ultra settings, and neither should you.My rig cost me about $2,400 and can't even hit 4k/60/Ultra in Odyssey. You're not building a PC for around $1,000 and getting 4k/60/Ultra. The best Ampere cards alone will cost that much.
I would just not support evident development lazyness at full price.But then, how will it be a worse experience than what you played on the other ACs? Expectations? That's what I am trying to say : you are setting the bar to get disappointed. I do understand the wish for something better (in all aspects) with Next Gen, but arbitrary and unflexible expectations will just interfere with us simply enjoying what we get (I can't make any judgement on AC : V of course, it could be crap for all I know).
It's a cross-gen game.
It will release all the way down to the weakest OG Xbox One, we're not talking about a true next-gen "device-killer" title here, plus we all seen the in-game supercut.
Damn, it won't even support Ray Tracing on new consoles...
A flag switch from Medium/High settings to Ultra shouldn't kill 50% of the performance on a device 4x more powerful CPU wise, 8x more powerful GPU wise and 35x more powerful I/O wise.
And Valhalla, even if it does look better than Odyssey, does not seem leaps and bound beyond from the in-game supercut we already seen.
I would just not support evident development lazyness at full price.
RDR2 still runs at Native 4K, HDR, 30fps on Xbox One X.I edited my post. The AC Valhalla trailer did have big upgrades over AC Odyssey. It has volumetric lighting similar to RDR2's. AC Odyssey didn't have it. It only had a fake post-process godray effect.
So far there is no confirmed RT.Will the game support Ray tracing too? Because judging but DF last video RT is so demanding and I think native 4K is not even possible with 30FPS if they are gonna incorporate RT as well
RDR2 still runs at Native 4K, HDR, 30fps on Xbox One X.
This means it may go at 4K/60fps on XSX with patch made in 1 day if only Rockstar wants, considering the power difference between the two machines.
If things won't change, I will have zero issues in skipping Valhalla at launch, and playing it once it'll drop down to 19€ or something.Awww buddy, that devolved into something less debatable. "Lazy dev" is probably something we should remove from these discussion. It's not like Ubi became assets flippers. Well anyway, do what feels right for you, it's only games, just don't cut yourself from nice things because you want to stand a ground.
RDR2 still runs at Native 4K, HDR, 30fps on Xbox One X.
This means it may go at 4K/60fps on XSX with patch made in 1 day if only Rockstar wants, considering the power difference between the two machines.
You used RDR2 volumetric light for perspective, so did I as a Resolution/Framerate/Fidelity/Open-World combo one to expect from current gen to the next.RDR2 has nothing to do with this though. Completely different games. Completely different engines. I only brought it up to explain what volumetric lighting is in case you didn't know. AC Valhalla does utilize XSX's power.
You used RDR2 volumetric light for perspective, so did I as a Resolution/Framerate/Fidelity/Open-World combo one to expect from current gen to the next.
You should know X1X version of RDR2 is not the same as the PC version set to "Low"...
It's mixed, some parts are low, others medium, some even high and there are some that go even below low!You should know X1X version of RDR2 is not the same as the PC version set to "Low"...
I agree 100% with you. 30fps is completely fine. Good motion blur, and more importantly, good framepacing makes 30fps perfectly fine.Unpopular opinion. I think 30fps is fine for a lot of games, especially stuff like Assassins Creed and Red Dead, I would rather take graphical fidelity over 60fps.
That said 60fps is essential for games where you need quick reflexes like DMC, fighting games and racing games.
Excuse me?No I didn't use it for perspective. Only to explain what volumetric lighting looks like because you didn't notice it in the ACV trailer. Anyway, I've just seen your "lazy devs" shit. Go away.
You should know X1X version of RDR2 is not the same as the PC version set to "Low"...
Might as well cancel next-gen if that's the best they can do.
People need to understand that PC benchmarks are not performance guides. They are for comparing GPUs relative to each other.
No-one should actually be using ultra settings to play games unless it's already running so well that it makes no difference.
Consoles don't use ultra settings, and neither should you.
Ultra settings are typically throwing away significant amounts of performance for details that people struggle to see in a direct A/B comparison.
And the game is still the best looking current gen console game of all.It's mixed, some parts are low, others medium, some even high and there are some that go even below low!
If things won't change, I will have zero issues in skipping Valhalla at launch, and playing it once it'll drop down to 19€ or something.
I still have a long backlog to complete, and there will still be a lot of games to enjoy on XSX when it will launch later this year (Halo Infinite in primis). ;)
Unpopular opinion. I think 30fps is fine for a lot of games, especially stuff like Assassins Creed and Red Dead, I would rather take graphical fidelity over 60fps.
That said 60fps is essential for games where you need quick reflexes like DMC, fighting games and racing games.
Cool, but my point is a 2080 TI can barely run RDR2 on 4k/Ultra settings above 40fps. I don't see how anyone is getting 4K + 60fps on the new consoles with the same settings.
the trick for consoles will be the same as it is for PC: "don't run it at Ultra settings if you want 4K, dummy". lol
nowhere is it recommended or required for Ultra settings to be engaged.
I know. I'm just saying a 2080TI is brought to it's knees like this, so I don't expect new consoles to fair much better at lower settings even. You want 4K in these bigger titles you will have to settle for 4k/30fps on lower/medium settings. That's what I expect at least.
I know. I'm just saying a 2080TI is brought to it's knees like this, so I don't expect new consoles to fair much better at lower settings even. You want 4K in these bigger titles you will have to settle for 4k/30fps on lower/medium settings. That's what I expect at least.
Cool, but my point is a 2080 TI can barely run RDR2 on 4k/Ultra settings above 40fps. I don't see how anyone is getting 4K + 60fps on the new consoles with the same settings.
Well, minimum, so only a little bit boo that it can't do 60fps at 4k.
I don't expect too much from third party cross-gen games. Especially the first batch. The hardware will often be used to basically just brute force stuff, I imagine. Whatever engine AC uses has been built around past hardware. And so has the game itself, for the most part.
It's the first party games that ought to demonstrate what the hardware is capable of, at least on a visual level.
And also, I guess some third party games like The Medium are next-gen only. Or maybe it'll be on PC too?
Kind of a weird wrinkle. Will they introduce NVMe SSD's to system requirements? Or not release these games on PC? Or make them work on slower memory solutions? I'm hoping for the former, but it's like....what's the adoption rate on those? It's commercial viability versus integrity/ambition of the game design. I think money tends to win in most cases and it'll probably be the latter.
Anyway I guess this is ramblingly going off-topic.
My question is: with all the reconstruction technology already available and its awesome performance, why settle for native 4K? It's a ridiculous compromise and bragging rights more than anything.
Please note I'm not a PC Gamer.
Been shackled to notebook CPU's for a long-ass time now. Next-gen is finally a chance at fully capable CPU's, so a big jump was thought to be possible.None of the Assassin's Creed games have ever run at 60FPS on a console. I don't really know why you all would be expecting this generation to be different.
Not even notebook, netbook brah...Been shackled to notebook CPU's for a long-ass time now. Next-gen is finally a chance at fully capable CPU's, so a big jump was thought to be possible.
Ahh sorry, meant Netbook lol.
What if they hire RandyThink people are overestimating the capability of the next gen consoles.
Top PC hardware struggles to run AC games at a stable 60fps.
Somehow people expected the consoles to play with improved graphics, new graphical techniques like Ray tracing, and run at 60fps.
Somethings gotta give.
Remember the consoles are gonna cost around $500. There's only so much that can be done at that price. Sony/Microsoft don't know magic.
You know what bugs me the most about devs that push this narrative? It's that, most of the time, if you play their games on PC, they either lack a 30fps cap, or the cap is awful and delivers extremely unstable frametimes."At Ubisoft for a long time we wanted to push 60 fps. I don't think it was a good idea because you don't gain that much from 60 fps and it doesn't look like the real thing"
Nicolas Guérin, world level design director of AC Unity
"30 was our goal, it feels more cinematic. 60 is really good for a shooter, action adventure not so much. It actually feels better for people when it's at that 30fps. It also lets us push the limits of everything to the maximum."
Alex Amancio, Unity's creative director
seems that this is still true for Ubisoft lol. more shocking is that people also believe this nonsense. 60fps is always better than 30fps, period.
Guess thats the issue with having an engine that is built around supporting old-ass tech. Maybe once Ubi move Anvil over to Next-gen and PC only, the jumps might be there.After playing Odyssey on PC and seeing how it performs with a Zen2 makes me think this game is more CPU bound that any other thing. I guess the AnvilNext engine still sucks and they rather keep 30fps than reducing the world density, etc, like they had to do in Unity to fix the consoles framerate issues.
Might as well cancel next-gen if that's the best they can do.
People need to understand that PC benchmarks are not performance guides. They are for comparing GPUs relative to each other.
No-one should actually be using ultra settings to play games unless it's already running so well that it makes no difference.
Consoles don't use ultra settings, and neither should you.
Ultra settings are typically throwing away significant amounts of performance for details that people struggle to see in a direct A/B comparison.
Think people are overestimating the capability of the next gen consoles.
Top PC hardware struggles to run AC games at a stable 60fps.
Somehow people expected the consoles to play with improved graphics, new graphical techniques like Ray tracing, and run at 60fps.
Somethings gotta give.
Remember the consoles are gonna cost around $500. There's only so much that can be done at that price. Sony/Microsoft don't know magic.
I just built a $2000 PC that can't run AC Odyssey at 4k/60. I don't think there are even GPUs that exist that can brute force it. If the 3000 series can, the card capable of it will certainly be $1000+ alone.