Seems like stuttering has become very common all of a sudden, what has changed? Or maybe I'm just imagining things... is this devs having trouble moving to DX12?
It's a new-gen exclusive developed on consoles with 8c/16t CPUs, you should be pleasantly surprised the minimum CPU isn't one of the i5-8400 or 8600.Maybe they're min-maxing on the "safe" side, but for 1080P @ High settings, even an i7-6700 feels high for recommended CPU.
If I had to guess, I bet they have really bad stuttering on the engine/shader cache side, and the high-ish CPU requirements are to try to sidestep it as much as possible.
It's still a 7-year old quad core. At some point there is just literally no grounds to show bewilderment about ancient CPUs falling to obsolescence.Even its that, it is for minimum requirements and its an i7 at that. I7s listed as a minimum requirement for any game next gen or not seems weird.
No option to turn off CA should be a crime!
It's a good port feature wise, very good, but once again the shader demon strikes from the dark.FFS this shit again. Can't we get any well optimized ports on PC anymore?
It's a good port feature wise, very good, but once again the shader demon strikes from the dark.
Just compile your fucking shaders at the start already.
An i7 thats still powerful than than i5's one and two gens, that come later than it. Don't expect most to have the latest and recent CPUs.It's a new-gen exclusive developed on consoles with 8c/16t CPUs, you should be pleasantly surprised the minimum CPU isn't one of the i5-8400 or 8600.
That it's 4c/8t CPUs is a sign that if you don't meet that bar in 2022, you've finally reached the point where it's time to upgrade.
It's still a 7-year old quad core. At some point there is just literally no grounds to show bewilderment about ancient CPUs falling to obsolescence.
I understand the reality of people not upgrading often. But it's also reality that developers have never slowed down because of that.An i7 thats still powerful than than i5's one and two gens, that come later than it. Don't expect most to have the latest and recent CPUs.
Doesn't matter if the CPU itself is 7 years old. Its being listed in the minimum system requirements surely you understand the weirdness of requiring it. Seriously eveybody doesn't live in the US to get the recent and powerful CPUs.. prices have gone up and there's been chip shortage issues since 2-3 years.
Can't wait, I wonder how my laptop 3060 will handle it.=p
Oh that won't be a problem, the higher wattage (115W+) mobile 3060 is actually within 10-15% of the desktop version. If you're gaming at 1080p it should be trivial.
Speaking of starfield i am optimistic that they won't be as high as this one. Bethesda Game Studio games have always been very decent to run on low to low-middle end systems. Fallout 3, TES V, Fallout 4, etc.I understand the reality of people not upgrading often. But it's also reality that developers have never slowed down because of that.
When new consoles launch, a year or so later, the requirements take a leap; here we are. Bemoan the fact that it's time to upgrade, not that games are moving on to requiring more powerful hardware. The former I get, especially in today's hardware climate; the latter makes less sense, to question WHY a game developed for a more powerful hardware target suddenly has requirements you didn't expect.
GTX 1060 for 1280x720 and low settings should be just as telling. "Next-Gen" PC Requirements are here. Truth is our systems sleepwalked through the entirety of the last generation, and the first brick wall of a technology check is about to happen in real time over the course of 2022 as more new-gen exclusives drop.
Wait until the Starfield requirements hit. Oh Lord.
eh I dont know one of the strong roumers going around was bethasda couldnt get it to run on lastgen, likely a cpu bottleneck, so I would assume a decent cpu will be required.Speaking of starfield i am optimistic that they won't be as high as this one. Bethesda Game Studio games have always been very decent to run on low to low-middle end systems. Fallout 3, TES V, Fallout 4, etc.
Lets be honest, last gen console CPUs were not good and bethesda games haven't always been great on consoles.eh I dont know one of the strong roumers going around was bethasda couldnt get it to run on lastgen, likely a cpu bottleneck, so I would assume a decent cpu will be required.
There are days where even my RTX 2080 and i7 9700k feel inadequate with some modern titles at 1440p, especially without DLSS, so I'm fully expecting the end of this generation/transition to next to put that rig out to pasture or just be hitting the minimum requirements.Oh that won't be a problem, the higher wattage (115W+) mobile 3060 is actually within 10-15% of the desktop version. If you're gaming at 1080p it should be trivial.
It's the GTX 1060 and quad cores being completely off of the table for which people need to start their preparations. Not that they won't play games at all, but it's going to look spooky when the 1070, 1660 Ti, or even the RTX 2060 moves into the minimum GPU slot in the hardware requirement releases.
It's a good port feature wise, very good, but once again the shader demon strikes from the dark.
Just compile your fucking shaders at the start already.
The CPU disagreement aside, the thing about GPU requirements is, the Series S GPU still has to run everything right? That thing has a steroid injected 5500 XT-ish GPU.Speaking of starfield i am optimistic that they won't be as high as this one. Bethesda Game Studio games have always been very decent to run on low to low-middle end systems. Fallout 3, TES V, Fallout 4, etc.
Here's what i expect for the minimum requirements
Some recent gen i5 CPU, 12GB of RAM, 1060/RX 480 or RX 580(4 or 6GB).
The only thing iam uncertain about is whether they'll recommend an SSD or not in the minimum requirements).
Well yeah by the end of the "end of this generation/transition to next" (2026?) the 2080 will be almost a decade old, it'd be well due for a replacement. I have a 9700K + RTX 2080 (200W TDP on the GPU) gaming laptop, the 2080 is going to easily last this entire generation, not playing at maximum settings at the highest resolutions, but it's still a monster.There are days where even my RTX 2080 and i7 9700k feel inadequate with some modern titles at 1440p, especially without DLSS, so I'm fully expecting the end of this generation/transition to next to put that rig out to pasture or just be hitting the minimum requirements.
If it weren't for my Legion 7 with the mobile RTX 3080 and 5900HX (and the Steam Deck to some extent), I'd probably be done with modern PC gaming within the next six years if these GPU prices never stabilize. Instead, because of laptops, I'm still in the high-end game for the long haul.
Never thought gaming laptops would be my savior, but here we are.
Would love Dictator to reach out to Epic on this wider issue. Sure devs can get around it with some customised work to pre-compile shaders during the title screen like we've recently seen with UE4 titles The Ascent and Psychonauts 2 after their respective optimisation patches, but by default this isn't how UE4 (or UE5) works. Hence why this issue keeps rearing its ugly head again and again.. and will only continue to do so more frequently as more devs move their development to UE.
Ideally UE4 should be compiling the shaders in the background for the given area, rather than waiting until the precise moment they actually need to be used! This is how RE engine works for example.
I think there are just more (AAA) UE4 games in recent years that means we're suffering the issue more. That and games are using more shaders than ever.This also is something that has only really started to happen recently. Almost zero Unreal 4 games released previous to two years ago ever had this issue.
devs should just add a shader compaling as part of the instal process.I think there are just more (AAA) UE4 games in recent years that means we're suffering the issue more. That and games are using more shaders than ever.
Thinking back to older UE4 games like PUBG, Ruiner, Observer, Fallen Order and Outer Worlds, they had the same issue.
It's been going since 2017 at least, and that was the year we first started seeing the bigger budget UE4 games releasing.
It shouldn't be part of the install proces, as shaders can and will get deleted during major driver updates.devs should just add a shader compaling as part of the instal process.
ok than a option in the menu to run it.It shouldn't be part of the install proces, as shaders can and will get deleted during major driver updates.
This has been an issue with UE4 since like forever.
We got spoiled by how underwhelming last gen CPUs were. Eventually this gen older 4C/8T i7s aren't going to be enough for minimum either.
I didn't say it did, that's on the complaints about the minimum CPU requirements which is why I separated it from the UE4 sentence.
He should ask the devs why they don't ask Epic for help instead. Epic has offices in Japan that exist for this very reason.Would love Dictator to reach out to Epic on this wider issue. Sure devs can get around it with some customised work to pre-compile shaders during the title screen like we've recently seen with UE4 titles The Ascent and Psychonauts 2 after their respective optimisation patches, but by default this isn't how UE4 (or UE5) works. Hence why this issue keeps rearing its ugly head again and again.. and will only continue to do so more frequently as more devs move their development to UE.
Ideally UE4 should be compiling the shaders in the background for the given area, rather than waiting until the precise moment they actually need to be used! This is how RE engine works for example.
Dammit, I just bought it on PC from GMG and used the code. Am I really going to need to just go back to only playing on console at this point until they start fixing this?
It's crazy that emulator can make this but we still got true games on pc with stutters.Would love Dictator to reach out to Epic on this wider issue. Sure devs can get around it with some customised work to pre-compile shaders during the title screen like we've recently seen with UE4 titles The Ascent and Psychonauts 2 after their respective optimisation patches, but by default this isn't how UE4 (or UE5) works. Hence why this issue keeps rearing its ugly head again and again.. and will only continue to do so more frequently as more devs move their development to UE.
Ideally UE4 should be compiling the shaders in the background for the given area, rather than waiting until the precise moment they actually need to be used! This is how RE engine works for example.
I would love DF to ask Epic directly what the issue seems to be. It would not only be nice to know why it is happening at all, but also just getting more press to talk about the issue will hopefully light a fire under the asses of Epic and/or publishers to stop releasing these broken games onto the market.
This also is something that has only really started to happen recently. Almost zero Unreal 4 games released previous to two years ago ever had this issue.
"modern" emulators make cache on the fly so they stutter, that's why they are still far to a perfect way to play it.It's crazy that emulator can make this but we still got true games on pc with stutters.