• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.

kami_sama

Member
Oct 26, 2017
6,998
What the fuck are those resolutions? 1280p? 2140p?
But I think I'm ok. My system is around the 1440p high with rt.
 

Gestault

Member
Oct 26, 2017
13,356
Maybe they're min-maxing on the "safe" side, but for 1080P @ High settings, even an i7-6700 feels high for recommended CPU.

If I had to guess, I bet they have really bad stuttering on the engine/shader cache side, and the high-ish CPU requirements are to try to sidestep it as much as possible.
 

Duxxy3

Member
Oct 27, 2017
21,699
USA
Seems like stuttering has become very common all of a sudden, what has changed? Or maybe I'm just imagining things... is this devs having trouble moving to DX12?

Typically we had an option to play games in DX11 if there was an issue with the DX12 version.

But fewer and fewer games are giving us that option. Developers are going to DX12 and not doing a great job with the DX12 version. Nvidia/AMD can't bail them out because DX12 puts more power in the devs hands than DX11, and out of Nvidia/AMD's hands.
 

Ra

Rap Genius
Moderator
Oct 27, 2017
12,203
Dark Space
Maybe they're min-maxing on the "safe" side, but for 1080P @ High settings, even an i7-6700 feels high for recommended CPU.

If I had to guess, I bet they have really bad stuttering on the engine/shader cache side, and the high-ish CPU requirements are to try to sidestep it as much as possible.
It's a new-gen exclusive developed on consoles with 8c/16t CPUs, you should be pleasantly surprised the minimum CPU isn't one of the i5-8400 or 8600.

That it's 4c/8t CPUs is a sign that if you don't meet that bar in 2022, you've finally reached the point where it's time to upgrade.

Even its that, it is for minimum requirements and its an i7 at that. I7s listed as a minimum requirement for any game next gen or not seems weird.
It's still a 7-year old quad core. At some point there is just literally no grounds to show bewilderment about ancient CPUs falling to obsolescence.
 

cowbanana

Member
Feb 2, 2018
13,674
a Socialist Utopia
I kinda like CA in some games if it isn't super overdone. Doesn't bother me much here.

I'm still oddly interested in this game, it's just the whole stuttering thing on PC and thin gameplay mechanics in an icon filled open world that's holding me back. I think this would have worked better as a 8-10 hour game. Time is my most precious commodity and I really have to pick and choose when it comes to these open world things that take so much time to get through.

I've only just really started Elden Ring on PC because I spent near 80 hours on Horizon Forbidden West. Both these games are well worth investing time in. But open world games have to really suck me in, or I just don't bother due to the time investment needed.

I can much easier stomach imperfect games when they're shorter experiences.

I miss the olden days where I could buy a new 8 hour game every weekend and have a good experience. Now it takes a bloody month to finish one game, playing every day and binging on weekends.
 

Rogote

Member
Oct 25, 2017
2,606
Really does kinda feel like we're going back in time to the bad times of PC gaming again.
Even I, a very much a PC for everything not sony 1st party guy now regularly think about ps5 versions of multiplat games. hasn't happened in a decade but now all of a sudden I'm checking console game prices for all these multiplat games I'm interested in.

You know what? Fuck lamenting I can't afford a 3090 or 3080. I wouldn't even fucking want one anymore even if I could get one. What's the point?
 

Polyh3dron

Prophet of Regret
Banned
Oct 25, 2017
9,860
these fucks are really trying to get me to upgrade from 3090 to 4090 aren't they
 
OP
OP
AshenOne

AshenOne

Member
Feb 21, 2018
6,089
Pakistan
It's a new-gen exclusive developed on consoles with 8c/16t CPUs, you should be pleasantly surprised the minimum CPU isn't one of the i5-8400 or 8600.

That it's 4c/8t CPUs is a sign that if you don't meet that bar in 2022, you've finally reached the point where it's time to upgrade.


It's still a 7-year old quad core. At some point there is just literally no grounds to show bewilderment about ancient CPUs falling to obsolescence.
An i7 thats still powerful than than i5's one and two gens, that come later than it. Don't expect most to have the latest and recent CPUs.

Doesn't matter if the CPU itself is 7 years old. Its being listed in the minimum system requirements surely you understand the weirdness of requiring it. Seriously everybody doesn't live in the US to get the recent and powerful CPUs.. prices have gone up each gen and especially since, there's been chip shortage issues since 2-3 years.

With that said iam saying all of this when taking these requirements at face value..maybe they're being very conservative with them? Time will tell.
 

Ra

Rap Genius
Moderator
Oct 27, 2017
12,203
Dark Space
An i7 thats still powerful than than i5's one and two gens, that come later than it. Don't expect most to have the latest and recent CPUs.

Doesn't matter if the CPU itself is 7 years old. Its being listed in the minimum system requirements surely you understand the weirdness of requiring it. Seriously eveybody doesn't live in the US to get the recent and powerful CPUs.. prices have gone up and there's been chip shortage issues since 2-3 years.
I understand the reality of people not upgrading often. But it's also reality that developers have never slowed down because of that.

When new consoles launch, a year or so later, the requirements take a leap; here we are. Bemoan the fact that it's time to upgrade, not that games are moving on to requiring more powerful hardware. The former I get, especially in today's hardware climate; the latter makes less sense, to question WHY a game developed for a more powerful hardware target suddenly has requirements you didn't expect.

GTX 1060 for 1280x720 and low settings should be just as telling. "Next-Gen" PC Requirements are here. Truth is our systems sleepwalked through the entirety of the last generation, and the first brick wall of a technology check is about to happen in real time over the course of 2022 as more new-gen exclusives drop.

Wait until the Starfield requirements hit. Oh Lord.
 

Ra

Rap Genius
Moderator
Oct 27, 2017
12,203
Dark Space
Can't wait, I wonder how my laptop 3060 will handle it.=p
Oh that won't be a problem, the higher wattage (115W+) mobile 3060 is actually within 10-15% of the desktop version. If you're gaming at 1080p it should be trivial.

It's the GTX 1060 and quad cores being completely off of the table for which people need to start their preparations. Not that they won't play games at all, but it's going to look spooky when the 1070, 1660 Ti, or even the RTX 2060 moves into the minimum GPU slot in the hardware requirement releases.
 
OP
OP
AshenOne

AshenOne

Member
Feb 21, 2018
6,089
Pakistan
I understand the reality of people not upgrading often. But it's also reality that developers have never slowed down because of that.

When new consoles launch, a year or so later, the requirements take a leap; here we are. Bemoan the fact that it's time to upgrade, not that games are moving on to requiring more powerful hardware. The former I get, especially in today's hardware climate; the latter makes less sense, to question WHY a game developed for a more powerful hardware target suddenly has requirements you didn't expect.

GTX 1060 for 1280x720 and low settings should be just as telling. "Next-Gen" PC Requirements are here. Truth is our systems sleepwalked through the entirety of the last generation, and the first brick wall of a technology check is about to happen in real time over the course of 2022 as more new-gen exclusives drop.

Wait until the Starfield requirements hit. Oh Lord.
Speaking of starfield i am optimistic that they won't be as high as this one. Bethesda Game Studio games have always been very decent to run on low to low-middle end systems. Fallout 3, TES V, Fallout 4, etc.

Here's what i expect for the minimum requirements

Some recent gen i5 CPU, 12GB of RAM, 1060/RX 480 or RX 580(4 or 6GB).

The only thing iam uncertain about is whether they'll recommend an SSD or not in the minimum requirements.
 
Last edited:

eathdemon

Member
Oct 27, 2017
9,644
Speaking of starfield i am optimistic that they won't be as high as this one. Bethesda Game Studio games have always been very decent to run on low to low-middle end systems. Fallout 3, TES V, Fallout 4, etc.
eh I dont know one of the strong roumers going around was bethasda couldnt get it to run on lastgen, likely a cpu bottleneck, so I would assume a decent cpu will be required.
 
OP
OP
AshenOne

AshenOne

Member
Feb 21, 2018
6,089
Pakistan
eh I dont know one of the strong roumers going around was bethasda couldnt get it to run on lastgen, likely a cpu bottleneck, so I would assume a decent cpu will be required.
Lets be honest, last gen console CPUs were not good and bethesda games haven't always been great on consoles.

The game probably was able to run on last gen consoles but would encounter crashes and errors and not enough power to load the whole world, etc. in my speculated opinion.
 

JigglesBunny

Prophet of Truth
Avenger
Oct 27, 2017
31,101
Chicago
Oh that won't be a problem, the higher wattage (115W+) mobile 3060 is actually within 10-15% of the desktop version. If you're gaming at 1080p it should be trivial.

It's the GTX 1060 and quad cores being completely off of the table for which people need to start their preparations. Not that they won't play games at all, but it's going to look spooky when the 1070, 1660 Ti, or even the RTX 2060 moves into the minimum GPU slot in the hardware requirement releases.
There are days where even my RTX 2080 and i7 9700k feel inadequate with some modern titles at 1440p, especially without DLSS, so I'm fully expecting the end of this generation/transition to next to put that rig out to pasture or just be hitting the minimum requirements.

If it weren't for my Legion 7 with the mobile RTX 3080 and 5900HX (and the Steam Deck to some extent), I'd probably be done with modern PC gaming within the next six years if these GPU prices never stabilize. Instead, because of laptops, I'm still in the high-end game for the long haul.

Never thought gaming laptops would be my savior, but here we are.
 

pksu

Member
Oct 27, 2017
1,239
Finland
I wonder if AMD/NVIDIA have to fix these issues in the end. A bit disappointing but not really surprising.
 

Ra

Rap Genius
Moderator
Oct 27, 2017
12,203
Dark Space
Speaking of starfield i am optimistic that they won't be as high as this one. Bethesda Game Studio games have always been very decent to run on low to low-middle end systems. Fallout 3, TES V, Fallout 4, etc.

Here's what i expect for the minimum requirements

Some recent gen i5 CPU, 12GB of RAM, 1060/RX 480 or RX 580(4 or 6GB).

The only thing iam uncertain about is whether they'll recommend an SSD or not in the minimum requirements).
The CPU disagreement aside, the thing about GPU requirements is, the Series S GPU still has to run everything right? That thing has a steroid injected 5500 XT-ish GPU.

Series S Specs

5500 XT Specs

Hell the 5500 XT has more cores and texture units, but the Series S is RDNA2.

As long as a game has to run on the Series S, today's midrange GPUs should survive playing games at 1080p. Forever.

I mean, now that I'm thinking this over, GWT lists the 5500 XT for minimum at 720p/30fps Low? What is the Series S going to to do? These specs are likely overblown, just like Elden Ring's were, unless the XSS is going to run this really poorly. I know the yadda yadda console optimization comes in here, but that only goes so far.


There are days where even my RTX 2080 and i7 9700k feel inadequate with some modern titles at 1440p, especially without DLSS, so I'm fully expecting the end of this generation/transition to next to put that rig out to pasture or just be hitting the minimum requirements.

If it weren't for my Legion 7 with the mobile RTX 3080 and 5900HX (and the Steam Deck to some extent), I'd probably be done with modern PC gaming within the next six years if these GPU prices never stabilize. Instead, because of laptops, I'm still in the high-end game for the long haul.

Never thought gaming laptops would be my savior, but here we are.
Well yeah by the end of the "end of this generation/transition to next" (2026?) the 2080 will be almost a decade old, it'd be well due for a replacement. I have a 9700K + RTX 2080 (200W TDP on the GPU) gaming laptop, the 2080 is going to easily last this entire generation, not playing at maximum settings at the highest resolutions, but it's still a monster.

Your AMD CPU does crush the 9700K, but is the mobile 3080 really more than 15% ahead of the desktop 2080? From my testing the mobile 3080 is 20% ahead of my 2080, and my GPU is 10% slower than the desktop 2080.
 

pswii60

Member
Oct 27, 2017
26,667
The Milky Way

Would love Dictator to reach out to Epic on this wider issue. Sure devs can get around it with some customised work to pre-compile shaders during the title screen like we've recently seen with UE4 titles The Ascent and Psychonauts 2 after their respective optimisation patches, but by default this isn't how UE4 (or UE5) works. Hence why this issue keeps rearing its ugly head again and again.. and will only continue to do so more frequently as more devs move their development to UE.

Ideally UE4 should be compiling the shaders in the background for the given area, rather than waiting until the precise moment they actually need to be used! This is how RE engine works for example.
 

flyinj

Member
Oct 25, 2017
10,941
Would love Dictator to reach out to Epic on this wider issue. Sure devs can get around it with some customised work to pre-compile shaders during the title screen like we've recently seen with UE4 titles The Ascent and Psychonauts 2 after their respective optimisation patches, but by default this isn't how UE4 (or UE5) works. Hence why this issue keeps rearing its ugly head again and again.. and will only continue to do so more frequently as more devs move their development to UE.

Ideally UE4 should be compiling the shaders in the background for the given area, rather than waiting until the precise moment they actually need to be used! This is how RE engine works for example.

I would love DF to ask Epic directly what the issue seems to be. It would not only be nice to know why it is happening at all, but also just getting more press to talk about the issue will hopefully light a fire under the asses of Epic and/or publishers to stop releasing these broken games onto the market.

This also is something that has only really started to happen recently. Almost zero Unreal 4 games released previous to two years ago ever had this issue.
 
Last edited:

pswii60

Member
Oct 27, 2017
26,667
The Milky Way
This also is something that has only really started to happen recently. Almost zero Unreal 4 games released previous to two years ago ever had this issue.
I think there are just more (AAA) UE4 games in recent years that means we're suffering the issue more. That and games are using more shaders than ever.

Thinking back to older UE4 games like PUBG, Ruiner, Observer, Fallen Order and Outer Worlds, they had the same issue.

It's been going since 2017 at least, and that was the year we first started seeing the bigger budget UE4 games releasing.
 

eathdemon

Member
Oct 27, 2017
9,644
I think there are just more (AAA) UE4 games in recent years that means we're suffering the issue more. That and games are using more shaders than ever.

Thinking back to older UE4 games like PUBG, Ruiner, Observer, Fallen Order and Outer Worlds, they had the same issue.

It's been going since 2017 at least, and that was the year we first started seeing the bigger budget UE4 games releasing.
devs should just add a shader compaling as part of the instal process.
 

PHOENIXZERO

Member
Oct 29, 2017
12,073
This has been an issue with UE4 since like forever.

We got spoiled by how underwhelming last gen CPUs were. Eventually this gen older 4C/8T i7s aren't going to be enough for minimum either.
 

Adum

Member
May 30, 2019
924
I'm surprised so many people are surprised about higher minimum specs. Especially on games that release only on current gen platforms. It's been more than a year since the PS5/Series consoles released. I know it's not easy or cheap to just buy new hardware, but it also shouldn't be surprising that games have moved beyond 2C/4T CPUs released in 2010.
 

Mifec

Member
Oct 25, 2017
17,733
Would love Dictator to reach out to Epic on this wider issue. Sure devs can get around it with some customised work to pre-compile shaders during the title screen like we've recently seen with UE4 titles The Ascent and Psychonauts 2 after their respective optimisation patches, but by default this isn't how UE4 (or UE5) works. Hence why this issue keeps rearing its ugly head again and again.. and will only continue to do so more frequently as more devs move their development to UE.

Ideally UE4 should be compiling the shaders in the background for the given area, rather than waiting until the precise moment they actually need to be used! This is how RE engine works for example.
He should ask the devs why they don't ask Epic for help instead. Epic has offices in Japan that exist for this very reason.

This is a dev issue not a UE4 issue.
 

andymoogle

Member
Oct 27, 2017
2,307
I'm only at recommended 1080p specs. I really need to upgrade my computer if I want to be able to stay at 3440x1440. Prices in Sweden have started to go down to almost reasonable levels... Might be time soon.
 

Li Kao

Member
Oct 25, 2017
1,729
2600x / 2060 super / mid RTX with DLSS gives me 50-60fps. But it varies a little too much.
That being said I could be fine with it if I didn't need to set the game at 1080p for cinematics to keep in sync. JFC.
 

craven68

Member
Jun 20, 2018
4,550
Would love Dictator to reach out to Epic on this wider issue. Sure devs can get around it with some customised work to pre-compile shaders during the title screen like we've recently seen with UE4 titles The Ascent and Psychonauts 2 after their respective optimisation patches, but by default this isn't how UE4 (or UE5) works. Hence why this issue keeps rearing its ugly head again and again.. and will only continue to do so more frequently as more devs move their development to UE.

Ideally UE4 should be compiling the shaders in the background for the given area, rather than waiting until the precise moment they actually need to be used! This is how RE engine works for example.
It's crazy that emulator can make this but we still got true games on pc with stutters.
 

MaLDo

Member
Oct 27, 2017
1,401
I would love DF to ask Epic directly what the issue seems to be. It would not only be nice to know why it is happening at all, but also just getting more press to talk about the issue will hopefully light a fire under the asses of Epic and/or publishers to stop releasing these broken games onto the market.

This also is something that has only really started to happen recently. Almost zero Unreal 4 games released previous to two years ago ever had this issue.

That's not true. Shader compilation stutter, asset loading stutter and cleaning garbage collection stutters are present in bad programmed games for decades.

There are good programmers and bad programmers.

The problem that must be fixed before is how in the hell a game like Elder Rings in PC has good reviews?
if games with technical problems were killed in reviews and avoided by buyers, everything would be solved in one year.
 

pksu

Member
Oct 27, 2017
1,239
Finland
In the end most games are simple read input => apply input => draw state loops and if any of the stages block then you get frame drops. Tools used to make games are usually quite primitive and games are just "amusement software". Nobody dies if rendering deadline can't be met thus architectures are not really designed to guarantee perfect frame delivery. It would only cost money and most? people don't really care too much.

Wouldn't be surprised if UE5 still has a dedicated rendering thread and so on.
 

SixelAlexiS

Member
Oct 27, 2017
7,724
Italy
It's crazy that emulator can make this but we still got true games on pc with stutters.
"modern" emulators make cache on the fly so they stutter, that's why they are still far to a perfect way to play it.

Vulkan cache are less stuttery but still stutters.

We just need a precache, let's wait at the game start and call it a day.