It's really only noticeable if the game has a 60 FPS mode to compare it to. I play TotK on a G1 and it's perfectly fine - I typically only notice when there are frame rate dips. If I try Horizon: Forbidden West on 30 FPS it's very noticeable.
It's really only noticeable if the game has a 60 FPS mode to compare it to. I play TotK on a G1 and it's perfectly fine - I typically only notice when there are frame rate dips. If I try Horizon: Forbidden West on 30 FPS it's very noticeable.
It's really only noticeable if the game has a 60 FPS mode to compare it to. I play TotK on a G1 and it's perfectly fine - I typically only notice when there are frame rate dips. If I try Horizon: Forbidden West on 30 FPS it's very noticeable.
Personally I wished higher fps was pushed more than resolution. I would be fine with a 1080, 60ps version rather than a 4k, 30.
I'm trying to decide on whether to get this on Series X or have it run on my 2060.. which doesn't even meet recommended specs (everything else does just not the GPU..) Would I even be able to hit 60fps if I can't meet recommended?
As soon as they have to make sure that the game would run well at all times at 40 fps they would have to consider their budget for the simulation and that would affect all modes and platforms. It's literally a 33% reduction in frame time. They have decided to compromise performance for being able to do more per frame. It's fine to disagree with that decision, but they didn't make it because they are incompetent. They just value having that CPU budget more."Fortunately in this one, we've got it running great. It's often running way above that. Sometimes it's 60. But on the consoles, we do lock it because we prefer the consistency, where you're not even thinking about it."
That's what Todd said. So 40 FPS vrr wouldn't be a problem. But maybe that's is one of Todd's sweet little lies :p
That sucksI'm out unfortunately, I just don't enjoy 30fps. One for the Xbox Series Z then. I have no doubt this will need to be patched for years to come anyway. Disappointed, but not surprised at all.
Your CPU dictates the framerate you can play a game at, not your GPU. Reconstructed 720p and low settings aren't going to help even a 4090 out if you're CPU limited. You can always increase your GPU performance, very rarely can you improve your CPU performance to a significant degree.
a better way to put it is that your framerate can only go as high as your cpu allows it.The CPU won't be an issue, I upgraded that not too long ago. But to say the CPU dictates the framerate seems not right.. By your logic, are you saying I can run this game happily at 4k 60 as long as my CPU is good enough?
I figured as much. That increase in res is no joke, and I fear my performance is going to tank if I get one. 😋it sort of is but at the same time isn't. Like for the aspect ratio it's the same as a ultrawide monitor in terms of what you view, but for full screen and taking full advantage of the width/horizontal view, well, since there's black bars on top and bottom it's not as immersive as a real ultra wide monitor.
It's like watching a movie but when you do it on a real ultrawide monitor it makes it hard(maybe impossible) to go back to a 16:9 monitor.
But regardless, it's still a fine option but on a real ultrawide it's something else special!
But yeah that's a cool option still.
One of the things I was really worried about when I got my oled was 30fps gaming. But Tears was fine. So I'm less worried now. Though I hope my cpu can get me to the 60 fps promise land.It really is one of those first world problems. The panel is so good, almost to its detriment. Give it 60/120fps content and it sings.
FWIW 30fps content on an OLED isn't that bad. The other qualities of this screen more than make up for it.
The CPU handles logic threads, render drawcalls, physics and all that. It needs to complete everything needed to render a frame, and then tell the GPU to render it. So yes, CPU dictates the maximum fps you can get, assuming unlimited GPU power. This is the reason why something like a 4090 also requires a powerful CPU, because it is so fast, slower CPUs can not keep up.The CPU won't be an issue, I upgraded that not too long ago. But to say the CPU dictates the framerate seems not right.. By your logic, are you saying I can run this game happily at 4k 60 as long as my CPU is good enough?
Shouldn't it be more that CPU and GPU need to be on par, so the CPU can compute all of the information from the GPU, but you still need a decent GPU to be able to even reach the required/wanted graphical settings.
The CPU won't be an issue, I upgraded that not too long ago. But to say the CPU dictates the framerate seems not right.. By your logic, are you saying I can run this game happily at 4k 60 as long as my CPU is good enough?
Shouldn't it be more that CPU and GPU need to be on par, so the CPU can compute all of the information from the GPU, but you still need a decent GPU to be able to even reach the required/wanted graphical settings.
Both do. It's all about how you use your resources. There's no doubt Starfield could run at 60 on XSX, but for that it's likely they'd have to compromise on draw distances or physics or whatever in a way they deemed detrimental to the game.The CPU handles logic threads, render drawcalls, physics and all that. It needs to complete everything needed to render a frame, and then tell the GPU to render it. So yes, CPU dictates the maximum fps you can get, assuming unlimited GPU power. This is the reason why something like a 4090 also requires a powerful CPU, because it is so fast, slower CPUs can not keep up.
I really hope the 30 will be stable though. I'm sooo tired of rough 30s ...Just played TotK for 115 hours at 30 fps and god knows what resolution. I got used to it.
I can't imagine how difficult it must be to optimize a game of this caliber.
That's not entirely true. If the game is CPU limited (which is very likely the case) a performance mode wouldn't be possible.If they feel comfortable with Series X locking in 4K30fps, there's really no reason not to allow a performance option of locked 1080p60fps.
Then again, it's a slow paced Bethesda RPG and not an action game, so I don't think my personal enjoyment will be negatively impacted by 30fps.
Overall, performance options are becoming industry standard at this point and it's odd they just said "fuck it".
If you have both, why was this ever even a question?I'm at a point where I'm not willing to go back to 30fps. PC it is
Ok I see, I think it's the same with the 30fps of Halo Infinite, seems always related to frame time issues:It's really only noticeable if the game has a 60 FPS mode to compare it to. I play TotK on a G1 and it's perfectly fine - I typically only notice when there are frame rate dips. If I try Horizon: Forbidden West on 30 FPS it's very noticeable.
Yeah, I would bet so. They are excellent prices here in the UK for the LG C1 and I think it will still be a great fit for Starfield there.It really is one of those first world problems. The panel is so good, almost to its detriment. Give it 60/120fps content and it sings.
FWIW 30fps content on an OLED isn't that bad. The other qualities of this screen more than make up for it.
People really don't understand how CPUs limit the performance of a game, huh? It's gut wrenching to hear the constant "just lower the resolution to get 60fps" armchair critics. Embarassing.
If it's an unstable 30 that would suuuuck imo.Well that's disappointing for console owners, but I doubt it will be a locked 30 as well.
BGS games have always been CPU limited, that's just how they design their games, its a very fair assesment (which experts like John Linneman agree with based on what they've seen so far https://twitter.com/dark1x/status/1668160646276431872 )Is it not an armchair assessment to say its CPU limited? I don't see where Todd Howard said it was.
the recommended PC CPU specs aren't particularly high either.Is it not an armchair assessment to say its CPU limited? I don't see where Todd Howard said it was.
Starfield Minimum System Predicted Requirements
- CPU: AMD FX-8350/ Core i5 6600K
- RAM: 8GB RAM
- GPU / Video Card: GeForce GTX 1050 Ti / AMD Radeon RX 570
- Storage: 75GB
- Operating System: Windows 8.1 & Windows 10 (64-bit)
Starfield Recommended Predicted System Requirements
- CPU: Intel Core i7-5820K / AMD Ryzen 5 2600
- RAM: 16 GB RAM
- GPU / Video Card: GeForce RTX 3070 / Radeon RX 6800
- Storage: 75GB
- Operating System: Windows 10 or later (64-bit)
The creative choice is to pump the graphics instead of lowering them to have 60FPS."Creative choice" sounds like a joke, who prefers 30 fps over 60?
I mean, it just makes sense if you think. We speak of 1000+ visitable planets with each of them their own assets, weather etc + the space itself with npcs, cities, your ship (with various parts!), seamless transitions, shit tons of interactions with objects npcs in real time, physics. Add that in Bethesda games, items usually stay forever where you drop them, assuming it's the same here it's an insane CPU load at this scale. Alex from DF resume this pretty well:Is it not an armchair assessment to say its CPU limited? I don't see where Todd Howard said it was.
lol there are so many reasons one would prefer their console over their PC
It's all speculation as we're not working on the game and we don't have access to their profiling data.Is it not an armchair assessment to say its CPU limited? I don't see where Todd Howard said it was.
I think the best way to think of it is actually as two FPS graphs: CPU frame rate and GPU frame rate. That usually helps people understand it better and is why many benchmarks visualise it that way as two numbers and two fps counts.a better way to put it is that your framerate can only go as high as your cpu allows it.
Is it not an armchair assessment to say its CPU limited? I don't see where Todd Howard said it was.
XSS has ⅓ the GPU power (but pretty much equivalent CPU) of the XSX but runs the game at ½ the resolution. If it was just GPU bound they could just run the XSX at the same settings as XSS and get 60FPS.Is it not an armchair assessment to say its CPU limited? I don't see where Todd Howard said it was.
Except that's not how it works. Both work in tandem. "CPU bound" and "GPU bound" are restrictive ways to describe that lowering res (or AA, or other "pure" GPU settings) wouldn't help the framerate by a significant margin in the first case, and that increasing it would hurt the framerate significantly in the second case.I think the best way to think of it is actually as two FPS graphs: CPU frame rate and GPU frame rate.
Yes, of course. You're right. It's just that it's a convenient way of reinterpreting, that's true enough, and that actually makes sense to people, at least in my experience.Except that's not how it works. Both work in tandem. "CPU bound" and "GPU bound" are restrictive ways to describe that lowering res (or AA, or other "pure" GPU settings) wouldn't help the framerate by a significant margin in the first case, and that increasing it would hurt the framerate significantly in the second case.
They're just shorthands to easily summarize an infinitely more complex situation.