Honestly think that 120fps will remain a pipe dream this generation unless we're talking small scale indie titles or games from previous generations i.e. Overwatch and other 60fps titles.
120fps will be useless when only a fraction of TVs will support itHonestly think that 120fps will remain a pipe dream this generation unless we're talking small scale indie titles or games from previous generations i.e. Overwatch and other 60fps titles.
That too!120fps will be useless when only a fraction of TVs will support it
Honestly think that 120fps will remain a pipe dream this generation unless we're talking small scale indie titles or games from previous generations i.e. Overwatch and other 60fps titles.
I would love it if Halo Infinity launched with a 120fps option for MP (with the same sort of trade offs we see with the X enhanced options on current gen). I think that's a genuine possibility, confined to MP.
Spec Analysis: Can Project Scarlett truly deliver Xbox's biggest generational leap?
What if raw performance isn't the game-changer this time?
After a pitch-perfect, well choreographed introduction to Xbox One X way back in 2016, hopes were high that Microsoft could repeat the trick for the crucial reveal of Project Scarlett at this year's E3. New details were indeed unveiled, major claims were made - but Microsoft muddied the waters somewhat with messaging that still leaves us unclear about what the new box is actually about, how powerful it is and what the vision is that separates it from Sony's upcoming PlayStation 5, built from the same technological building blocks.
Based on our own information, along with teasing reveals within the Scarlett announcement trailer, here are my thoughts on the set-up of the box - well, one of them at least. Curiously, Microsoft is using very strange PR-speak to avoid the question of the leaked lower-end box, codenamed Lockhart. Another interesting aspect is that while the Xbox One X reveal effectively told us the RAM allocation, SoC size, teraflop count and even clued us in on the cooler, Microsoft is being a lot more coy this time around and there may even be some red herrings in the assets this time around.
Scarlett was defined as the biggest generational leap in console technology that the firm has delivered - but I do think it hard to see the computational leap from OG Xbox to the Xbox 360 being bettered. Meanwhile, the 16x increase in RAM allocation seen moving from Xbox 360 to Xbox One is highly unlikely to be surpassed. Then there's the notion of Scarlett delivering a 4x leap in 'processing performance' over Xbox One X. On the CPU side, this does seem likely but the idea of the machine delivering an equivalent of 24 teraflops of GPU compute is unlikely.
I believe that some of the Xbox folks around E3 used words to almost confirm that this will be the case. I mean you don't at least mention 120 fps options for your new Scarlett console without having your flagship game support them, right :)
I already said it in a post above but you are 100% right, almost all TVs wont support 120hz, so no point. Heck most TVs wont support VRR that i know a lot of posters here want to be supported in gamesNo flagship title, be it on XB or PS5 will offer a 120fps mode. 1000% guaranteed. But once again, i appreciate your..."enthusiasm".
While I think refinement will definitely be a big piece of next gen games, I do think the console tools being ramped up to this extent will allow them to explore their smart and creative ideas far better. That sort of thinking is held back by technical limitations. You can argue the extent but it is definitely there.I don't believe - nor expect there to be 'defining' next-gen features. SSDs and stronger processing power will facilate some games to do things that they couldn't in the past, but smart and interesting game design imo - are achieved in spite of all the technical limitations, not because they're held back by it.
And that's fine to be honest. We started off this generation with a bunch of marlakey like "where's games that have next-gen gameplay", and eventually we stopped seeing nonsense like that as more people realise game design and gameplay isn't some voodoo magic.
All that matters are that the games are good and made both ethically with responsible finances.
Don't compare PC to console, even with Vulcan and DX12 PC cards can never get the same amount of performance that a console can with the same or similarly spec'd pc hardware. Low-level access APIs don't compare to programming to a single spec device.120 fps ? New consoles will have vega 64/gtx1080 level of perf so 30fps in 4k and 60 in 1440p are realistic levels
Pretty much.I don't really understand why people are trying to make "one size fits all" claims about next-gen resolution and fps. Just like every gen before it, some games will be native 4K, some will be 4Kcb and some will probably even be sub-1440p (1800Pcb for instance); some will be 60, some will be 30. That's just how it goes, every developer chooses whatever fits their game. Developers will have ~X4 CPU, ~X7 GPU power, an SSD and a big pool of fast RAM and they will build their game in the way that fits their philosophy.
My point is that consoles are closed box, fixed hardware spec. so there isn't infinite scope to increase game complexity while also rendering at twice the framerate; even with 4x the CPU next-gen.
Nah, true that often games are better optimized for consoles but there is no "magic" in console perf.Don't compare PC to console, even with Vulcan and DX12 PC cards can never get the same amount of performance that a console can with the same or similarly spec'd pc hardware. Low-level access APIs don't compare to programming to a single spec device.
120 fps ? New consoles will have vega 64/gtx1080 level of perf so 30fps in 4k and 60 in 1440p are realistic levels
Who's talking about COD? I was obviously talking about the games mentioned in one of the earlier posts like RDR2.Surely devs will continue to choose what suits their individual games' needs? Whilst (hopefully) the availability of VRR will mean that more games will push for higher framerates knowing that they don't have to hold a stable 60fps 99% of the time?
Also, have you seen how good the new CoD:MW looks on a PS4Pro?
As for Ray Tracing, per pixel RT lighting is definitely off the cards, but we're already seeing alternate uses for RT for lighting, whether that's in Control (providing extra information to the voxel grid), NVidia's paper on RT radiance fields or other, unannounced tech.
This has been true for CPU optimization but never really true for GPU optimization. A PC with a similar GPU as the PS4, Pro or even the X1X can do exactly what these consoles do. A 580 for example is on par with the X1X in what it can do. In that sense low-level APIs don't really do much on the GPU front, even on PC it's only about CPU optimization.Don't compare PC to console, even with Vulcan and DX12 PC cards can never get the same amount of performance that a console can with the same or similarly spec'd pc hardware. Low-level access APIs don't compare to programming to a single spec device.
Yeah definitelyThis has been true for CPU optimization but never really true for GPU optimization. A PC with a similar GPU as the PS4, Pro or even the X1X can do exactly what these consoles do. A 580 for example is on par with the X1X in what it can do. In that sense low-level APIs don't really do much on the GPU front, even on PC it's only about CPU optimization.
I don't think a Tetris port running at 8k and/or 120 fps is unfathomable for next gen.120 fps ? New consoles will have vega 64/gtx1080 level of perf so 30fps in 4k and 60 in 1440p are realistic levels
The 120fps is either a MS focus on VR capability, or it's just a push to make VRR more standard.
I am sure there will be plenty of 60fps options but if the system can push above that, then that is fine. A little headroom is always nice.
Often checkerboard 1800p or mixed settings (some graphic setup in Division 2 are lower than pc low) and keep in mind that nextgen games will look better and be more demanding.I know increased fidelity will add more burden with next gen games, but the X is already exceeding those with good IQ this gen in more than just isolated examples.
This has been true for CPU optimization but never really true for GPU optimization. A PC with a similar GPU as the PS4, Pro or even the X1X can do exactly what these consoles do. A 580 for example is on par with the X1X in what it can do. In that sense low-level APIs don't really do much on the GPU front, even on PC it's only about CPU optimization.
Next-Gen games with RayTracing gonna look in-fucking-sane :0Look at what RT does to such an old game, I'm very impressed. RT makes a damn big difference. I really hope RT will be like this for the new systems.
Pretty sure something like Bleeding Edge will aim for 120fps on ScarlettNo flagship title, be it on XB or PS5 will offer a 120fps mode. 1000% guaranteed. But once again, i appreciate your..."enthusiasm".
Just like the 1X BC enhancements, I am interesting in seeing what older titles get the RT treatment. I get the feeling MS will handle it really well, just like their previous BC stuff.Look at what RT does to such an old game, I'm very impressed. RT makes a damn big difference. I really hope RT will be like this for the new systems.
Has there been anything new on the horizon to help make better looking shadow maps without stupidly expensive render costs?
Pretty sure something like Bleeding Edge will aim for 120fps on Scarlett
If you consider that flagship titel or not is a different question.
How do you even play Overwatch at 60fps on console?
Tetris Effect at 8K/120fps :DI don't think a Tetris port running at 8k and/or 120 fps is unfathomable for next gen.
I think that HDMI's VRR will force most devs to provide two modes of operation in their next gen console releases:Surely devs will continue to choose what suits their individual games' needs? Whilst (hopefully) the availability of VRR will mean that more games will push for higher framerates knowing that they don't have to hold a stable 60fps 99% of the time?
Per pixel lighting is definitely not off the cards as any lighting with RT is per pixel. What Contol does is correct its voxel based GI with the help of per pixel visibility data it gets from shooting rays.As for Ray Tracing, per pixel RT lighting is definitely off the cards, but we're already seeing alternate uses for RT for lighting, whether that's in Control (providing extra information to the voxel grid), NVidia's paper on RT radiance fields or other, unannounced tech.
You made that point by saying that 4x the processing power gives you access to 4x the simulation, which isn't the case. And it's also worth pointing out that doubling the framerate doesn't mean a straight halving of what you can do with the CPU per frame either.
Devs will choose what is best for their projects based on the resources they have. DrKeo nails it!
Despends on the game.For who, the minuscule amount of people that play their consoles on a screen that supports 120Hz?
I don't think this is a bulletpoint worth chasing in the slightest.
Despends on the game.
You can also play 120fps on a 60hz screen.
I play 300fps on a 75hz screen on PC.
Who's talking about COD? I was obviously talking about the games mentioned in one of the earlier posts like RDR2.
"...4x the simulation complexity" which was intended to be an intentionally vague and arbitrary description of the boost in game complexity that a 4x CPU perf jump brings. There's no need to start trying to ascribe specific definitions to an intentionally non-specific and very general statement.
Perhaps I should have worded it as "4x the CPU performance worth of increased game complexity"? I guess I just figured it would have been obvious.
I don't think a Tetris port running at 8k and/or 120 fps is unfathomable for next gen.
This will likely be an unpopular opinion, but the more I see of Ray Tracing, the less important it seems to me. They payoff for the amount of power it needs doesn't seem worth it.
Does it look better, of course. But Control having more realistic floor reflections doesn't exactly seem like a game changer to me.
The problem is that it isn't vague or arbitrary when '4x the simulation' complexity is mentioned in reference to the 4x CPU perf jump and as an opposing argument against 60fps titles. And worse, you don't even mention the possibility of increased simulation complexity with your 60fps option.
Personally I think it's super early days for RT Support and the floor reflection stuff is basic and RT can be. Metro Exodus used RT in a different way and I think as developers get hold of it via Scarlett and PS5, more interesting uses of the hardware will appear.This will likely be an unpopular opinion, but the more I see of Ray Tracing, the less important it seems to me. They payoff for the amount of power it needs doesn't seem worth it.
Does it look better, of course. But Control having more realistic floor reflections doesn't exactly seem like a game changer to me.
This.
I think a lot of the hype around it is founded more in the novelty of the underlying technology and it now being possible in realtime rendering than the actual real-life results.
The only RT demo that suitably impressed me so far was the Star Wars one, and that won't even be possible in-game on the next-next-gen of high end PC hardware, much less game consoles.
I strongly disagree. The lower input latency of 60fps doesn't benefit every game to the same extent, and the compromises in reining in the developer's vision for the sake of a bit more responsive input don't make sense for many games in many genres.
We don't need arbitrary framerate standards for all games. Far better to let the devs decide where to aportion their perf budget to achieve their own specific vision.
You made that point by saying that 4x the processing power gives you access to 4x the simulation, which isn't the case. And it's also worth pointing out that doubling the framerate doesn't mean a straight halving of what you can do with the CPU per frame either.
Devs will choose what is best for their projects based on the resources they have. DrKeo nails it!
Actually devs do have documentation on both systems if they have both devkits. Everything you'd need to know is in them with the obvious *subject to change remark.Truth is devs are mostly like talking .
They do have dev kits , target specs , documentation etc etc
It's to bad we not getting to see anything compare to how it was for XB1 and PS4 before they came out .