• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.
  • We have made minor adjustments to how the search bar works on ResetEra. You can read about the changes here.
Status
Not open for further replies.

Patent

Self-requested ban
Banned
Jul 2, 2018
1,621
North Carolina
Assuming rumors and speculated price are true, I'd buy it as a second console for my 1480P widescreen and to take on trips. What will truly define these consoles as next gen consoles is the SSDs and CPUs. The GPU power is for image quality enthusasts.
I find it hard to believe the only difference your gonna see between 12+tf and 4tf is gonna be resolution but if it was the case i could def see that moving many many units
 

Vimto

Member
Oct 29, 2017
3,718
Scaled proportionately. You dont need exactly the same memory if you aren't pushing 4K textures. Again we're going off rumors right now. Rumors show its same architecture, using Navi, using Zen, using SSD with same IO speeds. It's going to be capable of all the same games, with all the same characters, running at the same frame rate with the same next gen physics and AI...running at lower resolutions with less sophisticated lighting.

This is too good to be true.. there will be big compromises for the 299$ tag. I also expect sub 900p @30 will be the standard for lockhart next gen games.
 

gundamkyoukai

Member
Oct 25, 2017
21,316
I find it hard to believe the only difference your gonna see between 12+tf and 4tf is gonna be resolution but if it was the case i could def see that moving many many units

I don't see it moving that many early on .
I really feel some people forget the PS3\360 gen and how similar it is to what they saying .
 

OneBadMutha

Member
Nov 2, 2017
6,059
I find it hard to believe the only difference your gonna see between 12+tf and 4tf is gonna be resolution but if it was the case i could def see that moving many many units

Resolution eats a shit ton of power. Look at the Xbox One S vs X. Going from 1.3 tflops to 6 tflops with a lot more memory and a slight uptick in CPU gets you from 1080P to 4K with 4K textures.

Then we haven't even covered the ray tracing. That eats a shit ton of GPU power. Resolution and lighting start high and scale down. There's really nothing else you'd need to scale to a unit that is a 3x drop off in GPU. Lockhart is still working with next gen Navi features, a next gen CPU and next gen IO speeds.

Again this is assuming rumors are facts (which they aren't yet). If the rumors come to fruition, the scaling between Series X and Lockhart is even more seamless and smoother than the scaling between Xbox One X and the S because there isnt the messy memory structure that we had with the S. It's all same architecture, same generation and same development methods.
 
Oct 25, 2017
11,798
United Kingdom
Because comparing the Series X to the X is comparing apples with apples.

Comparing a base model to a Pro model is comparing apples with Pink Lady apples.

If there is a 9 to 12 gulf then if you're Sony you want to be very clear that the PS5 is a base model and not to be talked about in the same breath as the Series X and that the PS5 Pro will become the king in due course.

Well if the gap really is 9.2TF vs 12TF, they could easily just focus more on their great games and likely cheaper price, as the big selling points for PS5.

Even with 9.2TF, we all know Sony developers are going to make some eye melting stuff, after what they did with PS4's 1.8TF.
 

Dave.

Member
Oct 27, 2017
6,181
What many here are underrating is how much the next gen CPUs are going to change the feel and immersion of games. Devs haven't even utilized them on PC much yet because of the lack of a wide market. Now they'll begin to target those better CPUs and its exciting.

Would you say computers with these modern CPUs today are... hmmm... let me try and think of the words.... "held back"... by weak consoles ?

No, that's impossible...
 
Jun 23, 2019
6,446
Man I would love an OLED 4K TV, but my current TCL is only ~14ms in input lag. I don't know if I want to give that up unless the new TVs coming out are baller.
 

OneBadMutha

Member
Nov 2, 2017
6,059
This is too good to be true.. there will be big compromises for the 299$ tag. I also expect sub 900p @30 will be the standard for lockhart next gen games.

Only way you get sub 900P games is if devs are targeting 1480P games on Series X....which is highly unlikely. Absolutely no reason for frame rates to dip if it has a Zen CPU. So nope. Again there is plenty of real world evidence to understand how the scaling would work with the rumored specs.
 

19thCenturyFox

Prophet of Regret
Member
Oct 29, 2017
4,313
Some part of my brain still wants to think that Lockhart is basically an "Xbox Gamepass Edition" that plays backwards compatible games natively and streams next gen games added to game pass. I know that there are a million reasons why that isn't going to happen but I'm triying to make sense of the weak ass hardware in that thing. I mean the 5500 XT performs nicely, it plays current PC games at Ultra settings in 1080p and 60+ fps mostly, so I get why a 1080p console might make sense but that card packs quite a bit more punch than 4 TF, why not just go for that?
 

Vimto

Member
Oct 29, 2017
3,718
Only way you get sub 900P games is if devs are targeting 1480P games on Series X....which is highly unlikely. Absolutely no reason for frame rates to dip if it has a Zen CPU. So nope. Again there is plenty of real world evidence to understand how the scaling would work with the rumored specs.

lmao you think the cpu is the main part responsible for frame rates?

And yes next gen games will be 1440p with check boarding tech iques.

Native 4k is stupid and a waste of resources.
 

OneBadMutha

Member
Nov 2, 2017
6,059
Would you say computers with these modern CPUs today are... hmmm... let me try and think of the words.... "held back"... by weak consoles ?

No, that's impossible...

Yes...weak CPUs. #1 thing that holds games back is economics. They aren't going to make crazy immersive worlds that cost a shit ton to make if only a small amount of the gaming consumers have the CPUs to enjoy them and everyone else gets a scaled down version. Now as all next gen consoles have Zen CPUs, they can start to do things to make worlds more interactive...and yes, still scale them down to weak consoles with Jaguar CPUs.
 

OneBadMutha

Member
Nov 2, 2017
6,059
lmao you think the cpu is the main part responsible for frame rates?

And yes next gen games will be 1440p with check boarding tech iques.

Native 4k is stupid and a waste of resources.

Yes. CPUs are absolutely most important in getting to 60fps. Devs aren't targeting 1480 on 12TF Navi. It would be a dumb business move as it's obvious that people buying those units value resolution.
 

Dust

C H A O S
Member
Oct 25, 2017
32,717
Someone needs to pin Cerny to the Ground and scream "IS IT 9 TFLOPS OR HIGHERRRRRRRR" while headlocking him.

giphy.gif
 

PrimeRib

Member
Nov 16, 2017
261
Man I would love an OLED 4K TV, but my current TCL is only ~14ms in input lag. I don't know if I want to give that up unless the new TVs coming out are baller.

We'll know more next week @ CES. Expect Samsung to push QLED and Sony to showcase their new microLED panels, which arguably are the response vs OLED but without the potential for burn-in.

I gotta say, I picked up an LG OLED a few months back and ... it hasn't blown my hair back nearly as much as others have led me to believe. Good tech, crisp blacks (for sure) but I actually still prefer the brighter LED sets. When HDR shines, it shines ungodly bright on those sets and I find that more compelling than the slightly more accurate shades of black. Just my $.02.
 
Jun 23, 2019
6,446
We'll know more next week @ CES. Expect Samsung to push QLED and Sony to showcase their new microLED panels, which arguably are the response vs OLED but without the potential for burn-in.

I gotta say, I picked up an LG OLED a few months back and ... it hasn't blown my hair back nearly as much as others have led me to believe. Good tech, crisp blacks (for sure) but I actually still prefer the brighter LED sets. When HDR shines, it shines ungodly bright on those sets and I find that more compelling than the slightly more accurate shades of black. Just my $.02.

Yikes. I was literally looking at LG sets on Best Buy when I read this so thanks for saving me lol.
 

Silver-Streak

Member
Oct 25, 2017
3,011
I really hope we get HDMI 2.1 TVs in smaller sizes than OLED are currently available. I can't fit anything above a 48 inch in the space for the TV currently, and no OLED goes to that size right now. :(
 

D BATCH

Member
Nov 15, 2017
148
That's a bit hyperbolic on your part. The input lag on Sony TVs are perfectly fine. Granted they're not as low as the LG OLED but it's more than acceptable. My Sony XF90 series has input lag of around 40ms connected to a 1080p source (such as the Switch) which isn't great but not the end of the world either. Still playable. However when connected to a 4K source the input lag reduces to around 23ms which is absolutely fine.
Aw man not going to argue with you. 40ms is bad man 23 ms is ok but lower is better for FPS and competitive play.Samsung TVs are around 11ms, LG Oleds 12ms, sony Oleds are 30ms-34ms and LCD around 25ms. 850g is not bad @ 14ms but image quality sucks on that TV. The bottom line 13ms and under is good anything above 20 is not as good. These are the facts.
 

PrimeRib

Member
Nov 16, 2017
261
Yikes. I was literally looking at LG sets on Best Buy when I read this so thanks for saving me lol.

I look at two things for TV purchasing: what its primary function is and where it's going to be placed. If I'm mostly playing games in a brightly lit room (with a lot of windows I can't/don't want to cover during gaming) then a big bright LED set is better. You need a set that will outshine the ambient light reflected on the screen. If I'm more into movie watching and the TV is going in a fairly dark corner of the room (or one I can easily control the ambient light sources) then OLED all the way - deep blacks are incredible (especially in movies like 2001 or the recent Ad Astara). My thoughts, if they help.
 

Turkey

Member
Oct 27, 2017
17
This is too good to be true.. there will be big compromises for the 299$ tag. I also expect sub 900p @30 will be the standard for lockhart next gen games.

Going from 1080 to 4k is 4 times the pixel count, the one to X shows it does indeed take somewhere close to 4 times the GPU to do this (the X gets far better bandwidth to help it)

With this in mind 4tf is only a third of 12tf so a native 4k game would scale down to above 1080p.

I don't see the CPU changing in any meaningful way so I doubt we will see frame rates drop unless they are also pushing for closer to resolution parity.

Memory and bandwidth may be slightly lower but this will scale slightly with resolution and other changes may be to do with the system and quality of life features at the OS. Less instant resuming games supported at once etc.
 
Oct 25, 2017
1,760
Yes. CPUs are absolutely most important in getting to 60fps. Devs aren't targeting 1480 on 12TF Navi. It would be a dumb business move as it's obvious that people buying those units value resolution.
This gen, yeah, processors are weak sauce and are probably a limiting factor quite often. I don't see that being the case next gen, at least early on when we're getting a lot of cross-gen games. It's pretty rare to experience CPU-limited frame rates on PC, even at 1440p with a 1080 Ti on a 144Hz monitor, and consoles processors will be close to (or better than) the more popular ones used by PC gamers today. Exclusives may be a different story, eventually, but I still think GPUs will continue to be a major limitation RE performance.
 

Dictator

Digital Foundry
Verified
Oct 26, 2017
4,947
Berlin, 'SCHLAND
Oh wow! Yes, DoF is another effect that is best served during appropriate circumstances for guiding the eye. I wasn't aware that it had performance saving benefits. Shame we never got that in the ME trilogy (then again, DoF effects generally amounted to Gaussian blur on last gen HW).

Talking about RE2 and SSS, I am reminded of GoW, another game where it is variable but it does show up prominently more often than not (I tend to go for close ups of the models) and on the flip side JFO (at least on consoles) has one of the worse implementation of how skin to reacts to light.

For example-

Real time cinematics:

3pBUFMw.jpg


Gameplay when it's good:

1dtX5dh.jpg


And when it goes pear shaped:

jUzTtWC.jpg


It's more than just SSS though, I can't put my finger on it but it's the way the skin reacts to lighting as well as character shadowing that looks, "off". And it's not specific to this game either. I have seen it in almost every game. Makes me wonder if ray tracing would be of help here.
I have written this in a topic here before, but real-time sub-surface scattering needs at least 2 to real time lights directions (with corresponding accurate shadows to look correct. 1 light and shadow driving the primary lit side of an object, and one directional light and shadow drving the darker side in the primary shadow. Hence why in cutscenes with multiple lights close to a digital actor skin looks pretty great. In most gameplay scenes you probably have 1 primary light that casts shadows if it is a console game or outdoors, so the shadowed side looks pretty awkward. It is probably lit with some very innacurate cube map/probe and SSAO, which does not produce the Sub surface scattering effect. Even worse is when the shadow map resolution is far too low to even produce a real Sub surface scattering effect (your 3rd screenshot shows that here) even if there is a direct light.

That is one reason why RT GI is so interesting, because it gives the dark side of an object outside of direct lighting another direction light source... it dramatically improves the shading of the dark side of objects: those with sub surface scattering, or even just the normal material which could be matte or metal. Image based lighting which usually shades the dark sides of objects is just not at all accurate to make it look completely convincing... and it also tends to have edges which glow.
 
Oct 25, 2017
1,760
This SSS talk just makes me think of The Irishman, and how the veil drops in outdoor scenes, making it quite obvious that de-aging techniques still have some work ahead. Not sure if it was even SSS to blame (I suspect it's a number of things), but that was the first thing that popped into my mind. Off-topic aside over.
 
Oct 27, 2017
4,657
I really hope we get HDMI 2.1 TVs in smaller sizes than OLED are currently available. I can't fit anything above a 48 inch in the space for the TV currently, and no OLED goes to that size right now. :(
LG is supposed to have a high end 48" OLED this year as part of their 10 series, and personally, that's the one I'm keeping an eye out for.
 

Hey Please

Avenger
Oct 31, 2017
22,824
Not America
I have written this in a topic here before, but real-time sub-surface scattering needs at least 2 to real time lights directions (with corresponding accurate shadows to look correct. 1 light and shadow driving the primary lit side of an object, and one directional light and shadow drving the darker side in the primary shadow. Hence why in cutscenes with multiple lights close to a digital actor skin looks pretty great. In most gameplay scenes you probably have 1 primary light that casts shadows if it is a console game or outdoors, so the shadowed side looks pretty awkward. It is probably lit with some very innacurate cube map/probe and SSAO, which does not produce the Sub surface scattering effect. Even worse is when the shadow map resolution is far too low to even produce a real Sub surface scattering effect (your 3rd screenshot shows that here) even if there is a direct light.

That is one reason why RT GI is so interesting, because it gives the dark side of an object outside of direct lighting another direction light source... it dramatically improves the shading of the dark side of objects: those with sub surface scattering, or even just the normal material which could be matte or metal. Image based lighting which usually shades the dark sides of objects is just not at all accurate to make it look completely convincing... and it also tends to have edges which glow.

Yes, I think the topic pertained to Uncharted 4's Nathan Drake's gameplay model rendering (IIRC).

Thank you for the detailed breakdown. By the sound of it, Metro 2033 with RT enabled should have some of the finest looking NPCs during actual gameplay.

So for next gen the questions are:

1. Can RT be used for GI solution instead of how it has been conventionally advertised- reflective surfaces.
2. If it proves too expensive, are there any conventional solutions to alleviate this issue (because it sounds a lot of like brute forcing some aspects of it like much higher resolution shadowmaps, might be a pyrrhic victory).
 

Proven

Banned
Oct 29, 2017
5,841
The Xbox One X already runs many games at 4K/30 and quite a few games at 4K/60 and yet people are expecting CB 4K @ 30 with a significantly better CPU and a GPU that's probably around 12 TF????

I expect Halo Infinite to run at 4K/60 and even have the option for a slightly lower resolution to run at uncapped frame rates that achieve well more than 60 FPS.
 

Patent

Self-requested ban
Banned
Jul 2, 2018
1,621
North Carolina
The Xbox One X already runs many games at 4K/30 and quite a few games at 4K/60 and yet people are expecting CB 4K @ 30 with a significantly better CPU and a GPU that's probably around 12 TF????

I expect Halo Infinite to run at 4K/60 and even have the option for a slightly lower resolution to run at uncapped frame rates that achieve well more than 60 FPS.
Devs tend to focus on ingame graphics over resolution or framerate, I don't think its crazy to think that a lot of games wont try for 4k, Halo sure that a game that is gonna run on the original xbox one
 

Deleted member 17092

User requested account closure
Banned
Oct 27, 2017
20,360
The Xbox One X already runs many games at 4K/30 and quite a few games at 4K/60 and yet people are expecting CB 4K @ 30 with a significantly better CPU and a GPU that's probably around 12 TF????

I expect Halo Infinite to run at 4K/60 and even have the option for a slightly lower resolution to run at uncapped frame rates that achieve well more than 60 FPS.

there's gonna be more effects and more stuff going on. Ray tracing is also a big performance hog. Even with current games with RT people have to down res to get 60fps with a 2080 super level card. Even without RT everything maxed at 4K 60 on pc is difficult. One x is running more like med/high settings at 4K/CB.
 

Trup1aya

Literally a train safety expert
Member
Oct 25, 2017
21,534
Nop, you dont get to 299 with Gpu alone, they will need to cut on cpu, ram and gpu.


The cpu will be slightly slower and there will be less ram, but that would be offset by less demand on those components due to smaller assets and more resolution.

This thing should definitely be able to play the same games but at a much lower res.
 

CrispyGamer

Banned
Jan 4, 2020
2,774
there's gonna be more effects and more stuff going on. Ray tracing is also a big performance hog. Even with current games with RT people have to down res to get 60fps with a 2080 super level card. Even without RT everything maxed at 4K 60 on pc is difficult. One x is running more like med/high settings at 4K/CB.
Long time lurker first post!...I totally agree with this ZOONAMI we have no idea what next-gen will look like and i doubt it'll be current gen graphics at 4k with Ray tracing.
 

UltraInstinct

Member
Nov 19, 2017
1,099
Aw man not going to argue with you. 40ms is bad man 23 ms is ok but lower is better for FPS and competitive play.Samsung TVs are around 11ms, LG Oleds 12ms, sony Oleds are 30ms-34ms and LCD around 25ms. 850g is not bad @ 14ms but image quality sucks on that TV. The bottom line 13ms and under is good anything above 20 is not as good. These are the facts.

Of course the lower the input lag the better the response times, that's a given. My point is that, it's not as 'horrible' as you described it. Personally for me, 40ms is okay, not great, but playable. (I've not complained). 20ms or under is great. Unless you're a serious competitive gamer (which you'd be playing on PC in any case), chances are most people will not really notice a difference of say 12ms to 20ms.
 

Patent

Self-requested ban
Banned
Jul 2, 2018
1,621
North Carolina
Status
Not open for further replies.