• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.
  • We have made minor adjustments to how the search bar works on ResetEra. You can read about the changes here.

Overall maximum teraflops for next-gen launch consoles?

  • 8 teraflops

    Votes: 43 1.9%
  • 9 teraflops

    Votes: 56 2.4%
  • 12 teraflops

    Votes: 978 42.5%
  • 14 teraflops

    Votes: 525 22.8%
  • Team ALL THE WAY UP +14 teraflops

    Votes: 491 21.3%
  • 10 teraflops (because for some reason I put 9 instead of 10)

    Votes: 208 9.0%

  • Total voters
    2,301
Status
Not open for further replies.

44alltheway

Banned
Apr 17, 2019
38
Do people think we won't get anything substantial till E3?

Also, looking past E3, do we have any idea of when Microsoft/Sony will begin their marketing for their next-gen consoles?
 

Tappin Brews

#TeamThierry
Member
Oct 25, 2017
14,879
i expect a few watch dog-like reveals giving us at least a ball park of where we are heading. maybe not from MS though
 

PetohKing

Alt account
Banned
Apr 16, 2019
82
With 10+ TF as a baseline hopefully open world games should have Agnes philosophy level visuals. Linear games should aim beyond that.

Edit: Might be a low ball even for open world titles.
 
Last edited:

Tappin Brews

#TeamThierry
Member
Oct 25, 2017
14,879
can you imagine pcars 3 on the madbox? we need some madbox speculation up in here

EDIT: just googled it (for shits) and there are a bunch of one day old stories about how its likely dead - killed by stadia
 

OnPorpoise

Avenger
Oct 25, 2017
1,300
Do people think we won't get anything substantial till E3?

Also, looking past E3, do we have any idea of when Microsoft/Sony will begin their marketing for their next-gen consoles?

We have a really good chance of getting leaks, but they could vary wildly... and even if full dev kit specs leak, those might not be the final clockspeeds.

I'd argue MS is already subtly marketing their next-gen console.
 

PetohKing

Alt account
Banned
Apr 16, 2019
82
I'm going back and watching the old tech demoes (Samaritan, Agnes, Dark Sorcerer, Elemental) and I think in many ways they've been exceeded. Anyone else share this analysis?
 

flipswitch

Member
Oct 25, 2017
3,955
What's everybody's expectations of the PS5's wifi capabilities?

In other words, will it suck? I hope they put a decent wifi chip in it!
 

Kage Maru

Member
Oct 27, 2017
3,804
Ok hopefully I can respond in a way that makes sense.

this right here is what i want to focus on the most. everything else is just a side conversation.
A game cannot be both 'designed around a 6 tflops GPU' and then at the same time be a generational leap on a similar 6 tflops GPU a gen later. the fact is that they didnt push the foliage as high as they could have. the lighting engine didnt get reworked to take advantage of the insane 6 tflops gpu. the car models show more detail but clearly we both agree that next gen cars will look much better. its a clear upgrade, no doubt. but not a generational one you would get it if was designed on the xbox 1x.

and no, i have not seen it on the base xbox one console, but i have seen RDR2 running at 900p on my brother's Slim and it looked like a blurray mess on his tv. i am a 100% in agreement that there is a clear visual difference between a native 4k forza horizon 4 and a 1080p xbox one version, but it's mostly due to the resolution. the foliage, number of NPCs, some shadows dont amount to much, and certainly dont offer a massive visual leap.

check out the unity book of the dead tech demo running realtime on the ps4 pro. the foliage, the textures, the wind effects, the lighting everything looks way ahead of RDR2, Horizon, and even Ghosts of Tshushima. Because it is utilizing all of Pro's GPU horsepower.

With all that said, i will extend you an olive branch and accept that if we apply this logic in reverse to the anaconda version being the benchmark and lockheart version a watered down 1080p version with fewer NPCs, shadows, slight downgrade in foliage, less tesselation, reflections etc. then ok, it might work. but i am skeptical for the reasons i list below.

First of all yes, yes you can have a next gen leap with a 6TF console next gen when it targets 1080p. You're comparing the two systems apples to apples when that's not how things will play out. For one, we don't know how similar the GPUs will be. Granted, it'll still be the GCN architecture but there will be efficiencies in the newer GPU due to a newer iteration of that architecture. We can at least expect all of the improvements seen with Vega, plus whatever further improvements they make for Navi. Even IF the GPU is exactly the same 6TF GPU we have now but at 7nm, you will still see massive gains in what's possible when targeting 1080p. Here's a rough example to why that's the case: a frame has 33.33ms to render for a 30fps game before it needs to be flipped to the front buffer to display. Let's pretend a game is able to run a handful of effects (SSS, SSAO, motion blur, and SSR) at 4K within this 33.33ms budget. At 1080p however, the same GPU can run this same frame at ~9.5ms, leaving A LOT of extra headroom for further rendering improvements. Throw in the additional benefits of more memory, a faster storage solution, and a much more capable CPU and the 6TF GPU no longer has to be a bottleneck. This would allow the same developers to provide those larger worlds, better animations, better textures, greater range of effects, smarter AI and everything else we expect from these next gen games. If developers were still targeting 4K with Lockhart, you'd have a point, but that won't be the case.

Also how far they were able to push foliage on the 1X had nothing to do with the 1S. I'm not sure how far you think they should have pushed the foliage but they aren't going to limit themselves because of the 1S. If a scene has 100 bushes on the 1S, they aren't going to hit 150 bushes on the 1X and suddenly say "whoa wait a second, we can't do more because of this other system that's not running this build." Foliage creates overdraw. Overdraw eats up bandwidth. Extra objects will also increase the number of draw calls that need to be processed by the CPU to be sent to the GPU. Without access to development documentation it's impossible to decipher exactly why they stopped at 50% more foliage density but the 1S had nothing to do with it. It's far more likely they hit bandwidth and/or CPU limitations. Regarding other parts of your comments like reworking the lighting model or next gen car models. Time, budgets, and overall efficiencies have as much, if not more, to do with the results as the hardware. The Forza Horizon engine is a forward+ engine. Meaning it is a forward renderer with similar benefits of deferred renderers by allowing many more lights in a scene with a far less performance impact compared to a strictly forward render engine. While being able to support many lights per scene, this renderer type also does not have some of the drawbacks we see with a strictly deferred engine, like being able to better handle transparencies and support for MSAA. It's clear they have tailored this engine to fit their needs. The engine already supports a 24 hour day/night cycle, support many lights per scene, and in the case of the 1X version, the game also supports dynamic shadows at night. So while it's possible that they could have reworked the lighting engine, there really is no good reason to, nor would that really be the most efficient use of their time. When you're talking about a lighting engine, you also have to consider all of the assets this lighting engine would touch and light, meaning you may need to author a completely different subset of assets for a different lighting engine that may not be better in the end. This all falls in line with gamers weird obsession with "new" engines as if it's a magic bullet when in reality most new engines are optimized iterations with rewritten renderer and moving over an entirely new engine does guarantee better results. As for the car models, again, time and money. We also don't know what their pipeline allows with their current rendering budgets for 4K on the 1X. I'm not really sure what else to say about the leap in FH4 besides the leap is bigger than you are implying. It's not a next gen leap, but it was never intended to be either. It was never going to be a next gen leap with them targeting 4K.

I've seen the Book of the Dead tech demo and I don't agree at all. Even Unity doesn't agree with you because they say on their site that the demo supports the OG PS4 and even the OG XBO. Again another good example to how scalable rendering technology is:

The project runs 30 fps in 1080p on PS4 Pro, Xbox One X and Windows/DX11 (mid-range gaming system). It is also supported on macOS/Metal, Windows/Vulkan/DX12 as well as on PS4, Xbox One and Xbox One S.

This is an interesting example. technically you can port current gen ps4 and xbox one games on the switch, though they tend to run at 360p-480p undocked. literally the three biggest games (wolfenstein, doom and mortal kombat 11) all spend most of their time in between those low resolutions going by DF comparisons. we dont know the exact tflops numbers of the tegra gpu in the switch and its a tough comparison due to the massive difference between nvidia and amd tflops but its safe to assume that its roughly half of the 1.3 tflops xbox one GPU, somewhere around 600 gflops and it is running these games at way less than half the resolution most of the time at HALF the framerate to boot. Although, MK11 runs at 60 fps, but it looks pretty bad.

i am not saying that a game designed on a 13tflops ps5 gpu to run at 4kcb will run at 480p or even 720p on the lockheart. i am saying these games next gen are going to have some really complex physics and simulations that will make it much harder to downport at just half the resolution or 1080p in these games. there is nothing really that complex about doom, mortal kombat and wolfenstein at least not when it comes to physics and next gen simulations but it still struggles mightily.

which brings us to the worse case scenario or concern trolling as you called it, which is that devs use the 6-8 tflops console as the benchmark instead and simply up res the game for the other versions with other smaller forza like upgrades like better shadows, slightly better foliage quality etc. and i dont want slight. i want devs to start with 13 tflops.

Edit: originally thought the Switch had more performance than it really did.

Actually the Switch docked is closer to 1TF of performance. So you may be wondering why games still fall very short of the XBO counterparts if overall system flop counts is so close. has more issues than just the GPU performance. That has to do with the other aspects of the system, the 4 core CPU, 4GB of memory, and especially the amount of bandwidth available to feed both the CPU and GPU. Because of all of these reasons, the Switch isn't really a good system to look at when determining how things will play out next gen. Lockhart won't be bottleneck Ed in CPU and memory performance like the Switch is.

Yes we all want more complex physics and simulation, which is why it's important for MS to have a CPU very close to the performance of the PS5 and Anaconda if not being exactly the same. This is why I said before that if Lockhart has the same CPU and almost as much memory, there should be no reason to believe it will bottleneck next gen. You're correct the game logic does not scale nearly as well as the renderer. However even if the CPU is slightly weaker, it still shouldn't be an issue. Developers can still target what they want on the PS5 and Anaconda and do something as simple as scaling down the draw distance on Lockhart so it has to process fewer draw calls and that could be a good way to work around the issue.

The only people that should be bothered by my concern trolling comments are those that were doing it. I understand if you took that personally since you and I have been engaged in a conversation but it wasn't directed at you, just some members in this thread. Going back to your benchmark concerns, I wouldn't be too worried yet. Assuming all things but the GPU can be equal, theoretically developers would be able to do more with a 6TF GPU at 1080p than a 13TF GPU at 4K. In this instance, we're talking about one system that's a bit more than 2x the performance but needs to render 4x the number of pixels. This may be where the 4TF rumors for Lockhart are coming from because if the high end systems are at or below 13TF, you technically don't need 6TF to run games at the same settings, same next gen effects, but at 1080p.

The Anthem example is the most recent one, but I can promise you that even if they didnt have a troubled development, they couldnt have produced those E3 2017 visuals on the base xbox one or ps4. That demo looked a gen ahead of RDR2 which is the current graphics king on consoles. are you saying that if bioware had enough time and resources they could make the game look that good on consoles? the funny thing is that they had 7 years and spent the last two years downgrading the game to get it to run on consoles.

if you dont like the anthem example, you can pick any other downgrade. dark souls 2, watch dogs, rainbow six. when they got downgraded on consoles, they also got downgraded on PCs which were most likely being used to create the vertical slices in the first place. the point is that the base consoles limited the vision of the devs. I fear the same will be the case for third party devs and first party MS devs who are limited to 6-8 tflops in the same way.

Yes if they didn't struggle with the engine to the point where they were considering taking flying out of it, then yes there's a chance that the final game could have looked a lot like that earlier tech demo on the 1X. However the game ran into trouble after trouble and was even rebooted twice IIRC. So sure it was in development for 7 years, but that doesn't really mean much when you waste 5 of those years.

Regarding the rest of the downgraded titles you mentioned, yeah some developers originally showed more than what was capable on the PS4 and XBO. These were systems that were expected render games at higher resolutions with more effects, and had CPUs that weren't really a massive improvement over last gen CPUs. This is different than what Lockhart should be next gen where the base resolution demands will remain the same (1080p) while still increasing the memory pool, GPU, and CPU performance by a huge degree over previous generation base consoles.

This post has all been a highly simplified overview of my point but hopefully now you'll better understand why I'm saying Lockhart doesn't have to be a bottleneck, even at 6TF.
 
Last edited:

Deleted member 1589

User requested account closure
Banned
Oct 25, 2017
8,576
I'm going back and watching the old tech demoes (Samaritan, Agnes, Dark Sorcerer, Elemental) and I think in many ways they've been exceeded. Anyone else share this analysis?
Yeah, Samaritan was bested or on par with Arkham Knight. With the rest there's RDR2, Days Gone, The Order, God of War, Hellblade, and upcoming TLoU2
 

Kage Maru

Member
Oct 27, 2017
3,804
Um, no. Not remotely. The Switch docked is about 352GF, if I remember correctly. I have heard that the Tegra X1 can potentially hit 1TF, but only with PC-level cooling, and that might've been referring to FP16.

That seems extremely low but looking it up the only thing I could find is early rumors like the one below. So you can be right. I'll correct that part in my post, thanks.

https://www.gamespot.com/articles/nintendo-switch-specs-report-1-teraflop-of-perform/1100-6446282
 

BreakAtmo

Member
Nov 12, 2017
12,838
Australia
That seems extremely low but looking it up the only thing I could find is early rumors like the one below. So you can be right. I'll correct that part in my post, thanks.

https://www.gamespot.com/articles/nintendo-switch-specs-report-1-teraflop-of-perform/1100-6446282

It's about what was expected, the PS3, 360 and Wii U all hovered around the 170-230GF area. Portable Switch is about the same - I think it varies from about 190-200GF, which is why it runs last-gen games similarly well to the older consoles - and then docked mode uses the fan to beef it up for higher resolutions. I'm pretty sure those early 1TF rumours were probably due to the Tegra X1 being initially announced with up to 1TF in 2015.
 

Putty

Double Eleven
Verified
Oct 27, 2017
931
Middlesbrough
Add me to the CB/reconstruction camp...Some results on Pro titles have been stellar, so doing the same on a 12tf machine means they could throw the kitchen sink at the IQ dept and STILL have a monstrous budget to laivish each scene with as much detail and complexity as you could shake a stick at.
 
Feb 23, 2019
1,426
Add me to the CB/reconstruction camp...Some results on Pro titles have been stellar, so doing the same on a 12tf machine means they could throw the kitchen sink at the IQ dept and STILL have a monstrous budget to laivish each scene with as much detail and complexity as you could shake a stick at.

I totally agree...and moreover, I think 1440-1880 is the sweet spot for typical large 4K TVs and normal viewing angles

To truly resolve a ton of the detail of full 4K you need to sit extremely close to the TV and that makes games practically unplayable

I cannot really tell a huge difference between resolutions beyond 1440p or so.

I want next gen to hit the resolution sweet spot and then use all those TFs for graphical fidelity or framerate.

Which is also why I'm not a huge fan of a 2 SKU strategy, I want 1 high end SKU baseline
 

Kage Maru

Member
Oct 27, 2017
3,804
It's about what was expected, the PS3, 360 and Wii U all hovered around the 170-230GF area. Portable Switch is about the same - I think it varies from about 190-200GF, which is why it runs last-gen games similarly well to the older consoles - and then docked mode uses the fan to beef it up for higher resolutions. I'm pretty sure those early 1TF rumours were probably due to the Tegra X1 being initially announced with up to 1TF in 2015.

Ah OK that makes sense. The Switch flop count isn't critical to the point I was going for but it's still disappointing that it's so low. At the same time I'm even more impressed with what's been done on the system now.
 
Nov 12, 2017
2,877
I will bet my avatar that majority of Sony's 1st party will not be implementing native 4K over the lifetime of PS5, especially if they wish to incorporate complex world geometry, loads of draw calls, increased alpha effects, various post processing effects like real time SSS and even perhaps limited ray tracing usage. This is especially the case given the success GG has had with their 4K CB, something for which Pro was customized. Furthermore, this gen has seen quite a few games successfully use dynamic resolution to preserve performance.

The expectation that of devs implementing native 4K more rigorously for comparatively simplistic looking titles with current gen parallels being Ashen, Abzu, Nex Machina, Earthlock, Absolver, The Gardens Between, Divinity Original Sin etc, is, however, quite valid.
Take the bet ) I think u gonna lose your avatar on the first day reveal
 

Deleted member 1589

User requested account closure
Banned
Oct 25, 2017
8,576
I will bet my avatar that majority of Sony's 1st party will not be implementing native 4K over the lifetime of PS5, especially if they wish to incorporate complex world geometry, loads of draw calls, increased alpha effects, various post processing effects like real time SSS and even perhaps limited ray tracing usage. This is especially the case given the success GG has had with their 4K CB, something for which Pro was customized. Furthermore, this gen has seen quite a few games successfully use dynamic resolution to preserve performance.

The expectation that of devs implementing native 4K more rigorously for comparatively simplistic looking titles with current gen parallels being Ashen, Abzu, Nex Machina, Earthlock, Absolver, The Gardens Between, Divinity Original Sin etc, is, however, quite valid.
I'll take that bet.

Naughty Dog is insane enough to try and do it.
 

Saberus

Member
Oct 28, 2017
583
Vancouver, BC
I totally agree...and moreover, I think 1440-1880 is the sweet spot for typical large 4K TVs and normal viewing angles

To truly resolve a ton of the detail of full 4K you need to sit extremely close to the TV and that makes games practically unplayable

I cannot really tell a huge difference between resolutions beyond 1440p or so.

I want next gen to hit the resolution sweet spot and then use all those TFs for graphical fidelity or framerate.

Which is also why I'm not a huge fan of a 2 SKU strategy, I want 1 high end SKU baseline


butt..butt.. Digital Foundry will tell us you're wrong.... .. .
 

Xeontech

Member
Oct 28, 2017
4,059
Add me to the CB/reconstruction camp...Some results on Pro titles have been stellar, so doing the same on a 12tf machine means they could throw the kitchen sink at the IQ dept and STILL have a monstrous budget to laivish each scene with as much detail and complexity as you could shake a stick at.
Not just that but the frame rates man! The frame rates!!
 

Andromeda

Member
Oct 27, 2017
4,846
I will bet my avatar that majority of Sony's 1st party will not be implementing native 4K over the lifetime of PS5, especially if they wish to incorporate complex world geometry, loads of draw calls, increased alpha effects, various post processing effects like real time SSS and even perhaps limited ray tracing usage. This is especially the case given the success GG has had with their 4K CB, something for which Pro was customized. Furthermore, this gen has seen quite a few games successfully use dynamic resolution to preserve performance.

The expectation that of devs implementing native 4K more rigorously for comparatively simplistic looking titles with current gen parallels being Ashen, Abzu, Nex Machina, Earthlock, Absolver, The Gardens Between, Divinity Original Sin etc, is, however, quite valid.
Let's hope that. But the most important for me is > 60fps. At some point with 4K30fps, whatever flashy effects you implement in your game, you hit diminushing returns very quickly because everything becomes a blurry mess in motion (this is obviously aggravated by motion blur they keep thinking is best for us). The difference of image quality / fidelity between static >1080p resolutions and in motion, at 30fps, is incredibly huge for me. This wasn't the case for 720p games.
 
Nov 12, 2017
2,877
If you watch Digital Foundry videos, there are cases where with magnifying apps and tools they fail to recognize CB instead of native 4K. They've publicly apologized in some cases.

Proper use of CB saves so much extra power native takes up, it literally makes native 4K a waste of resources. Frame rates, fidelity, iq, all can be increased with the resource savings.

It's honestly a no brainer if you want best image quality.
In lots other game the Implementation was horrible creating blurry images. So a no brainier depend who there's behind the game.
 

Deleted member 12635

User requested account closure
Banned
Oct 27, 2017
6,198
Germany
Add me to the CB/reconstruction camp...Some results on Pro titles have been stellar, so doing the same on a 12tf machine means they could throw the kitchen sink at the IQ dept and STILL have a monstrous budget to laivish each scene with as much detail and complexity as you could shake a stick at.
A4xcFPn.png
 

illamap

Banned
Oct 28, 2017
466
Let's hope that. But the most important for me is > 60fps. At some point with 4K30fps, whatever flashy effects you implement in your game, you hit diminushing returns very quickly because everything becomes a blurry mess in motion (this is obviously aggravated by motion blur they keep thinking is best for us). The difference of image quality / fidelity between static >1080p resolutions and in motion, at 30fps, is incredibly huge for me. This wasn't the case for 720p games.

I think that I devs do hit a bit of brick wall when trying trying to gain noticeable graphics difference between 30fps and 60fps games. And as such most top tier devs will gear towards 60 fps titles. As was discussed before on twitter by some of the best rendering programmers, improvements to rendering have been quite incremental esp. during this generation.

On next-gen for example I think small triangles are gonna be quite big bottleneck when it comes rendering more complex scenes that also lod well.
 

Doctor Avatar

Member
Jan 10, 2019
2,599
If you watch Digital Foundry videos, there are cases where with magnifying apps and tools they fail to recognize CB instead of native 4K. They've publicly apologized in some cases.

Proper use of CB saves so much extra power native takes up, it literally makes native 4K a waste of resources. Frame rates, fidelity, iq, all can be increased with the resource savings.

It's honestly a no brainer if you want best image quality.

Yep. But people who fetishise arbitrary stats rather than, you know, how the games *actually* look will obsess over the native 4K emperors new clothes.

When DF get it wrong, with zoomed screens and specific equipment/software, do you think normal people at normal distances playing a moving game will be able to see any difference? Of course not. Even the people that probably would claim they can won't be able to.

Native 4K vs CB 4K is like lossless audio vs high bitrate MP3. A waste of resources the benefit of which is imperceptible (if done well) to 99% of users.

I've said it multiple times in this thread but I'll say it again. At the end of the next gen when we look back the best looking games will use reconstruction techniques.

Getting a 40%+ boost in GPU power for what is often a completely imperceptible reduction in resolution clarity is an absolute no brainier. Unless you need to wage a PR war over "native vs non native" as a producer or fetishise numbers rather than reality as a consumer.
 
Feb 10, 2018
17,534
sure. i dont like the Radeon 7 comparisons because ive no idea wtf they were doing with that card, but they have said Navi GPUs will have twice the capacity, uses 2x less power and 1.35x performance which im assuming are the clock speed gains you usually get from going to a smaller node.

4mtm2OX.jpg


A few months ago, i looked at the clock speed gains for the Pro and X1X when they went from 28nm to 16nm and they were roughly 15% and 35% (with vapor chamber cooling) respectively. And the jump from 28 to 16 is less than the jump from 16nm to 7nm. Since they cant just double the 36 and 40 CU counts of the Pro and X1X due to the 64 CU GCN limit, i think they will top out at 60-64 CUs and use the rest of the die space to add RT cores.

60CU at 1.85Ghz=14.2 tflops.

Sony seems to have settled for a 56CU at 1.85Ghz if the latest rumors are true. it will give them even more space to add RT cores. I am no hardware engineer and no doubt completely out of my element here lol but the math just adds up:

- 16nm to 7nm = 2.3x reduction in size.
- 40 * 2.3 = 92 CUs for the same die size as the one they had in the Xbox one x with vapor chamber cooling
- Cant exceed the 64CU limit due to poor performance gains, architectural issues etc
- the space that could fit 28-32 CUs could be used for RT cores.
- The Nvidia RTX cards are fabricated on a 12nm process, consume only 200-250W and have dedicated RTX cores. not entirely shocking to believe that AMD can do the same at a 7nm process. People have no faith in AMD lol

Lastly, if AMD releases the Navi GPUs without ray tracing cores on july 7th of this year, then we are all fucked.
The thing is the vega 7 is not 1.8x smaller than vega 64.
So who knows how reliable AMD saying Navi will be 2x smaller.
Navi is still cgn so the CUs can not be that smaller then vega cu's.
 

Andromeda

Member
Oct 27, 2017
4,846
I think that is often more to do with the AA solution that is used. Games using CBR that also use TAA tend not to mix too well.
I think it's only a few games that fail at doing it properly recently: mainly RDR2 and I think a few japanese games. Most games using doing CB on Pro actually do it rather well all things considered (Tomb Raider games, The Witcher 3, Anthem, Division, some COD games, BF1 & 5), I am talking about multiplats obviously.

I think RDR2 is the only recent AAA game doing it really badly because of their post AA solution. But DF were impressed with the CB solution used in Anthem on Pro. At first they even thought it was running at the same native res than the XBX version, while the game was running better on Pro.
 
Feb 10, 2018
17,534
Add me to the CB/reconstruction camp...Some results on Pro titles have been stellar, so doing the same on a 12tf machine means they could throw the kitchen sink at the IQ dept and STILL have a monstrous budget to laivish each scene with as much detail and complexity as you could shake a stick at.

It will work if they explain it correctly.

Cerny on stage:
We believe for a new generation the increase in visuals has to be vast. In order to achieve this goal we have refined our 4k reconstruction techniques which enables developers to use the resources saved by 4k reconstruction to make there games more rich and detailed but while still maintaining excellent clarity on 4k displays.
And maybe have slides of games made with 4k vs 4kcb, showing the 4kcb ones have more effects + polycounty.
 

modiz

Member
Oct 8, 2018
17,844
It will work if they explain it correctly.

Cerny on stage:
We believe for a new generation the increase in visuals has to be vast. In order to achieve this goal we have refined our 4k reconstruction techniques which enables developers to use the resources saved by 4k reconstruction to make there games more rich and detailed but while still maintaining excellent clarity on 4k displays.
And maybe have slides of games made with 4k vs 4kcb, showing the 4kcb ones have more effects + polycounty.
there is no need to explain that. sony will obviously give the option to developers to either do native or checkerboard or whatever other technique or resolution they want (heck some devs might even push beyond 4k because the PS5 can support up to 8k).
 
Feb 10, 2018
17,534
there is no need to explain that. sony will obviously give the option to developers to either do native or checkerboard or whatever other technique or resolution they want (heck some devs might even push beyond 4k because the PS5 can support up to 8k).

There messaging and marketing has to be consistent.
If they have "4k" on adverts and packaging but there 1st party games are 4kcb they will probably receive some negative PR because of it.
 
Status
Not open for further replies.