• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.

When will the first 'next gen' console be revealed?

  • First half of 2019

    Votes: 593 15.6%
  • Second half of 2019(let's say post E3)

    Votes: 1,361 35.9%
  • First half of 2020

    Votes: 1,675 44.2%
  • 2021 :^)

    Votes: 161 4.2%

  • Total voters
    3,790
  • Poll closed .
Status
Not open for further replies.
Feb 10, 2018
17,534
If people are seriously expecting Disc BC from Sony for anything but PS4 games than they are in for a disappointment. There is no way Sony would just make that happen. They have no reason other than goodwill. They can sell that stuff on PSN if they need to or ignore it completely. The only BC we are going to get is PS4 because they want to build the foundation for a Playstation eco system and platform.

I would love to pop in Silent Hill 2 into my PS5 and I would pay up to a 1000 bucks to do that, but It's not gonna happen. The Sony of the PS4 era has shown that they don't care about BC to older games. Just look at the pathetic collection of PS2 games and the lack of PS1 games that are sold through the store. They will have PS4 BC to lock this gens consumers down but there is no benefit from having teams work on emulators for all the other systems since they don't expect customers from that. Also, isn't licencing those old games a real bitch, like Square expecting their games to be more expensive than the rest?

Here is my prediction:

PS5 announced, Cerny talks about generations and drops the bomb, you can play all your PS4 games on PS5 and they run (marginally) better and there will be PS5 patches for the final bunch of PS4 exclusives. Shows some screens or even videos of PS5 enhancements and yadda yadda yadda. No word of last last gen games until the journalists pester them. They will give a bullshit excuse like: "We are always listening to feedback and working towards establishing a Playstation experience that suits all of our fans."

Cue Microsoft conference. In an attempts to mock Sony, MS releases a video akin to the old "Playstation Used Games Instruction" video called, this is how you play ALL your old games on Xbox --> video of Phil popping in Halo 1 into the Xbox Infinity.

I don't think xbox under Phil will mock Sony, Phil has had a friendly attitude towards Sony and Nintendo this gen, he's even mentioned them several times on conferences and interviews.
Also MS have had opportunities to mock Sony this gen with BC and cross play but they didn't.
The strategy at MS seems to be inclusive and non judgementle.
I don't know if Sony are still in the more cutthroat and they have to "beat" the competition PR stance.
 
Jan 21, 2019
2,902
I don't think xbox under Phil will mock Sony, Phil has had a friendly attitude towards Sony and Nintendo this gen, he's even mentioned them several times on conferences and interviews.
Also MS have had opportunities to mock Sony this gen with BC and cross play but they didn't.
The strategy at MS seems to be inclusive and non judgementle.
I don't know if Sony are still in the more cutthroat and they have to "beat" the competition PR stance.

I meant the mock video to be in good spirits, not to spite the competition and its fans, but I guess internet culture would only see vitriol and riot.

So I have a bit of a crazy idea...

We all know that interpolation between 30 to 60fps adds lag.

What if a game engine (the logic on the CPU) updates at 60fps, but the GPU only renders every other frame and interpolates the other. That would cut down on lag between action and effect.

Input lag is interesting to me, because even between games with the same frame rate, say 60, it can vary significantly:

Call of Duty: World at War 72.5ms60fps
Call of Duty: Modern Warfare 2 77.5ms60fps
Battlefield 3 157.2ms30fps
Battlefield 4 97.6ms30fps
Call of Duty: Infinite Warfare 39.3ms60fps
Call of Duty: Modern Warfare Remastered 40.3ms60fps
Battlefield 1 56.1ms60fps
Halo 5 63.0ms60fps
Battlefield 4 63.7ms60fps
Titanfall 2 71.8ms60fps
Overwatch 76.8ms60fps
Doom 2016 86.8ms60fps
Killzone Shadowfall Multiplayer 89.8ms60fps
Killzone Shadowfall Single-Player 110.0ms30fps

Now, if you coded a game with an engine that runs at 60fps like COD infinite warfare in this example, with 40ms input lag, but rendered at 30fps and interpolated the image to 60fps your total input lag actually would be better than Doom's true 60fps rendering lag!

You would be one 30fps frame behind, which is 33ms as the interpolation would require a future 30fps frame with which to calculate the intervening 60fps frame it's inserting. Which means if you add the engine latency (40ms - likely less because this actually includes the rendering latency of that game), and the rendering latency being a frame behind to interpolate (33ms), your total latency would be <73ms - so an interpolated 60fps COD would still more responsive than the "true 60fps" Doom.

By not tying your logic update to your graphical update you can reduce the downside of interpolating significantly.

Could this be a technique used by next generation consoles, with fast silicon based interpolation, to reduce the GPU render burden of 60FPS? By decoupling the rendering refresh rate from the logic refresh rate and interolating you get a happy medium between 30fps and 60fps in regards to latency, and indeed can even beat some current gen "60fps" experiences in regards to lag. Who knows, better bluetooth communication between the controllers and the console could also shave some ms off that <73 value.

Not to really anger the "fakery doesn't count" crowd but you could actually use this technique to improve temporal checkerboading too - since the temporal checkerboard would have far more frames to work with (currently only previous frames, but with interpolation it will have future frames too) meaning that temporal checkerboading/injection would be of higher quality and exhibit fewer artefacts. Which would mean you could probably produce a 60fps 4K game by rendering half the frames and half the pixels and it would actually look pretty decent. Not as good as true4K60 but considering it would require about 25-30% of the GPU power (rendering 50% of the frames and 50% of the pixels) of true 4K60 it would actually be a pretty amazingly efficient way of making good looking games. Indeed if there was a silicon based solution to this temporal checkerboarding and interpolation it would cost no more GPU power than making a game at half 4K at 30FPS, and would look far far sharper and smoother and actually have reduced latency compared to a 30fps game if the engine is running at 60fps.

My Sony X900E is pretty good at interpolating video and I sometimes turn in on for games just to see the difference between 30 and 60 in games like horizon. As long as the movement is not erratic and the camera spans slowly there is no problem, but normal gameplay introduces many artifacts. Maybe you know more about interpolation than I do but how do you propose you would fight the artifacts that happen when gaming normally. My understanding is that the interpolation looks at where the pixels travels between frames and invents new approximations between those frames to make it smoother. When I watch a movie, 90% is artifact free (just watched the movie Creed with full on interpolation and it ran beautifully), but games are interactive and things can change on a whim.
 
Last edited:
Jan 17, 2019
964
I don't think xbox under Phil will mock Sony, Phil has had a friendly attitude towards Sony and Nintendo this gen, he's even mentioned them several times on conferences and interviews.
Also MS have had opportunities to mock Sony this gen with BC and cross play but they didn't.
The strategy at MS seems to be inclusive and non judgementle.
I don't know if Sony are still in the more cutthroat and they have to "beat" the competition PR stance.

They did. Ybarra and XboxUK had tweeted something about BC and X-play before.
 

Shoshi

Banned
Jan 9, 2018
1,661
Maybe wrong thread but is there any chance we will get a new SCU of PS4 Pro before summer? With 7nm or at least more silent than the current one (I know the RDR2-bundled is already quite silent)
 

ImGumbyDammit

Banned
Nov 25, 2018
133
Sony has one more important move - they can move PS4 / PS4 Pro to 7nm. This will significantly alter the hardware cost structure. They should be able to permanently reduce costs by additional $50 after redesigns are released. Generally, node improvements not only lead to lower APU costs, but also reduced cooling and other costs. If they launch it by Fall, they could potentially stabilize Q4 (Q3 FY) sales and slow down the cyclical drops. This generation is already quite expensive going by ASP this late in the cycle.

I think you are oversimplifying such a process and any benefits it could ever bring. You know when you have a car you have owned for a while and it starts to break down more often but you keep putting money in it with less and less return on that investment. It gets to that point where you are spending more on repairs then the car is worth. Well, Sony is at that point where they need a new car because replacing the engine or transmission will cost more than their car is worth.
 

Lagspike_exe

Banned
Dec 15, 2017
1,974
I think you are oversimplifying such a process and any benefits it could ever bring. You know when you have a car you have owned for a while and it starts to break down more often but you keep putting money in it with less and less return on that investment. It gets to that point where you are spending more on repairs then the car is worth. Well, Sony is at that point where they need a new car because replacing the engine or transmission will cost more than their car is worth.

What? If anything, Sony has always looked to maximize efficiency gains through numerous console iterations linked with node advancement. It's not meant to replace a new generation, it's designed to give more pricing flexibility.
 

Albert Penello

Verified
Nov 2, 2017
320
Redmond, WA
Ok imagine this for the PS5:

It is announced at E3. It launches in March 2020.

It offers:
- Backwards compatibility with ALL Playstation discs, from any generation (PSX, PS2, PS3, PS4).
-It offers backwards compatibility with ALL Digital Playstation games. These are linked by PSN account and would all be playable on one system.

Retails for $499.

Would you bite?

What's the status on the PS2 and PS3 emulation scene? Honest question. PS1 is pretty mature AFAIK (Sony using it on PS Classic). PS4 you can see assuming they stick with AMD. PS1 should be easy but with that library that is a LOT of testing (and licensing nightmare). PS2 and PS3 would have to be SW, and considering the sizes of those libraries that would be a development and testing undertaking of insane proportions. Large enough that I would have to believe we'd have heard something by now.

So I'm betting PS4 B/C is a slam dunk. PS1 is a stretch but do-able. PS2/3 would definitely be a mic drop.
 

Lagspike_exe

Banned
Dec 15, 2017
1,974
Is 7nm for Radeon/Jaguar on AMD's roadmap? I don't think it is. Colbert?

I don't remember AMD talking a lot about 16nm port either. 16nm Jaguar was only made for PlayStation and Xbox.

If Sony can launch this Fall 2019, they can capture ~8M sales this year and let's say ~10M in 2020 and ~7M in 2021. This is 25M in sales where costs can be reduced significantly. And I think these are somewhat conservative figures, and then there is limited sales potential post 2021.
 

Bloodcore

Member
Mar 24, 2018
137
What? If anything, Sony has always looked to maximize efficiency gains through numerous console iterations linked with node advancement. It's not meant to replace a new generation, it's designed to give more pricing flexibility.
Do not forget that porting a design to a smaller node is very expensive. It made sense when they went from 28nm to 16nm, since it went into other products as well as two consoles.

Doing it again to just give some pricing flexibility for Sony/Microsoft simply won't happen.
It would be cheaper to go for a slower and smaller Zen2/Navi APU or the defective chips that otherwise would have gone into the next-gen consoles.
It would essentially be a crippled PS5/Next-xbox.
 
Last edited:

Lagspike_exe

Banned
Dec 15, 2017
1,974
Do not forget that porting a design to a smaller node is very expensive. It made sense when they went from 28nm to 16nm, since it went into other products as well as two consoles.

Doing it again to just give some pricing flexibility for Sony/Microsoft simply won't happen.
It would be cheaper to go for a slower and smaller Zen2/Navi APU. It would essentially be a crippled PS5/Next-xbox.

I would argue that this is a simple business decision. There are upfront costs (investment) that causes long term cost reduction (cash flow increase). Whether this decision should be made is purely linked with the investment that's needed, per unit cost reduction and just how much Sony projects they'll sell PS4 in the future.
 

Albert Penello

Verified
Nov 2, 2017
320
Redmond, WA
I don't remember AMD talking a lot about 16nm port either. 16nm Jaguar was only made for PlayStation and Xbox.

If Sony can launch this Fall 2019, they can capture ~8M sales this year and let's say ~10M in 2020 and ~7M in 2021. This is 25M in sales where costs can be reduced significantly. And I think these are somewhat conservative figures, and then there is limited sales potential post 2021.

Your logic is sound. I would point out that the costs of AMD porting Jaguar to 16nm was spread across 2 corporations and 4 consoles. So unless both MS and Sony are doing it, that raises the costs for just one of them. And if they are only doing it for PS4 Pro, and not for the base, that raises the costs even further. Your point is valid, and I agree on your forecasting. But the costs would not pan out unless it was being used in more than one console.
 

Dant21

Member
Apr 24, 2018
842
What's the status on the PS2 and PS3 emulation scene? Honest question. PS1 is pretty mature AFAIK (Sony using it on PS Classic). PS4 you can see assuming they stick with AMD. PS1 should be easy but with that library that is a LOT of testing (and licensing nightmare). PS2 and PS3 would have to be SW, and considering the sizes of those libraries that would be a development and testing undertaking of insane proportions. Large enough that I would have to believe we'd have heard something by now.

So I'm betting PS4 B/C is a slam dunk. PS1 is a stretch but do-able. PS2/3 would definitely be a mic drop.
PS2 is very mature, but not quite to the point that Gamecube/Wii emulation is for comparison. PS3 emulation is in a fairly early state, but it made massive advancements in the past two years by having some people working on it full-time and taking full advantage of the vectorization in modern CPUs (mainly for dynanic recompilation). About 40% of the library is playable in RPCS3.

FWIW, Sony must already have a competent PS2 emulator as they've had fully software-emulated "PS2 Classics" on their storefront for both the PS3 and PS4 now. PS3 really is its own beast though.
 

anexanhume

Member
Oct 25, 2017
12,912
Maryland
Is 7nm for Radeon/Jaguar on AMD's roadmap? I don't think it is. Colbert?

AMD has stated that the cost per yielded 250mm^2 is double on 7nm that it was on 16nm. So, unless they can halve the die size, they won't even break even.

However, a 7nm shrink where they half the bus size to 128 bit and go to GDDR6 memory could be very interesting as a cost and power saver.
What's the status on the PS2 and PS3 emulation scene? Honest question. PS1 is pretty mature AFAIK (Sony using it on PS Classic). PS4 you can see assuming they stick with AMD. PS1 should be easy but with that library that is a LOT of testing (and licensing nightmare). PS2 and PS3 would have to be SW, and considering the sizes of those libraries that would be a development and testing undertaking of insane proportions. Large enough that I would have to believe we'd have heard something by now.

So I'm betting PS4 B/C is a slam dunk. PS1 is a stretch but do-able. PS2/3 would definitely be a mic drop.

RPCS3 has enough Patreon supporters to support a full time developer, and they claim over 33% of the library is playable. Given Sony's use of open source PS1 emu in PS classic, perhaps they could also leverage that for PS5. It would allow them to get rid of their cell and RSX server racks for PSNow.
Your logic is sound. I would point out that the costs of AMD porting Jaguar to 16nm was spread across 2 corporations and 4 consoles. So unless both MS and Sony are doing it, that raises the costs for just one of them. And if they are only doing it for PS4 Pro, and not for the base, that raises the costs even further. Your point is valid, and I agree on your forecasting. But the costs would not pan out unless it was being used in more than one console.

This is a good point. Optimistically, Sony could sell another 20M PS4 with this chip. That might be enough, but they'll have lowered the MSRP a lot lower by then. I'm sure they make a handsome margin now at $300, but that will go away with drops.
 

eathdemon

Member
Oct 27, 2017
9,626
what would stop them from using underclocked zen? are devs really coding to timings/frequencies in 2019?
 

AudiophileRS

Member
Apr 14, 2018
378
Slightly late to the subject here, but regarding reconstruction..

I much prefer Insomniac's Temporal Injection to any checkerboard solution I've seen, though I'd put Horizon's 2160 CB very close.

I believe Spider-Man used a 1440p-1600p dynamic resolution with TI on top within a full 4K framebuffer to provide an excellent, anti-aliased imaged that was only touch soft but very clean.

I'd like to see next-gen; a dynamic res in the 1700-2160p range so they're aiming for native 4K in ideal situations, but then drop resolution rather than frames and fill in the gaps with TI.
 

Albert Penello

Verified
Nov 2, 2017
320
Redmond, WA
what would stop them from using underclocked zen? are devs really coding to timings/frequencies in 2019?

You mean for a Pro cost reduction or am I misunderstanding? What would stop them is that at that point, it's literally a different console. It would require a wholesale system redesign on top of creating a brand new SOC. An 'underclocked' Zen still costs you the same $$/chip as one running at the intended speed. Unless you are meaning to create a specific low-powered Zen chip that only runs at the speed necessary to run Pro games. But now you're talking about a second SOC, and if you were going to do that, you'd be better off doing a two-tier strategy as Xbox is rumored to be doing and just create two Gen9 consoles.

It would be barely feasible to justify 7nm on the existing platform if it was only being used by the Pro. It would be much less feasible to create a new SOC that runs 1:1 speed for the PRO, when for the exact same development and parts cost you could create one that was better and just replace the console at the same pricepoint.

The simplest case at this point, IMO, is just to lose a little more money per console if you want to bring the price down, even with 25M+ units left.
 

AudiophileRS

Member
Apr 14, 2018
378
I wonder if it'd be best for Sony to wait for 7nm EUV to shrink the PS4?

And if they need to reinvigorate sales a little in the meantime, perhaps do a small price drop now and another small price drop for the shrink rather than one big drop now or then.
 

eathdemon

Member
Oct 27, 2017
9,626
for anyone that has been fallowing the space, what changed with EUV? last I heard they were having issues making photo masks that could withstand it?
 

anexanhume

Member
Oct 25, 2017
12,912
Maryland
I wonder if it'd be best for Sony to wait for 7nm EUV to shrink the PS4?

And if they need to reinvigorate sales a little in the meantime, perhaps do a small price drop now and another small price drop for the shrink rather than one big drop now or then.

7nm+ gains are very small and the cost benefit isn't realized yet.
for anyone that has been fallowing the space, what changed with EUV? last I heard they were having issues making photo masks that could withstand it?

Foundries are currently working on getting the machines up to power levels that will support volume throughput.

The issue you're talking about is pellicule lifetime, which is an on-going effort. They are not required for EUV, but they are a tremendous help.

https://semiengineering.com/euv-pellicle-uptime-and-resist-issues-continue/
 

Lady Gaia

Member
Oct 27, 2017
2,477
Seattle
what would stop them from using underclocked zen? are devs really coding to timings/frequencies in 2019?

Not intentionally, but you seriously underestimate the chance of finding bugs that are timing sensitive but have never been a problem before with a fixed architecture. Why test your code running under circumstances that have never existed? And if you did find the issue, would you prioritize fixing it over problems that do affect your customers?
 

BreakAtmo

Member
Nov 12, 2017
12,824
Australia
AMD has stated that the cost per yielded 250mm^2 is double on 7nm that it was on 16nm. So, unless they can halve the die size, they won't even break even.

However, a 7nm shrink where they half the bus size to 128 bit and go to GDDR6 memory could be very interesting as a cost and power saver.

Hopefully that ran change is where it becomes worth it. It's a shame, though, that they can't easily just make a new PS4 out of a much smaller Zen2/Navi APU.
 

Doctor Avatar

Member
Jan 10, 2019
2,591
My Sony X900E is pretty good at interpolating video and I sometimes turn in on for games just to see the difference between 30 and 60 in games like horizon. As long as the movement is not erratic and the camera spans slowly there is no problem, but normal gameplay introduces many artifacts. Maybe you know more about interpolation than I do but how do you propose you would fight the artifacts that happen when gaming normally. My understanding is that the interpolation looks at where the pixels travels between frames and invents new approximations between those frames to make it smoother. When I watch a movie, 90% is artifact free (just watched the movie Creed with full on interpolation and it ran beautifully), but games are interactive and things can change on a whim.

It will probably vary between the algorithm and solution used. If a next gen PS5 or Xbox Two used a solution designed to run on silicon by Sony/MS with AMD's help I imagine it would be a pretty high quality one.

https://youtu.be/ONmJjuxEGZ0

This is Horizon running 4K60 - with checkerboarding and interpolation. As far as artefacting goes I don't see a huge issue there - looks plenty smooth and crisp to me. And remember with interpolation and temporal checkerboading done together it will actually look better than this because the temporal checkerboard will have a future frame to compare pixels to as well. You could even combine the tech with temporal injection to further increase quality. This means a game that looks as crisp and smooth as that Horizon video will cost ~30% of the GPU power it would have if you weren't "cheating". Cutting your rendering cost in 3 is easily worth any minor artefacts those techniques would introduce - that video looks great.

The major problem with interpolation is lag. But as I said, my proposed solution of updating game simulation at 60 on the CPU side would give easily equivalent lag to a current 30fps game, and it would be possible to have better lag than many current fen and previous gen 60fps games (like Doom at 86ms or Smash Ult at 96-112ms) with it too.
 
Last edited:
Oct 26, 2017
6,151
United Kingdom
AMD has stated that the cost per yielded 250mm^2 is double on 7nm that it was on 16nm. So, unless they can halve the die size, they won't even break even.

However, a 7nm shrink where they half the bus size to 128 bit and go to GDDR6 memory could be very interesting as a cost and power saver.

Your AMD quoted 7nm costs are a single snapshot in time and won't remain as such. As more of the industry transitions over to 7nm and volumes increase on the whole, as well as increased 7nm process maturation, costs per yielded square mm will come down.

Equally, as volumes move away from 16nm production and more and more customers embrace 7nm, 16nm costs will increase. So in my mind a 7nm PS4 is either a question of when or never, with the latter implying when to stop production of PS4 APUs in totality because the cost equation on both 16nm and 7nm can no longer be justified.
 

Cthulhu_Steev

Member
Oct 27, 2017
2,379
If people are seriously expecting Disc BC from Sony for anything but PS4 games than they are in for a disappointment. There is no way Sony would just make that happen. They have no reason other than goodwill. They can sell that stuff on PSN if they need to or ignore it completely. The only BC we are going to get is PS4 because they want to build the foundation for a Playstation eco system and platform.

I would love to pop in Silent Hill 2 into my PS5 and I would pay up to a 1000 bucks to do that, but It's not gonna happen. The Sony of the PS4 era has shown that they don't care about BC to older games. Just look at the pathetic collection of PS2 games and the lack of PS1 games that are sold through the store. They will have PS4 BC to lock this gens consumers down but there is no benefit from having teams work on emulators for all the other systems since they don't expect customers from that. Also, isn't licencing those old games a real bitch, like Square expecting their games to be more expensive than the rest?

Here is my prediction:

PS5 announced, Cerny talks about generations and drops the bomb, you can play all your PS4 games on PS5 and they run (marginally) better and there will be PS5 patches for the final bunch of PS4 exclusives. Shows some screens or even videos of PS5 enhancements and yadda yadda yadda. No word of last last gen games until the journalists pester them. They will give a bullshit excuse like: "We are always listening to feedback and working towards establishing a Playstation experience that suits all of our fans."

Cue Microsoft conference. In an attempts to mock Sony, MS releases a video akin to the old "Playstation Used Games Instruction" video called, this is how you play ALL your old games on Xbox --> video of Phil popping in Halo 1 into the Xbox Infinity.

I'll send Phil my copies of Metal Arms and JSRF to see if he can get them working for me.
 
Jan 21, 2019
2,902
It will probably vary between the algorithm and solution used. If a next gen PS5 or Xbox Two used a solution designed to run on silicon by Sony/MS with AMD's help I imagine it would be a pretty high quality one.

https://youtu.be/ONmJjuxEGZ0

This is Horizon running 4K60 - with checkerboarding and interpolation. As far as artefacting goes I don't see a huge issue there. And remember with interpolation and temporal checkerboading done together it will actually look better than this because the temporal checkerboard will have a future frame to compare pixels to as well. You could even combine the tech with temporal injection to further increase quality.

The major problem with interpolation is lag. But as I said, my proposed solution of updating game simulation at 60 on the CPU side would give easily equivalent lag to a current 30fps game, and it would be possible to have better lag than many current fen and previous gen 60fps games (like Doom at 86ms or Smash Ult at 96-112ms) with it too.

That sounds interesting and maybe even a viability for PSVR, as it needs higher framerates to boot with. Yeah, I dig it, I would love some interpolated game or at least an attempt in the vain of it. However, this sounds too good to be true and I doubt we get something like that. I love interpolation and I considered buying an Philipps OLED because their algorithm runs with 55ms of lag compared to other brands that run at over 130 with interpolation.
 

Doctor Avatar

Member
Jan 10, 2019
2,591
That sounds interesting and maybe even a viability for PSVR, as it needs higher framerates to boot with. Yeah, I dig it, I would love some interpolated game or at least an attempt in the vain of it. However, this sounds too good to be true and I doubt we get something like that. I love interpolation and I considered buying an Philipps OLED because their algorithm runs with 55ms of lag compared to other brands that run at over 130 with interpolation.

Using my solution the latency cost of interpolation would only be 33ms, however it's important to note that it actually would be no additional latency beteeen the same game running at native 30 without the engine running at 60.

So basically run the game at 30fps (simulation and rendering) with ~100ms+ latency of input (I can't find examples of 30fps games with better than 100ms latency). Or use my solution (simulation at 60, rendering at 30 with interpolation to 60) and have the smoothness of 60fps and as low as 73ms latency (as in the case of my interpolated COD) for the same GPU cost. So you could technically have a game that has reduced latency compared to the 30fps eqivalent and rendering at 60fps for the same GPU rendering cost (if interpolation is in silicon which it would be). As I said it would be possible to have less latency than many current 60FPS games like Smash and Doom!
 

Nachtmaer

Member
Oct 27, 2017
346
I'd honestly be surprised if any of the current gen consoles get a 7nm revision. You'll be spending hundreds of millions of dollars just to get a cheaper SoC way down the line. anexanhume has a point with cutting the memory controller down and using GDDR6 instead, but currently that's also more expensive $/GB. So yeah, at some point you'll get a cheaper box, but are there enough sales left in the tank to make up for all that? I genuinely don't know.
 

VX1

Member
Oct 28, 2017
7,000
Europe
I'd honestly be surprised if any of the current gen consoles get a 7nm revision. You'll be spending hundreds of millions of dollars just to get a cheaper SoC way down the line. anexanhume has a point with cutting the memory controller down and using GDDR6 instead, but currently that's also more expensive $/GB. So yeah, at some point you'll get a cheaper box, but are there enough sales left in the tank to make up for all that? I genuinely don't know.

Yeah,will be interesting to see what happens.I still think we'll get super slim 7nm $199 PS4 at some point,as a cheap Fortnite/PUBG/whatever box kids can play on for 3-4 more years.
 

MrKlaw

Member
Oct 25, 2017
33,038
If devkits are in the wild now as rumored with PS5 and after GDC that is rumored to happen with Xbox then it is highly unlikely any of these consoles will arrive in 2021; it just too far from devkits being out there already. Actually, devkits should be a fairly close approximation of the final design - if 2021 that design could change dramatically in two and a half years time and devkits could be drastically different (not ideal if want developers working on making games) . That is why they are released at a point when makers have all major things locked down with regard to specs and design. things can change (at least until tapout or easy upgrades like memory) but not dramatically.

People are more correct going with either Spring 2020 (less likely than Fall but a lot more likely than any 2021 scenario) or Fall 2020 as the easy bets now. Now, since I am relying upon the devkits being released, all my assumptions would change if there are no devkits this year.

If Sony kits have been out since last year and ms going out around GDC, then Sony are more like.y technically ready for a q120 launch then MS. But possible they choose to wait it out if they think it makes more business sense.

One worry is if they originally planned 2019 then postponed, how up to date will the ps5 be - is there a chance it has more conservative specs if it was drafted earlier?
 

Screen Looker

Member
Nov 17, 2018
1,963
Using my solution the latency cost of interpolation would only be 33ms, however it's important to note that it actually would be no additional latency beteeen the same game running at native 30 without the engine running at 60.

So basically run the game at 30fps (simulation and rendering) with ~100ms+ latency of input (I can't find examples of 30fps games with better than 100ms latency). Or use my solution (simulation at 60, rendering at 30 with interpolation to 60) and have the smoothness of 60fps and as low as 73ms latency (as in the case of my interpolated COD) for the same GPU cost. So you could technically have a game that has reduced latency compared to the 30fps eqivalent and rendering at 60fps for the same GPU rendering cost (if interpolation is in silicon which it would be). As I said it would be possible to have less latency than many current 60FPS games like Smash and Doom!

I honestly wouldn't care about 60 FPS if every 30 FPS game ran like Destiny does.
 

RevengeTaken

Banned
Aug 12, 2018
1,711
I can't believe you're paying attention to someone who said this lol

No way' PS4 has a 30% power advantage over Xbox One
 
Status
Not open for further replies.