vaporchamber...So how will you explain the XB1X fitting a 185W TBP equivalent GPU inside it?
Witchcraft?
vaporchamber...So how will you explain the XB1X fitting a 185W TBP equivalent GPU inside it?
Witchcraft?
I've been on the fence about switching to PC for a long time. If the consoles are less than 10TF, I think that will be what pushes me over the edge.
I would characterize it as a refinement of the design philosophy of the PS4 and a direct response to its bottlenecks and shortcomings. It remains to be seen which will be the more defining shift, but I think the PS3 to PS4 improvement will be hard to top.So... A custom-Ryzen 7 3700x CPU and a custom-Navi 10 GPU (with ray-tracing!) for the PS5 is the likeliest scenario?
Combined with a superfast SSD and faster RAM, would it be an exaggeration to say that this is one of the bigger leaps we've made? It's a lot bigger leap than PS3 > PS4.
Imagining games looking and playing waaay better than PS4 games is just... Insane to me. We're officially hitting CGI-visuals next-gen! 😊
I can't wait to see Naughty Dog's sci-fi adventure game on PS5... (Savage Starlight, baby!)
I've been on the fence about switching to PC for a long time. If the consoles are less than 10TF, I think that will be what pushes me over the edge.
Moar speed....MOAR SPEED!!! jeez, i still can't comprehend the speed of these new ssd's.For the SSD centric crowd, PHISON just announced that are in the midst of developing a NVMe Gen4 controller that supports speed up to 6.500MB/s reading. Planned for release in Q12020.
Thanks again Colbert for your time to answer my questions..Does a chiplet design have any performance advantages or disadvantages over a monolithic design?
Or is it 'just' a different manufacturing process to mix and match different chips?
MS comments about dual purpose would strongly support this design I think.
Another question I have, when MS talked about the dual purpose approach, is it good or bad?
Good would be: they are maximizing the performance(=higher cost) which will be offset by Azure
Bad: their will be silicon stuff which will have limited impact on game performance.
I would like to see MS creating a beast of a system, even as a PS fan, but heat and watt exists.
If anything, the 2 sku approach means that Xbox fan will definitely get a high end system.
*trapped in a dark room with Joel, after fighting on and off for hours, they are sweaty, bloody, the world burns outside around them, the finale of the game*The Last of Them, with you controlling an evolved sentient cordycept trying to stop humans from killing her own kind.
Moar speed....MOAR SPEED!!! jeez, i still can't comprehend the speed of these new ssd's.
Thanks again Colbert for your time to answer my questions..
Even if you are sceptical, the more i think about it, the more it makes sense to me.
It would fit a lot with what MS said about cloud dual purpose and their 2 sku approach.
Navi will contain a L0 cache called "Destination Operand Cache" in AMDs Super-SIMD patent which is where ALUs could export directly to, which reduces CU downtime. This would allow AMD to support Transcendental instructions such as "Sqrt", "log" and "exp". This is an advantage for NVidia as it has a Special Function Unit(SFU), currently AMD must perform such instructions in the normal SIMD setup. This would also allow AMD to embed Traversal Units(similar to RT cores) for Ray Tracing.
What are the prons and cons for using dedicated/discrete CPU and GPU instead of APU?
So... A custom-Ryzen 7 3700x CPU and a custom-Navi 10 GPU (with ray-tracing!) for the PS5 is the likeliest scenario?
Combined with a superfast SSD and faster RAM, would it be an exaggeration to say that this is one of the bigger leaps we've made? It's a lot bigger leap than PS3 > PS4.
Imagining games looking and playing waaay better than PS4 games is just... Insane to me. We're officially hitting CGI-visuals next-gen! 😊
I can't wait to see Naughty Dog's sci-fi adventure game on PS5... (Savage Starlight, baby!)
This post alleges RDNA is Super-SIMD
Pros:
Better yield
Potentially lower cost due to using desktop equivalent dies
Potentially higher thermal envelope
Cons:
Higher IO power to move data around
Interposer/package costs rise.
Yep.
- SSD
- Ray tracing (Even if its limited to proper reflections, we will no longer have PuddleGate.
- Massive 5x CPU boost
- 8x GPU Boost (in terms of performance if not flops)
So yes, it's no exaggeration. It will be a massive boost. The 8x boost in GPU power alone can push some extremely photorealistc or CGi quality visuals as seen in the latest Unity and Unreal Engine demos. The ray tracing tech will help sell the realism by properly applying reflections and lighting. The CPU will increase number of NPCs, more realistic NPC and enemy behavior, better destruction and weather effects at the very least. No more dumb friendly A.I sneaking around in the last of us. Joel and Ellie teaming up like Captain America and Thor. Each reacting immediately to the other's moves.
And finally the ability to traverse environments faster than ever thanks to the SSD. You want to fly a Stormbird in Horizon? Go right ahead. You want a jet to fly at the speed of sound in GTA? No problem. Travel from Liberty City to Vice City? Why use fast travel when you can literally fly there in a minute? You want a superman game? A flash game? Now possible without any restrictions on speed. You can also have multiple characters fighting in different places at once and quickly cut between two encounters like in a Christopher Nolan movie. A few seconds of cutscene transitions should be enough to load an entirely different level into the RAM. Inception dream within a dream levels in a game.
Oh damn.So where is 8TF coming from? Endless pessimism and the idea that being cynical makes you cool?
You think the RAM bottleneck from the PS3/X360 era, and the (admittedly insane) x16 jump to 8 GB GDDR5 was/is more significant than everything else I listed? You don't think it's on par with the "8 GB GDDR5" announcement?I would characterize it as a refinement of the design philosophy of the PS4 and a direct response to its bottlenecks and shortcomings. It remains to be seen which will be the more defining shift, but I think the PS3 to PS4 improvement will be hard to top.
Reporter: Can we say anything about Renoir [a rumored APU combining Zen 2 technology and Vega GPU cores]? There have been reports that the project is dead.Su: That is not true. It is doing well.Drew Prairie, AMD spokesman: We might want to start with, we don't know what Renoir is.
Source
Impressive bit of back-tracking there.
Renoir was/is scheduled for 2020. Zen 3+Vega seems like a good Xbox candidate, no?
After all, we're all Vega stans here, aren't we?
I think you're overestimating the impact of an SSD here. Pop-in and speed traversal can be impacted by a low speed drive, but the CPU remains the biggest culprit in term of draw distances and assets loading, because of the heavy load.
As for 8 times a GPU boost... That remains to be seen and that'd be, imo, the most generous estimate. Keep also in mind that in the mean time, it'll also be about pushing higher resolution. 1080p to 4k is the biggest resolution leap since PS1 to PS2. We're still talking about a VEGA64 equivalent GPU in term of power.
The PS5 patent show nearly no CPU workload for the SSD read the patent explanation. They explain CPU is a major problem for exploiting NAND flash speed and they imagine a solution where the involvement of the CPU will be minimal.
i dont think they will be wasting 4x the gpu resources on rendering 4k natively. they will settle for checkerboarding but we have been here before.I think you're overestimating the impact of an SSD here. Pop-in and speed traversal can be impacted by a low speed drive, but the CPU remains the biggest culprit in term of draw distances and assets loading, because of the heavy load.
As for 8 times a GPU boost... That remains to be seen and that'd be, imo, the most generous estimate. Keep also in mind that in the mean time, it'll also be about pushing higher resolution. 1080p to 4k is the biggest resolution leap since PS1 to PS2. We're still talking about a VEGA64 equivalent GPU in term of power.
And that's something that's is impossible for Sony to do? Because Sony can't or won't spend like $10-$20 extra on cooling if that will give them like 20% more performance at the start of a new gen?
It's not about the CPU impact from reading from the drive. It's about the computational impact of displaying more elements on screen and also the speed of loading them/hiding them fast enough.
Good to see you too... Things are looking positive for next gen.This is classic Chris. Love it.
Good to see ya, Chris.
Oh damn.
I'm partly with you, I don't think MS is using 'Navi' either.....
I also think that both Sony and MS will be using different products with the RDNA architecture tho... Su explicitly said that RDNA is their 'next-gen architecture', and I highly doubt MS will use Vega.
I predict that while Sony is using Navi, a product focused on gaming, MS will use RDNA equivalent of Vega, a product that will be focused on Database usage, and has not been announced by AMD yet..... actually they may have hinted at it with their next gen Instinct GPU line. 🤔
Reporter: Can we say anything about Renoir [a rumored APU combining Zen 2 technology and Vega GPU cores]? There have been reports that the project is dead.Su: That is not true. It is doing well.Drew Prairie, AMD spokesman: We might want to start with, we don't know what Renoir is.
Source
Impressive bit of back-tracking there.
Renoir was/is scheduled for 2020. Zen 3+Vega seems like a good Xbox candidate, no?
After all, we're all Vega stans here, aren't we?
Quite positive.Good to see you too... Things are looking positive for next gen.
That could be it. Anaconda could be the first server version of Navi with notably fp64, Int8 added. Maybe this is exactly what Arcturius is: First Server RDNA GPU. I still really don't see MS using an obsolete and broken GPU (Vega) to start a new gen.If anything, if MS added fp64 to Navi, it would be like Sony adding fp16 to polaris.
It's no biggy
True, but they're also forward-looking to PSVR2 which likely has a launch date offset from the PS5.
And you see them using a server GPU on a console?That could be it. Anaconda could be the first server version of Navi with notably fp64, Int8 added. Maybe this is exactly what Arcturius is: First Server RDNA GPU. I still really don't see MS using an obsolete and broken GPU (Vega) to start a new gen.
Holy shit, one day gone and so many new posts. Is there anything substantially new?
That would occur if and only if there's any credence to that rumor that Sony helped develop Navi in exchange for exclusivity in the console space. I doubt it though.I can't wait for this Vega theory to die. Why would MS use Vega when they know AMD's road map and know that Navi will be available?
That could be it. Anaconda could be the first server version of Navi with notably fp64, Int8 added. Maybe this is exactly what Arcturius is: First Server RDNA GPU. I still really don't see MS using an obsolete and broken GPU (Vega) to start a new gen.
I'm partly with you, I don't think MS is using 'Navi' either.....
I also think that both Sony and MS will be using different products with the RDNA architecture tho... Su explicitly said that RDNA is their 'next-gen architecture', and I highly doubt MS will use Vega.
I predict that while Sony is using Navi, a product focused on gaming, MS will use RDNA equivalent of Vega, a product that will be focused on Database usage, and has not been announced by AMD yet..... actually they may have hinted at it with their next gen Instinct GPU line. 🤔
Did you not read what I said? RDNA is the architecture within Navi.. Navi is just the name of the gaming GPU product line. MS can still use the architecture without naming Navi as the product that is in their console.
Also, Spencer himself said that the Xbox team is working with the Azure team to create hardware that can be used both in consoles and in servers. And that same hardware will work on compute functions while it isn't being used for gaming.. I didn't come up with that out of thin air.
Per AMD, RDNA is already optimized for gaming, what I'm suggesting is that MS would want the part to be able to handle compute workloads (I imagine they want GPUs with FP64 implemented). Rumors have long suggested that Navi will not have FP64, and will be purely meant for the gaming market, hence why Vega is still being marketed as the main GPU for database usage. I imagine we will see server cards based on RDNA architecture next year, and those will fall under a different product line (Arcturus?).
I think it's along the lines of people never hearing of vapor chambers prior to xbxX. They think it's beyond scope of just anyone rather than it being fairly simple tech.And that's something that's is impossible for Sony to do? Because Sony can't or won't spend like $10-$20 extra on cooling if that will give them like 20% more performance at the start of a new gen?
One thread version ago, didn't we talk about how going with stuff that is "sever based" could hamper a console?
That would occur if and only if there's any credence to that rumor that Sony helped develop Navi in exchange for exclusivity in the console space. I doubt it though.
That would occur if and only if there's any credence to that rumor that Sony helped develop Navi in exchange for exclusivity in the console space. I doubt it though.
Software arguments, mainly hypothetical 3rd party moneyhats.What are we gonna argue about, when we have the specs? This is my first time keeping up with upcoming console information, I usually relied on YouTube recaps.
Wrong tree or the right one?
GFX909 = Renoir Graphics (7nm (?) Vega Graphics).*Renoir is Picasso's successor.
The bolded is what happens every generational shift since at least 1994. Whether you're right on the former is a bit early to tell. Shifting to desktop class CPUs is a pretty big paradigm shift in history of all consoles for one.I would characterize it as a refinement of the design philosophy of the PS4 and a direct response to its bottlenecks and shortcomings.
I am beginning to think so too.I think it's along the lines of people never hearing of vapor chambers prior to xbxX. They think it's beyond scope of just anyone rather than it being fairly simple tech.
Eh, the PS4 has a CPU that is basically on par with PS3 CPU, if I understand correctly. That was a continued shortcoming.The bolded is what happens every generational shift since at least 1994. Whether you're right on the former is a bit early to tell. Shifting to desktop class CPUs is a pretty big paradigm shift in history of all consoles for one.
I figured the PS3 to PS4 transition was the lowest compared to PS1 to PS2 and PS2 to PS3.I would characterize it as a refinement of the design philosophy of the PS4 and a direct response to its bottlenecks and shortcomings. It remains to be seen which will be the more defining shift, but I think the PS3 to PS4 improvement will be hard to top.