I just saw that Ubisoft has almost 14k employees (!!!)...Sony sent bunch of dev kits recently to third parties...and we know Ubi leaks like a sieve,sooo...
So how do we know a leak is real. What are we looking for?
I just saw that Ubisoft has almost 14k employees (!!!)...Sony sent bunch of dev kits recently to third parties...and we know Ubi leaks like a sieve,sooo...
Memory controllers can be changed for specific customers. MS for example went with custom route and used DDR3 memory controller for base Xbox One.https://www.overclock3d.net/news/gpu_displays/amd_navi_pcb_leaks_-_confirms_memory_configuration/1
This is interesting, could deconfirm HBM2 on PS5, GDDR6 more likely now if you want to bet on it.
he means that Ubi developers will leak to jason or some other journalist the details.
Sure, 1440p
depends if the game is designed for anaconda and downscaled to lockhart or the other way around. i could see a scenario where the game is designed to be 1800 p on anaconda and runs 900p on lockhart.
Yeah, there's also that. But I think that would be in years to come.depends if the game is designed for anaconda and downscaled to lockhart or the other way around. i could see a scenario where the game is designed to be 1800 p on anaconda and runs 900p on lockhart.
Lockhart will have at least equals or more TF than Xbox one x but i suspect little bit higher
DF also noticed it:
"If this is an engineering sample board, a potent power delivery set-up like this may simply be in place to put the new silicon through its paces. However, it is a touch concerning if this is an AMD reference board aimed at the consumer - as the configuration suggests that a standard 'blower' design with a single fan may be in place as a cooling solution, something that rarely works out well for Radeon hardware."
:)Looks like they're largely repeating buildzoid's analysis without giving him credit.
Native driver support just moves the s/w part which translate DXR calls into driver/hardware calls from the emulation layer into the driver. DXR is in its essence DX compute. Thus you can either translate it into DX compute via some emulation layer (what DXR Fallback Layer does) or handle this directly inside the driver, mapping DXR API calls to your GPU compute capabilities. The second option is better because a) you don't have an emulation layer somewhat slowing things down for you and b) you can map DXR calls to your GPU h/w better because you're the one who made the h/w. But both are "h/w supported raytracing" since GPU compute is something which is running on a dedicated h/w - GPU.But to answer the other part of your initial questions what I think about the examples you listed: I see RT as hardware supported if there is native driver support. It is not emulated in software anymore. Well that is how I see it.
It wasn't.
There's no indication that this is the case. However it's worth remembering that next gen consoles are highly unlikely to use the same CLN7FF node which Vega 20 is using. Navi 10 (or whatever will launch this year) likely will though.
Yea that's a more useful breakdown - not sure if there's any way to make terminology stick, but it helps conversations to call out the differences.Maybe we can split the different classes of hardware support into more than two like you did in your comment :
I'd tend to agree, I don't expect a ton of fixed function work there(computing whole intersection tests like RT core - I doubt it). But based on the work AMD/Sony did for things like VR facing features and CB acceleration etc. - I do expect 'some' specific hw-customizations that will help the process.So my assumption is we will see Hardware Accelerated RT in consoles but not Hardware Implemented RT.
Perhaps you can elaborate on what you think exactly Vega VII is and what node consoles will be on, then?It wasn't.
There's no indication that this is the case. However it's worth remembering that next gen consoles are highly unlikely to use the same CLN7FF node which Vega 20 is using. Navi 10 (or whatever will launch this year) likely will though.
That isn't a good thing, right?
HmmGCN is a memory starved design as well, requiring the use of expensive HBM for performance gains
Indeed. Just like DF is still listing a 4TF Lockhart.Don't they still have it wrong about what Arcturus is? I thought it'd had been explained months ago it is a product, not a microarchitecture?
Yeah that seems like it would be a mistake if true. $299 is likely to be a higher physical purchase demographic than the $399-499 launch crowd. If you're price sensitive I think you're more more likely to want access to be able to trade games in and buy used games.
To keep consumers happy and not be seen as forcing people into buying games a certain way like what happened in 2013They want price sensitive gamers to be gamepass gamers. Why would ms give two fucks about retail and used games?
Who really knows. Maybe I am making a mistake compare it to the Xbox One X. But it would be weird if Lockhart would have less peak performance than the One X. Would be a marketing nightmare.
Lockhart will have at least equals or more TF than Xbox one x but i suspect little bit higher
4tf lockhart in a 2020 is a joke I don't know how people can trust this
I agree.Who really knows. Maybe I am making a mistake compare it to the Xbox One X. But it would be weird if Lockhart would have less peak performance than the One X. Would be a marketing nightmare.
Vega 20 was designed specifically for N7 process from the start. There's nothing "shrank" about it.Perhaps you can elaborate on what you think exactly Vega VII is and what node consoles will be on, then?
That is something i agree about:Who really knows. Maybe I am making a mistake compare it to the Xbox One X. But it would be weird if Lockhart would have less peak performance than the One X. Would be a marketing nightmare.
To keep consumers happy and not be seen as forcing people into buying games a certain way like what happened in 2013
I don't agree with this statement at all. Wtf? Maybe you'd want 8TF to try to get as many games as possible native 4k.. but it is in no way a "minimum". I respect Cerny, but I don't take his word as gospel, especially when we already have a more than a year of actual real world data showing us that a 6TF machine is more than capable of outputting native 4k resolutions on some of the most beautiful games we've ever seen in gaming. I have to be misunderstanding your statement right? It just doesn't make sense.
You know this is all like putting a blind man in a dark room.
599$ PS5 confirmed
Who really knows. Maybe I am making a mistake compare it to the Xbox One X. But it would be weird if Lockhart would have less peak performance than the One X. Would be a marketing nightmare.
Who really knows. Maybe I am making a mistake compare it to the Xbox One X. But it would be weird if Lockhart would have less peak performance than the One X. Would be a marketing nightmare.
4K/30 FPS and 1080/60 FPS should be totally doable next gen and hopefully devs start offering them.
I dunno, the 6TF X1X runs some current gen games at 1440p, trying to push next gen visuals on a 5TF GPU sounds like a nightmare even at 1080p.GPU is not the only measure of performance.
A 4tflop gpu, 16-24gddr6 and a 8core zen 2 would be more caperble then the X1X.
However overhead is needed to guarantee 1080p, so I would expect at least 4.8tflops if anaconda is 12tf.
Lockhart needs to be at least 40% of anacondas gpu to have 1080p in the majority of games.
Very well then Sony can counter with ps4 and psnow. That would be cheaper and psnow has a lot more traffic doesn't it?They can do that by making gamepass a gentle breeze blowing through your bank account every month and having all first party new games on there day one. Yes there will be some who complain, but it wont be like 2013, esp with eXpensive box as an option.
They wont be using TF to push cheapbox at all, so it doesn't matter as far as marketing goes. It is a cheap option to play next gen games with new tech like ssd and zen, and they will pump out a bunch of anaconda footage for dat halo effect. + gamepass! Keep your kids quiet for the cost of a Netflix sub! They've got lots of sales hooks besides more power.
Very well then Sony can counter with ps4 and psnow. That would be cheaper and psnow has a lot more traffic doesn't it?
if a 6tf gpu cant do 1080p nextgen, no way 12 does 4k. while not everything a a liner scale, resalusion is. to display the same 1080p image at 4k requires 4x the gpu power. a 6tfp gpu will not only be fine for 1080p, it will be more room for devs to work with, than a 12 tfp gpu at 4k.I dunno, the 6TF X1X runs some current gen games at 1440p, trying to push next gen visuals on a 5TF GPU sounds like a nightmare even at 1080p.
I dunno, the 6TF X1X runs some current gen games at 1440p, trying to push next gen visuals on a 5TF GPU sounds like a nightmare even at 1080p.