This is so wrong.PC with GTX 1080 = Teen Gohan Super Saiyan 2
Xbox One X = Super Vegeta (Cell aka muscles form)
PS4 Pro = Super Namek Piccolo
PS4 = Super Saiyan Trunks (Android saga, when he gets his ass beat by the Androids)
XBO = Piccolo before fusion with Kami (Android saga)
Switch Docked = Mecha Frieza
Wii U = Frieza 100% Full Powered (muscles form)
Switch Handheld = Frieza's Third Form
Playstation 3 / Xbox 360 = Frieza's Second Form
Playstation Vita = Frieza's First Form
The only thing we have seen from Dark Souls, is the trailer that was analyzed, and it was dropping frames. Second hand comments from play at events isn't analytical. Skyrim, I never played on consoles so I have no clue if the framerate is better, but resolution being boosted doesn't have much to do with CPU.
That's actually quite a good shot, it looked worse than that most of the time and as mentioned, draw distance was shocking and blurry, didn't stop me from loving it as a 1:1 port from the PS3 at the time.
I remember when Vita was suppose to have a 2Ghz CPU, it's a bit of a shame it turned out underpowered, but at the time it really was more powerful than any Smart Phone, it's not Sony's fault how quickly mobile tech leapfrogged what Vita had & Sony couldn't really afford to wait another year.
The big problem with Vita was the lack of exclusives anyway, the hardware being a bit weak & games being sub native wasn't the issue at all.
FP16 is a real thing. It simply is the output from the GPU in 16bit rather than 32bit, it can't be used for lots of things, but it can be used for shader code. I actually have been developing game software as a hobby for years, and I just took on this method not long ago, my figure is my own findings. There is also an ex Ubisoft employee on a beyond3D forum saying that he can do about 70% of his code in FP16. It's not what you think it is, it's just what it is.FP16 is not magic, Cerny was telling porky pies. it's the same meme we heard about async compute being a huge game changer.
"supposed to". I think you're confusing theoretical maximums of the ARM A9. no system hits their chipset's maximums dude to issuesI remember when Vita was suppose to have a 2Ghz CPU, it's a bit of a shame it turned out underpowered, but at the time it really was more powerful than any Smart Phone, it's not Sony's fault how quickly mobile tech leapfrogged what Vita had & Sony couldn't really afford to wait another year.
The big problem with Vita was the lack of exclusives anyway, the hardware being a bit weak & games being sub native wasn't the issue at all.
It helps, but it isn't double performance gains like people imply by quoting pure FP16 TFLOPS as if that was the "true" power of the machine, since like you said, it can mainly be used for shader code, which is not everything in a game. Logic, draw calls, much more.FP16 is a real thing. It simply is the output from the GPU in 16bit rather than 32bit, it can't be used for lots of things, but it can be used for shader code. I actually have been developing game software as a hobby for years, and I just took on this method not long ago, my figure is my own findings. There is also an ex Ubisoft employee on a beyond3D forum saying that he can do about 70% of his code in FP16. It's not what you think it is, it's just what it is.
It helps, but it isn't double performance gains like people imply by quoting pure FP16 TFLOPS as if that was the "true" power of the machine, since like you said, it can mainly be used for shader code, which is not everything in a game. Logic, draw calls, much more.
Vita was stated to be 2Ghz for a long time, even the Wikipedia page had it at 2Ghz & Sony refused to talk about the actual specs for us to get 100% confirmation. It really wasn't until it was hacked that people accepted it was 333/444Mhz."supposed to". I think you're confusing theoretical maximums of the ARM A9. no system hits their chipset's maximums dude to issues
Well ill take your word for it, just it sounded like another one of Cerny's famous cases of embellishing the PS4 (Supercharged PC) & Far Cry 5 has FP16 & it really doesn't give it a massive boost, maybe if a game was completely designed with it in mind, but would that be possible this gen even with PS4 exclusives since you need to make sure it runs on Base PS4?FP16 is a real thing. It simply is the output from the GPU in 16bit rather than 32bit, it can't be used for lots of things, but it can be used for shader code. I actually have been developing game software as a hobby for years, and I just took on this method not long ago, my figure is my own findings. There is also an ex Ubisoft employee on a beyond3D forum saying that he can do about 70% of his code in FP16. It's not what you think it is, it's just what it is.
Some games might look nice when static but the framerates can get pretty awful on Vita.
Also some games look insanely muddy and hard to enjoy, like Need for Speed. Can barely see what is in the horizon due to the low res and muddy visuals.
God damn when will this God damn DBZ shit stopPC with GTX 1080 = Teen Gohan Super Saiyan 2
Xbox One X = Super Vegeta (Cell aka muscles form)
PS4 Pro = Super Namek Piccolo
PS4 = Super Saiyan Trunks (Android saga, when he gets his ass beat by the Androids)
XBO = Piccolo before fusion with Kami (Android saga)
Switch Docked = Mecha Frieza
Wii U = Frieza 100% Full Powered (muscles form)
Switch Handheld = Frieza's Third Form
Playstation 3 / Xbox 360 = Frieza's Second Form
Playstation Vita = Frieza's First Form
weird because it looks like Sony squashed those 2GHz rumors early on (quote was from 2011, before it was even called the Vita)Vita was stated to be 2Ghz for a long time, even the Wikipedia page had it at 2Ghz & Sony refused to talk about the actual specs.
https://web.archive.org/web/2011030....com/news/43308/Sony-tempers-NGP-power-claims"Some people in the press have said 'Wow, this thing could be as powerful as a PS3'," he stated. "Well, it's not going to run at 2GHz because the battery would last five minutes and it would probably set fire to your pants."
Pleeeeease explain to me what this is
Ah right, it was years ago & im remembering it a bit wrong, but i do for sure remember tons of people saying Vita was 2Ghz for the longest time.weird because it looks like Sony squashed those 2GHz rumors early on (quote was from 2011, before it was even called the Vita)
https://web.archive.org/web/2011030....com/news/43308/Sony-tempers-NGP-power-claims
and the ARM A9 does go up to 2GHz, so there was probably people spreading fud to circlejerk
Isn't "a bit below a PS3, but you also had to worry about battery life" what people at the time assumed? It seemed to me like the surprise was that it was way below the PS3. Sony should not have released those Uncharted Golden Abyss bullshots as it made people think it was going to be closer to a PS3 than it ended up being.
Any hint as to which big FPS project you were working on? Because I remember rumors that Black Ops Declassified had to be super rushed because it was originally meant to be a Black Ops port but the team realized the hardware wasn't as powerful as expected.
I feel like people tend to forget the difference from AMD flops and Nvidia flops.Nvidia flops different tho
Anyways Vita definitely did not have enough power for its screen res but it was still pretty darn good for a $250 handheld.
I feel like people tend to forget the difference from AMD flops and Nvidia flops.
remember the "mind-blowing development magic" frustum culling gifs that were going around a few years ago?
not even that, the difference between the Wii U's GPU and the XBO/PS4 GPU is big from an architecture standpointI feel like people tend to forget the difference from AMD flops and Nvidia flops.
From my understanding, instead of rendering the entire screen at once, the screen is broken down into smaller tiles and the GPU dedicates it's resources to one tile at a time, rendered to a small, internal, highspeed cache. Essentially allows the system to sidestep memory bandwidth bottlenecks for most (though not all) effects, without needing a giant pool of embedded RAM (as seen in the GCN, Wii, X360, Wii U and Xbone). The cache only needs to be big enough to store one tile, rather than the whole framebuffer. Or at least that's the tile-based part. Have no idea what the deferred part is meant to achieve in practical terms (It would mean you render different stuff to different to different buffers before combining, but I don't personally know to what benefit that's done).
Looking back now, it's insane how hard developers had to work with memory management last gen, especially given how long the gens lasted and how much RAM prices had dropped by the end of the gen making even budget PC's overkill compared to the 360 and PS3 by the end.
From my understanding, instead of rendering the entire screen at once, the screen is broken down into smaller tiles and the GPU dedicates it's resources to one tile at a time, rendered to a small, internal, highspeed cache. Essentially allows the system to sidestep memory bandwidth bottlenecks for most (though not all) effects, without needing a giant pool of embedded RAM (as seen in the GCN, Wii, X360, Wii U and Xbone). The cache only needs to be big enough to store one tile, rather than the whole framebuffer. Or at least that's the tile-based part. Have no idea what the deferred part is meant to achieve in practical terms (It would mean you render different stuff to different to different buffers before combining, but I don't personally know to what benefit that's done).
Whilst TBDR is common in the mobile space, it hasn't caught on elsewhere (until recently), so the Vita is the first time we've seen it used in a dedicated game console since the Dreamcast. The Nintendo Switch (and all Nvidia Maxwell and Pascal GPUs for that matter) uses a similar tile-based forward rendering solution, that seems give it a similar performance advantage. It's expected AMD will implement TBFR into it's next gen Navi architecture too, meaning next gen consoles will also get it. Should mean good things for the future.
And well, I guess that marketing didn't really work out for them considering the sales of the console!Sony really oversold the capabilities of the Vita early in its life. It's not a PS3 lite, it's a souped up Xbox with better shaders and more RAM. More powerful than the 3DS, but not the complete generation leap that the PSP was compared to the DS.
The Vita is significantly weaker than most gamers realize. When I first tried running Cosmic Star Heroine on my Vita devkit, it was running at around 10 fps and crashed due to running out of RAM quickly. It took me months of extensive optimization to get it to run well. Now admittedly, Unity adds a lot of overhead, but the fact that devs managed to get games like Gravity Rush & Killzone running on the thing boggles my mind.
I think you're going to need this.
Whilst we're talking about the Vita's hardware, I find the way it's RAM has been packaged in the SiP to be rather neat. Crazy how little space the whole thing takes up on the motherboard. That said, I can't imagine this set up is particularly great for thermals, which is probably part of the reason why the Vita is clocked so low. Anyway, you can read more about it here.
Viita was already outdated in terms of mobile tech when it came out, unfortunately.
Re: switch, paper specs be damned, i dont care if its from 2015 tech, 2018 tech, whatever. All i know is what i see on my tv is only barely better than the best looking wii u games.
And no i do not count Doom running in lower than low settings with bad framerates and sub hd "substantially better" than what i saw from my Wii u either.
Heres hoping Metroid Prime 4 can dazzle.
I understood that referencePC with GTX 1080 = Teen Gohan Super Saiyan 2
Xbox One X = Super Vegeta (Cell aka muscles form)
PS4 Pro = Super Namek Piccolo
PS4 = Super Saiyan Trunks (Android saga, when he gets his ass beat by the Androids)
XBO = Piccolo before fusion with Kami (Android saga)
Switch Docked = Mecha Frieza
Wii U = Frieza 100% Full Powered (muscles form)
Switch Handheld = Frieza's Third Form
Playstation 3 / Xbox 360 = Frieza's Second Form
Playstation Vita = Frieza's First Form
It was a marvel of engineering.Whilst we're talking about the Vita's hardware, I find the way it's RAM has been packaged in the SiP to be rather neat. Crazy how little space the whole thing takes up on the motherboard. That said, I can't imagine this set up is particularly great for thermals, which is probably part of the reason why the Vita is clocked so low. Anyway, you can read more about it here.
The problem with Vita is that, while the specs were good at the time... people didn't knew it was severly downclocked.
4 core Cortex A9 and a PowerVR SGX544 were amazing specs back then... the problem is the CPU clock was like 333mhz and the GPU clock wasn't high either.
You can overclock the Vita:Can you increase the clockspeed when you are using Custom Firmware? And if so, where it will get unstable? On PSP you could increase the clockspeed from 222 Mhz to 333 Mhz.
Oh, that's pretty cool. I should look deeply into this.The deferred part refers to the rasterization of said tiles. Also, the cache is usually big enough to store multiple tiles at once, to allow for parallelization.
Yeah I know they're not exactly the same thing, that's why I did make the distinction that Nvidia uses tile-based forward rendering. I was under the impression Nvidia's solution was at least somewhat similar in execution to ImgTech's and still offered much of the same benefit, except for those linked directly to the deferred part, but I admittedly that was purely based off second hand impressions, so I'll concede if I'm wrong about that.You are confusing tile-based rendering with tile-based deferred rendering unfortunately. What pushes the PVR isn't just the tile-based rendering, it's the deferred part. Some games use deferred rendering, some games use tile-based rendering, but it's very, very rare to see tile-based deferred rendering outside of the PVR.
Yeah I know they're not exactly the same thing, that's why I did make the distinction that Nvidia uses tile-based forward rendering.
You can overclock the vita to 444/222 for any game which to my knowledge no game released ran at natively. This irons out every singles games bad performance apart from the odd ones that absolutely stink like resident evil revelations, Jak and daxter, and assassin's creed. To my knowledge (and a 256gb full memory card) it nails performance to a 30fps or 60fps lock on every single title if they underperformed at stock.Can you increase the clockspeed when you are using Custom Firmware? And if so, where it will get unstable? On PSP you could increase the clockspeed from 222 Mhz to 333 Mhz.