This team hasn't been beaten on power though - their first console was 1X which is more powerful than the PS4 Pro.
Not as difficult when you launch a year later and at a $100 premium.
This team hasn't been beaten on power though - their first console was 1X which is more powerful than the PS4 Pro.
You're underestimating the overall power upgrade both consoles are bringing. The differences are going to matter less and be less noticeable.Because they were hit over the head with 1080p v 900p memes last time. No way they wanted it to happen again. When they started designing the console they probably had 'win the power battle' underlined on a whiteboard.
lol nah. The way MS handled the XB1 announcement was an embarrassment. I love the way they are handling XSX now. Even if they get beat on specs, these consoles will be so close compared to last time that the only people who will throw a fit are crazy, annoying fanboys who grasp at anything for their "warz".But that's easy when you're launching later. This will follow on from the Xbox One debacle. Being beaten again on power would be an embarrassment.
You're right that the differences would be hardly noticeable. However, I think people will still hit them over the head with it.You're underestimating the overall power upgrade both consoles are bringing. The differences are going to matter less and be less noticeable.
GAINZHonestly the thing I'm most excited about is all the potential multipliers built in to these new consoles rather than the raw power. It sounds like between the dedicated RT hardware, the RT Audio support, the Machine Learning optimizations, the Variable Rate Shading and Mesh Shaders, and the dedicated decompression hardware related to texture streaming that these consoles could end up punching way higher than even their raw TFs compared to the existing consoles. That's not even mentioning the RDNA efficiency gains.
lol nah. The way MS handled the XB1 announcement was an embarrasment. I love the way they are handling XSX now. Even if they get beat on specs, these consoles will be so close compared to last time that the only people who will throw a fit are crazy, annoying fanboys who grasp at anything for their "warz".
No, results are the only thing that matter. The only way people will hit them over the head with it is if games consistently look and perform noticeably worse on XSX compared to PS5.You're right that the differences would be hardly noticeable. However, I think people will still hit them over the head with it.
X was released with the full knowledge it wasn't going to sell like hot cakes, it was a premium product, Phil even said the S would sell more, so sales performance isn't a reason to use against its design. We got plenty of games that received huge gains from X, BC to current gen. 4k RDR & RDR2, Gears 5 @ 4k/60, FH4 @ 4K etc those are enough to demonstrate the machine was a sublime piece of tech, even though some devs didn't invest the amount needed to squeeze out similar performance, that's not an issue with the X itself but their dev pipeline.I don't agree. It was to expensive, which let to the fact it got constantly outsold by every other console. The balance was also really off. The CPU was to weak and the GPU was to strong. This resulted in developers pushing for 4K that led to sometimes inferior framerates. There's more than one example of games running worse on X than on Pro. That's pretty weird for a stronger console on all fronts. Guess MS was really demanding 4K native because that was the previous buzzword. Now it's "teraflops".
Sure, it's mighty impressive X can run RDR2 at 4K native, but I'm pretty sure 4K checkerboarding/1440p for 100 euro's/dollars/whatever less would have sold a lot more.
X was released with the full knowledge it wasn't going to sell like hot cakes, it was a premium product, Phil even said the S would sell more, so sales performance isn't a reason to use against its design. We got plenty of games that received huge gains from X, BC to current gen. 4k RDR & RDR2, Gears 5 @ 4k/60, FH4 @ 4K etc those are enough to demonstrate the machine was a sublime piece of tech, even though some devs didn't invest the amount needed to squeeze out similar performance, that's not an issue with the X itself but their dev pipeline.
In fact, X design influenced XSX and that wouldn't be the case if it wasn't perceived and proved it's capabilities.
the goal of Xbox One X was never to be the most selling console, it wouldnt have been the 4K machine if thats what they wanted.which let to the fact it got constantly outsold by every other console.
I don't agree. It was to expensive, which let to the fact it got constantly outsold by every other console. The balance was also really off. The CPU was to weak and the GPU was to strong. This resulted in developers pushing for 4K that led to sometimes inferior framerates. There's more than one example of games running worse on X than on Pro. That's pretty weird for a stronger console on all fronts. Guess MS was really demanding 4K native because that was the previous buzzword. Now it's "teraflops".
Sure, it's mighty impressive X can run RDR2 at 4K native, but I'm pretty sure 4K checkerboarding/1440p for 100 euro's/dollars/whatever less would have sold a lot more.
You shouldn't compare the XSX to a stop-gap system. Look at the PS4 and Xbox One, compare the XSX to them, and be amazed.After thinking about Microsoft's Series X specs a bit, I'm going to give Microsoft an A for the CPU and GPU, a B for the storage subsystem (NVMe SSD) and a C for the memory subsystem.
The Storage subsystem only gets a B, because a 2.4GB/s NVMe drive isn't very fast compared to the newest NVMe drives. (Forget the 4.8GB/s compressed number, that's a useless marketing number. Engineers only refer to a spec sheet that shows to raw transfer rate, which is 2.4GB/s) It's a huge step up from last gen, so it does get a B, but it may be surpassed by the competition.
The big disappointment is the memory subsystem. On the Xbox One X I own, Microsoft went very aggressive with a 12GB, 384-bit GDDR5 solution at 326 GB/s. For Series X, they increased the memory by only 33%, from 12GB to 16MB. That's a very small jump. And then they used less aggressive engineering and shrunk from a 384-bit bus to 320-bit bus. And to make it worse, only 10GB is fast memory (560GB/s), the other 6GB is as slow as the XBox One X (336GB/s). Put together, that's a sure sign they were not focused on max. performance, but hitting a lower cost target. Given that Microsoft has a lower cost Lockhart to hit a price point, you would hope that Series X would be focused on top performance, not cutting corners in the memory subsystem.
I have an LG B7 OLED as well, but I've always run my gaming devices through a receiver as there would never be enough inputs on a TV for all of my devices anyways. When I bought the B7 I upgraded my previous receiver to a Marantz receiver capably of 4K HDR pass through, but it was less than $500 (it is one of the slimline models). If you are only using a couple of devices when connected to your TV, then I can see how a full featured receiver could seem like overkill for some, but if you looking to run a full 5.1 (or higher with Atmos possibly) then it is something to consider.
Yes it was.
the goal of Xbox One X was never to be the most selling console, it wouldnt have been the 4K machine if thats what they wanted.
The goal was to provide a permium option for console players that want the best experience a console could offer which it did in spades.
How about both? The internal SSD can be connected using a hard-to-access m.2 slot, and the external can be connected through an easily accessible m.2 slot.
How would it impart the form of the device? m.2 SSDs are tiny!
This is the most important point to make here. The differences will be ABSOLUTELY negligible to the consumer. I honestly do not forecast a situation where either console RUNS AWAY with a power lead. 10% "more power" means nothing if both games are hitting 4k 60 with HDR10. That's pretty much the ceiling this generation. Even some of the pie-in-the-sky features both have marketed will not be relevant to 14 year old billy at home. Can it hit 4K/60 with HDR10? That's the ceiling.You're underestimating the overall power upgrade both consoles are bringing. The differences are going to matter less and be less noticeable.
This makes me sad. Love hdmi in.So I guess now I would need 4 hdmi inputs minimum since the Series X doesnt have HDMI-IN : /
Are you me?I love love love it too, I'm so torn up on that. I love using my switch or surface with the xbox and being able to change back and forth so quickly.
The only remote I use in my entire life is this bad boy
so if this xbox doesnt have IR I guess Im a bit confused how it will control my receiver and TV the way it does now
Interesting that the XSX has 76MB total cache. That's enough for known RDNA 1 caches and a whole 32MB L3 for Zen 2. If true, it's even more impressive that XSX is only 360mm^2.
it also means no IPC loss from desktop counterparts. This is unlike the mobile APUs that only have 8MB cache for L3.
Other possibility: L3 is only 8/16MB, and RDNA 2 has even more cache than RDNA 1. Great news either way.
But that's easy when you're launching later. This will follow on from the Xbox One debacle. Being beaten again on power would be an embarrassment.
the balance comes from delivering more options like the S and X. This gen Xbox suffered from things completely different than hardware quality (PR/Messaging and First Party lineup) because S and X are arguably the most high quality consoles on the market as of right now.I know. I'm just saying it could've sold and perform better if other choices were made and it was better balanced.
I agree. Getting whopped on power on again would be a big yikes for MS engineering team.
Apparently the framerate was variable between 30-60.Watching the DF video on Minecraft's ray tracing, if I am hearing them right, MS had Minecraft going at 1080p60 with DXR on the XSX. What does that mean for a theoretical Lockhart? I'm assuming DXR wouldn't be supported, if for instance every other spec between them is equal besides CU count drop.
Watching the DF video on Minecraft's ray tracing, if I am hearing them right, MS had Minecraft going at 1080p60 with DXR on the XSX. What does that mean for a theoretical Lockhart? I'm assuming DXR wouldn't be supported, if for instance every other spec between them is equal besides CU count drop.
Apparently the framerate was variable between 30-60.
As for Lockhart, they're going to have to figure out a way to get it running. I think feature parity - that you'll get the same experience on both machines - is a necessity. Otherwise they greatly undercut Lockhart's value proposition and risk confusing their messaging.
From what I've read, RT largely does scale with resolution. But it does beg the question how much they'll have to downscale things - and how much the end presentation will suffer - if XSX is 'only' rendering at 1080p.
Also worth noting that Minecraft's implementation is the most costly type of RT there is. Most games won't do path tracing.
So the XSS will run it in 880p 30fps. If you buy a lower-res console, you have to consider that you will have to play games at lower resolution :)Watching the DF video on Minecraft's ray tracing, if I am hearing them right, MS had Minecraft going at 1080p60 with DXR on the XSX. What does that mean for a theoretical Lockhart? I'm assuming DXR wouldn't be supported, if for instance every other spec between them is equal besides CU count drop.
I hope you remember your justifications when these drives cost an arm and a leg for the next five years, or when your internal drive dies and you have to get a whole new unit.When you design the box such that users can access and replace the internal drive, it's going to inform the placement of all the other internals as well as complicate the development environment when devs don't know exactly what they'll have access to.
I'm not convinced that allowing the replaceable ssd is worth the hassle when users will overwhelmingly gravitate to the far easier options available.
I can all but guarentee that most XSX owners will simply use large USB HDDs as cold storage and is the 1TB internal for the next-gen games they are actively playing
I hope you remember your justifications when these drives cost an arm and a leg for the next five years, or when your internal drive dies and you have to get a whole new unit.
I'd trade in 2TF for a beefier CPU with 1440p games @ 60fps any day. That would've been really masterful. You can't expect multiplatform devs to squeeze out the same performance as first party studios. I'm just saying 399 with 4K checkerboarding (of which, let's be honest, most people don't even see the difference with native 4K at a proper distance) would have been the better choice. Probably overall lower resolutions but with better performance.
This is quite nebulous, its proved to perform well, and if pushed it performs brilliantly. Sales is irrelevant in this context as it achieved its goal.I know. I'm just saying it could've sold and perform better if other choices were made and it was better balanced.
I hope you re read the statements from MS regarding storage.I hope you remember your justifications when these drives cost an arm and a leg for the next five years, or when your internal drive dies and you have to get a whole new unit.
I have an older version of the current slim line of Marantz recievers.If you dont mind could you shoot me over a link for the receiver you purchased?
Unless my usage doesnt count because of your last sentence.
heres my setup
3 in-wall front speakers, 2 in-ceiling rear speakers and subwoofer on the side
Xbox One HDMI -> TV
Xbox One Optical -> Astro Stand (on subwoofer) -> Receiver
PS4 HDMI -> TV
PS4 Optical -> Receiver (just unplugged)
Nintendo Switch HDMI -> Xbox One
(Switch is hidden behind books)
Surface Pro HDMI -> Xbox One
So I guess now I would need 4 hdmi inputs minimum since the Series X doesnt have HDMI-IN : /
edit: would a new receiver need to be HDMI 2.1 to make all the new stuff work?
Right, and 10GB won't always be enough, particularly as the XSX and PS5 will represent a baseline shift. There are PC games today that use north of 8GB VRAM at 4K.
Watching the DF video on Minecraft's ray tracing, if I am hearing them right, MS had Minecraft going at 1080p60 with DXR on the XSX. What does that mean for a theoretical Lockhart? I'm assuming DXR wouldn't be supported, if for instance every other spec between them is equal besides CU count drop.
Isn't that the point of having cheaper hardware, for those who don't want RT and other high end features like native 4K? The games will be scalable like PC. You want high end you buy Series X.
lockhart will most likely have RT as well. having a split in support will just be a nail for MS's coffin if they force devs to support both. then they won't even bother with RT, which undermines MS's push for itIsn't that the point of having cheaper hardware, for those who don't want RT and other high end features like native 4K? The games will be scalable like PC. You want high end you buy Series X.
Well then pay for it with Series X. I don't get the issue. Even the Nvidia 2060 Super which retailed for $400 had frame rates drop severely when turned on.I dont want RT to be a "high end" feature, aka hardly supported by anyone other than 1st parties. I think it would be best if it was a part of the baseline
lockhart will most likely have RT as well. having a split in support will just be a nail for MS's coffin if they force devs to support both. then they won't even bother with RT, which undermines MS's push for it
I watched that and had to laugh at how slow it seemed on the old xbox ha. Pretty cool tech.So I just rewatched the load time video and holy cow that load time is greatly improved.
xbox series x vs x1x is 10 sec vs 52sec or 5.2x faster and if I'm not mistaken that's just a bc game no added optimizations
not saying it'll put Xbox into the grave, but it'll make RT support in the future more spotty because devs will have to build for both full rasterizing and rt and some might just not bother with it. when the baseline has rt, then we can see better optimization that will benefit everyoneNail in MS's coffin? Nvidia has games with it and AMD gets the same releases and they don't have cards with RT yet. Not sure I get your point.
I dont want RT to be a "high end" feature, aka hardly supported by anyone other than 1st parties. I think it would be best if it was a part of the baseline
The reason PC games use so much VRAM is because the penalty for not having something in local memory and having to move it in over PCIe is so severe. The GPU ends up holding way more in local memory because of this whether it needs the full bandwidth of the GPU memory or not. Also, minus the system reservation on the XBSX, there's only 3.5GB of slower RAM left available. I'm thinking most games will easily be able to accumulate this much data in CPU + I/O + lower-bandwidth GPU ops + everythng else that's not bandwidth intensive.