???? It's literally the Vega architecture on 7nm.....
Maybe a sizable amount better than the Vega 64 just from the node shrink. Only being barely ahead and still suffering from the power, heat and noise issues is weird.Dont know why people are so suprised, we already knew that this is Vega at 7nm, what were people expecting?
Sure they havent been competing at the high end for about ten years now right? Shame NVidia only has 1.5 cards faster than the Vii then (counting the 2080 as faster, which is kinda marginal and it probably wont be in any decently forward looking technology game rather than dodgyass generic ue4 titles,, but i'll give it .5)...
I'd hate to see if they did compete!
Maybe a sizable amount better than the Vega 64 just from the node shrink. Only being barely ahead and still suffering from the power, heat and noise issues is weird.
Totally, maybe then it wouldn't take them two years and a node shrink to match a card like the 1080 Ti and have customers crossing their fingers hoping drivers sort the rest. But yeah, super great, and Nvidia's RT tech is just old hat. *eye rolls into infinity*
Also it should be noted that street prices of the 2080 are a good deal higher than the MSRP of VII. A few cards at $699, most closer to $800, a few custom cards at $900... that's a lot to pay for ray-tracing in a single game, which is only playable at 1080p on the 2080...
Here the street price of a Radeon VII is 749€ and the 2080 is 669€. We'll have to see how that develops in a few weeks, but it doesn't look enticing at all even if you absolutely don't want any RTX features and don't care about power consumption.Also it should be noted that street prices of the 2080 are a good deal higher than the MSRP of VII. A few cards at $699, most closer to $800, a few custom cards at $900...
Here the street price of a Radeon VII is 749€ and the 2080 is 669€. We'll have to see how that develops in a few weeks, but it doesn't look enticing at all even if you absolutely don't want any RTX features and don't care about power consumption.
(I don't think comparing prices to high-end third party 2080 models makes sense given the Radeon VII noise results)
There seem to be several pretty good models in stock at 699 USD at your link too though (as I said, given VII noise results, I see no reason to compare to the higher-end third party models). Still, it's interesting to see a situation where EU prices for GPUs in € are actually cheaper than US prices in $, that hasn't been the case for a while now.At those prices yeah, the VII doesn't look enticing at all. Here's current situation for 2080 in the US:
http://www.nowinstock.net/computers/videocards/nvidia/rtx2080/
Again, though, I stand by this- $699 is way too high for this Vega update. Even $599, that's around the price of a 2070, much more tempting.
Nvidia RT tech is basically tech demo, 5 months with basically 1 RTX game.
And? The tech is there and available for anyone to use, and there will be more in the coming months. Each title that supports it is one more than AMD will have for the foreseeable future.
Yeah, this looks like a paper launch basically. Sold out everywhere already. I bet they made less than 50k units of these. Purely a stopgap card that they put out just to have something showing up in higher end benches.
50,000? Launch day allocation for the whole of the UK was ~100 and 20 apiece for France and Spain.
I should clarify; not 20 thousand, 20 units for an entire country.
Metro coming out in a couple of days. It's going to look glorious with RTX. The only case someone can make for this Radeon card is if they want VRR on HDMI which is a good feature.
Haven't had a chance to watch yet. Where do they excel in productivity? I know working with 4K video is improved thanks to the memory. Anything else?While I understand this is the gaming side of Era, this card fits AMDs current MO down to a T. Just like their CPUs, they make products that are great for productivity first, and can also game on the side.
From the brief benchmarks I saw on both Gamers Nexus and LTT, the thing absolutely screams in productivity compared to even the 2080Ti, so approaching this card as a purely gaming product is already the wrong mindset. I'm glad AMD are still putting in work and putting out products, but seeing all the negative press because it doesn't do it's secondary function quite as well as Nvidia's arguably primary function is a little disappointing.
This is all great and all but we are a gaming forum and secondly we are all nervous about next gen console performance with how terrible its doing.While I understand this is the gaming side of Era, this card fits AMDs current MO down to a T. Just like their CPUs, they make products that are great for productivity first, and can also game on the side.
From the brief benchmarks I saw on both Gamers Nexus and LTT, the thing absolutely screams in productivity compared to even the 2080Ti, so approaching this card as a purely gaming product is already the wrong mindset. I'm glad AMD are still putting in work and putting out products, but seeing all the negative press because it doesn't do it's secondary function quite as well as Nvidia's arguably primary function is a little disappointing.
They get to claim the world's first 7nm GPU.Rich posted his video - and now that it is out, I kinda want to say that this is an awkward GPU launch. Other than the highlighted point of using 16 GB of VRAM for editing video on the cheap while foresaking CUDA, I am not sure what this GPU is about.
One niche audience I was thinking this card might be good for: dedicated Linux gamers who want high performance cards for Proton. nVidia's Linux drivers are known to be crappy.
Has anyone done Linux-specific benchmarks? Might be interesting to see.
Why? If RVII is of any indication of what AMD plan to do in the future they'll just price their similar performing Navi cards accordingly to where Turing cards are now.By the time Navi hits Nvidia probably launch a more polished RTX and hopefully cheaper.
While Vega 20 is very similar to Vega 10 it's still a different chip made for a different market. The fact that most (if not all) HPC/DL features of Vega 20 doesn't work in RVII (are disabled or simply remain unused in gaming) doesn't mean that they don't consume additional power.Its not only barely ahead, and the card even has less CU than Vega 64 (60 vs 64). They invested the node reduction to get higher clocks, not power savings. If they went for power saving they would get the same Vega 64 perf for lower power (since the specs are basically the same)
There are more coming and I wonder - how long do you think does it take to make an AAA game these days? DXR was made available in spring 2018, without h/w acceleration (via Titan V), released to consumers in autumn 2018 basically alongside Turing cards. It's been less than a year since developers actually had any access to DXR to add it into their games.Nvidia RT tech is basically tech demo, 5 months with basically 1 RTX game.
There's a catch to these benchmarks - most of them compare cards in OpenCL while NV cards are usually a lot faster when using CUDA. There's also a lot of productivity software which is using CUDA only.From the brief benchmarks I saw on both Gamers Nexus and LTT, the thing absolutely screams in productivity compared to even the 2080Ti
Yeah, results ar
Haven't had a chance to watch yet. Where do they excel in productivity? I know working with 4K video is improved thanks to the memory. Anything else?
This is all great and all but we are a gaming forum and secondly we are all nervous about next gen console performance with how terrible its doing.
Radeon VII level of performance in a ~$500 console in 2019/2020 would be the deal of the century. Whatever is going in the next gen boxes is probably not even going to perform at that level.This is all great and all but we are a gaming forum and secondly we are all nervous about next gen console performance with how terrible its doing.
One niche audience I was thinking this card might be good for: dedicated Linux gamers who want high performance cards for Proton. nVidia's Linux drivers are known to be crappy.
Has anyone done Linux-specific benchmarks? Might be interesting to see.