Would make about zero sense to do this costs wise so nah.The top Ampere generation cards will likely get a TSMC 7nm refresh, maybe even with TSMC 7nm EUV.
N5 quotas are already bought up, NV has their share too.But AMD is going to come in and buy a lot of TSMC's 5nm EUV capacity next year after Apple has had most of the early 5nm capacity that they have this year.
Also about zero chance of any "MCM" or "chiplet" GPUs reaching consumers any time soon from any IHV. Hopper is likely a 100% data center / AI / HPC oriented design. No clear info on if it's even a separate architecture or just Ampere chip(let)s in a multichip configuration.Nvidia's next, next gen architecture after Ampere is RTX Hopper. This will likely have MCM capabilities from the start (different to SLI). These MCMs would be Nvidia's answer to AMD's Chiplets for CPUs AND GPUs. Hooper RTX cards could reach consumers as soon as early 2022, perhaps even late 2021, to combat RDNA3 cards.
I'd give about 1% of the first happening with Navi 2x vs Ampere and the second is basically guaranteed.If AMD ends up leading in 4K Native and nVidia has a clear lead with DLSS enabled
As if they ever were any different. Remember, DLSS and RT are useless gimmicks and NV will release GTX 2080Ti for 1/4th of RTX 2080Ti price because RT h/w adds this much price to it. Any day now.then the PC threads are going to be console native vs checkerboarding all over again
We really need to up our thumbnail game - I mean just look at this master piece.
Getting really tempted by a Samsung G9 monitor recently but don't think my 1080ti is enough to drive it. Patiently waiting for the 3080ti to be fully revealed and then have a stroke at the price of it.
The funny part is that while driving such a huge monitor sounds like a lot of work, it actually has less pixels than a 4K screen. Even a 3080 would likely be fine, especially if you were playing a lot of games in 21:9 rather than the full 32:9 - and with DLSS it would be even easier.
Like, a 2080ti manages about 80fps when running Death Stranding at native 3840x2160. Now imagine 3440x1440 DLSS Quality Mode on something 10-15% stronger.
Huh. In that case I might just pick up the monitor soon since I already play 4k.
I'm currently running an Acer Predator 32inch 4k G-Sync monitor but I figured the G9 would be harder to run.
That's comparing a ~$400 mid-range card to what's going to be a high-end card with an updated architecture that's probably going to follow suit in price. We already have things to go off on with the PS5 and XSX with the former's high clock and the latter's high CU count 52 in an APU form which makes it fairly easy to get an idea what a discreet GPU would be capable of with 72 CUs, it's pretty unlikely RDNA2 is going to be worse than RDNA1 in like, any area.I'm pretty freaking skeptical of a 100% increase in performance for AMD. Especially without a die shrink.
Bottleneck means there is a component that caps the performance. If you change all components but the one that bottlenecks, you should get more or less similar performances. So yeah, if you are CPU-bottlenecked and upgrade your gfc you only "lose out" on potential performance from the new GFX, you dont make your CPU perform worse.Might not be the right place to ask, but some discussion here confused me about bottlenecks:
If I have a CPU that would be too weak for a new upgraded GPU, creating a bottleneck, do I just lose out on potential performance that upgrade could have caused, or does it negatively affect the base performance of the seperate components? Like, make the CPU perform worse than it would with a weaker GPU?
Not sure I put that in way that can be understood.
The first situation. The GPU is still just as capable and you could even max out its full potential if you keep increasing resolution or settings, but it has to wait for the CPU if what you want is high refresh rates. If the CPU can't do 120fps in a certain game, then it is limiting the maximum fps you can reach regardless of the GPU.Might not be the right place to ask, but some discussion here confused me about bottlenecks:
If I have a CPU that would be too weak for a new upgraded GPU, creating a bottleneck, do I just lose out on potential performance that upgrade could have caused, or does it negatively affect the base performance of the seperate components? Like, make the CPU perform worse than it would with a weaker GPU?
Not sure I put that in way that can be understood.
Also very doubtful, especially with how much R&D Nvidia has spent on their ML solutions.Somehow I don't think that AMD will have their own DLSS-like solution with RDNA2.
As for rumored performance numbers for Ampere/RDNA2, it's all a bit to wild right now. Insiders are optimistic, but I don't trust them without some concrete benchmark leaks.
At last we won't have to wait too long to get the truth. Few more months.
Bottleneck means there is a component that caps the performance. If you change all components but the one that bottlenecks, you should get more or less similar performances. So yeah, if you are CPU-bottlenecked and upgrade your gfc you only "lose out" on potential performance from the new GFX, you dont make your CPU perform worse.
The first situation. The GPU is still just as capable and you could even max out its full potential if you keep increasing resolution or settings, but it has to wait for the CPU if what you want is high refresh rates. If the CPU can't do 120fps in a certain game, then it is limiting the maximum fps you can reach regardless of the GPU.
Now would I be insane hoping that ampere line gpus would not clock 1k+ in Australian dollars
Also very doubtful, especially with how much R&D Nvidia has spent on their ML solutions.
Damn, DLSS has become an extremely strong selling point for NV. It's literally free performance.
Right. But using the tensor cores for DLSS is free in the sense that they otherwise wouldn't be used at all by the game - it's not a trade-off in that sense.Well, technically not free performance. It uses the tensor cores, and you are paying for those. It's just very very efficient, since I'm guessing that making similar-sized GPUs (like what RDNA2 will offer, I'm guessing) not only wouldn't give the kind of performance boosts DLSS offers, but it would also mean Nvidia having to make a separate GPU line for professionals that need the ML hardware. Releasing a single line with both and finding a way to use ML for games really paid off for them.
Hm. So it seems rumors are late August to Early September release, but do we know what may releases? 3090 potentially (or 3080ti?)? Or just the "base" cards?
Realistically as long as the 3080ti or 3090, whatever they call the top consumer card, releases before November I'll be happy, but sooner is always better!
I'll have a 2080Ti going for 400 if you can hang on a few months ;)
I expect it to be around a 2080 Ti, maybe slightly below.How strong do people expect the 3070 to be relative to currently available cards? I'm thinking of going from a RX 5700 XT to a 3070 for DLSS alone, but it'd be good if I weren't losing out on any FPS outside of DLSS games of course.
why do people worry about psu? if you get the new ti a 100$ new one is nothing compared to the gpu cost. also many have a too big psu, then theres undervolting. my 1080ti and 3700x undervolted just need 220W at max load, add the rest and its still below 350
I'm expecting them to hold on releasing the top end consumer cards till AMD will launch Navi 2x. So 3080 and a new Titan seem like the safest bet.Hm. So it seems rumors are late August to Early September release, but do we know what may releases? 3090 potentially (or 3080ti?)? Or just the "base" cards?
Realistically as long as the 3080ti or 3090, whatever they call the top consumer card, releases before November I'll be happy, but sooner is always better!
That'd be perfect. I loved my GTX 570 and 970, so I hope the 3070 will be similarly good bang for the buck.
I'm expecting them to hold on releasing the top end consumer cards till AMD will launch Navi 2x. So 3080 and a new Titan seem like the safest bet.
3070 possibly but with it being based on a different chip now it may also come later.
3090/3080Ti will probably launch when NV will know where the fastest Navi 21 will land in perf/price.
I'm actually pondering the idea of NV holding on with launches this time and letting AMD go first - but it's more likely that NV will know everything they need to know some months prior to Navi 2x launching so it would probably be pointless.
why do people worry about psu? if you get the new ti a 100$ new one is nothing compared to the gpu cost. also many have a too big psu, then theres undervolting. my 1080ti and 3700x undervolted just need 220W at max load, add the rest and its still below 350
I'm expecting them to hold on releasing the top end consumer cards till AMD will launch Navi 2x. So 3080 and a new Titan seem like the safest bet.
3070 possibly but with it being based on a different chip now it may also come later.
3090/3080Ti will probably launch when NV will know where the fastest Navi 21 will land in perf/price.
I'm actually pondering the idea of NV holding on with launches this time and letting AMD go first - but it's more likely that NV will know everything they need to know some months prior to Navi 2x launching so it would probably be pointless.
This is getting ridiculous.. announce the damn thing already >_>
I would be happy with an announcement for the event date
Did EVGA stop making 2080tis? I'm either blind or not seeing them listed on their website.
When I say "go first" I mean something like what happened between RX480 and GTX1060 - i.e. not that much of allowing to go first as launching more or less simultaneously.Letting AMD go first wouldn't be terrible, although there is a slight advantage in going first by reinforcing the perception that Nvidia is always ahead.
That was a totally different launch though. NV wasn't threatened anywhere with Turing and they took their time rolling them out due to an excess of Pascal inventory which accumulated because of a sudden ecoin mining crash.For 20xx series it was 2080 and Ti in September, 2070 in October and Titan in December.
There were rumors recently that all 2070, 2080 and 2080Ti parts are EOL.Did EVGA stop making 2080tis? I'm either blind or not seeing them listed on their website.
DLSS is great but it's not free.
- it runs in serial so the CUs are stalled while it is running. This is ok because it's really fast at doing what it does - but it's still a few ms per frame out of your budget
- the tensor cores also take up silicon area so for any given sized GPU you could argue you'd get more CUs for the same size chip if you didn't have tensor cores.
so in theory if you had X% more CUs, and Y milliseconds extra time per frame - you might go some way towards being able to render more of those pixels natively.
I still think it's a great solution and better than rendering natively though
Isn't RDNA2 supposed to be on TSMC's 7nm+ (or at least an improved version of the 7nm process they used for RDNA1)?
Wouldn't there be a pretty sizeable difference in transistor densities if NV went with Samsung's 8nm?
does it increase latency is rendering frames? I dont have DLSS card but hows the performance when it makes up for the textures?
i think it takes about 3-4ms to handle its part of the frame creation. So if you're running at 60fps/16ms, that leaves you with less time to do the rest of your rendering. But then of course you're only rendering a quarter of the pixels (1080p vs 3k) so that should be a big net saving
Pardon me for this random and potentially stupid question as I'm new to a lot of this and am researching as I go along. I'm currently in the process of building my first pc and am awaiting Ampere info, release, pricing etc for the gpu component. Now initially I was going to wait on new amd cpu info also but decided to go with a ryzen 3900x (partly because I'm not completely certain about the forward compatibility of the motherboard I'm looking at, the Mag X570 tomahawk wifi atx am4 and also getting a little close to my budget for said pc). Ideally I'd like to not upgrade again for say 5+ years with a 4k60 baseline (barring deals i simply can't pass up). Now would I be insane hoping that ampere line gpus would not clock 1k+ in Australian dollars keeping in mind that I'm currently sitting at a build cost of around 2.7k and based on past Nvidia history because looking at some current prices for the 2080 RTX line on pcpartpicker and 1.5k upto 3k has legitimately got me sweating lol.
I'd guess august announce and late September release, but choose the rumor of your choice
3070/3080/ti would be the initial three, if it's anything like the RTX2000 launch.
And to australians: yes, $1000+, if the 3070 fits under that number I'll be surprised
I'm expecting them to hold on releasing the top end consumer cards till AMD will launch Navi 2x. So 3080 and a new Titan seem like the safest bet.
3070 possibly but with it being based on a different chip now it may also come later.
3090/3080Ti will probably launch when NV will know where the fastest Navi 21 will land in perf/price.
I'm actually pondering the idea of NV holding on with launches this time and letting AMD go first - but it's more likely that NV will know everything they need to know some months prior to Navi 2x launching so it would probably be pointless.
Letting AMD go first wouldn't be terrible, although there is a slight advantage in going first by reinforcing the perception that Nvidia is always ahead.