• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.
  • We have made minor adjustments to how the search bar works on ResetEra. You can read about the changes here.
Status
Not open for further replies.

Brot

Member
Oct 25, 2017
6,131
the edge
AMD finally being able to compete with Nvidia on the high-end level would be super exciting to see.

I don't know how reliable that Youtube channel is, which is why I'm hoping that other sources can come forward an confirm this.
 

dgrdsv

Member
Oct 25, 2017
12,024
The top Ampere generation cards will likely get a TSMC 7nm refresh, maybe even with TSMC 7nm EUV.
Would make about zero sense to do this costs wise so nah.

But AMD is going to come in and buy a lot of TSMC's 5nm EUV capacity next year after Apple has had most of the early 5nm capacity that they have this year.
N5 quotas are already bought up, NV has their share too.

Nvidia's next, next gen architecture after Ampere is RTX Hopper. This will likely have MCM capabilities from the start (different to SLI). These MCMs would be Nvidia's answer to AMD's Chiplets for CPUs AND GPUs. Hooper RTX cards could reach consumers as soon as early 2022, perhaps even late 2021, to combat RDNA3 cards.
Also about zero chance of any "MCM" or "chiplet" GPUs reaching consumers any time soon from any IHV. Hopper is likely a 100% data center / AI / HPC oriented design. No clear info on if it's even a separate architecture or just Ampere chip(let)s in a multichip configuration.

If AMD ends up leading in 4K Native and nVidia has a clear lead with DLSS enabled
I'd give about 1% of the first happening with Navi 2x vs Ampere and the second is basically guaranteed.

then the PC threads are going to be console native vs checkerboarding all over again
As if they ever were any different. Remember, DLSS and RT are useless gimmicks and NV will release GTX 2080Ti for 1/4th of RTX 2080Ti price because RT h/w adds this much price to it. Any day now.
 

BreakAtmo

Member
Nov 12, 2017
12,962
Australia
Getting really tempted by a Samsung G9 monitor recently but don't think my 1080ti is enough to drive it. Patiently waiting for the 3080ti to be fully revealed and then have a stroke at the price of it.

The funny part is that while driving such a huge monitor sounds like a lot of work, it actually has less pixels than a 4K screen. Even a 3080 would likely be fine, especially if you were playing a lot of games in 21:9 rather than the full 32:9 - and with DLSS it would be even easier.

Like, a 2080ti manages about 80fps when running Death Stranding at native 3840x2160. Now imagine 3440x1440 DLSS Quality Mode on something 10-15% stronger.
 

AlanOC91

Owner of YGOPRODeck.com
Verified
Nov 5, 2017
962
The funny part is that while driving such a huge monitor sounds like a lot of work, it actually has less pixels than a 4K screen. Even a 3080 would likely be fine, especially if you were playing a lot of games in 21:9 rather than the full 32:9 - and with DLSS it would be even easier.

Like, a 2080ti manages about 80fps when running Death Stranding at native 3840x2160. Now imagine 3440x1440 DLSS Quality Mode on something 10-15% stronger.

Huh. In that case I might just pick up the monitor soon since I already play 4k.

I'm currently running an Acer Predator 32inch 4k G-Sync monitor but I figured the G9 would be harder to run.
 

SayemAhmd

Unshakable Resolve
Member
Dec 3, 2019
241
It's really interesting to see AMD rumoured to step up like this, though the one thing that I feel is really missing from an AMD offering is some sort of DLSS equivalent, that's probably my only holdout.
 

BreakAtmo

Member
Nov 12, 2017
12,962
Australia
Huh. In that case I might just pick up the monitor soon since I already play 4k.

I'm currently running an Acer Predator 32inch 4k G-Sync monitor but I figured the G9 would be harder to run.

It intuitively feels that way, but in truth, it's the same res as two 1440p monitors, and 1440p is about 44% of 2160p. So even when playing something in full 32:9 you'll likely see a performance boost. I can't imagine that's an option everywhere though - supposedly only maybe three-quarters of PC games support 21:9 as it is.
 

DieH@rd

Member
Oct 26, 2017
10,673
Somehow I don't think that AMD will have their own DLSS-like solution with RDNA2.

As for rumored performance numbers for Ampere/RDNA2, it's all a bit to wild right now. Insiders are optimistic, but I don't trust them without some concrete benchmark leaks.

At last we won't have to wait too long to get the truth. Few more months.
 

PHOENIXZERO

Member
Oct 29, 2017
12,193
I'm pretty freaking skeptical of a 100% increase in performance for AMD. Especially without a die shrink.
That's comparing a ~$400 mid-range card to what's going to be a high-end card with an updated architecture that's probably going to follow suit in price. We already have things to go off on with the PS5 and XSX with the former's high clock and the latter's high CU count 52 in an APU form which makes it fairly easy to get an idea what a discreet GPU would be capable of with 72 CUs, it's pretty unlikely RDNA2 is going to be worse than RDNA1 in like, any area.

Even if the rumors end up being true NVIDIA is still probably going to have major advantages that could negate AMD's advantage in potential raw rasterization horsepower.
 

Banzai

The Fallen
Oct 28, 2017
2,598
Might not be the right place to ask, but some discussion here confused me about bottlenecks:
If I have a CPU that would be too weak for a new upgraded GPU, creating a bottleneck, do I just lose out on potential performance that upgrade could have caused, or does it negatively affect the base performance of the seperate components? Like, make the CPU perform worse than it would with a weaker GPU?

Not sure I put that in way that can be understood.
 

eonden

Member
Oct 25, 2017
17,126
Might not be the right place to ask, but some discussion here confused me about bottlenecks:
If I have a CPU that would be too weak for a new upgraded GPU, creating a bottleneck, do I just lose out on potential performance that upgrade could have caused, or does it negatively affect the base performance of the seperate components? Like, make the CPU perform worse than it would with a weaker GPU?

Not sure I put that in way that can be understood.
Bottleneck means there is a component that caps the performance. If you change all components but the one that bottlenecks, you should get more or less similar performances. So yeah, if you are CPU-bottlenecked and upgrade your gfc you only "lose out" on potential performance from the new GFX, you dont make your CPU perform worse.
 
Oct 29, 2017
13,605
Might not be the right place to ask, but some discussion here confused me about bottlenecks:
If I have a CPU that would be too weak for a new upgraded GPU, creating a bottleneck, do I just lose out on potential performance that upgrade could have caused, or does it negatively affect the base performance of the seperate components? Like, make the CPU perform worse than it would with a weaker GPU?

Not sure I put that in way that can be understood.
The first situation. The GPU is still just as capable and you could even max out its full potential if you keep increasing resolution or settings, but it has to wait for the CPU if what you want is high refresh rates. If the CPU can't do 120fps in a certain game, then it is limiting the maximum fps you can reach regardless of the GPU.
 

Readler

Member
Oct 6, 2018
1,974
Somehow I don't think that AMD will have their own DLSS-like solution with RDNA2.

As for rumored performance numbers for Ampere/RDNA2, it's all a bit to wild right now. Insiders are optimistic, but I don't trust them without some concrete benchmark leaks.

At last we won't have to wait too long to get the truth. Few more months.
Also very doubtful, especially with how much R&D Nvidia has spent on their ML solutions.

Damn, DLSS has become an extremely strong selling point for NV. It's literally free performance.
 

Banzai

The Fallen
Oct 28, 2017
2,598
Bottleneck means there is a component that caps the performance. If you change all components but the one that bottlenecks, you should get more or less similar performances. So yeah, if you are CPU-bottlenecked and upgrade your gfc you only "lose out" on potential performance from the new GFX, you dont make your CPU perform worse.
The first situation. The GPU is still just as capable and you could even max out its full potential if you keep increasing resolution or settings, but it has to wait for the CPU if what you want is high refresh rates. If the CPU can't do 120fps in a certain game, then it is limiting the maximum fps you can reach regardless of the GPU.

Great, that's what I thought thanks! Don't know what it was that confused me.
Might hold on to my Ryzen 2600x just a little bit longer then.
 

ugoo18

Member
Oct 27, 2017
150
Pardon me for this random and potentially stupid question as I'm new to a lot of this and am researching as I go along. I'm currently in the process of building my first pc and am awaiting Ampere info, release, pricing etc for the gpu component. Now initially I was going to wait on new amd cpu info also but decided to go with a ryzen 3900x (partly because I'm not completely certain about the forward compatibility of the motherboard I'm looking at, the Mag X570 tomahawk wifi atx am4 and also getting a little close to my budget for said pc). Ideally I'd like to not upgrade again for say 5+ years with a 4k60 baseline (barring deals i simply can't pass up). Now would I be insane hoping that ampere line gpus would not clock 1k+ in Australian dollars keeping in mind that I'm currently sitting at a build cost of around 2.7k and based on past Nvidia history because looking at some current prices for the 2080 RTX line on pcpartpicker and 1.5k upto 3k has legitimately got me sweating lol.
 

dragn

Attempted to circumvent ban with alt-account
Banned
Oct 26, 2017
881
why do people worry about psu? if you get the new ti a 100$ new one is nothing compared to the gpu cost. also many have a too big psu, then theres undervolting. my 1080ti and 3700x undervolted just need 220W at max load, add the rest and its still below 350
 

BreakAtmo

Member
Nov 12, 2017
12,962
Australia
Also very doubtful, especially with how much R&D Nvidia has spent on their ML solutions.

Damn, DLSS has become an extremely strong selling point for NV. It's literally free performance.

Well, technically not free performance. It uses the tensor cores, and you are paying for those. It's just very very efficient, since I'm guessing that making similar-sized GPUs (like what RDNA2 will offer, I'm guessing) not only wouldn't give the kind of performance boosts DLSS offers, but it would also mean Nvidia having to make a separate GPU line for professionals that need the ML hardware. Releasing a single line with both and finding a way to use ML for games really paid off for them.
 

Readler

Member
Oct 6, 2018
1,974
Well, technically not free performance. It uses the tensor cores, and you are paying for those. It's just very very efficient, since I'm guessing that making similar-sized GPUs (like what RDNA2 will offer, I'm guessing) not only wouldn't give the kind of performance boosts DLSS offers, but it would also mean Nvidia having to make a separate GPU line for professionals that need the ML hardware. Releasing a single line with both and finding a way to use ML for games really paid off for them.
Right. But using the tensor cores for DLSS is free in the sense that they otherwise wouldn't be used at all by the game - it's not a trade-off in that sense.
As you said, it was really a brilliant move to combine their ML hadware with their gaming side, to the extent I didn't expect. I'm curious to see what AMD has to offer, but right now I can't not imagine a card without DLSS, it's THAT good. I also can't see AMD responding anytime soon, as NV's got quite a lead in visual computing research.
Should DLSS 3.0 work on the driver level... hoo boy.
 

Shifty Capone

Member
Oct 27, 2017
620
Los Angeles
Hm. So it seems rumors are late August to Early September release, but do we know what may releases? 3090 potentially (or 3080ti?)? Or just the "base" cards?

Realistically as long as the 3080ti or 3090, whatever they call the top consumer card, releases before November I'll be happy, but sooner is always better!
 

Crazymoogle

Game Developer
Verified
Oct 25, 2017
2,892
Asia
Hm. So it seems rumors are late August to Early September release, but do we know what may releases? 3090 potentially (or 3080ti?)? Or just the "base" cards?

Realistically as long as the 3080ti or 3090, whatever they call the top consumer card, releases before November I'll be happy, but sooner is always better!

I'd guess august announce and late September release, but choose the rumor of your choice

3070/3080/ti would be the initial three, if it's anything like the RTX2000 launch.

And to australians: yes, $1000+, if the 3070 fits under that number I'll be surprised
 

Deleted member 2834

User requested account closure
Banned
Oct 25, 2017
7,620
How strong do people expect the 3070 to be relative to currently available cards? I'm thinking of going from a RX 5700 XT to a 3070 for DLSS alone, but it'd be good if I weren't losing out on any FPS outside of DLSS games of course.
 

kiguel182

Member
Oct 31, 2017
9,473
why do people worry about psu? if you get the new ti a 100$ new one is nothing compared to the gpu cost. also many have a too big psu, then theres undervolting. my 1080ti and 3700x undervolted just need 220W at max load, add the rest and its still below 350

I'm new to PC Gaming so I worry about everything.
 

dgrdsv

Member
Oct 25, 2017
12,024
Hm. So it seems rumors are late August to Early September release, but do we know what may releases? 3090 potentially (or 3080ti?)? Or just the "base" cards?

Realistically as long as the 3080ti or 3090, whatever they call the top consumer card, releases before November I'll be happy, but sooner is always better!
I'm expecting them to hold on releasing the top end consumer cards till AMD will launch Navi 2x. So 3080 and a new Titan seem like the safest bet.
3070 possibly but with it being based on a different chip now it may also come later.
3090/3080Ti will probably launch when NV will know where the fastest Navi 21 will land in perf/price.
I'm actually pondering the idea of NV holding on with launches this time and letting AMD go first - but it's more likely that NV will know everything they need to know some months prior to Navi 2x launching so it would probably be pointless.
 
Nov 8, 2017
13,242
I'm expecting them to hold on releasing the top end consumer cards till AMD will launch Navi 2x. So 3080 and a new Titan seem like the safest bet.
3070 possibly but with it being based on a different chip now it may also come later.
3090/3080Ti will probably launch when NV will know where the fastest Navi 21 will land in perf/price.
I'm actually pondering the idea of NV holding on with launches this time and letting AMD go first - but it's more likely that NV will know everything they need to know some months prior to Navi 2x launching so it would probably be pointless.

Letting AMD go first wouldn't be terrible, although there is a slight advantage in going first by reinforcing the perception that Nvidia is always ahead.
 

laxu

Member
Nov 26, 2017
2,785
why do people worry about psu? if you get the new ti a 100$ new one is nothing compared to the gpu cost. also many have a too big psu, then theres undervolting. my 1080ti and 3700x undervolted just need 220W at max load, add the rest and its still below 350

If the new GPUs have high power draw at stock speed, they might have even higher if they have any substantial overclocking capability. I would expect that anyone with a 650-750W PSU is perfectly fine but those on 500-600W might at least experience PSU fan coming into play more often. Most people don't want to buy a new PSU if they can help it, it's an extra cost. I recently got an SFX size PSU and went with a 750W model just in case even though for my 3700X + 2080 Ti the 600W Platinum model would have been enough. The good thing is the new PSU reduced my 2080 Ti coil whine so it bothers me less.

I'm expecting them to hold on releasing the top end consumer cards till AMD will launch Navi 2x. So 3080 and a new Titan seem like the safest bet.
3070 possibly but with it being based on a different chip now it may also come later.
3090/3080Ti will probably launch when NV will know where the fastest Navi 21 will land in perf/price.
I'm actually pondering the idea of NV holding on with launches this time and letting AMD go first - but it's more likely that NV will know everything they need to know some months prior to Navi 2x launching so it would probably be pointless.

For 20xx series it was 2080 and Ti in September, 2070 in October and Titan in December. Might follow a similar pattern but replace Titan with 3090 and leave the full Titan as well as 3060 for early next year. This would allow them to combat AMD at every turn by having an allmost top tier part on first release getting people to buy that one, throw in the 3070 a bit later as "hey look, you can get something almost as good for less!" (and clearing 2080 Ti stock) and finally taking back the performance crown with the halo products.

Personally I am in zero rush to upgrade and can wait for the full lineup. Those on 10xx cards can fight for the early bird specials.
 

Vimto

Member
Oct 29, 2017
3,718
This is getting ridiculous.. announce the damn thing already >_>

I would be happy with an announcement for the event date
 

RCSI

Avenger
Oct 27, 2017
1,840
I've already accepted that one of the cards (except for the Ti/3090) will be in my PC in October at the latest.
 

F34R

Member
Oct 27, 2017
12,022
If I can sell off enough of my stuff I have in the house lol, I wanna get 3080ti x2 + 32" 4K 144Hz IPS HDR1000 monitor. What a wish to have!!
 

Blade30

Member
Oct 26, 2017
4,648

It's probably out of stock and Nvidia not producing them anymore.

www.resetera.com

Nvidia Allegedly Kills Off Four Turing (RTX 20 Series) Graphics Cards In Anticipation Of Ampere

https://www.tomshardware.com/news/nvidia-kill-four-turing-graphics-cards-anticipation-ampere https://www.ithome.com/0/497/387.htm https://wccftech.com/nvidia-geforce-rtx-20-turing-production-end-geforce-rtx-30-ampere-gaming-graphics-card-launch-close/ Keeping my fingers crossed that AMD with...
 

dgrdsv

Member
Oct 25, 2017
12,024
Letting AMD go first wouldn't be terrible, although there is a slight advantage in going first by reinforcing the perception that Nvidia is always ahead.
When I say "go first" I mean something like what happened between RX480 and GTX1060 - i.e. not that much of allowing to go first as launching more or less simultaneously.
But again, considering that they both usually know what their competitor's upcoming products can be capable of well before the actual launch dates waiting for another month can be fairly pointless - the only thing an IHV can do in a month with a modern GPU is adjust its pricing.

For 20xx series it was 2080 and Ti in September, 2070 in October and Titan in December.
That was a totally different launch though. NV wasn't threatened anywhere with Turing and they took their time rolling them out due to an excess of Pascal inventory which accumulated because of a sudden ecoin mining crash.
Ampere will launch to a fierce competition from RDNA2 top to bottom really so going first with "halo" cards which are the best you can manage in price/perf or just pure perf makes sense. All the "Ti" and interim variants will likely come after the competition will play its hand - applies to AMD too btw.

There were rumors recently that all 2070, 2080 and 2080Ti parts are EOL.
 
Last edited:

MrKlaw

Member
Oct 25, 2017
33,246
DLSS is great but it's not free.

- it runs in serial so the CUs are stalled while it is running. This is ok because it's really fast at doing what it does - but it's still a few ms per frame out of your budget
- the tensor cores also take up silicon area so for any given sized GPU you could argue you'd get more CUs for the same size chip if you didn't have tensor cores.

so in theory if you had X% more CUs, and Y milliseconds extra time per frame - you might go some way towards being able to render more of those pixels natively.

I still think it's a great solution and better than rendering natively though
 

tusharngf

Member
Oct 29, 2017
2,288
Lordran
DLSS is great but it's not free.

- it runs in serial so the CUs are stalled while it is running. This is ok because it's really fast at doing what it does - but it's still a few ms per frame out of your budget
- the tensor cores also take up silicon area so for any given sized GPU you could argue you'd get more CUs for the same size chip if you didn't have tensor cores.

so in theory if you had X% more CUs, and Y milliseconds extra time per frame - you might go some way towards being able to render more of those pixels natively.

I still think it's a great solution and better than rendering natively though


does it increase latency is rendering frames? I dont have DLSS card but hows the performance when it makes up for the textures?
 

Sabin

Member
Oct 25, 2017
4,682
Good thing i bought a Seasonic 750W Platinum PSU recently huh.

Above 300w at stock settings is insane.
 

Deleted member 25042

User requested account closure
Banned
Oct 29, 2017
2,077
Isn't RDNA2 supposed to be on TSMC's 7nm+ (or at least an improved version of the 7nm process they used for RDNA1)?
Wouldn't there be a pretty sizeable difference in transistor densities if NV went with Samsung's 8nm?
 

Deleted member 17092

User requested account closure
Banned
Oct 27, 2017
20,360
Isn't RDNA2 supposed to be on TSMC's 7nm+ (or at least an improved version of the 7nm process they used for RDNA1)?
Wouldn't there be a pretty sizeable difference in transistor densities if NV went with Samsung's 8nm?

It should hopefully help them at least stay more competitive on pure performance and if indications are accurate on Nvidia power draw it seems Nvidia seems to think so as well.

Questions still on RT performance and if AMD/MS come up with something that does close to as well as DLSS.
 

MrKlaw

Member
Oct 25, 2017
33,246
does it increase latency is rendering frames? I dont have DLSS card but hows the performance when it makes up for the textures?

i think it takes about 3-4ms to handle its part of the frame creation. So if you're running at 60fps/16ms, that leaves you with less time to do the rest of your rendering. But then of course you're only rendering a quarter of the pixels (1080p vs 3k) so that should be a big net saving
 

Jimrpg

Member
Oct 26, 2017
3,280
Pardon me for this random and potentially stupid question as I'm new to a lot of this and am researching as I go along. I'm currently in the process of building my first pc and am awaiting Ampere info, release, pricing etc for the gpu component. Now initially I was going to wait on new amd cpu info also but decided to go with a ryzen 3900x (partly because I'm not completely certain about the forward compatibility of the motherboard I'm looking at, the Mag X570 tomahawk wifi atx am4 and also getting a little close to my budget for said pc). Ideally I'd like to not upgrade again for say 5+ years with a 4k60 baseline (barring deals i simply can't pass up). Now would I be insane hoping that ampere line gpus would not clock 1k+ in Australian dollars keeping in mind that I'm currently sitting at a build cost of around 2.7k and based on past Nvidia history because looking at some current prices for the 2080 RTX line on pcpartpicker and 1.5k upto 3k has legitimately got me sweating lol.

No one can really predict what's going to happen to the gaming industry in 5 years. I bought my machine an i5-4690k, with a GTX 970 in 2015, and its fine for 1080p, but with my 1440p itd be chugging. I replaced the 970 with a 1070 when that came out. Even if I got the i7 - it'd be better, but there's a huge gap between the i7-4790k and say the latest CPUs. At a guess I'd say you'd be fine if your specs are better than the consoles in general. But with PCs, PC devs develop with the market in mind. So if you buy a mid tier card, or your high end card becomes mid tier - you'll get that performance. That's why for me, I'm probably going with a 3080 over the 3070, just that the little bit better gives a bit of breathing room in terms of performance.

Another thing - the PS5 is looking like its going to be around $499 - that is serious value - given to build an equivalent PC it'd be at least $1200-$1500.

I'd guess august announce and late September release, but choose the rumor of your choice

3070/3080/ti would be the initial three, if it's anything like the RTX2000 launch.

And to australians: yes, $1000+, if the 3070 fits under that number I'll be surprised

I'm pretty sure the 970 was $329, which ended up being AU$450ish for me i think after shipping. The 1070 was $429 and about AU$600. Pretty crazy the 2070 and the 3070 are AU$1000.
 

Jimrpg

Member
Oct 26, 2017
3,280
I'm expecting them to hold on releasing the top end consumer cards till AMD will launch Navi 2x. So 3080 and a new Titan seem like the safest bet.
3070 possibly but with it being based on a different chip now it may also come later.
3090/3080Ti will probably launch when NV will know where the fastest Navi 21 will land in perf/price.
I'm actually pondering the idea of NV holding on with launches this time and letting AMD go first - but it's more likely that NV will know everything they need to know some months prior to Navi 2x launching so it would probably be pointless.
Letting AMD go first wouldn't be terrible, although there is a slight advantage in going first by reinforcing the perception that Nvidia is always ahead.

I think Nvidia would know the performance already and will launch according to their schedule, and if AMD do end up beating them, well Nvidia could just launch another card and call it a Super or whatever.
 
Status
Not open for further replies.