• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.
  • We have made minor adjustments to how the search bar works on ResetEra. You can read about the changes here.
Status
Not open for further replies.

Nooblet

Member
Oct 25, 2017
13,676
weirdly the video is at 24 fps, which really bothered me. Is AMD trying to saboutage how their card performance looks?
They are trying to achieve that Cinematic Feel ™
You really think it looks like TAAU? I think it looks rather 1440p ish, to the point where we were counting screens of still footage and not moving ones (which is harder to do with reconstruction).
IIRC it's just a form of TSSAA, it'd get rid of jaggies well but it doesn't really make the image look more crisp than what a standard 1440P/TAA image would be like. Also their TAA has a lot of very obvious ghosting.
 

ppn7

Member
May 4, 2019
740
This is just absolutely hilarious, what a shit showcase.


Isn't the PS4's GPU literally just a tweaked HD 7850?

I'm still rocking an R9 280X, which is basically a rebranded 7970 GHz, which is but an updated 7970 from 2012. That card has aged like fine wine, still plays most stuff at medium setting at 1080p. I'm due for an update for next gen though.

Me too I still enjoy my 280x but it lacks freesync and I hope it will survive until the end of the year for ampere.
 

scabobbs

Member
Oct 28, 2017
2,109
If rumours are true and big navi is somewhere in-between 3080 and 3090/3080Ti, for $999 than another expansive GPU generation is ahead of us.
People would still celebrate "a thousand dollar" big navi because you'd get +50-60% performance over a 2080Ti for ~20% less cash.
But this would allow for the 3080 and a slightly smaller big navi to be priced around 700-800€ and the even faster 3090/3080Ti could be put above thousand dollars again.
We'll see as all rumours should be taken with a grain of salt, but if both AMD and NVIDIA start playing cat and mouse in the 10-15% difference margins, as they did this "gpu generation" than another expansive high-end cycle is ahead of us.

The most interesting part for me is Ray Tracing performance and how it is going to compare. There will be many entertaining benchmarks and tech talks to follow.
Where are the rumors big Navi is between the 3080 and 3090?
 
Nov 2, 2017
2,275
I thought the One X was more like a 980 ti
On average a 980Ti is much faster. The GPU in the X1X is more or less a 580 with a bit more bandwidth. There might be games where a 580 can match a 980Ti but these are exceptions.

I mean the 1060 is faster than or equal to a 980

Correct but the 970 is not that much slower than a 980. The Pro GPU is a good bit below a 470 in raw power, which in turn is already below a 970.
 

Dekuman

Member
Oct 27, 2017
19,039
I don't have time to re-watch that DF video on the 1060, but 1060 also have 2 variants. 3GB with 1152 CUDA cores and the really awesome 6GB version with 1280
the 3GB version is really a much lesser card than the 6GB one but both are branded 1060.
 

dgrdsv

Member
Oct 25, 2017
12,024
Where are the rumors big Navi is between the 3080 and 3090?
Supposedly comes from NV's expectation of Navi 21 landing above 3080 forcing them to introduce a 3080Ti/3090 card as well.
I don't see why it can't tbh. It is doubtful that AMD will take the performance crown but they can surely make a GPU which will be capable of competing in the high end.
 

Nothing

Member
Oct 30, 2017
2,095
That's all you can back you claim of Turing being the worst h/w investment with?
Oh so it was a disingenuous question, just like I thought.

Turing was overpriced from the start. Everyone knew it. And now, it's getting usurped by Ampere cards that are going to knock them out of the park. As well as have to be price competitive with AMD cards and the new consoles. If you don't think that almost anyone whom spent $699+ on the first cycle of RTX isn't going to have a little bit of buyer's remorse come September then I don't know what to tell you. Other cycles like the 1080 / Ti didn't elicit that feeling, because those cards held much better long-term value. They also didn't cost as much. Whereas here on this very board we have several people looking to replace their RTX 2000 cards already. Yikes.

The new Ampere generation is shaping up to be much better than Turing, and similar to Pascal (1080 / Ti), in terms of long-term value and owner satisfaction. As shown by numerous performance leaks which you've very clearly been following. The second generation of raytracing is hittings its stride plus more games are finally incorporating it. It's all dependent on the price of course. But the gains are there, and this cycle is shaping up to be a much better time to invest for anyone who cares about "the best time to buy" an expensive piece of PC hardware to last them several years.

I have a good buddy whom I did a build for last March 2019. He bought an EVGA 2080 SC Ultra for $840 from Microcenter at the time. We built him a nice rig for around $2200. $3000 with his Alienware ultrawide. Guess how good that gpu investment is looking today in June 2020 just fifteen months later?

That's fine if you're happy buying an expensive (Turing) card and replacing it every two years. But most consumers see these types of purchasing decisions much differently. When most people buy an expensive graphics card they don't even want to THINK about having to replace it for at least the next 4 years. In this last (Turing) case, spending $800+ didn't even buy owners any extra longevity.
 

Jroc

Member
Jun 9, 2018
6,145
Oh so it was a disingenuous question, just like I thought.

Turing was overpriced from the start. Everyone knew it. And now, it's getting usurped by Ampere cards that are going to knock them out of the park. As well as have to be price competitive with AMD cards and the new consoles. If you don't think that almost anyone whom spent $699+ on the first cycle of RTX isn't going to have a little bit of buyer's remorse come September then I don't know what to tell you. Other cycles like the 1080 / Ti didn't elicit that feeling, because those cards held much better long-term value. They also didn't cost as much. Whereas here on this very board we have several people looking to replace their RTX 2000 cards already. Yikes.

The new Ampere generation is shaping up to be much better than Turing, and similar to Pascal (1080 / Ti), in terms of long-term value and owner satisfaction. As shown by numerous performance leaks which you've very clearly been following. The second generation of raytracing is hittings its stride plus more games are finally incorporating it. It's all dependent on the price of course. But the gains are there, and this cycle is shaping up to be a much better time to invest for anyone who cares about "the best time to buy" an expensive piece of PC hardware to last them several years.

I have a good buddy whom I did a build for last March 2019. He bought an EVGA 2080 SC Ultra for $840 from Microcenter at the time. We built him a nice rig for around $2200. $3000 with his Alienware ultrawide. Guess how good that gpu investment is looking today in June 2020 just fifteen months later?

That's fine if you're happy buying an expensive (Turing) card and replacing it every two years. But most consumers see these types of purchasing decisions much differently. When most people buy an expensive graphics card they don't even want to THINK about having to replace it for at least the next 4 years. In this last (Turing) case, spending $800+ didn't even buy owners any extra longevity.

There's definitely a subset of people known as the "just buy it." demographic who update annually for single digit gains. They don't really care about pricing or value across models. If the obscenely priced RTX2080Ti was able to outperform the old GTX1080Ti then that was good enough for them. The same kind of people who jumped from the 9900K to the 10900K the second it hit the market.

I guess you could argue that the 1000 series was abnormally good value and that the more mediocre 2000 series was a return to form, but at the end of the day it still makes the 2000 series relatively worse than the 1000 series. The GTX1070 was a GTX980Ti with more VRAM whereas the RTX2070 was a weaker GTX1080Ti with less VRAM.
 

Nooblet

Member
Oct 25, 2017
13,676
Supposedly comes from NV's expectation of Navi 21 landing above 3080 forcing them to introduce a 3080Ti/3090 card as well.
I don't see why it can't tbh. It is doubtful that AMD will take the performance crown but they can surely make a GPU which will be capable of competing in the high end.
I don't understand.
They were always going to have a big card that was going to sit well above 3080, they've always done that for years now.

Unless you are saying that without big Navi sitting above 3080, Nvidia wouldn't have made a Ti card this gen and would've just had a large gap between 3080 and Titan. But then rumours point to the 3090 being the titan equivalent in the first place and nothing above that.
 

RedSwirl

Member
Oct 25, 2017
10,102
That's not true.
A Pro is close/around a GTX 970.
No its not. If a one x is around 1060 which is like a 970, then a ps4 pro cant be near a 970.
I thought the One X was more like a 980 ti
I mean the 1060 is more or less equal to a 980

I think Microsoft itself compared the One X to a 980.
 

Vimto

Member
Oct 29, 2017
3,718
I got it from Igor's Lab. (EX-tomshardware germany).
Bit unknown outside of Germany, but nice and deep enough in the hardware circles.
If this is to be trusted then NV will be forced to release 3090 with 3080 as they would want to keep the power crown.

Works for me, I want the highest tier day 1, no more staggered bullshit releases.
 

Nothing

Member
Oct 30, 2017
2,095
There's definitely a subset of people known as the "just buy it." demographic who update annually for single digit gains. They don't really care about pricing or value across models. If the obscenely priced RTX2080Ti was able to outperform the old GTX1080Ti then that was good enough for them. The same kind of people who jumped from the 9900K to the 10900K the second it hit the market.

I guess you could argue that the 1000 series was abnormally good value and that the more mediocre 2000 series was a return to form, but at the end of the day it still makes the 2000 series relatively worse than the 1000 series. The GTX1070 was a GTX980Ti with more VRAM whereas the RTX2070 was a weaker GTX1080Ti with less VRAM.
Of course there are. But those people are very few and far between, and far more prevalent on a tech related thread/forum. I speak in generalizations, and one only needs to check a hardware survey to see the type of upgrade cycle that the majority of people are on.

The people that buy new cards every cycle will continue to do so,and are neither here nor there really in regards to a "best time to purchase", "best value", or "investment longevity" discussion. They are the outliers. They also buy much more frequently, are more profitable customers, and those are the folks that Nvidia is trying to target with the initial wave of (3080+) Ampere releases. But the vast majority of folks aren't upgrading their expensive core pc components every 2 yrs.
 

Nothing

Member
Oct 30, 2017
2,095
I would actually love to do a survey of RTX owners and see the collective answers for the number of raytracing games they played between Sept 2018 - Sept 2020, the estimated hours they spent playing RT games, and for them to rate their satisfaction of the enhanced experience.
The last two times I witnessed people discussing this, they kept RT off in games like Battlefield in order to get better fps.
 
Last edited:

Edgar

User requested ban
Banned
Oct 29, 2017
7,180
I would actually love to do a survey of RTX owners and see the collective answers for the number of raytracing games they played between Sept 2018 - Sept 2020, the estimated hours they spent playing RT games, and for them to rate their satisfaction of the enhanced experience.

The last two times I witnessed people discussing this, they kept RT off in games like Battlefield in order to get better fps.
Did you see also see the discussion about control rtx? Or metro rtx? I played control to completion and the visual layer that rts adds is very significant to the overall experience, same for metro exodus
 

icecold1983

Banned
Nov 3, 2017
4,243
You really think it looks like TAAU? I think it looks rather 1440p ish, to the point where we were counting screens of still footage and not moving ones (which is harder to do with reconstruction).
It looks a lot cleaner to me than a standard 1440p->4k upscale. Went back and watched the earlier DF vids on DLSS 1.0 and there is definitely a difference.
 

dgrdsv

Member
Oct 25, 2017
12,024
Turing was overpriced from the start. Everyone knew it.
So it's a "common knowledge" argument then.

And now, it's getting usurped by Ampere cards that are going to knock them out of the park.
Just like every generation of GPUs being "knocked out of the park" by the next generation of GPUs. Gotcha.

If you don't think that almost anyone whom spent $699+ on the first cycle of RTX isn't going to have a little bit of buyer's remorse come September then I don't know what to tell you.
Why RTX specifically? Was there no cards at $699 previously, ever?

That's fine if you're happy buying an expensive (Turing) card and replacing it every two years.
Yeah, I'm going on a limb here and saying that EVERYONE WILL BE FINE with that. Two years is a very long time in PC upgrades. Turing wasn't "short lived" at all. And it won't suddenly turn into trash with the launch of either Ampere or RDNA2. In fact, it will still be pretty much the only h/w currently available capable of running next gen engines with next gen features.

But I digress. You've said and I quote:
If this does indeed happen then it's going to retrospectively turn the previous RTX 2000 lineup into one of the worst pc hardware investments of all time.
This means that you can provide us with examples of GPU h/w investments which were better than Turing during the last two years. So please do this.

I don't understand.
They were always going to have a big card that was going to sit well above 3080, they've always done that for years now.
For what years? They haven't been able to beat either 1080 or 2080. This means that since 2016 AMD hasn't been able to compete above x70 performance tier. If RDNA2 will be able to this will be a huge change for the market for the first time in five years.

Unless you are saying that without big Navi sitting above 3080, Nvidia wouldn't have made a Ti card this gen and would've just had a large gap between 3080 and Titan. But then rumours point to the 3090 being the titan equivalent in the first place and nothing above that.
It's possible that NV would've went with a 3080+Titan approach, as they did pretty much always since 600 series.
But it's even more possible that they simply wouldn't have hold onto the "3080Ti/3090" card and would launch it straight away with 3080 and all. Them waiting for Navi 21 means that they are planning on adjusting the pricing on this card according to Navi 21's price and performance. Hence why they seem to be expecting Navi 21 to be faster than 3080 (or 3070? we don't know much about the lineup yet).
 

Nooblet

Member
Oct 25, 2017
13,676
For what years? They haven't been able to beat either 1080 or 2080. This means that since 2016 AMD hasn't been able to compete above x70 performance tier. If RDNA2 will be able to this will be a huge change for the market for the first time in five years.


It's possible that NV would've went with a 3080+Titan approach, as they did pretty much always since 600 series. But it's even more possible that they simply wouldn't have hold onto the "3080Ti/3090" card and would launch it straight away with 3080 and all. Them waiting for Navi 21 means that they are planning on adjusting the pricing on this card according to Navi 21's price and performance. Hence why they seem to be expecting Navi 21 to be faster than 3080 (or 3070? we don't know much about the lineup yet).
I was talking about nvidia.
They've always had a card over the x80 series, be it a titan or a Ti for a long time now.

But yea launching the Ti equivalent at once would be something nvidia would do if big navi sits above 3080...and I think that's a good thing.
 

Azai

Member
Jun 10, 2020
4,000
I don't know how reliable this "leak" is, but this is supposedly a "leak" of the RTX 3080 Ti spec sheet.


400-A1 ?

So we have binned chips again?
I dont remember exactly but I think Igor said something about them getting rid of that and now doing something different in that regard.
 

Azai

Member
Jun 10, 2020
4,000


@9:15

He says that eg 3080 and 3080Ti (or w.e. its called) use the same chip but with different memory interface and VRAM.
Means that A and non-A chips are replaced by just naming them differently. A chip would be a 3080 Ti and a non-A chip would be 3080. The biggest card would then be the 3090.

Naming the biggest model 3090 and having a 3080Ti that isnt the biggest anymore for a lower price would make sense if they want to sell it.
The 2080Ti is a synonym for the strongest Gaming GPU (Titan excluded). With ampere they can make another xx80 Ti and many would associate it with the 2080 Ti.
 

dgrdsv

Member
Oct 25, 2017
12,024
So we have binned chips again?
...There was a time when we didn't?..

I dont remember exactly but I think Igor said something about them getting rid of that and now doing something different in that regard.
A1 is the silicon revision. You're thinking about the XXXA/XXX split which Turing had. This was basically done with with the launch of 20 Super series.
But again, all chips are always "binned". The fact that NV decided to mark some bin type on TU chips themselves doesn't mean that GPUs weren't binned previously or won't be binned in the future.
 

Sanctuary

Member
Oct 27, 2017
14,250
I have a good buddy whom I did a build for last March 2019. He bought an EVGA 2080 SC Ultra for $840 from Microcenter at the time. We built him a nice rig for around $2200. $3000 with his Alienware ultrawide. Guess how good that gpu investment is looking today in June 2020 just fifteen months later?

That's fine if you're happy buying an expensive (Turing) card and replacing it every two years. But most consumers see these types of purchasing decisions much differently. When most people buy an expensive graphics card they don't even want to THINK about having to replace it for at least the next 4 years. In this last (Turing) case, spending $800+ didn't even buy owners any extra longevity.

Did you happen to look into the used marked for a 1080 Ti before buying that? I know it was slim pickings after 2018 though. The 2080 SC is obviously a little better with rasterization and had the added benefit of ray tracing (and now the new image sharpening and integer scaling features that are only on the RTX cards), but $840 is still crazy for what you're getting. In November 2018 I got a really good, used EVGA 1080 Ti for $500 and it's been great since. I chose to go that route since I already had a 1080 in one of my PCs and the normal 2080 was an abysmally bad upgrade for the price. The 2080 Ti to me was just a stop-gap until the real next-gen cards actually hit, which appears to be Ampere.

Even though the 2080 Ti was way overpriced, I won't hesitate to pick up the Ampere version for up to $1,200 and it should last at least five years in my main gaming PC, and a few years more when that becomes my secondary.

If this is to be trusted then NV will be forced to release 3090 with 3080 as they would want to keep the power crown.

Works for me, I want the highest tier day 1, no more staggered bullshit releases.

Do they always intentionally stagger it just to force people to buy the release cards at launch, only to go "Aha, gotcha!" a few months later? I always assumed it was more of a manufacturing issue than anything. Get the cards out that more people are likely to buy, and then start producing the high end enthusiast version. Day one flagship is what I would want too. Otherwise I'm going to gamble and use Step-Up, buying whatever the best card is at release (get one in November that is), and then hoping the next card is out within ninety days.

Did you see also see the discussion about control rtx? Or metro rtx? I played control to completion and the visual layer that rts adds is very significant to the overall experience, same for metro exodus

I have Exodus, but have been waiting to play it. Also will grab Control after I get a new card. Would rather play at 1440p with ray tracing, without it ever going below 60fps, and higher if possible.

Yeah, I'm going on a limb here and saying that EVERYONE WILL BE FINE with that. Two years is a very long time in PC upgrades. Turing wasn't "short lived" at all. And it won't suddenly turn into trash with the launch of either Ampere or RDNA2. In fact, it will still be pretty much the only h/w currently available capable of running next gen engines with next gen features.

Yeah, no. Unless you are always chasing the bleeding edge, it's really not. More like three years is when it's time to start looking for an upgrade if you were using a mid-high end card and are no longer getting what you consider acceptable performance. I have never done a two year cycle, aside from the outlier when I went right from a launch 5850 to a launch GTX 580. Otherwise, there has never been a need for me to upgrade any sooner. This was also during a period though where most of the better GPUs would eat 1080p for breakfast and were also very competent at 1440p. Since going 4K, there's been a slight paradigm shift where GPUs were all lagging behind. Look at any random upgrade survey outside of the extreme enthusiast bubble, and most will post 3-5 years, with a select few saying two.
 
Last edited:

Bosch

Banned
May 15, 2019
3,680


@9:15

He says that eg 3080 and 3080Ti (or w.e. its called) use the same chip but with different memory interface and VRAM.
Means that A and non-A chips are replaced by just naming them differently. A chip would be a 3080 Ti and a non-A chip would be 3080. The biggest card would then be the 3090.

Naming the biggest model 3090 and having a 3080Ti that isnt the biggest anymore for a lower price would make sense if they want to sell it.
The 2080Ti is a synonym for the strongest Gaming GPU (Titan excluded). With ampere they can make another xx80 Ti and many would associate it with the 2080 Ti.

It is like we are having 3070-80-80ti rebranded into 80-80ti-90

So now a 3070 is a 3080 and will let Nvidia starts around $699
 

Vimto

Member
Oct 29, 2017
3,718
Do they always intentionally stagger it just to force people to buy the release cards at launch, only to go "Aha, gotcha!" a few months later? I always assumed it was more of a manufacturing issue than anything. Get the cards out that more people are likely to buy, and then start producing the high end enthusiast version. Day one flagship is what I would want too. Otherwise I'm going to gamble and use Step-Up, buying whatever the best card is at release (get one in November that is), and then hoping the next card is out within ninety days.

I don't know if its intentional or not. But yeah, the Ti model is always months after 80 launch. Turing was the only exception I know of.
 

Deleted member 11276

Account closed at user request
Banned
Oct 27, 2017
3,223

Turing is the architecture that pioneered mesh shaders, sampler feedback, dxr acceleration and variable rate shading, all the DX12 Ultimate features that are important for next gen gaming, as well as boosting machine learning performance considerably. It's far from a bad overpriced architecture if you can look beyond the raw performance in current gen titles and it might last longer than previous architectures because of that advanced featureset.

The value of Turing will be clear once more games use machine learning and DX12 Ultimate features.

I'd argue the only architecture that is overpriced and bad right now is RDNA1 because it lacks the featureset and machine learning acceleration that is so important for next generation games. Thankfully it's soon to be replaced by RDNA2.
 
Last edited:

Sanctuary

Member
Oct 27, 2017
14,250
There won't be any need to upgrade from Turing after Ampere launch either. So my point stands.

Then what was the point of you even bothering to comment about two years being a very long time for an upgrade? If most people don't actually upgrade until they need to, then two years is all relative. In terms of tech advancements you might have a point, but that's not what you said.

You are also in a different argument about longevity, which the 20 series lost in a big way compared to the previous gen to gen improvements. All for the RTX, early adopter tax.
 
Status
Not open for further replies.