Turing is the architecture that pioneered mesh shaders, sampler feedback, dxr acceleration and variable rate shading, all the DX12 Ultimate features that are important for next gen gaming, as well as boosting machine learning performance considerably. It's far from a bad overpriced architecture if you can look beyond the raw performance in current gen titles and it might last longer than previous architectures because of that advanced featureset.
The value of Turing will be clear once more games use machine learning and DX12 Ultimate features.
I'd argue the only architecture that is overpriced and bad right now is RDNA1 because it lacks the featureset and machine learning acceleration that is so important for next generation games. Thankfully it's soon to be replaced by RDNA2.
The point obviously is that two years is a very long time to upgrade.Then what was the point of you even bothering to comment about two years being a very long time for an upgrade?
It's great to have but it's barely something you can bank on.DLSS 2.0 and beyond will also have a ton of importance allowing lower Turing cards to decently run intensive games at "high" resolution in the future. Some of the results they shared with a 2060 were mind-blowing. It still was a bit expensive for what it was at launch but its value has improved
They have said the target of DLSS is not requiring being whitelisted and be easy to implement (not in 2.0 sadly), and given the high improvements on DLSS from 1.0 to the current standard, I can see that becoming a reality for future games in some time in the future.It's great to have but it's barely something you can bank on.
The numbers of games that support it is infinitesimally small compared to the number of releases at large.
The 2.0 tech was just introduced in march I think. So games that was releasing soon after didnt have time to implement.It's great to have but it's barely something you can bank on.
The numbers of games that support it is infinitesimally small compared to the number of releases at large.
@9:15
He says that eg 3080 and 3080Ti (or w.e. its called) use the same chip but with different memory interface and VRAM.
Means that A and non-A chips are replaced by just naming them differently. A chip would be a 3080 Ti and a non-A chip would be 3080. The biggest card would then be the 3090.
Naming the biggest model 3090 and having a 3080Ti that isnt the biggest anymore for a lower price would make sense if they want to sell it.
The 2080Ti is a synonym for the strongest Gaming GPU (Titan excluded). With ampere they can make another xx80 Ti and many would associate it with the 2080 Ti.
So thats how they are going to make a 3080 cost $500, ahhhhhhhhhIt is like we are having 3070-80-80ti rebranded into 80-80ti-90
So now a 3070 is a 3080 and will let Nvidia starts around $699
Wouldn't call it a rebrand. I mean, there more likely still will be 70-60-50 cards additional to that. Maybe you consider it a shift/rebrand comparing the 80ti with the 90, but the rest sounds like it seems to stay the same as previous generations.It is like we are having 3070-80-80ti rebranded into 80-80ti-90
So now a 3070 is a 3080 and will let Nvidia starts around $699
The point obviously is that two years is a very long time to upgrade.
It doesn't mean that everyone is upgrading every two years nor that there's a need to do it every two years.
So the actual question is what is your point exactly?
He wanted something brand new to last him for several years and so he wouldn't have to think about it again. It was his first time building a PC. I tried to talk him into getting a new card for around $200 as a stopgap solution, bc the rasterization improvement with RTX 2000 simply wasn't there. But while strongly considering a regular 2070 for $500+ it seemed to make more sense to him to go all-in on stepping up into the 2080 instead.Did you happen to look into the used marked for a 1080 Ti before buying that? I know it was slim pickings after 2018 though. The 2080 SC is obviously a little better with rasterization and had the added benefit of ray tracing (and now the new image sharpening and integer scaling features that are only on the RTX cards), but $840 is still crazy for what you're getting. In November 2018 I got a really good, used EVGA 1080 Ti for $500 and it's been great since. I chose to go that route since I already had a 1080 in one of my PCs and the normal 2080 was an abysmally bad upgrade for the price. The 2080 Ti to me was just a stop-gap until the real next-gen cards actually hit, which appears to be Ampere.
Even though the 2080 Ti was way overpriced, I won't hesitate to pick up the Ampere version for up to $1,200 and it should last at least five years in my main gaming PC, and a few years more when that becomes my secondary.
I've bought a 2080 which used to cost the same as 1080Ti while being both faster and more future proof. So you can preach whatever you want really, the simple fact is - Turing never was a bad investment and it certainly isn't now. Which is the point where this argument started.The 20 series was nowhere near the same level of improvement Nvidia cards had previously without the massive inflation of cost
I still only see nvidia promoted games implementing this. There are not a lot of nvidia promoted games.They have said the target of DLSS is not requiring being whitelisted and be easy to implement (not in 2.0 sadly), and given the high improvements on DLSS from 1.0 to the current standard, I can see that becoming a reality for future games in some time in the future.
I don't think they will. This time Nvidia has to compete with AMD's big Navi plus the new consoles. Nvidia had a stranglehold on the market previously but this cycle they need to be much more competitive. People have choices now. They also aren't going to freely concede a good portion of the market share to AMD.
I've bought a 2080 which used to cost the same as 1080Ti while being both faster and more future proof. So you can preach whatever you want really, the simple fact is - Turing never was a bad investment and it certainly isn't now. Which is the point where this argument started.
I don't think they will. This time Nvidia has to compete with AMD's big Navi plus the new consoles. Nvidia had a stranglehold on the market previously but this cycle they need to be much more competitive. People have choices now. They also aren't going to freely concede a good portion of the market share to AMD.
Same. Which is why I waited for the 3080ti/3090. With the 24% upgrade we saw from the 1080ti to the 2080ti and the 30-50% upgrade we will see from 2080i to the 3090....I'm super exited.As a 1080 Ti owner, I just don't want to have to spend over $1k for a relatively small upgrade. I've bought the Ti variant 3 gens in a row, but the 2080 Ti was just too much for me after the previous 3 were like $700 and not $1200 (without even factoring in the small performance gain).
Especially considering the global economy. Now is not the time to push prices higher.Not only that turing also sold pretty bad compared to pascal.
We reached the maximum what folks are willing to pay for a GPU and with another price increase and even more would nope out.
You may even be able to hang onto it through the 3000 series depending what games you play and at what resolutions.As a 1080 Ti owner, I just don't want to have to spend over $1k for a relatively small upgrade. I've bought the Ti variant 3 gens in a row, but the 2080 Ti was just too much for me after the previous 3 were like $700 and not $1200 (without even factoring in the small performance gain).
It's already bothering me that I've had it for nearly 4 years, no way I can hang onto it past the end of this year.You may even be able to hang onto it through the 3000 series depending what games you play and at what resolutions.
yeah it's going to be a lonnng two months until they are announced in August lol.graphics card marketing is so labyrinthine.
really just want them to announce already geeeez
"Turing certainly isn't a bad investment now". ROFL. Yeah nevermind that the PS5/XBSX are putting GPU + CPU combos in their machines that will come close to matching the performance of a 2080 Super with raytracing capabilities PLUS modern Zen 2 processing (cutdown) at a cost of around ~$200-250ish.I've bought a 2080 which used to cost the same as 1080Ti while being both faster and more future proof. So you can preach whatever you want really, the simple fact is - Turing never was a bad investment and it certainly isn't now. Which is the point where this argument started.
At everything, when compared to all other options.
I'm still waiting on you providing an example of a better investment than Turing during the last two years.
Cool, yeah you do that. *thumbsup*I'm still waiting on you providing an example of a better investment than Turing during the last two years.
This revisionist history on first-gen Turing has to stop. Now.More future proof at what, the hope that more games implement DLSS 2.0? The regular rasterization of the 2080 wasn't even equal to a 1080 Ti at launch in many games, and it took almost a year for it to actually have an insignificant lead in the majority of games.
This revisionist history on first-gen Turing has to stop. Now.
Techpowerup 2080 FE Review. 20+ games tested.
Please go through each game's graphs, and count how many in which the 1080 Ti has the advantage, at any resolution. I'll wait for you to report back on your findings, as I already did so and counted.
Please, no pivots.
Please stop making stuff up. It's really annoying to argue with someone who just makes stuff up to strengthen their argument. It doesn't help that it takes 5 seconds of google to prove you wrong.More future proof at what, the hope that more games implement DLSS 2.0? The regular rasterization of the 2080 wasn't even equal to a 1080 Ti at launch in many games, and it took almost a year for it to actually have an insignificant lead in the majority of games. Do you think future ray traced games are somehow going to become more optimized and that the 2080 won't struggle to even have it above a medium setting and 1080p? I guess if you're fine staying at a locked 30fps, or don't mind constant drops into the mid 40s and 1080p it's great.
Turing was absolutely a bad investment, even more so if the current benchmarks aren't even of the flagship card. Even a 2080 Ti couldn't do a locked 60fps on a mix of Very High at 4K with more demanding games, and you also had to play at 1080p to get decent ray tracing performance. Like what the fuck? It's a card where you have to choose between a higher resolution and graphical options, or ray tracing, but can't have both without DLSS 2.0 being implemented.
If all someone cared about was very high frame rates, regardless of the resolution, then maybe that 25% advantage of the 2080 Ti over the 1080 Ti (or 2080/2080S) would be considered a good value for some, but that's not future proofing, because you're buying it for the current frame rate advantage it could give. You'll still be stuck with a card that makes you choose ray tracing over resolution and graphics though, and that's the flagship.
Ampere looks like it's going to be closer to the jump Turing should have been for the cost. Actually more, but still closer than what Turing offered.
Welp I sold it for $980. Not bad since I paid $1070 for it 16 months ago. Combined with what I've saved these last few months I'll be going all in on whatever the Ti equivalent happens to be. Might even go for a fancy model this time.The possible memory size and minor performance edge in most current games could be offset or go completely the other way as the 3070 like Ampere cards are rumored to having significantly improved ray tracing capability in games that use it and more games probably will be as the next generation gets rolling. This is a terrible time to buy a 2080 Ti or any Turing card really.
IMO buying a 2080 Ti for more than a 3070 is probably going to be about as good of a decision as buying a 780 Ti when the 970 was close to coming out or after. It's going to be interesting to see how aggressive clearance sales get.
If you can get that much, I would absolutely go for it, would go a long way to buying a 3080 Ti or probably cover the cost of a 3080 with money left over.
This revisionist history on first-gen Turing has to stop. Now.
Techpowerup 2080 FE Review. 20+ games tested.
Please go through each game's graphs, and count how many in which the 1080 Ti has the advantage, at any resolution. I'll wait for you to report back on your findings, as I already did so and counted.
Please, no pivots.
That's the level of discussion I'm used to expect from people like you.
You've chosen a used 1080Ti because it was dirt cheap at the end of GPU mining boom back in 2018. There were no other reasons, don't kid yourself. Newsflash: there is no mining boom now and you won't get a dirt cheap 2080Ti this time.The main reason I decided to not get a new 2080 over a used 1080 Ti was precisely because the 2080 wasn't even matching the 1080 Ti
You've chosen a used 1080Ti because it was dirt cheap at the end of GPU mining boom back in 2018. There were no other reasons, don't kid yourself. Newsflash: there is no mining boom now and you won't get a dirt cheap 2080Ti this time.
It's already bothering me that I've had it for nearly 4 years, no way I can hang onto it past the end of this year.
The discussion is already on this page. Maybe open up your eyes and try to read and understand the things that people post.That's the level of discussion I'm used to expect from people like you.
I've upgraded twice since I had my 970's (980Ti to 1080Ti), can't imagine how nice it'll feel for you to see such a huge jump!Man, I've had my 970 for longer than that, imagine how I feel!
I've upgraded twice since I had my 970's (980Ti to 1080Ti), can't imagine how nice it'll feel for you to see such a huge jump!
Man, I've had my 970 for longer than that, imagine how I feel!
Same, I just receiced my new components (i7 10700, 32GB RAM and 1tb nVME SSD), but i'm waiting for the next Geforce to change my 970. I was motivated to upgrade because I want to run DCS World on my Pimax 5K VR headset.
I really hope the next Geforce cards will not have ridiculous prices. I miss having a high end GPU for 350€. I'm prepared to put about 700€ tops but I'm afraid they will start at 800 or some shit.
What's the performance uplift like? I've been wondering what a cutting-edge rig with an older, weaker graphics card would actually be like.
It's better. Just not total solution better. I went from 4670K>R3600 and overall the framerate is clearly better. Certainly windows and everything else is great. But the 970 just sometimes gets into full-chop situations where the framerate can tank, and it really struggles at 1440p. And I've mentioned this before but same games (AC:Odyssey, for example, at 1440p will go past the 3.5GB soft-limit on the 970 which will further tank performance...)
I went from a 4670k to a 8600k (both with a 1070) and I could really see a jump in framerate. Mostly in 1% lows. But yeah, it depends on what you are playing. Competitive games like OW? CPU-bound, eye-candy games like SotTR? GPU-bound (mostly). In uour case, I think a cheap and easy way to get some extra performance without any hassle is to buy another 8 gigs of ram, more and more games need a lot of ram.Interesting. My current rig (from 2013 outside of the graphics card) is a 970 with an i5-3570K, and 8GB of DDR3 and a SanDisk SATA SSD. Last Black Friday when there was an eBay Plus sale, I considered getting most of what I needed for a new rig (3900X, 32GB of 3600mhz DDR4, an NVMe drive, etc) and keeping the 970 until I could get a 3080 on sale. I decided the result wouldn't be worth it and I'd be better off waiting a year and going Zen3 instead.
Speaking of Zen3, I was wondering - do you think the larger core counts are going to trickle down to the cheaper CPUs this time? Like, say, a 12-core 4800X when the 3800X was only 8-core?
My current PC is exactly the same as yours and I'm also waiting for Ryzen 4000 and Nvidia's 3000. I'm so ready.Interesting. My current rig (from 2013 outside of the graphics card) is a 970 with an i5-3570K, and 8GB of DDR3 and a SanDisk SATA SSD. Last Black Friday when there was an eBay Plus sale, I considered getting most of what I needed for a new rig (3900X, 32GB of 3600mhz DDR4, an NVMe drive, etc) and keeping the 970 until I could get a 3080 on sale. I decided the result wouldn't be worth it and I'd be better off waiting a year and going Zen3 instead.
Speaking of Zen3, I was wondering - do you think the larger core counts are going to trickle down to the cheaper CPUs this time? Like, say, a 12-core 4800X when the 3800X was only 8-core?
My current PC is exactly the same as yours and I'm also waiting for Ryzen 4000 and Nvidia's 3000. I'm so ready.