• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.
  • We have made minor adjustments to how the search bar works on ResetEra. You can read about the changes here.

Burai

Member
Oct 27, 2017
2,091
I wouldn't be surprised if people buy them quickly at that price so there would probably never be a chance for me to get one.

The 3060 will still have a few advantages over a used 2080Ti. It won't have been used for hundreds of hours, you'll get a warranty, it'll use less power and it'll come in a variety of form factors so there will be options for all case sizes.

Plus we really don't know how aggressive Nvidia will go to cover off whatever AMD unveil.

I feel pretty comfortable not rushing out and committing to anything just yet.
 

BigTnaples

Member
Oct 30, 2017
1,752
My next rig is going to get a 3090.

8K resolution here we come!

@24 fps


We've made it.

Now I just wish 8K TVs would come down and price a bit sooner. My 4K65" Samsung is feeling long in the tooth, especially after getting my PG35VQ with its 1000nit HDR and 200hz Gsync.

I want a 80-85 inch LED 8K Samsung with HDMI2.1 and 120hz to be available. And not for 10,000.

I'd go for it for $4K and below
 

Metalmucil

Member
Aug 17, 2019
1,382
It's a bit on the low side, but I think those cards will be squarely targeting 1080p HFR and 1440p resolutions. 6GB should be acceptable (though not stellar) up to 1440p.
Honestly at the midrange and cheaper, AMD will probably have much more compelling options at competitive prices.
I'm gonna say no. I have a 4gb card now and I'm running into problems often enough at 1080p. You need at least 8 with what the new consoles have in them. Consoles are baseline. Otherwise you are in the "2gb will be fine!" realm from when last gen transitioned. I'll cross my fingers for an 8gb variant.
 

Timu

Member
Oct 25, 2017
15,613
The 3060 will still have a few advantages over a used 2080Ti. It won't have been used for hundreds of hours, you'll get a warranty, it'll use less power and it'll come in a variety of form factors so there will be options for all case sizes.

Plus we really don't know how aggressive Nvidia will go to cover off whatever AMD unveil.

I feel pretty comfortable not rushing out and committing to anything just yet.
I wonder how the 3060 would compare in ray tracing performance as well.
 
Jun 1, 2018
4,523
So the 30TFLOPS are the peak of what the card can provide, all FP32 cores running all the time. That's 8704 FP32 cores * 1.71GHz * 2 (because they can do two ops per cycle), and it ends up being 29.767 TFLOPS.
But the other things around those FP32 units hasn't changed, so they will be idle for longer periods of time. They will be waiting for other parts to finish before they can compute.
In the end it seems it is still an increase, comparing the 2080ti and 3080, which have the same number of SMs, the 3080 is around 30% extra, at least going by leaks. So every TFLOP in Ampere is around 65% of the ones in Turing.
Thanks for the explanation, so I guess I should just stick with my 2080 Ti for now?
 

CheeseWraith

Member
Oct 28, 2017
618
My current plan is to get the best 3070 in terms of silence/price ratio.

I'd love to go for a 3080 from NVIDIA, the design is awesome...but apparently I'd need to also upgrade my PSU. My current PSU (Seasonic mII evo 520W) is barely enough to cover 100% load on the system with the 3080, while the 3070 fits in my current power envelope without any problems.

The upside: since I also need to wait for october I can hope AMD will unveil their answer.

That said, I really like DLSS and its intrisic evolving nature. Tensor cores will be useful and adaptable for future iterations of any kind of ai-based software and I can't see AMD deploying that kind of feature at the moment.

Decisions, decisions...
 

Flandy

Community Resettler
Member
Oct 25, 2017
3,445
I really didn't see a reason top upgrade from my 1070. It seemed as if they wanted to get RTX out the door and be done with it.
I mean it's like twice as powerful and should be able to handle next gen stuff much better than a pascal card. Not to mention it'll be very good for those of us targeting either high frame rates or resolution. Not hard to see why pascal owners would want one lol
 

kami_sama

Member
Oct 26, 2017
7,021
I mean it's like twice as powerful and should be able to handle next gen stuff much better than a pascal card. Not to mention it'll be very good for those of us targeting either high frame rates or resolution. Not hard to see why pascal owners would want one lol
We talking about Turing or Ampere? Turing was middling at best, Ampere is a must-buy, at least for me.
 

brain_stew

Member
Oct 30, 2017
4,736




So 16GB 3070 Ti and 20GB 3080 Ti.

Edit:
videocardz.com

NVIDIA GeForce RTX 3070 Ti spotted with 16GB GDDR6 memory - VideoCardz.com

Lenovo has confirmed that their Legion T7 gaming desktop system will ship with GeForce RTX 3080 and GeForce RTX 3070 graphics cards. Lenovo launches Legion T7 with GeForce RTX 3070 Ti 16GB Interestingly, Lenovo also confirmed that their Legion T7 system will feature the GeForce RTX 3070 Ti...


No, don't do this, there's finally a card I can buy without buyer's remorse.

GDDR6 would be a little disappointing for a To card, moving to 6X feels like an easy performance gain.
 
Oct 27, 2017
1,078
Yea I'm sticking with the 3090 as my purchase. Those ti / super variants shall be interesting. I expect them to be released as a response to AMD next year.
 

Serene

Community Resettler
Member
Oct 25, 2017
52,558
Debating camping out in front of Microcenter on the 17th. lol

🤔🤔🤔
 

LordRuyn

Member
Oct 29, 2017
3,910




So 16GB 3070 Ti and 20GB 3080 Ti.

Edit:
videocardz.com

NVIDIA GeForce RTX 3070 Ti spotted with 16GB GDDR6 memory - VideoCardz.com

Lenovo has confirmed that their Legion T7 gaming desktop system will ship with GeForce RTX 3080 and GeForce RTX 3070 graphics cards. Lenovo launches Legion T7 with GeForce RTX 3070 Ti 16GB Interestingly, Lenovo also confirmed that their Legion T7 system will feature the GeForce RTX 3070 Ti...

Goddammit, just as I was settling on buying the 3080 at the end of the year...
 

GlacialTruffle

One Winged Slayer
Member
Oct 25, 2017
572
They've already announced some cards with 2x HDMI 2.1 on the spec sheet. Forgot who.
Those were leaks before. All the actual released specs have 1 HDMI port.

edit; I am full of shit sorry!

This gigabyte card clearly has 3 HDMI ports:
uMbm8nV.jpg
100%. Custom AIB's already have shown us that some of them have added an extra HDMI 2.1
Excellent. Thank you!
 
Last edited:

Darktalon

Member
Oct 27, 2017
3,267
Kansas
cool tyvm

no measurement on the width tho. do we know if any of these cards require 3x 8-pins?
Yes some of the custom AIB's are using 8 pin, and specifically from your picture, the EVGA FTW3 is using 3x 8 pin.
So I don't see nvlink support for the 3070 or 3080. Am I gonna need to get 2 3090s? That would be ridiculous...
Only 3090 supports Nvlink this gen. Jensen doesn't want you to combine that crazy value of two 3080's. (Tbf most games don't work with SLI nowadays)
Any educated guesses on the performance differences between the 3080 and 3090? From the specs it seems that the difference isn't huge outside of the VRAM size. About 20%. If it reflects to the real life performance, I don't see how it's worth the extra in money. I was mentally prepared to buy the 3090, but the more math I do, the more I am becoming skeptical.
For double the price, it has to be at least 30%. Has to be.
Just doing math, shows that the 3090 (outside of 8K and VRAM usage and bandwidth constraints) is 20% faster. So it can only be at best 20%, likely a little less. But hey you get to pay $800 more than the 3080! Its not good value for gamers.


what in the absolute fuck

I will have nightmares now thank you

My main takeaway from this is that they've increased all their prices. 70 series bumped up to where 80 series was, and 80 series slightly higher than where Ti was
I don't see how when they are offering a 2080 Ti for $500, you see this as a bump in prices. You gotta stop focusing on the names and focus on the performance per dollar.
When I saw how they were handling heat during the presentation, I facepalmed. That is going to be a problem.
Your gpu dumps hot air into your case. This isn't really different.
No, RTX Titan is slightly faster than a 2080 Ti, and the 3090 is far faster than a 2080 Ti.
The "starting at $699" part is what worries me. I remember them using some sort similar verbage with the 2000 series unveiling.
It is confirmed on Nvidia's website, that Founder's Editions are priced at MSRP.
As much as I want the OMG BEST right now which would be pairing the 3090 with my still-perfectly-fine 9900k and 32GB RAM setup...I think instead of dropping what i can only assume will be +$3,000 AUD for a single item; i will instead upgrade my monitor to a 4k/high refresh rate ultrawide and keep ticking with my Asus 2080ti OC card.

(or maybe will just buy the 3090 anyway...).
Why not get the 3080? Still a really large upgrade from 2080 Ti, and it isn't kicking you in the nuts with the price.
I'm assuming the 3090 fondle my balls while I play games if it's $1500.
Only if you don't mind fan blades.
Isn't that an exponential scale? It's a great card, but this seems disingenuous lol.
Uh what? How do you see that as exponential? They didn't even start from higher numbers this time, these are the most honest Nvidia's graphs have EVER been.
Should I be worried about my pcie3 setup?
Is PCIe 4.0 necessary for the 3090?
No, all benchmarks and slides were using a PCIE 3 computer. PCIE4 may provide some benefit, but we assume thats in the range of 1-5%. Not something to lose sleep over, or upgrade your whole computer before you are ready.
I have a question about DLSS that is kinda key in deciding which card to buy, if anyone would be so kind.

I have an LG B9 which only supports G-Sync in the range of 40-60fps on 4k resolution, and 40-120fps on 1440p. My question is, if I play a game in 1440p with DLSS reconstructing the image to 4k, what is my actual resolution in regards to the triggering of the G-Sync? Would I be able to play games in the 40-120fps sync range while the image is being reconstructed to 4k?
You will set your game resolution as 1440p 120hz or 4k 60hz, and then touch DLSS settings separately. And yes, you can always use DSR to supersample, DLSS same thing, but if you want 120hz, its going to be supersampled down from 4k to 1440p. your monitor doesn't give a shit about what the internal resolution is, only the frame presented to it.
Do we know why the 3090 is more than twice the price of the 3080? Why so much? Not that it matters much with the specs on the 70 and 80, but still.
It is a card for people who are prosumer and are using RTX Titan's to upgrade to. It's not a good value for a gaming card. Its at BEST 20% more performance, starting at $800 more. Gonna be even more than 1500 for AIB 3090s.
Also just noticed the number of inputs....damn

2 x HDMI 2.1
3 x DP 1.4a

5?!? Makes my TV seem old
Reference board is only 1x HDMI 2.1, some custom boards are adding an extra.
I usually upgrade every 2 generations. Pascal-Ampere seems like a solid upgrade path.
Jensen literally called you out in the video and said it was ok for you to now upgrade.
I just found out that we're getting HDR Shadowplay finally too. Nvidia not holding back.
Nice!
So RTX 3070 is 20 teraflops? Is this TRUE?
Of course this is true, unless we are all just a simulation, or a hallucination of an alien.
To everyone of y'all that sold their Turing cards just before the conference, I tip my hat. That was quite the heist!
Thank you! I feel so good that I sold it when I did ^_^
Unless the 3000 cards are instantly sold out for months.
Ye of little faith.
Agreed. I had a good experience with EVGA when I had to RMA a 780 Ti, so I'm inclined to stay with them. Hopefully, they allow you to turn off all the RGB shit on the FTW3.
I was able to turn off the RGB shit on 2080Ti FTW3, I have no reason to expect you can't this time.
Hmmmm, interesting, EVGA's RTX 3090 card is a two expansion card slot sized graphics card instead of being three expansion card slot sized like the FE RTX 3090:

www.evga.com

EVGA - Articles - EVGA GeForce® RTXᐪᔿ 30 Series

The EVGA GeForce® RTXᐪᔿ 30 Series are the absolute definition of ultimate performance.
Only the reference board. The FTW3 is a 2.75 slot beast. And the reason for this, is because one, they want you to pay more for the better FTW3, and two, some people can't fit more than 2 slot cards in their case and they still want to have a 3090 for them, without having them to go watercooling route.
What are the chances 3rd party cards could have more than one HDMI 2.1 port?
100%. Custom AIB's already have shown us that some of them have added an extra HDMI 2.1
so....the $499 RTX3070 is more powerful than 98% of the people's PC graphic card right now?
Going off steam, only 0.88% of people had a 2080 Ti.
So yeah $500 buys you more gaming power than 99% of computers currently have. What a time to be alive :)
Just saw this video a lot of info goes a bit over my head but am I right in assuming he is claiming that non rt performance from 2080ti to 3080 will be minimal?

This guy is fucking insane and full of shit. Watch Digital Foundry if you want some real information.
Are benchmarks out? Don't buy just based on what Nvidia say. It's marketing bullshit. Wait for benchmarks and see what they say.
Digitial Foundry has a limited benchmark right now for 3080. Watch the video, its good!
After some late night considering, I may go for a 3080 over a 3090

I mainly want to play at 4K 120hz, so the cheapest card that can get me close to that or on it would be the choice

My current 2070 super doesn't compare to even the 3080 so that's that
Having a difficult time deciding
Don't waste your money on the 3090, it is at BEST 20% faster than a 3080 at 4k. Not worth paying an extra $800+, you could be getting a PS5 or a new monitor, or half of an LG OLED with that kinda money. Or a whole lotta games!
Gotta make some space for the 3080 Ti.

Probably ~9500 cores and 16GB Vram. At £999 or £1,099 (3080 is 649 and 3090 is 1499)
This is not possible without completely fucking the bandwidth of the card up. It can only be either 10,11,12,20,22,24.... 3080 Ti will be a 12GB card, mark my words.
Really tempted to go for a 3080 instead of a 3070, but I only have a 650w PSU... Hopefully some brave soul can test the card out on a lower wattage PSU and see if it's fine or not.
Unless you have an overclocked 10900k, 650w is enough.
Is the 3090 double the performance of a 3080 or is the price just stupid?
The price is more than double for 20% (at best) of the performance at 4k.
Is that a bad thing?
3x 8 pin means you can provide more power to the card, this is useful for overclocking.
Yeah, it must some error, at best, dlss can give you around twice the fps, I don't get how they end up with such a difference.
Since 8K is 4x worse than 4k, some games will completely shit the bed at that resolution. And then DLSS makes up for it by being fucking incredible and rendering from 1440p to 8k.
Nvidia are hilarious.

Their timing is hilarious, these cards are announced just before the console launch. They completely shift the bar for performance, when it had felt like the consoles were kinda catching up.

They have also managed to fuck with the publics price perception over the last few years, to the eccentric that many people are looking at the 3080 as a bargain.

Funny guys!
Yeah I do feel that 2080 Ti performance on the 3070 for $500 is a bargain. The 3080 has better price / performance than even the 300 so yeah, I would consider it an even better bargain too. I guess if some people really think RDNA2 is about to drop cards with even better prices, you could think otherwise, but I think they should prepare to be dissapointed.
I slept all day because I stayed up all night to watch a dude in his kitchen, any impressions/benchmark vids to watch?
Any benchmarks for 4k?
Lol! Hell no! These cards were just announced and everyone was caught blindsided.
Watch the Digital Foundry Video
Which is why Series X basing their whole marketing on '12 teraflops' looks very silly now. We got over 30 teraflops here. I think the next gen consoles have other perks, namely games, SSD and I/O.
Of course. In terms of raw horse power. The differentiating factor will be the console's unique storage array. They might be faster than what you can currently get on PCs but I think that is about it. So, you might be able to get faster load times on consoles, but when the game is loaded up, I think these GPUs will knock the socks off of the PS5 and XSX.
www.nvidia.com

Introducing NVIDIA RTX IO: GPU-Accelerated Storage Technology For The Next Generation of Games

Load instantaneously, experience vast worlds with endless views and rich detail, and further improve gameplay by leveraging the power of GeForce RTX 30 Series graphics cards and NVIDIA RTX IO.
Yup, even the one thing consoles had going for them, Nvidia has them beat.
I'm looking forward to someday having PCIE 4.0 SSD in my system.
Gotta love how they include the easy to run games just to make it look like it's a super beast at 8K...
Then you look at modern games that weren't already easy enough to run, and you have to think "OK, what's the fucking point?". Is the memory what's allowing the card to do this, and preventing the 3080 from achieving somewhat similar results? Still a moot point though since 8K anything will never be relevant to me probably for the rest of my life.
8K is 4x the pixels of 4k, and 16x of 1080p. NO game is "easy to run" at 8K. This requires tremendous, tremendous amounts of power and bandwidth.

3080 sounds good, but am I the only one really disappointed by the VRAM size?
Yeah the VRAM size is TERRIBLE, they are waaaaay too big of cards, and can you believe the PSU requirements? These cards LITERALLY require you to buy a new PC. I think y'all should just back out and wait for 4000 series.
Thank You. Seems like the 3080 is quite a bit more powerful than the 2080ti. The differences between the 3080 and 3090 is good but not as massive outside of the memory. So for the 3080 the only concern is the 10gb of VRAM. With next gen systems allocating about 13gb to games or so, wonder how this will impact the 3080.
Yeah, 10 gb is concerning, need more overhead over console like 16gb would be ideal... Then again console 13gb also include other things like audio etc which can be store on system memory on pc. Gears 5 for example has no issue handling ultra texture on my 2080 with 8gig vram but a 9gigs xbox one x can't.
3080's 10GB Ram is appalling. Should be at least 16GB. Once hype blows over, and people see 3080's RAM bottleneck, majority will go for 3070 or 3090 for serious gaming, me thins.
This is a public misconception about how much a game actually uses for VRAM, and the number you see in MSI Afterburner/Rivatuner, which includes caching.

We can take Flight Simulator 2020 for example, if you use the developer FPS overlay, it will tell you exactly how much the game is using.
On 4K, everything set to Ultra and all sliders to to the right except supersampling, FS2020 only uses 8GB of VRAM!

Everything above that is just the game caching assets that it may or may not need, and they will fill fill fill all that room up with potentially useless data, which is good because unused VRAM is wasted VRAM, but you will not have a performance disadvantage having a 10GB card in 4k gaming even including Next-Gen titles.

TL;DR: 10GB IS PLENTY FOR 4K Next-gen
 

Skade

Member
Oct 28, 2017
8,875
Well... I was considering upgrading my 1080Ti to a 2080 Super, guess i'll wait a bit for a more powerfull card that costs less. Fantastic !
 

CreepingFear

Banned
Oct 27, 2017
16,766
Ye of little faith.

I was able to turn off the RGB shit on 2080Ti FTW3, I have no reason to expect you can't this time.
Even if they have more supply this time, it's obvious that there are more people excited about the 30 series over the 20 series. We'll see how it plays out.

I hope you are right about disabling all the RGB. I want the FTW3 for the cooler and little overclock that it provides. I'm not interested in doing my own overclock.
 

Darktalon

Member
Oct 27, 2017
3,267
Kansas
No even if you use mods for skyrim you need more vram than that. 10gb is def NOT enough for the whole next generation.
Did you read anything I actually said? The number that MSI Afterburner reports is not the accurate amount of VRAM in use.

No one said "whole next generation". I said nextgen as a descriptor of the type of game that Flight Sim 2020 is.

Also, I am not saying this card is good for the next 7 years. Anyone in here who bought a 780 Ti on release date and hasnt upgraded once?

Edit: Honestly, this type of hit and run response is exactly why I hate even using a TL;DR
 
Last edited:

Kewlmyc

Avenger
Oct 25, 2017
26,749
Tempting, but my 2070 Super should still hold up for at least a couple of years in terms of at least 1080p144 for newer games and 4k60 for older games.
 

Serious Sam

Banned
Oct 27, 2017
4,354
Did you read anything I actually said? The number that MSI Afterburner reports is not the accurate amount of VRAM in use.
It's fairly accurate. Just like with regular RAM, the more you have the more will be used, and that's a good thing. I really wish people stop spreading this nonsense that GPU monitoring apps are inaccurate, without actually saying what IS accurate.

Why is it when I played old game I see VRAM usage 500-1000MB, but when I play new game in 4K with ultra textures it's 8000-10000MB. Seems pretty accurate to me.
 

Qassim

Member
Oct 25, 2017
1,532
United Kingdom
The 3090 release is so clever.

People like me who usually buy the *80Ti cards don't really want to buy the 3080 despite how good it is, because we want the bigger chip (I don't care as much for the 24GB of VRAM).

The 3090 isn't actually that much more expensive over the 2080Ti (at least here in the UK), which reduces the impact of the high price.

I'm actually considering a 3090. I can't justify it, it's really not worth the huge extra cost over the 3080 (for me) - but I think I've already convinced myself to get it.

It's fairly accurate. Just like with regular RAM, the more you have the more will be used, and that's a good thing. I really wish people stop spreading this nonsense that GPU monitoring apps are inaccurate, without actually saying what IS accurate.

Why is it when I played old game I see VRAM usage 500-1000MB, but when I play new game in 4K with ultra textures it's 8000-10000MB. Seems pretty accurate to me.

Because VRAM used does not necessarily equal VRAM required.

A game may use up your VRAM for no real noticeable performance benefit to you.

A default of caching everything until full and then replacing cache entries with newer data as required is just a nice safety net to have in case you run into scenarios where anything in that cache may be useful in a very particular, even if rare, scenario.
 

Darktalon

Member
Oct 27, 2017
3,267
Kansas
It's fairly accurate. Just like with regular RAM, the more you have the more will be used, and that's a good thing. I really wish people stop spreading this nonsense that GPU monitoring apps are inaccurate, without actually saying what IS accurate.

Why is it when I played old game I see VRAM usage 500-1000MB, but when I play new game in 4K with ultra textures it's 8000-10000MB. Seems pretty accurate to me.
Turn on FS2020, use the developer overlay, and come back to me with real numbers. Show me any possible way you can get FS2020 to even reach 10GB without raising the resolution above 4k.
The reason old games don't fill up your VRAM is because old games don't cache. This started when streaming became a thing.

csmKqsh.png


I really wish people stop spreading this nonsense that GPU monitoring apps are inaccurate, without actually saying what IS accurate.
Just because YOU don't understand how the behind the scenes of your computer works, doesn't make it nonsense. You personally have had at least 6 different people tell you the same thing now.
 

mordecaii83

Avenger
Oct 28, 2017
6,862
The 3090 release is so clever.

People like me who usually buy the *80Ti cards don't really want to buy the 3080 despite how good it is, because we want the bigger chip (I don't care as much for the 24GB of VRAM).

The 3090 isn't actually that much more expensive over the 2080Ti (at least here in the UK), which reduces the impact of the high price.

I'm actually considering a 3090. I can't justify it, it's really not worth the huge extra cost over the 3080 (for me) - but I think I've already convinced myself to get it.
The 3080 is the "bigger chip" this time around, it's the same GA102 chip as the 3090 just cut down to about 80% active SM's.
 

Temperance

Member
Oct 25, 2017
5,832
[NO 2FA]
This method to tell the layman, "Hey look how small that resolution you currently use is." square lines will never not be used.
I wait for the day 8K is the middle box lol.

watch-dogs-legion-1k-4k-8k-resolution-comparison.jpg
 

Serious Sam

Banned
Oct 27, 2017
4,354
Because VRAM used does not necessarily equal VRAM required.

A game may use up your VRAM for no real noticeable performance benefit to you.

A default of caching everything until full and then replacing cache entries with newer data as required is just a nice safety net to have in case you run into scenarios where anything in that cache may be useful in a very particular, even if rare, scenario.
Strange argument. Again, just like with regular RAM (or any other component for that matter), you can get away with minimal requirement, but if your system has more you'll have better experience.
Turn on FS2020, use the developer overlay, and come back to me with real numbers. Show me any possible way you can get FS2020 to even reach 10GB without raising the resolution above 4k.
The reason old games don't fill up your VRAM is because old games don't cache. This started when streaming became a thing.

csmKqsh.png
What is your image of FS running in 1440p supposed to prove? VRAM usage jump from 1440p to 4K is HUGE.
 

SixelAlexiS

Member
Oct 27, 2017
7,741
Italy
So the 30TFLOPS are the peak of what the card can provide, all FP32 cores running all the time. That's 8704 FP32 cores * 1.71GHz * 2 (because they can do two ops per cycle), and it ends up being 29.767 TFLOPS.
But the other things around those FP32 units hasn't changed, so they will be idle for longer periods of time. They will be waiting for other parts to finish before they can compute.
In the end it seems it is still an increase, comparing the 2080ti and 3080, which have the same number of SMs, the 3080 is around 30% extra, at least going by leaks. So every TFLOP in Ampere is around 65% of the ones in Turing.
Ah ok, those 30 tflops always looked off chart compared to the image shown yesterday of those new card compared to the old one.
So realistically a 3080 is around 20tf?
 

PHOENIXZERO

Member
Oct 29, 2017
12,113
Just saw this video a lot of info goes a bit over my head but am I right in assuming he is claiming that non rt performance from 2080ti to 3080 will be minimal?


Dude's calling other people idiots while fishing for views and being bad at math and not understanding what NV did with Ampere, especially with the increase in FP32 units.
 

VariantX

Member
Oct 25, 2017
16,903
Columbia, SC
I figure by the time I get money for a 3080, there will either be low supply or high demand which sends the prices of the cards themselves well above MSRP. So I've kind of reisgned myself to wait until the inevitable mid-gen refresh thats eventually happening that Nvidia keeps in their back pocket to keep AMD cornered
 

Isee

Avenger
Oct 25, 2017
6,235
What are the chances of a 3080Ti. There is a huge price gap between 3080 and 2090.