• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.
  • We have made minor adjustments to how the search bar works on ResetEra. You can read about the changes here.
Status
Not open for further replies.

Deleted member 25042

User requested account closure
Banned
Oct 29, 2017
2,077
The 3080 looks really weak. Only 10% better than a 2080 ti ? The only gpu there that's worth the money is the 3080 ti.

How did you come to this conclusion?
If it's something like comparing core counts, it's meaninglesss when not within the same gen.
A 1070 has 1920 CUDA cores for ex and is slightly faster than a 980 Ti with its 2816 cores.

edit: didn't even see that what's the source says
Still calling doubt on those...
 
Last edited:

orava

Alt Account
Banned
Jun 10, 2019
1,316
So like nearly with every node transition in GPU history then?

I hope these are true. This would definitely be quite a bump after the stagnation we have seen in the last couple generations. The only worrying thing is that I can't really see how AMD can compete with this, performance-wise.
 

Edgar

User requested ban
Banned
Oct 29, 2017
7,180
I hope these are true. This would definitely be quite a bump after the stagnation we have seen in the last couple generations. The only worrying thing is that I can't really see how AMD can compete with this, performance-wise.
i dont think they can. At least not with highest end flagship gpus and its sad really , since nvidia can price whatever the fuck they want and i will be willing to pay for it
 

orava

Alt Account
Banned
Jun 10, 2019
1,316
i dont think they can. At least not with highest end flagship gpus and its sad really , since nvidia can price whatever the fuck they want and i will be willing to pay for it

On the other hand, It also looks like the mid range and entry level GPUs will be also be quite strong and hopefully there's going to be price wars happening there. AMD ray tracing is still a mystery though and they probably can barely match current 20xx RTX.
 

Escaflow

Attempted to circumvent ban with alt account
Banned
Oct 29, 2017
1,317
Day one for 3080 if it's $700. I'm missing out the DLSS and RT tech with current 1080 Ti. Can't be bothered about the 3080 Ti, it's crazy to spend $1300 on a graphic card during these period. And yes, 2080 Ti is really selling $1300 here
 

Alvis

Saw the truth behind the copied door
Member
Oct 25, 2017
11,241
Spain
Do we know how much VRAM the 3070 will have? Will it finally go past 8 GB?
 

kris.

The Fallen
Oct 25, 2017
3,248
I'm still rockin a 970. Think I'm gonna treat myself this fall.


giphy.gif
 

fade

Avenger
Oct 25, 2017
3,516
Is it me or do these type of rumors always come up before a reveal whether it be core count hbm or price and Nvidia hits us from the top rope with some reality each and every time.
 

Kemono

▲ Legend ▲
Banned
Oct 27, 2017
7,669
3080 or 3080Ti dependent entirely on which is out before CyberPunk.

That's my line of thinking aswell...

Buti really, really don't want to be pissed that i bought a 3080 if the ti launches a few weeks later and destroys the 3080.

Maybe i have to be strong and wait a bit longer. The 3080ti looks awesome.
 

Tora

The Enlightened Wise Ones
Member
Jun 17, 2018
8,641
Is it me or do these type of rumors always come up before a reveal whether it be core count hbm or price and Nvidia hits us from the top rope with some reality each and every time.
AMD rumours seem to sorta match what they actually show, with NVIDIA it's worse lol
 

UF_C

Member
Oct 25, 2017
3,354
That's my line of thinking aswell...

Buti really, really don't want to be pissed that i bought a 3080 if the ti launches a few weeks later and destroys the 3080.

Maybe i have to be strong and wait a bit longer. The 3080ti looks awesome.
I think the ti will release about a year or so after the 3080. So It may be a bit of a wait.
 

Kernal 64

#TeamThierry
Member
Oct 28, 2017
492
NY
Around 1km of duct tape.

If only Nintendo had the foresight to SLI gamecubes we wouldn't have to revert to investing in duct tape.

For real! Nintendo keeps screwing us with their old ass technology. And thank you for the duct tape measurements!

28,670/9.4 = 3050

I think we need to move beyond the GameCube, and start using the Switch instead. It would end up needing 29 of them.

Thank you for checking my math!

If we switched to using... uh... the Switch, I suspect we'd have very limited adoption. I bet it would be like trying to get the US to use the metric system. People are just gonna keep using the GameCube because it's what they know.


:D

God I love this meme. It's one of the few classics that I always find funny.

Same!


On a more on topic note, I've been pretty happy with my 1080ti. I'll probably stick with it and reevaluate when the 40xx series unless I come across something that doesn't run well at 2k60 with acceptable settings on my rig. This card has been really, really good to me.
 

Sanctuary

Member
Oct 27, 2017
14,245
If we switched to using... uh... the Switch, I suspect we'd have very limited adoption. I bet it would be like trying to get the US to use the metric system. People are just gonna keep using the GameCube because it's what they know.

The Switch is literally listed as 1 Teraflop though, which not only makes it way more up to date, it makes it ridiculously simple to remember, and use for generic comparisons.
 

Kernal 64

#TeamThierry
Member
Oct 28, 2017
492
NY
The Switch is literally listed as 1 Teraflop though, which not only makes it way more up to date, it makes it ridiculously simple to remember, and use for generic comparisons.

Oh, I agree! It would be a far simpler measure... just like the metric system, lol. Now I ask you, sir, where's the comedy in that?
 

Maple

Member
Oct 27, 2017
11,769
I really can't wait to build a Zen 3 + RTX 3080 PC late this year or early next year.

My 6700k + 1070 system is starting to show its age.
 

Owlet

Owl Enthusiast
Verified
May 30, 2018
1,935
London, UK
I've been using my launch 1080 through the 20xx generation specifically hoping the 30xx cards get a big boost in performance.

I really hope this is accurate cause if it is then I'm in on a 3080ti as soon as possible lol
 

Deleted member 20297

User requested account closure
Banned
Oct 28, 2017
6,943
Interesting that Nvidia for the TI goes the way of more cores at a lower clock speed while the standard 3080 has less cores but higher frequency.
 

Sanctuary

Member
Oct 27, 2017
14,245
I hope these are true. This would definitely be quite a bump after the stagnation we have seen in the last couple generations. The only worrying thing is that I can't really see how AMD can compete with this, performance-wise.

???

Wasn't the 10 series considered an outlier in performance gains for the cost within two years? The only stagnation I've seen is with the 20 series, but not because it didn't add anything new, but rather the rasterization power leap was paltry compared to not just the previous gens, but especially compared to the 10 series.
 

Tovarisc

Member
Oct 25, 2017
24,487
FIN
???

Wasn't the 10 series considered an outlier in performance gains for the cost within two years? The only stagnation I've seen is with the 20 series, but not because it didn't add anything new, but rather the rasterization power leap was paltry compared to not just the previous gens, but especially compared to the 10 series.

There is also bit delusion going with what kind gen to gen improvements people expect, I imagine.

Some hold with two hands to belief that gen to gen needs to be at least 40-50% gain or it's trash.
 

Jockel

Member
Oct 27, 2017
685
Berlin
I just got a 2070 Super a few months ago. Still, if the 3080Ti is as good as this suggests, oh my. Not sure what to do.
 

BeI

Member
Dec 9, 2017
5,999
The 3080 looks really weak. Only 10% better than a 2080 ti ? The only gpu there that's worth the money is the 3080 ti.

Don't forget about architecture improvements / node shrink that result in more performance per shader core. And on top of that, I'd guess more RT cores means less performance impact with raytracing, which could act as an extra little performance increase multiplier in those types of loads. So if a game runs at 100 fps on a 2080, cut down 50% to 50 fps with RT, and a 3080 at 130 fps (30% higher traditional performance) gets cut down 40% to 78 fps, then the 3080 would be ~56% stronger with RT involved.

Unsure how realistic it is to reduce the performance hit of RT by that much though, but the traditional performance increase will probably be higher anyway.
 
Last edited:

Onix555

Member
Apr 23, 2019
3,381
UK
The Switch is literally listed as 1 Teraflop though, which not only makes it way more up to date, it makes it ridiculously simple to remember, and use for generic comparisons.
Wut??

It isnt. It has a max performance of 396GFlops FP32 in docked mode.
The Tegra X1 has a max performance of 1TFlop FP16 , which Nvidia used for marketing but is extremely misleading as FP16 is used nearly exclusively for machine learning and AI activites.
 

disco_potato

Member
Nov 16, 2017
3,145
Those ~2000mhz boots clocks likely mean 2200-2300 "actual" boost clocks if 20x0 series is anything to go by.
 

ps3ud0

Banned
Oct 27, 2017
1,906
Man if love to go back to PC gaming but I'm too cheap. Wonder if the current crisis might impact nVidias pricing...

ps3ud0 8)
 

Yogi

Banned
Nov 10, 2019
1,806
TBH I wish they all had more RT cores. 3080 only has a little more than double a 2080ti (which has terrible rt performance).

256 sounds closer to the 500 I want. They're going to drip feed the RT cores. Maybe by the time the 5000 series is out we'll have the 500+ rt cores for a good framerate with ultra raytracing (which won't even be that ultra - we're not getting minecraft level rtx in cyberpunk or anything like that till next next gen, or after that).

But if the consoles aren't going to have that great rt till like the midgen refresh then it won't matter much anyway I guess.

Disable and move on. Just need intel to put up a serious cpu so I can keep 200+ fps next gen at 1080p. That's what really matters. I can't go back to 100fps competitive first person shooters. It's a different, slower, clunkier feel that just isn't right for competitive. It's fine when you haven't played at 200 for a while but when you do, it sucks ass going back.
 
Last edited:

GameAddict411

Member
Oct 26, 2017
8,527
3080 or 3080Ti dependent entirely on which is out before CyberPunk.
earliest estimate seem to indicate a September launch for the RTX 3080. If the TI version comes out in Q4, then it's going to be November or December. The game comes out in September. I think if you can afford the higher end version, you should wait. That's what I am doing. I doubt Cyberpunk will run poorly on RTX 2080 ti especially with DLSS 2.0.
 
Status
Not open for further replies.