• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.
Status
Not open for further replies.

Tovarisc

Member
Oct 25, 2017
24,407
FIN
Sites were posting completely fabricated "leaks" about 2080 Ti Super what, a week before the lineup was announced?

People get so thirsty for new GPUs that they will keep clicking on the rumor articles, so sites keep posting them even when they know they aren't legit.

True.

Few past gens NV has released consumer GPU's within few months of keynote revealing them so that is best indicator for when release could be. Ampere would be in June to August if I were to guess.

Using rumors as indicator is anything, but reliable.
 

Plasmid

Attempted to circumvent ban with alt account
Banned
Oct 25, 2017
686
My 2080 super will stay in the tower for another 2 years then.
 

dgrdsv

Member
Oct 25, 2017
11,850
Sites were posting completely fabricated "leaks" about 2080 Ti Super what, a week before the lineup was announced?
How are we to know that NV didn't actually ponder the idea of introducing 2080Ti Super? Many "leaks" don't turn out to be true because this business is way more fluid and prone to last minute changes than people seem to assume.
 

Wag

Member
Nov 3, 2017
11,638
I think you would see a good boost with minimum framerate and frametimes with a CPU upgrade. Mine went up considerably when I went from a 1700x to 9900k on my 2080 Ti.
I'll wait for the 3080Ti to be released to see if I should upgrade. My 5820k@4k is still quite usable. By then the current round of CPUs will have come down in prices as well.
 

GrrImAFridge

ONE THOUSAND DOLLARYDOOS
Member
Oct 25, 2017
9,668
Western Australia
How are we to know that NV didn't actually ponder the idea of introducing 2080Ti Super? Many "leaks" don't turn out to be true because this business is way more fluid and prone to last minute changes than people seem to assume.

In the case of the 2080 Ti Super, though, the rumour originated on Wccftech and was never corroborated by any other outlet (IIRC, Tom's Hardware DE even repudiated it), and Wccftech stopped short of providing any actual details about the card despite reiterating its supposed existence multiple times across several months. Perhaps Wccftech just had bad, hollow info, but, frankly, it very much seems like it merely assumed the 2080 Ti Super to be a sure thing and invented the rumour expecting an easy credibility boost.

Edit: To be clear, I'm not suggesting Nvidia wasn't kicking around the idea of releasing a 2080 Ti Super, just that the rumour of it being on the horizon was lacking in substance and only got more dubious as time went on.
 
Last edited:

Resident Guru

Member
Oct 28, 2017
919
I'll wait for the 3080Ti to be released to see if I should upgrade. My 5820k@4k is still quite usable. By then the current round of CPUs will have come down in prices as well.
Sounds like a good plan. I kept seeing random framerate dips before my upgrade that were being caused by the 1700x not being able to keep up. I also game at 4k/60.
 

Wag

Member
Nov 3, 2017
11,638
Sounds like a good plan. I kept seeing random framerate dips before my upgrade that were being caused by the 1700x not being able to keep up. I also game at 4k/60.
The 5820k is a better overclocker than the 1700x, that's probably why. My EVGA 2080Ti Black can't overclock for shit tho- I can up the voltage, but a lot of games just don't like it if I overclock even a little, so I just run it at stock. Still fast enough.
 

SolidSnakeUS

Member
Oct 25, 2017
9,595
In comparison of the 2080 and 3080:

2080:

GPU Cores: 2944
RT Cores: 46
Tensor Cores: 368
Boost Clock: 1710 MHz

3080:

GPU Cores: 4608 (56.5% increase over 2080)
RT Cores: 144 (3.13x times as many over 2080)
Tensor Cores: 576 (56.5% increase over 2080)
Boost Clock: 2000 MHz (17% increase over 2080)

Seems like a pretty substantial upgrade over the 2080 and could be a huge winner if it comes out at a similar price points as when the 2080 did.
 
Nov 8, 2017
13,099
Sounds like a good plan. I kept seeing random framerate dips before my upgrade that were being caused by the 1700x not being able to keep up. I also game at 4k/60.

I've got a 2700 and feel the same way. I just get so many little stutters that "aren't a big deal" but annoy me. Oh my average framerate is super high! Ok but the 0.1% lows aren't. Even with an OC. Tons of games just don't "feel" smooth because I can't seem to lock framerates. Worse because I'm on a 1440/144hz monitor.
 

Wag

Member
Nov 3, 2017
11,638
I've got a 2700 and feel the same way. I just get so many little stutters that "aren't a big deal" but annoy me. Oh my average framerate is super high! Ok but the 0.1% lows aren't. Even with an OC. Tons of games just don't "feel" smooth because I can't seem to lock framerates. Worse because I'm on a 1440/144hz monitor.
I use my Shield to Gamestream to my bedroom TV @ 4k/60Hz so I'm mostly ok with it. My Sony 65XBR900E has a really good game-mode so I don't notice any significant drops. I can jack up most standard games (without RTX features) to high and it's still good.
 

Cup O' Tea?

Member
Nov 2, 2017
3,603
Last night I had a dream that I shoplifted a 3000 series card but the shopkeeper tracked me down and I had to give it back. lol.

Would love to replace my 1070 but I have a feeling the new cards will be way too expensive for me because my country's dollar is in the toilet.
 
Mar 22, 2019
811
Chugging along with my 9900k and 2080ti - issue is i dont have a 4K screen: still have my trusty 34'' UW 120mhz from Asus.
So before i even think about upgrading GPU i need to invest in a quality gaming screen to take full advantage of it.
 

JahIthBer

Member
Jan 27, 2018
10,377
In comparison of the 2080 and 3080:

2080:

GPU Cores: 2944
RT Cores: 46
Tensor Cores: 368
Boost Clock: 1710 MHz

3080:

GPU Cores: 4608 (56.5% increase over 2080)
RT Cores: 144 (3.13x times as many over 2080)
Tensor Cores: 576 (56.5% increase over 2080)
Boost Clock: 2000 MHz (17% increase over 2080)

Seems like a pretty substantial upgrade over the 2080 and could be a huge winner if it comes out at a similar price points as when the 2080 did.
The GDDR6 is highly likely 16 gbps too, plus hopefully at least 11gb's.
 

Resident Guru

Member
Oct 28, 2017
919
The 5820k is a better overclocker than the 1700x, that's probably why. My EVGA 2080Ti Black can't overclock for shit tho- I can up the voltage, but a lot of games just don't like it if I overclock even a little, so I just run it at stock. Still fast enough.
I also have the EVGA 2080 Ti Black. I have to agree that it's not a great overclocker. With power limit of 112% I can get +195 on the core and +764 on the memory with the fan at 77%.
 

Wag

Member
Nov 3, 2017
11,638
I also have the EVGA 2080 Ti Black. I have to agree that it's not a great overclocker. With power limit of 112% I can get +195 on the core and +764 on the memory with the fan at 77%.
I can get something around that too, but some games just have a problem with it for some reason so I stopped overclocking.
 

GhostofWar

Member
Apr 5, 2019
512
That moores law dude is saying ampere is going to have dlss 3.0 that will work on everything that has taa, rt performance across the line 4 times faster than its turing counter part (so the 3060 will be faster than a 2080ti in rt which the numbers on the first page imply anyway). New thing called nvcache which is using tensor cores for lossless texture compression and can use ssd for vram like amd's hbcc and the up coming consoles.
 

Eslayer

Chicken Chaser
Member
Oct 27, 2017
330
From the video above
NVAMP-ARC.png
NVAMP-PERF.png
NVAMP-LAUNCH.png
 
Nov 8, 2017
13,099
Based on those slides, my thoughts:

  • Ampere being a meaningful arch change not just a shrink - likely
  • DLSS 3.0 working inherently with any game that has TAA and implying you can just enable it in the new control panel - very unlikely
  • RTX voice being upgraded / only in beta - this is not a rumour, this is confirmed by Nvidia
  • Encoder improvements - likely
  • 4x better RT performance - hard to say
  • "RTX on shouldn't lower the performance in games" - extremely unlikely, since the limiting factor according to devs is not the RT core but the shading
  • Tensor accelerated vram compression - seems implausible
  • Lower power consumption per tier - plausible
  • No logins for GeForce thingies - Hell freeze over tier (but I want it)
 

GhostofWar

Member
Apr 5, 2019
512
Based on those slides, my thoughts:


  • DLSS 3.0 working inherently with any game that has TAA and implying you can just enable it in the new control panel - very unlikelyver tier (but I want it)

Would be interesting to see what Dictator thinks of that because in his dlss 2 video he said the nvidia engineers told him it slots into where they put taa in the render pipeline, so can nvidia at a driver level turn dlss on in place of taa now? If they can this is pretty huge.
 

Deleted member 11276

Account closed at user request
Banned
Oct 27, 2017
3,223
Eh. Some of those are just rumors from elsewhere and also many guesses and already known facts. And then you have things like 4x the RT performance where the presenter doesn't seem to know what causes the performance drop of RT at all. You can't just magically have 4x the RT performance because an increase in RT cores and some magic sauce, that's not how it works. RT is limited by the shader cores currently.

Watched the video and the presenter says RDNA1, without any ML hardware, DXR acceleration and no DX12 Ultimate featureset will age better than Turing. Also says RDNA2 has much better RT than Turing. Yeah I know where this is going. Dude has no clue whatsoever.
 
Last edited:

Dries

Banned
Aug 19, 2019
309
How long will it take before we'll get some launch dates on the new cards you guys think?
 

pswii60

Member
Oct 27, 2017
26,667
The Milky Way
From the video above
Sounds like fan fiction. Would love for much of that to be true though. Near-global DLSS would be a game changer.

And the Nvidia Control Panel is painful as hell. So I wish they'd overhaul that from the ground up. Why do we have to wait so damn long for the programs list to populate? And that's on a 9900k.

Edit: how would global DLSS work in practical terms? I assume you'd have to tell the game you're running at resolution X even though your monitor is actually set to resolution Y? Never done super sampling but assume it would be a similar method in reverse?
 
Last edited:

BeI

Member
Dec 9, 2017
5,974
Would be interesting to see what Dictator thinks of that because in his dlss 2 video he said the nvidia engineers told him it slots into where they put taa in the render pipeline, so can nvidia at a driver level turn dlss on in place of taa now? If they can this is pretty huge.

Seems way too good to be true. So many games have TAA these days, and it would basically be like a flat 50% performance increase just for the quality DLSS setting.
 

Black_Stride

Avenger
Oct 28, 2017
7,388
Based on those slides, my thoughts:

  • Ampere being a meaningful arch change not just a shrink - likely
  • DLSS 3.0 working inherently with any game that has TAA and implying you can just enable it in the new control panel - very unlikely
  • RTX voice being upgraded / only in beta - this is not a rumour, this is confirmed by Nvidia
  • Encoder improvements - likely
  • 4x better RT performance - hard to say
  • "RTX on shouldn't lower the performance in games" - extremely unlikely, since the limiting factor according to devs is not the RT core but the shading
  • Tensor accelerated vram compression - seems implausible
  • Lower power consumption per tier - plausible
  • No logins for GeForce thingies - Hell freeze over tier (but I want it)

If they get rid of needing to login to GFE thats really all I want.
Why doesnt the program remember my log in details, pretty much everytime I update to test new features im logged out and said new features dont work lest you log in.
Being that I have autologin set, its the last thing I check so i end up panicking and doing nigh literally everything else before even bothering to login.
Worse still, when GFE was introduced I found it useless so my first ike 3 accounts were throwaway accounts so some settings are pretty much lost to me forever.
Get rid of that shit.


DLSS 3.0 working across the board is actually alot more likely than you are thinking, but it def isnt going to be a automatic control panel thing, the game would still need to allow DLSS to work with it, so devs would still have to allow their game to give out some information to the program....which I think they are just implying alot more games are jumping on board because they plan on making the submission to allow DLSS to be easy as pie, basically when a devs submits a game for driver optimization DLSS is implemented at the same time.
An absolute godsend if true and might actually push me to go 4K. The perf impact of 4K is basically what stopped me from getting a 4K screen, id rather have a few more bells, whistles and frames at 1440p than 4K


If they reeally are pushing that many RT cores then the increase in RT performance should be totally doable and the overall impact should be lowered substancially....but there will def still be a hit.


The only thing they need to announce to completely win this battle is tell us the 2080Ti beating xx70 which has better RT performance cost ~500 dollars. Do that and I would be hard pressed to justify getting a XSX for the same money.
 

Darkstorne

Member
Oct 26, 2017
6,813
England
If those slides are true I'll happily make do with a 3060, thanks!

Was considering an 80 before along with a monitor upgrade to an ultrawide. Now with the virus and economic uncertainty I'm thinking of a 70 and sticking with my 1080/60 monitor for another year or two. But if the 60 card is that good... it sounds overkill for a 1080/60 monitor.
 
Nov 8, 2017
13,099
DLSS 3.0 working across the board is actually alot more likely than you are thinking, but it def isnt going to be a automatic control panel thing, the game would still need to allow DLSS to work with it, so devs would still have to allow their game to give out some information to the program....which I think they are just implying alot more games are jumping on board because they plan on making the submission to allow DLSS to be easy as pie, basically when a devs submits a game for driver optimization DLSS is implemented at the same time.

I'm aware that DLSS 2.0 is now not trained on a per-game basis, and is understood to be implementable in any game that use TAA (because it requires the same type of information from the game - motion vectors and the like). But this reporting here sounds kinda vaugely like someone heard that, then thought to themselves "oh yeah, that'll just be something you can toggle on in all games that have TAA! And it'll be called DLSS 3.0!"

I expect DLSS 2.0 to continue to improve and now that the training requirement is gone, I also expect it to become much more widespread. As described by that guy, seems way too good to be true.
 

Dictator

Digital Foundry
Verified
Oct 26, 2017
4,930
Berlin, 'SCHLAND
I do not think there will be a "it works with anything just at the driver level" DLSS in the future, near or far. It just does not work that way at all, especially with DX12 and Vulkan being around.

That sounds like wishlisting.
 

Black_Stride

Avenger
Oct 28, 2017
7,388
I'm aware that DLSS 2.0 is now not trained on a per-game basis, and is understood to be implementable in any game that use TAA (because it requires the same type of information from the game - motion vectors and the like). But this reporting here sounds kinda vaugely like someone heard that, then thought to themselves "oh yeah, that'll just be something you can toggle on in all games that have TAA! And it'll be called DLSS 3.0!"

I expect DLSS 2.0 to continue to improve and now that the training requirement is gone, I also expect it to become much more widespread. As described by that guy, seems way too good to be true.

Yeah I too dont think itll quite be driver level like DSR is.
But id expect Nvidia will be pushing this to devs so hard and making the process borderline automatic from the devs perspective that all devs will just allow it

For it to be fully driver level indeed is unlikely because then it would theoretically work backwards with games that predate or just never implemented DSR, something i dont see happening.

What Im really curious about is with GPU manufacturer deals and DXR would games that have AMDs backing have a toggle for when to use RTX and when to use AMD RT. While competition is good, RTX seems so far ahead that PC devs wanting AMD RT solution seems slighlty backwards, but implementing it for when an RTX card isnt present could make sense.
Also RTX for all is a real killer deal, if the cards lower than xx60 also get RTX support AMD might be in for a bad day at the races.
 
Oct 25, 2017
2,933
That video is incredibly specific, to the point where if a substantial amount of it doesn't happen it'll hurt his credibility long term. MLID was also one of the techtubers pushing 3/4 way SMT for Zen 3 (and 4) before that was essentially de-confirmed.

I think the guy has had some interesting on-the-record guest interviews with general tech industry people in the past few months, but I wouldn't get excited just yet. This rumor is just...too big.

The GA100 DGX leak is the way more relevant one.
 

Dictator

Digital Foundry
Verified
Oct 26, 2017
4,930
Berlin, 'SCHLAND
I do not think there will be a "it works with anything just at the driver level" DLSS in the future, near or far. It just does not work that way at all, especially with DX12 and Vulkan being around.

That sounds like wishlisting.
I would like to be wrong here of course, but there are too many "specific" things DLSS needs that I do not think the driver has any reasonable way of learning about. Unless there is some sort of AI driven super computer at NV learning how game renderers work and how to insert itself in them, lol.
 

Deleted member 11276

Account closed at user request
Banned
Oct 27, 2017
3,223
I would like to be wrong here of course, but there are too many "specific" things DLSS needs that I do not think the driver has any reasonable way of learning about. Unless there is some sort of AI driven super computer at NV learning how game renderers work and how to insert itself in them, lol.
What do you think about the RT performance? An improvement of 4x and free Raytracing with Ampere sounds very unrealistic. Or am I wrong? Could it be possible in any way?
 
Last edited:

PHOENIXZERO

Member
Oct 29, 2017
12,073
Bottleneck, for me, is when the game struggles and starts to be not FPS stable, for example using i5 with only 4 cores, some games starts tu stutter bc the CPU is not able to handle the GPU. That's my big concern. Nowadays with 6 core (12 thread) you have more than enough but....who knows in 2 years, im afraid the 6 core is not enough, specially for higher framerates.

Im not an expert on hardware so maybe Im wrong, that's why im asking :) thank you
Yeah it feels a bit early imo to upgrade and itd be a kinda small upgrade to go from a 6 core CPU to an 8 core CPU such as the 3600X to 3700X.

The next gen consoles have 8 core CPUs clocked at around 3.5ghz, which is a huge improvement over the CPUs now. Seems like itd be better to wait and see what the new games are like and how CPU hungry they are.

There's a 3950X that has 16 cores/32 threads right now. In a couple of years that's probably going to be the new Ryzen 7 and that'd be a big upgrade and a big jump from a 6 core CPU.



Yeah the XSX will be around the 2080 level, which is pretty crazy and might be even more crazy depending on if its $499-599. Man I hope they do $399 just to make Sony nervous.

I don't know if Nvidia will allow users to boost so high, starting with the 10 series cards, they've been really limiting the amount of power allowed so we weren't seeing users overclocking their cards by 25-30% like with Maxwell 9 series cards and prior.
Yeah, I'd avoid a 6C/12T CPU right now for gaming if you can afford it unless you're looking to get another, more expensive CPU later when prices come down or build a new system again in a couple years as development moves away from the current gen and greater use is made of the CPUs in the PS5 and XSX. Defintely hold off now until we got solid details on Zen 3 since the new architecture should resolve (or at least mostly) the memory latency issues that have played a role in holding their CPUs back from matching or beating Intel's performance in games.

It helps the next gen consoles looking as good as they do compared to NVIDIA's 20-series GPUs thanks in no small part to NVIDIA's decision to focus on things other than horsepower, the jump from GTX10 to RTX20 was so small it IMO lead to raw performance almost completely stagnating for four years because of it.

I'm leaning heavily towards finally building a new PC either late this year or early next. I've kept my 2500k way longer than I had intended and I've ended up waiting longer for Zen 3 since I decided back in 2017 that I would try to hold off for it and even though it's the end of the road for AM4 I don't want to be an early adopter for AM5 and DDR5.
 

GhostofWar

Member
Apr 5, 2019
512
I would like to be wrong here of course, but there are too many "specific" things DLSS needs that I do not think the driver has any reasonable way of learning about. Unless there is some sort of AI driven super computer at NV learning how game renderers work and how to insert itself in them, lol.

Thanks Alex, I'll take your input over that guys and wont get my hopes up for a driver level taa replacement and just hope nvidia make it an easy process that devs start using alot more like other guys here have said.
 

BreakAtmo

Member
Nov 12, 2017
12,830
Australia
I would like to be wrong here of course, but there are too many "specific" things DLSS needs that I do not think the driver has any reasonable way of learning about. Unless there is some sort of AI driven super computer at NV learning how game renderers work and how to insert itself in them, lol.

Would it be at all possible to have a different version of DLSS that actually could work with any TAA tabs at the driver level, with the cost of not being as good due to not having access to the specific things you're referring to? That would be a nice boost for older games with TAA, though actual DLSS patches would still be preferred.
 

dgrdsv

Member
Oct 25, 2017
11,850
"RTX On shouldn't lower performance much" point is enough to know that this person doesn't have any idea on what he's talking about in this video.

And how exactly do you merge GeForce Experience with GeForce Now?..
 

Deleted member 27751

User-requested account closure
Banned
Oct 30, 2017
3,997
Look, if the 3070 isn't a ridiculous AUD price sure I might get one. Otherwise I'm just going to stick to my 1070 because screw paying a ridiculous cost and we are all going to be in a financial crisis anyway.
 

Laiza

Member
Oct 25, 2017
2,170
In comparison of the 2080 and 3080:

2080:

GPU Cores: 2944
RT Cores: 46
Tensor Cores: 368
Boost Clock: 1710 MHz

3080:

GPU Cores: 4608 (56.5% increase over 2080)
RT Cores: 144 (3.13x times as many over 2080)
Tensor Cores: 576 (56.5% increase over 2080)
Boost Clock: 2000 MHz (17% increase over 2080)

Seems like a pretty substantial upgrade over the 2080 and could be a huge winner if it comes out at a similar price points as when the 2080 did.
Absolutely. It's a big enough upgrade that I'm strongly considering it as my upgrade option from my venerable 1080 Ti, assuming the specs are true and it costs around the same ($700-ish).

That raytracing performance looks tasty. I can hardly wait to mess around with ray-traced Minecraft with one of these cards. It's gonna be great.
 

SolidSnakeUS

Member
Oct 25, 2017
9,595
Absolutely. It's a big enough upgrade that I'm strongly considering it as my upgrade option from my venerable 1080 Ti, assuming the specs are true and it costs around the same ($700-ish).

That raytracing performance looks tasty. I can hardly wait to mess around with ray-traced Minecraft with one of these cards. It's gonna be great.

I have a 2080 right now and I'm considering the upgrade. Yeah, that's some wild shit right there.
 
Status
Not open for further replies.