• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.

kostacurtas

Member
Oct 27, 2017
9,060
NVIDIA Cuts Price of GeForce RTX 2060 To $299

With AMD set to launch their new 1080p-focused Radeon RX 5600 XT next Tuesday, NVIDIA isn't wasting any time in shifting their own position to prepare for AMD's latest video card. Just in time for next week's launch, the company and its partners have begun cutting the prices of their GeForce RTX 2060 cards. This includes NVIDIA's own Founders Edition card as well, with the company cutting the price of that benchmark card to $299.

1edkis.png
 

Deleted member 11276

Account closed at user request
Banned
Oct 27, 2017
3,223
That's a great price, especially because you can use DLSS and Raytracing.

AMD can't really compete with that imo.
 

Bluelote

Member
Oct 27, 2017
2,024
not sure how smart of a buy are these due to only having 6GB of vram, but, it's nice that RT can be used for $300 now, still for it to be really popular I would think it would need to go down to the $200 range.
 

Deleted member 11276

Account closed at user request
Banned
Oct 27, 2017
3,223
not sure how smart of a buy are these due to only having 6GB of vram, but, it's nice that RT can be used for $300 now, still for it to be really popular I would think it would need to go down to the $200 range.
6 GB will be fine when you don't play at 4K or max textures, I'm pretty sure of that.

Consoles will rely heavily on SSD data streaming and only have 13 GB usable RAM as well, so those are not really indications that VRAM usage will sky rocket next gen. 6-8 GB cards will be fine.
 

Deleted member 4346

User requested account closure
Banned
Oct 25, 2017
8,976
This is where it should have come out. Really the 2070 should be $400, the 2080 should be $600, and the 2080 Ti should be $800.
 

Rice Eater

Member
Oct 26, 2017
2,814
About fucking time, that thing was overpriced at $350. It's weaker than a 5700 and a small upgrade from the 1660 Super which is $120 less. Now we won't have that ridiculous gap from 230 to 350 to upgrade to the next card because the 1660 TI sure isn't worth the extra $50 over the 1660 Super.

Admittedly the 1660 TI doesn't really have a place anymore. Unless they drop the price to $250 to make it enticing for potential 1660 Super owners to pay an extra 20 instead of 50 like before.
 
Oct 27, 2017
6,960
That's a great price, especially because you can use DLSS and Raytracing.

AMD can't really compete with that imo.

Nvidia is dropping prices precisely because AMD is competing, even if AMD is not doing amazing.

2060S is not strong enough to drive new games with full detail + raytracing effects. You have to sacrifice performance, standard graphics settings or raytracing. This will only get worse as games naturally get more demanding in the future.

DLSS is not dead, but it is redundant. Sharpening effects are now possible on both Nvidia and AMD GPUs with barely any performance hits, and they work really well (Nvidia's shaprening is a bit better). Shapening works on everything, no need for developers to implement anything.

AMD is not popular because they can't even match 2070S, and their drivers seem to be a little more problematic.
 

Prelude

Member
Oct 25, 2017
2,555
That's a great price, especially because you can use DLSS and Raytracing.

AMD can't really compete with that imo.
It's 6GB, even if we pretend ray tracing is a major factor right now, the 2060 doesn't have the horse power to actually use it at acceptable framerates and both AMD and Nvidia have better sharpening filters than DLSS. It's exactly Nvidia that can't compete at this price point, hence the price drop.
 

Firebrand

Member
Oct 25, 2017
4,709
This is the 6GB non-Super version yeah? Still, at just $20 difference it's a no-brainer to go with the 2060 instead. Crappy cooler might be a bigger deal on the 2060 though.

The price gap between 1660Ti/Super still makes no sense considering how close they are in performance.
 

Deleted member 11276

Account closed at user request
Banned
Oct 27, 2017
3,223
Nvidia is dropping prices precisely because AMD is competing, even if AMD is not doing amazing.

2060S is not strong enough to drive new games with full detail + raytracing effects. You have to sacrifice performance, standard graphics settings or raytracing. This will only get worse as games naturally get more demanding in the future.

DLSS is not dead, but it is redundant. Sharpening effects are now possible on both Nvidia and AMD GPUs with barely any performance hits, and they work really well (Nvidia's shaprening is a bit better). Shapening works on everything, no need for developers to implement anything.

AMD is not popular because they can't even match 2070S, and their drivers seem to be a little more problematic.
You will always have to sacrifice something with midrange cards, I don't see that as a problem. There's no need to always turn the settings to the highest, many times there are not even that much differences.

I suggest you to inform yourself about what DLSS is and what it does. It's not a merely sharpening filter but rather a deep learning image reconstruction software that actually adds details compared to native resolution. I can recommend the new Digital Foundry video on Wolfenstein RTX on that matter.

The recent Wolfenstein RTX update shows that's not true.

If you play at 1080 without RTX, sure, but it's not enough if you want all the bells and whistles.

Wolfenstein is a unlucky case though as that game is very VRAM hungry already. Other games don't run into VRAM limits even with full RT on.
 
Nov 8, 2017
13,095
the gpu will run out juice well before you run into vram limits

For today's games mostly yeah. These are always more long term concerns. I doubt you'd ever truly be wrecked by this limit for the next few years, but turning vram hungry settings down will likely start happening as more games use RT implementations, which can be vram intensive (like Wolf YB recently is an example of - uber settings fine normally, turn on RT and it breaks through the VRAM limit and starts thrashing causing major perf degredation until you turn texture streaming down a notch).

I doubt it. Realtime raytracing and DLSS are the future of gaming moving forward. The three latest games to use DLSS all boast excellent results. They have obviously had a breakthrough. Why would they stop?

I believe it was speculated that Control wasn't actually using tensor cores for it's DLSS implementation, it was all in compute. But I'm not sure about others.
 

Deleted member 11276

Account closed at user request
Banned
Oct 27, 2017
3,223
I think it's more likely Nvidia drops the Tensor cores in future hardware and DLSS silently disappears or morphs into a more general upscaling solution.
To be honest, I don't think that's likely. The tensor cores are needed for the interferencing of the deep learning AI model Nvidia trained on their servers. If the model runs on the GPU itself, that would cost way too much performance to make sense. That's why the tensor cores are needed to interference the DL model.

You could also have an open general upscaling solution, but that one would not be comparable to the AI supported one, especially in regards to image quality.
I believe it was speculated that Control wasn't actually using tensor cores for it's DLSS implementation, it was all in compute. But I'm not sure about others.
I believe this is true and supported by the fact that DLSS in Control has a performance hit and looks much worse than what we have in Wolfenstein and DUTTM which afaik have hardware support confirmed.
 

closer

Member
Oct 25, 2017
3,165
can someone school me on the raytracing for 2060? I heard it's kind of just there and not really something you should buy with "future-proofing" in mind but I'm kind of ignorant on what that might mean

ah nm, someone posted about it above, that the 2060 in terms of performance cost w/ raytracing on is not particularly strong
 

asmith906

Member
Oct 27, 2017
27,355
Nvidia is dropping prices precisely because AMD is competing, even if AMD is not doing amazing.

2060S is not strong enough to drive new games with full detail + raytracing effects. You have to sacrifice performance, standard graphics settings or raytracing. This will only get worse as games naturally get more demanding in the future.

DLSS is not dead, but it is redundant. Sharpening effects are now possible on both Nvidia and AMD GPUs with barely any performance hits, and they work really well (Nvidia's shaprening is a bit better). Shapening works on everything, no need for developers to implement anything.

AMD is not popular because they can't even match 2070S, and their drivers seem to be a little more problematic.
I don't know why AMD isn't more popular RX 580 is still an insane card for the price it is. makes getting into pc gaming really cheap.
 
Oct 27, 2017
6,960
I suggest you to inform yourself about what DLSS is and what it does. It's not a merely sharpening filter but rather a deep learning image reconstruction software that actually adds details compared to native resolution. I can recommend the new Digital Foundry video on Wolfenstein RTX on that matter.

Thank you for being so formal. Unfortunately I don't need your marketing spiel, and instead, I would recommend watching independent reviews to gather information and forming conclusions:

 

Deleted member 11276

Account closed at user request
Banned
Oct 27, 2017
3,223
can someone school me on the raytracing for 2060? I heard it's kind of just there and not really something you should buy with "future-proofing" in mind but I'm kind of ignorant on what that might mean
It's pretty sufficient for 1080p60 on most titles with Raytracing on. BF5 runs at native 1080p60 ultra, textures down to high with RT medium, Metro runs at 1080p60 high, RT on with DLSS. In Control it runs at full raytracing settings with DLSS and volumetric turned down. Wolf runs with RT on, high settings at 1080p60 as well.

Overall, it's pretty good.

Thank you for being so formal. Unfortunately I don't need your market spiel, and instead, I would recommend watching independent reviews to gather information and forming conclusions:



Sorry but the fact that you are referring to a video from months ago on even much older games that use DLSS , which is a deep learning algorithym, shows me that you are quite biased on the matter already. DLSS on those games and DLSS in newer games like Wolfenstein and Deliver Us to the Moon are a world of difference, you can check them out on any source you want.
 
Last edited:

hans_castorp

Member
Oct 27, 2017
1,458
ELI5 , is this a good thing?
I've been thinking about building a PC and I'm not sure if now it's the right time or maybe wait for a few more months, as I'm not really in a hurry
 

Monster Zero

Member
Nov 5, 2017
5,612
Southern California
To be honest, I don't think that's likely. The tensor cores are needed for the interferencing of the deep learning AI model Nvidia trained on their servers. If the model runs on the GPU itself, that would cost way too much performance to make sense. That's why the tensor cores are needed to interference the DL model.

You could also have an open general upscaling solution, but that one would not be comparable to the AI supported one, especially in regards to image quality.

I believe this is true and supported by the fact that DLSS in Control has a performance hit and looks much worse than what we have in Wolfenstein and DUTTM which afaik have hardware support confirmed.


Makes sense, in this blog they said utilizing the tensor cores was the next step.
 
Nov 8, 2017
13,095
ELI5 , is this a good thing?
I've been thinking about building a PC and I'm not sure if now it's the right time or maybe wait for a few more months, as I'm not really in a hurry

Price drops are always good but that doesn't necessarily mean this is now the default thing to buy.

I always recommend waiting if you don't literally need something right now. New products will release, prices will come down. Without knowing the exact future we can't know when the "right time to buy" is, of course.
 

Megawarrior

Member
Oct 25, 2017
2,355
The pc i got on black friday has one. Really pleased with it so far even tho i mostly play ffxiv on it.
 
Oct 27, 2017
6,960
ELI5 , is this a good thing?
I've been thinking about building a PC and I'm not sure if now it's the right time or maybe wait for a few more months, as I'm not really in a hurry

Now. If you don't have a gaming PC, the time is now.

CPUs are in a very good spot price/performance. RAM/Storage is probably at the lowest and predicted to climb up. GPUs in $200-360 are great, anything above is dropping off price/performance wise heavily.
 

shark97

Banned
Nov 7, 2017
5,327
the gpu will run out juice well before you run into vram limits

For now, maybe. (maybe).

I always like extra ram on my cards. I remember I got a HD4890 a long time ago, got the 1GB version. Everybody advised me to get the 512mb version. Later, was sure glad I had the extra VRAM. Heck I almost used that card to play Destiny 2 PC in 2019 (I had sold my RX 480 to bitcoiners, had nothing but an intel IGP!)

Same deal I recently bought a 570 on the cheap, just to have some basic gaming capability (not much of a PC gamer). Although people like you would claim 4GB was fine, I feel great having the 8GB version.

6GB vs 8GB is less of a delta though, but still a little troubling.
 

Deleted member 11276

Account closed at user request
Banned
Oct 27, 2017
3,223

hans_castorp

Member
Oct 27, 2017
1,458
Price drops are always good but that doesn't necessarily mean this is now the default thing to buy.

I always recommend waiting if you don't literally need something right now. New products will release, prices will come down. Without knowing the exact future we can't know when the "right time to buy" is, of course.
That's what I wanted know. Are we expecting new cards soonish?
Last time I build a PC for gaming purpose was like 12 years ago, so I have no ideas how the market works atm 😅
 

Jroc

Banned
Jun 9, 2018
6,145
I'm thinking 6GB VRAM in 2020 will end up being like 2GB back in 2012. Fine right now, but once the new consoles are up and running things will start to hurt.
 

Monster Zero

Member
Nov 5, 2017
5,612
Southern California
It's safe to assume that DLSS was only software and not powered by AI before Deliver Us To The Moon. Which is really dumb from Nvidias side because DLSS reputation got a pretty strong hit with the first implementations. Maybe the deep learning models were not ready at that time.

Tech is constantly getting better/improving. Now DLSS is bearing fruit. The same youtubers that shit on DLSS in Metro didn't say anything about it when Nvidia improved it a couple of weeks later. People like to sell negativity.
 

mephixto

Member
Oct 25, 2017
306
It's safe to assume that DLSS was not powered by AI before Deliver Us To The Moon. Which is really dumb from Nvidias side because DLSS reputation got a pretty strong hit with the first implementations. Maybe the deep learning models were not ready at that time.

That's what deep "learning" is, at the start is not gonna be that good with the passing of time it improves.

The last video of DF about RT and DLSS on Wolfenstein show how good DLSS can be.

youtu.be

Wolfenstein Youngblood - Ray Tracing/VRS/DLSS in id Tech 6 - A Next-Gen Features Showcase?

id Tech 6 - a properly impressive, performant engine gets a ray tracing upgrade, along with the most impressive DLSS support we've seen yet *and* VRS (variab...
 

Deleted member 11276

Account closed at user request
Banned
Oct 27, 2017
3,223
I'm thinking 6GB VRAM in 2020 will end up being like 2GB back in 2012. Fine right now, but once the new consoles are up and running things will start to hurt.
Do you think VRAM usage will increase because the next consoles arrive, that only have 13 GB shared RAM/VRAM and rely on SSD data streaming? I don't see that. Those textures in next gen games will be also likely made for 4K resolution as well, so I would think at 1080p (maybe even 1440p) and normal texture settings, you will be fine with 6 GB.

Lockhart also sets the baseline a bit lower, RTX 2060 is performing a good step above Lockhart.

That's what deep "learning" is, at the start is not gonna be that good with the passing of time it improves.

The last video of DF about RT and DLSS on Wolfenstein show how good DLSS can be.

youtu.be

Wolfenstein Youngblood - Ray Tracing/VRS/DLSS in id Tech 6 - A Next-Gen Features Showcase?

id Tech 6 - a properly impressive, performant engine gets a ray tracing upgrade, along with the most impressive DLSS support we've seen yet *and* VRS (variab...
Yes that's right. It's pretty amazing how good DLSS has become.
 
Last edited:

ShinUltramanJ

Member
Oct 27, 2017
12,949
It's funny how I occasionally see comments as if AMD isn't serious competition. And yet Nvidia is constantly reacting to them.
 

DSP

Member
Oct 25, 2017
5,120
EVGA KO now looks like a bad deal since the cooling seems cheap AF but then again I don't think there is that much old 2060 stock out there so if you want it you have to buy these new cheaply built cards. 2060 is a very good card at this new price imo. Now 2060 super seems very overpriced and just not worth it. It's not really that much better in performance and $100 for 2GB more VRAM is poor value.

In a perfect world, 2060 super would have been 2060 from the start and now sold for $300, sigh.
 

Advc

Member
Nov 3, 2017
2,632
This are excellent news and the reason why competition is always good for the consumer. The 2060 is more than enough for people who only gonna bother with 1080p (like me), plus it has native ray tracing! I'm just waiting for retailers to drop the prices on their pages so I can order one! Glad I waited cuz I was about to purchase a 1660 Super last week. Currently waiting for the new KO model by EVGA but is still not available on Amazon.