Deleted member 19533

User requested account closure
Banned
Oct 27, 2017
3,873
But they're having no issue selling more cards and we're all hungry for new tech. Nvidia as a company is crushing it right now and has no high end competition. They have no reason to cut prices.
People are holding onto their 10 series cards. They need a reason to upgrade. the 20XX series didn't provide it. This one needs to. Why would anyone buy a new 3070 or 3080 if a console is delivering the same games with better performance? They're at the start of a new generation. They need to deliver performance and price.
 

turbobrick

Member
Oct 25, 2017
13,190
Phoenix, AZ
People are holding onto their 10 series cards. They need a reason to upgrade. the 20XX series didn't provide it. This one needs to. Why would anyone buy a new 3070 or 3080 if a console is delivering the same games with better performance? They're at the start of a new generation. They need to deliver performance and price.

well, people will have to upgrade if they're on too old of cards and they have a hard time playing new games.
 

ILikeFeet

DF Deet Master
Banned
Oct 25, 2017
61,987
people like the path of least resistance. I believe the people in the most populous bracket (the $150-$300) range are ones who'd be more at risk of jumping ship if that least resistance is a console. if the 3000 series goes up in price, at the very least, cutting prices so that the 2070S is within that price bracket would get a lot of people to upgrade to a new card instead of letting people's pc'c stagnate
 

Veliladon

Member
Oct 27, 2017
5,565
well, people will have to upgrade if they're on too old of cards and they have a hard time playing new games.

My 1080 Ti is 14.3 TFLOPs running at its typical 2GHz boost clock. That's still faster than a 2080 Super. What exactly am I going to be playing that's going to require me to upgrade that isn't going to beat the stuffing out of any card lesser than the 2080 Ti?

Our only path from a 1080 Ti was a 2080 Ti. It cost a fucking grand and didn't deliver a performance increase you would need to pay a thousand bucks for. I'm holding out hope for whatever comes next.
 

BeI

Member
Dec 9, 2017
6,045
Not high end competition, no.

Make no mistake, the 3080 is going to run circles around what's in the new consoles.

If it's priced like the 2080, it might be something like $700+ for the gpu alone, that might not even be twice as powerful.

Fingers crossed that there is some big advancement for this next gpu series that will greatly benefit future games, like the 8xxx series in 2006 with unified shaders.
 

Sabin

Member
Oct 25, 2017
4,704
Nvidia ain't giving this kind of performance increase away without a bump.

3080 Ti - $1499
3080 - $999
3070 - $699
3060 - $499

You know it to be true.

Leaks suggested that Nvidia is looking to make the RTX 3080 and RTX 3070 slightly cheaper than their Turing counterparts at launch.
 

turbobrick

Member
Oct 25, 2017
13,190
Phoenix, AZ
My 1080 Ti is 14.3 TFLOPs running at its typical 2GHz boost clock. That's still faster than a 2080 Super. What exactly am I going to be playing that's going to require me to upgrade that isn't going to beat the stuffing out of any card lesser than the 2080 Ti?

Our only path from a 1080 Ti was a 2080 Ti. It cost a fucking grand and didn't deliver a performance increase you would need to pay a thousand bucks for. I'm holding out hope for whatever comes next.

well, a 1080ti might be a bit of an exception as its still pretty good. There's still people on 1070's and 1080's which will show their age going forward, and they will be the ones buying the 3070 or 3080.
 

Mesoian

▲ Legend ▲
Member
Oct 28, 2017
26,943
If it's priced like the 2080, it might be something like $700+ for the gpu alone, that might not even be twice as powerful.

Fingers crossed that there is some big advancement for this next gpu series that will greatly benefit future games, like the 8xxx series in 2006 with unified shaders.

Oh expect the 3080's to be at least a grand. I doubt nvidia has learned their lesson at all.

But still, it'll be running laps around the consoles.
 

Veliladon

Member
Oct 27, 2017
5,565
It all comes back to RTG being a complete joke.

God I hope Xe can put some price pressure on Nvidia or nothing is going to change.
 

Sabin

Member
Oct 25, 2017
4,704
Oh expect the 3080's to be at least a grand. I doubt nvidia has learned their lesson at all.

But still, it'll be running laps around the consoles.

It's important to point out that it's very early to talk about pricing, which NVIDIA tends to set pretty much at the last minute. So definitely take this with a grain of salt, although I thought it was interesting enough that NVIDIA was communicating to its partners that it's looking to make the Ampere lineup more price competitive than Turing currently is. Whether they will end up following through on this or not is still up in the air and will definitely be influenced by what AMD does with Radeon next year.

wccftech

At worst im expecting the same launch price as Turing.
 

Deleted member 19533

User requested account closure
Banned
Oct 27, 2017
3,873
Not high end competition, no.

Make no mistake, the 3080 is going to run circles around what's in the new consoles.
From a raw power perspective. Given that console versions of games tend to be much better optimized since they only need to run on that hardware, the difference won't be that big at the beginning of the generation.
 

Armaros

Member
Oct 25, 2017
4,902
From a raw power perspective. Given that console versions of games tend to be much better optimized since they only need to run on that hardware, the difference won't be that big at the beginning of the generation.

We have already seen from the past, Console hardware does not punch above what it is predicted at.

And historically even the some of the worst PC ports have run better then console equivilents and with higher settings.
 
OP
OP
Earvin Infinity
Oct 27, 2017
6,905
A console matching a 2018 gpu isn't putting any pressure on Nvidia lol. Plus the new gpus will widen the gap quite a bit across the board, consoles are on the cheaper side for a reason.

I think with the new consoles looming, as well as Big Navi this year - I think Nvidia will at least not price gouge on Ampere as bad as they did with Turing.
 

Deleted member 19533

User requested account closure
Banned
Oct 27, 2017
3,873
We have already seen from the past, Console hardware does not punch above what it is predicted at.

And historically even the some of the worst PC ports have run better then console equivilents and with higher settings.
I don't agree with that at all. A 1070, for example, plays the same games as PS4 Pro in the same resolution at 60FPS. Cleaner, but the visual difference isn't significant.

I'll also point out I bought Nioh on PC recently, and while it's cleaner, it runs like shit and hiccups for no reason regularly (at 1080p on much more powerful hardware). So, I'm not sure which ports you're talking about cause there are some games that are just better optimized on console.
 

Rice Eater

Member
Oct 26, 2017
2,822
Not high end competition, no.

Make no mistake, the 3080 is going to run circles around what's in the new consoles.

True but that's a $700 card at minimum. How will the $400 card(very likely the 3060) compare to the next gen Xbox?

It sure looks like MS has aimed so high that even at $600 their next console will provide pretty good value when compared to what PC will be offering in late 2020. I don't think we'll see "build a Xbox for $600" videos a few months after it releases this time around.
 

DeaDPooL_jlp

Banned
Oct 31, 2017
2,518
I think with the new consoles looming, as well as Big Navi this year - I think Nvidia will at least not price gouge on Ampere as bad as they did with Turing.

I hope so, 2070 super was a part of my first build and it cost $500 on it's own. I'm prepared to go up to $1,000 if I can be sure I'll get much better ray tracing performance.
 

ILikeFeet

DF Deet Master
Banned
Oct 25, 2017
61,987
I don't agree with that at all. A 1070, for example, plays the same games as PS4 Pro in the same resolution at 60FPS. Cleaner, but the visual difference isn't significant.

I'll also point out I bought Nioh on PC recently, and while it's cleaner, it runs like shit and hiccups for no reason regularly (at 1080p on much more powerful hardware). So, I'm not sure which ports you're talking about cause there are some games that are just better optimized on console.
EDIT: disregard, I was thinking of Sekiro
 

BeI

Member
Dec 9, 2017
6,045
I don't agree with that at all. A 1070, for example, plays the same games as PS4 Pro in the same resolution at 60FPS. Cleaner, but the visual difference isn't significant.

I'll also point out I bought Nioh on PC recently, and while it's cleaner, it runs like shit and hiccups for no reason regularly (at 1080p on much more powerful hardware). So, I'm not sure which ports you're talking about cause there are some games that are just better optimized on console.

It often seems to be the case that PC is played with lower resolution than what the hardware could probably manage too. My husband plays Battlefield V at 1440p Ultra with a ~10TFLOP 2070 super, yet an X1X plays it at dynamic 1800p-4k and I don't think Ultra settings looks amazingly better anyway.

With even less difference in settings at the start of next gen, I could easily see there not being enough of a difference between the Xbox and high-end pc gpus to make them as worthwhile.
 

BreakAtmo

Member
Nov 12, 2017
12,984
Australia
It often seems to be the case that PC is played with lower resolution than what the hardware could probably manage too. My husband plays Battlefield V at 1440p Ultra with a ~10TFLOP 2070 super, yet an X1X plays it at dynamic 1800p-4k and I don't think Ultra settings looks amazingly better anyway.

With even less difference in settings at the start of next gen, I could easily see there not being enough of a difference between the Xbox and high-end pc gpus to make them as worthwhile.

I have heard many times before that Ultra settings tend to have a serious performance penalty for what is arguably a minimal visual improvement - though of course that'll vary from game to game.
 

Haint

Banned
Oct 14, 2018
1,361
Not high end competition, no.

Make no mistake, the 3080 is going to run circles around what's in the new consoles.

Most high end PC gamers are whales that buy everything. Though 2 year GPU refreshes are easily absorbed financially, dual $500 consoles AND $700+ GPU's all at once are assuredly pushing the limits for most. Nvidia will absolutely be competing for the same gamer dollars as the consoles, and if they put out shitty upgrades at ridiculous prices, they're going to lose big.
 

Csr

Member
Nov 6, 2017
2,042
I don't agree with that at all. A 1070, for example, plays the same games as PS4 Pro in the same resolution at 60FPS. Cleaner, but the visual difference isn't significant.

I'll also point out I bought Nioh on PC recently, and while it's cleaner, it runs like shit and hiccups for no reason regularly (at 1080p on much more powerful hardware). So, I'm not sure which ports you're talking about cause there are some games that are just better optimized on console.

Without benches you can't really draw any conclusions.
For example in RDR2 a 580 does perform worse than the x1x which has a similar gpu but the difference is small (from what i can tell around 10% maybe slightly higher) and the x1x has some settings at lower than the lowest setting on pc so the performance of these gpus is probably very close. Some gpus with low vram have fallen more behind in some games though.
Perhaps with much better cpus for the consoles things will be different.
 

Wumbo64

Banned
Oct 27, 2017
327
I have heard many times before that Ultra settings tend to have a serious performance penalty for what is arguably a minimal visual improvement - though of course that'll vary from game to game.

Frostbite is a nasty culprit for diminishing returns at Ultra settings. Good PC ports now often show you screenshots of what is being improved at different settings. Oftentimes, Ultra is immediately and clearly better. Stupid draw distances, more and higher fidelity lighting, more foilage density and of course just uncompressed textures.

Older software definitely has a worse issue with Ultra being far too demanding, proportional to what the player gets in return. Stuff like Ultra shadows in Bioshock Infinite come to mind, where running at high gains back substantial performance while showing virtually zero loss in quality in any regard under scrupulous examination.

It's why I wait to read user reviews and watch tech breakdowns like those from Digital Foundry. If there is some bizarre anomaly, they'll cover it. I game with a 4K monitor and a 1080ti, I and usually just let things run wide open. I only disable smeary AA and screen effects as a standard.

I also welcome the new consoles, they are the main thing that will force NVIDIA to really reinvigorate their product line. Some games, even with low settings have outstanding IQ and framerates on consoles (and often at 4K as well!). Their new cards need to be several magnitudes more performative than consoles for multiple markets. I think when we finally get benchmarks, the 3080ti will be close to twice as potent as a 1080ti (which is still a powerhouse).

That excites me greatly. It also kind of makes me sad. Even with double the performance, a lot of current and not-so-old titles will still be VERY difficult to run at 4K. Like, I could probably run Jedi: Fallen Order at just 80 fps with everything cranked. I can't imagine how under-powered these things will be by the time next-gen games get PC ports and the graphics demands are just insane.
 

Csr

Member
Nov 6, 2017
2,042
so will my 2080 be behind the x1x? i got a 9700k to pair with it and 32 gb 3600 ram

No one can say for sure.
I suspect you will have similar performance and if the game supports DLSS your card will probably have the lead (assuming AMD doesn't introduce something as good as the recent DLSS version for consoles) .
 

GhostTrick

Member
Oct 25, 2017
11,490
I don't agree with that at all. A 1070, for example, plays the same games as PS4 Pro in the same resolution at 60FPS. Cleaner, but the visual difference isn't significant.

I'll also point out I bought Nioh on PC recently, and while it's cleaner, it runs like shit and hiccups for no reason regularly (at 1080p on much more powerful hardware). So, I'm not sure which ports you're talking about cause there are some games that are just better optimized on console.

You mean the same Nioh which can drop its resolution down to 720p on PS4 PRO to maintain 60fps ?
 

Kadath

Member
Oct 25, 2017
622
I think the most worrying aspect is that PC hardware whales aren't that statistically significant. New games running poorly for MOST people will drive those people to consoles, and PC game sales will see a sharp decline. Sharp decline that will lead to tighter budgets, less optimization, more rushed games (or less ports), so worse performance and so on.

This chain of effects until PC parts are once again more accessible.
 

Deleted member 13560

User requested account closure
Banned
Oct 27, 2017
3,087
From a raw power perspective. Given that console versions of games tend to be much better optimized since they only need to run on that hardware, the difference won't be that big at the beginning of the generation.

The GTX 780 released in May 2013 and significantly outperformed the XB1 and PS4. The 780 Ti came out the same year and was an even bigger leap in performance. It wasn't until the XB1X was released in 2017 that it beat out the 780 Ti in performance. That was true in 2005 with the release of the XBOX 360. But even then the mythical 8800 GTX was released one year later on November 2006, the same month as the PS3, and outperformed both consoles. The Series X's GPU has comparable performance to 2016/17 NVDA flagship cards.

I mean Nvidia top in will probably be over twice the cost of the new Xbox.

I don't think very many people who buy flagship GPUs are looking too closely at price to performance ratio.


A 12TF console isn't competition?

12 TFlops was achieved with overclocking Titan XP cards in 2016. People overclocking 1080 Ti also game at 12+ TFLOPs. Titan Xp (2017 ver) runs at 12.1 TFLOPs on a base clock. The 2080 Ti was released in 2018 and has 13.4 TFLOPs running at a base clock and does 14+ on boost.

Will a Series X/PS5 be the best bang for the buck? Hell yes. Anyone saying that they could build a machine with a comparable GPU/CPU/Storage/Memory/etc. on the day that the new consoles are released is just talking nonsense. So it will definitely be competition for people who are deciding on a console or a low to mid-level enthusiast GPU.

I think the most worrying aspect is that PC hardware whales aren't that statistically significant. New games running poorly for MOST people will drive those people to consoles, and PC game sales will see a sharp decline. Sharp decline that will lead to tighter budgets, less optimization, more rushed games (or less ports), so worse performance and so on.

This chain of effects until PC parts are once again more accessible.

But multiplat games released since the 2013 have ran just fine on PC for the most part. The main gripes have been low effort ports for the most part. But low effort of bare-bone ports don't equate to a worse performing game. Most issues I've seen arise from people having bottlenecks in their systems. Also people maxing out games and claiming to be poorly optimized is another big one. That's why why wish developers would label settings to show what's the equivalent on consoles and allow people to step setting up from there.
 
Last edited:

ShadowFox08

Banned
Nov 25, 2017
3,524
So with GDC in SF pushed back to summer due to the Corona virus scare, I wonder if Nvidia will still reveal around March time.. perhaps online via Nvidia Direct?
 

BreakAtmo

Member
Nov 12, 2017
12,984
Australia
Frostbite is a nasty culprit for diminishing returns at Ultra settings. Good PC ports now often show you screenshots of what is being improved at different settings. Oftentimes, Ultra is immediately and clearly better. Stupid draw distances, more and higher fidelity lighting, more foilage density and of course just uncompressed textures.

Older software definitely has a worse issue with Ultra being far too demanding, proportional to what the player gets in return. Stuff like Ultra shadows in Bioshock Infinite come to mind, where running at high gains back substantial performance while showing virtually zero loss in quality in any regard under scrupulous examination.

It's why I wait to read user reviews and watch tech breakdowns like those from Digital Foundry. If there is some bizarre anomaly, they'll cover it. I game with a 4K monitor and a 1080ti, I and usually just let things run wide open. I only disable smeary AA and screen effects as a standard.

I also welcome the new consoles, they are the main thing that will force NVIDIA to really reinvigorate their product line. Some games, even with low settings have outstanding IQ and framerates on consoles (and often at 4K as well!). Their new cards need to be several magnitudes more performative than consoles for multiple markets. I think when we finally get benchmarks, the 3080ti will be close to twice as potent as a 1080ti (which is still a powerhouse).

That excites me greatly. It also kind of makes me sad. Even with double the performance, a lot of current and not-so-old titles will still be VERY difficult to run at 4K. Like, I could probably run Jedi: Fallen Order at just 80 fps with everything cranked. I can't imagine how under-powered these things will be by the time next-gen games get PC ports and the graphics demands are just insane.

This is why I'm waiting for Black Friday to get a new rig with a 3080. I'm looking forward to it, especially since I'm hoping to target 3840x1610 DLSS rather than native 2160p. Hopefully that will let me really push framerates and graphics settings.
 

Wumbo64

Banned
Oct 27, 2017
327
This is why I'm waiting for Black Friday to get a new rig with a 3080. I'm looking forward to it, especially since I'm hoping to target 3840x1610 DLSS rather than native 2160p. Hopefully that will let me really push framerates and graphics settings.

Honestly, games just need to slow down on the tech front. People who want ray-tracing on top of the dozens of gigabytes of uncompressed high resolution textures and gigantic realized vistas are gonna ensure every game has a 5 to 7 year development cycle and is never performant on its relative hardware.

DLSS is a neat feature, but I greatly prefer games with native dynamic resolution options.
 

pswii60

Member
Oct 27, 2017
26,756
The Milky Way
This is why I'm waiting for Black Friday to get a new rig with a 3080. I'm looking forward to it, especially since I'm hoping to target 3840x1610 DLSS rather than native 2160p. Hopefully that will let me really push framerates and graphics settings.
FYI only a few games support DLSS. It's not like checkerboard etc and not something that can just simply be enabled or forced. For DLSS a game has to be sent to Nvidia's DLSS team where it gets put through their AI computers to create the necessary data for the tensor cores to ultimately call upon when you're playing the game.

Edit: I'm wrong! Apparently that manual process is no longer required for DLSS. Awesome.
 
Last edited:

pswii60

Member
Oct 27, 2017
26,756
The Milky Way
That excites me greatly. It also kind of makes me sad. Even with double the performance, a lot of current and not-so-old titles will still be VERY difficult to run at 4K. Like, I could probably run Jedi: Fallen Order at just 80 fps with everything cranked. I can't imagine how under-powered these things will be by the time next-gen games get PC ports and the graphics demands are just insane.
JFO runs at a locked 60 on a 2080Ti at locked native 4k at ultra settings. Comparing that to the X which runs at an unlocked 30 with way lower settings at dynamic 1440 which goes down to 1080p. It's a huge jump.

Of course as we go a couple of years in to next gen and games are targeting 2160cb/30fps at PS5's rumoured ~9tf as a baseline, then it's certain to be a struggle to expect native 4k/locked 60/ultra settings on 2080Ti, but certainly knocking the resolution down slightly or lowering the settings from Ultra to High could get you in the ballpark.

Of course by 2022 we'll be talking about the 4080 Ti and the 2080 Ti's 15+ teraflops (OEM) will seem terribly old fashioned.
 

BreakAtmo

Member
Nov 12, 2017
12,984
Australia
FYI only a few games support DLSS. It's not like checkerboard etc and not something that can just simply be enabled or forced. For DLSS a game has to be sent to Nvidia's DLSS team where it gets put through their AI computers to create the necessary data for the tensor cores to ultimately call upon when you're playing the game.

That's how it was before, but DLSS has since been upgraded. It now no longer needs a specific algorithm for each individual title, and also seems to have better results. Wolfenstein Youngblood is the first game to use it, and Control also has a version that didn't even use the tensor cores.

www.kotaku.com.au

Nvidia Very Quietly Made DLSS A Hell Of A Lot Better

When Nvidia launched their RTX GPUs, the cards shipped with a wealth of potential to leverage AI in different scenarios. One of those was deep learning super sampling (DLSS), an AI-powered anti-aliasing technique that was designed to improve frame rates at higher resolutions by using neural...

Based on these advances, I'm expecting most next-gen games to come with it.
 

scabobbs

Member
Oct 28, 2017
2,110
People are holding onto their 10 series cards. They need a reason to upgrade. the 20XX series didn't provide it. This one needs to. Why would anyone buy a new 3070 or 3080 if a console is delivering the same games with better performance? They're at the start of a new generation. They need to deliver performance and price.
Because they play games on their PC?
 

pswii60

Member
Oct 27, 2017
26,756
The Milky Way
That's how it was before, but DLSS has since been upgraded. It now no longer needs a specific algorithm for each individual title, and also seems to have better results. Wolfenstein Youngblood is the first game to use it, and Control also has a version that didn't even use the tensor cores.

www.kotaku.com.au

Nvidia Very Quietly Made DLSS A Hell Of A Lot Better

When Nvidia launched their RTX GPUs, the cards shipped with a wealth of potential to leverage AI in different scenarios. One of those was deep learning super sampling (DLSS), an AI-powered anti-aliasing technique that was designed to improve frame rates at higher resolutions by using neural...

Based on these advances, I'm expecting most next-gen games to come with it.
I did not know this!! Very interesting.
 

ShadowFox08

Banned
Nov 25, 2017
3,524
That's how it was before, but DLSS has since been upgraded. It now no longer needs a specific algorithm for each individual title, and also seems to have better results. Wolfenstein Youngblood is the first game to use it, and Control also has a version that didn't even use the tensor cores.

www.kotaku.com.au

Nvidia Very Quietly Made DLSS A Hell Of A Lot Better

When Nvidia launched their RTX GPUs, the cards shipped with a wealth of potential to leverage AI in different scenarios. One of those was deep learning super sampling (DLSS), an AI-powered anti-aliasing technique that was designed to improve frame rates at higher resolutions by using neural...

Based on these advances, I'm expecting most next-gen games to come with it.
"the key technical point is that Control's implementation of DLSS didn't actually use the special tensor cores that are exclusively part of Nvidia's RTX cards, but regular shader cores that you can find on any GPU. "

Now that's really interesting.

edit: but the article goes on to say that it brings a lot of artifacts and shimmering, and the future is to use tensor cores for DLSS
 
Last edited:

dex3108

Member
Oct 26, 2017
22,966
"the key technical point is that Control's implementation of DLSS didn't actually use the special tensor cores that are exclusively part of Nvidia's RTX cards, but regular shader cores that you can find on any GPU. "

Now that's really interesting

And kinda limiting. That is why DLSS 2.0 used in Wolfenstein Youngblood is better. They learned a lot from Control version (let's call it 1.9 like Digital Foundry called it) and implemented that to 2.0 that uses Tensor Cores and achieved great results.
 

dgrdsv

Member
Oct 25, 2017
12,065
GTC is still on:
www.nvidia.com

GTC 2021: #1 AI Conference

Free registration. April 12-16, 2021. - The conference for AI innovators, technologists, and creatives. Join fellow AI innovators for webinars, training, demos, and more. Registration now open.
Which means that we will get some info on Ampere there most likely.

Nvidia ain't giving this kind of performance increase away without a bump.

3080 Ti - $1499
3080 - $999
3070 - $699
3060 - $499

You know it to be true.
With 3060 being faster than 2080Ti sure. Otherwise nope.
There is no reason to increase the prices on 30 series.
 
Oct 26, 2017
9,859
GTC is still on:
www.nvidia.com

GTC 2021: #1 AI Conference

Free registration. April 12-16, 2021. - The conference for AI innovators, technologists, and creatives. Join fellow AI innovators for webinars, training, demos, and more. Registration now open.
Which means that we will get some info on Ampere there most likely.


With 3060 being faster than 2080Ti sure. Otherwise nope.
There is no reason to increase the prices on 30 series.

If the 3060 is indeed faster than the 2080ti that's actually good.

Assuming they can price it at 499 bucks and not higher.
 

wollywinka

Member
Feb 15, 2018
3,118
I am not a PC gamer, but I recently got a new MacBook Pro which I installed Boot Camp on to do some gaming. Really enjoying it. While performance is acceptable, laptops are noisy and hot, so I think I'm gonna build my first gaming PC. At what point should I buy into Ampere? I would go for a high end card. Relatively silent operation is very important to me.