• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.

z0m3le

Member
Oct 25, 2017
5,418
I mean its like what, 250w just for this card? Isn't that more than the entire Xbox One X? You put something like this in a console and its the RROD saga all over again.
The XB1X uses something like 180w in a console half the size of the Xbox. It wouldn't surprise me if Xbox series X is designed for 300w. With a 140mm fan exhausting through the top, it should cool the console enough for such power consumption.
 

MPrice

Alt account
Banned
Oct 18, 2019
654
From what we're hearing the Series X will push beyond 300W


I like the contrast

The XB1X uses something like 180w in a console half the size of the Xbox. It wouldn't surprise me if Xbox series X is designed for 300w. With a 140mm fan exhausting through the top, it should cool the console enough for such power consumption.

I can definitely believe 300W. That likely isn't all going to the GPU though, they aren't using Jaguar cores this go around.
 

kaputt

Member
Oct 27, 2017
1,204
I don't expect a new console to be better than a GPU that, alone, probably draws the same amount of power of the console.

My main concern with PC parts right now is cost-efficiency. I'm delaying my PC upgrade because I don't believe any of the current GPUs will offer a good margin when compared to the consoles.
 

HeWhoWalks

Member
Jan 17, 2018
2,522
They're a useful metric for researchers and academics. Not enthusiasts on a gaming forum. I disagree about the esoteric nature of the workloads. I think low occupancy workloads will become increasingly important as time goes on. Volumetric effects and ray tracing are both pretty hard to make high-occupancy. I get that you don't think there's a point to explaining it, but I think it's worth saying how pointless this whole discussion is.


Fine I'll satisfy your pedantry. TFs aren't useful here because there are so many performance characteristics that differ between a console and desktop GPU. This is the point of my earlier posts. When comparing two desktop GPUs, TFs may be a valid measure. They're still mostly marketing fuzz, but they can be useful if you know that your workload is (as said above) high-occupancy. This means the majority of operations are FLOPS. Machine learning is an applicable workload.

Comparing a console GPU and a desktop GPU with just TFLOPs is nonsense though. There's so much more going on. I'm sorry if saying "TFs don't mean anything" pricked you, but as far as this conversation is concerned, they don't. They're not useful and they just end up confusing people who think they know more than they do.

Well said.
 

Wumbo64

Banned
Oct 27, 2017
327
In raw performance, I am sure the 2080 could be better. However, the optimization for consoles goes a lot farther (just ask Jon Carmack) so the point is moot.

If it is anywhere in the neighborhood of the same base performance, plus a good dev cycle, you'll get a hell of a lot more performance per dollar.
 

Xx 720

Member
Nov 3, 2017
3,920
Would it outperform a card due to customization? Don't consoles generally outperform their on - paper specs??
 

Minsc

Member
Oct 28, 2017
4,118
Wow my rtx 2060 is already fucked I should've waited

Can't you just get a new GPU down the road (sell and upgrade)? Pretty much anyone replacing/upgrading their system a few years after the consoles release will be able to get good value and be set for the rest of the gen - so maybe just count on that down the road. In the meantime just bump stuff down to 1080p or something, I'm curious how it'll all play out but there'll always be a solution around the corner.
 

ApeEscaper

Member
Oct 27, 2017
8,720
Bangladeshi
Can't you just get a new GPU down the road (sell and upgrade)? Pretty much anyone replacing/upgrading their system a few years after the consoles release will be able to get good value and be set for the rest of the gen - so maybe just count on that down the road. In the meantime just bump stuff down to 1080p or something, I'm curious how it'll all play out but there'll always be a solution around the corner.
Well it's a prebuilt PC and I don't have the boxes that comes with the graphic card etc is it still sellable at a decent price without the official boxes?
 

Dictator

Digital Foundry
Verified
Oct 26, 2017
4,930
Berlin, 'SCHLAND
In raw performance, I am sure the 2080 could be better. However, the optimization for consoles goes a lot farther (just ask Jon Carmack) so the point is moot.

If it is anywhere in the neighborhood of the same base performance, plus a good dev cycle, you'll get a hell of a lot more performance per dollar.
Modern dx12 and Vulkan + more easily usable hw (you can extract perf more easily these days, especially on NV) makes this old John Carmack quote you are thinking about not very relevant. In dx9 days? Sure.
 

sangreal

Banned
Oct 25, 2017
10,890
this slide is talking about the slim laptop he is holding, not a desktop 2080

that's why you also see the weight and size listed -- it's contrasting to an older 4.5kg laptop

Who knows what they meant by "next-generation consoles"
 

Alexandros

Member
Oct 26, 2017
17,800

Fafalada

Member
Oct 27, 2017
3,065
Modern dx12 and Vulkan + more easily usable hw (you can extract perf more easily these days, especially on NV)
There are still things you just can't do under dx12/vulcan and optimizing around single hw target is simply more cost effective, there's no way around that until we stop using humans for the work ;)

How much it all matters is definitely debatable though, even if you had a definite % number you can expect.
 

chris 1515

Member
Oct 27, 2017
7,074
Barcelona Spain
Oh my. This is gonna be fun to respond to.


Consoles cut back on Anistropic filtering cause it thrashes your cache. You have to do 4-8x memory loads than you had to do normally, removing your most recently used data that's saved for faster future runs.


Please don't be so pedantic. It's pointless for business reasons. You could push a shooter to 120FPS but it wouldnt' look good. It wouldn't go through HDMI. No TV would be able to display it, and no average consumer will be able to tell the difference. As a result, we prioritized things that the consumer would notice or has marketed to them (4k).


This is not at all what I said. PC doesn't have longer loading times. PC uses loading times to transfer as much data through the PCI-E bus that it can. Consoles use this time to fill up GDDR memory. I'm just explaining one technique desktops use for latency hiding.


It's a made up marketing term that no one in the real world would use.
Some people do use it but not for comparing a desktop and console GPU
. A GPU executes millions of tiny shader programs every second. If those shader programs consist of nothing but floating point operations, then TFLOPS might be more important. But they include logical operations, conditional operations, integer operations, and most importantly loading and storing from texture units (GDDR). It don't matter if your GPU is 24TFLOPS if all you have is loads and stores in your shader programs. You won't be doing any floating point ops.

I don't know if you can answer but will it be the case on the next-generation console about anisotropic filtering or if you can't answer on RDNA GPU.
 

Dictator

Digital Foundry
Verified
Oct 26, 2017
4,930
Berlin, 'SCHLAND
There are still things you just can't do under dx12/vulcan and optimizing around single hw target is simply more cost effective, there's no way around that until we stop using humans for the work ;)

How much it all matters is definitely debatable though, even if you had a definite % number you can expect.
Haha., yeah. GCN on PC probably also loves the wavefront mapping already done on console working out so well under Vulkan :D, Turing, Pascal and such see less love there.
 

CelestialAtom

Mambo Number PS5
Member
Oct 26, 2017
6,037
There is no doubt the 2080 is more powerful, but I would rather get an entire console instead of an overpriced GPU.
 

Inuhanyou

Banned
Oct 25, 2017
14,214
New Jersey
This thread went about as well as expected.

I dont think anybody should have expected the GPU's of these consoles to reach the absolute highest end of PC hardware, even a year out from release. It certainly wasnt true this gen or the gen before. I can say definitively before we get "official" details that they absolutely will not get that far.

However, its also true that Nvidia's slides here are reminiscent of their exact words at the beginning of this generation too before they got the switch contract(consoles are dead anyways so blahblahblah). Its almost mandatory for them to do this, the HD twins are almost as much of their opponent in terms of mindshare as AMD itself, especially due to them using its stuff. The fact that they even felt the need to put the next gen machines on the same level as their premium products says it all really.

What i can say is that the consoles beefing up in terms of power and getting rid of bottlenecks consoles have had for generations (CPU, changing from HDD to SSD tech, investment in RT ect) means that Nvidia and Intel both have to step up their game to keep their value propositions to AMD's hardware and also Sony and MS's brands.


Also, anyone saying that game development is not going to be significantly pushed forward by the consoles adopting these technologies and setting a far higher base standard than current gen is being silly, its just a reality. Now all developers will have access to star citizen level tech instead of just a single dev not accounting for all types of PC configurations, and that's good for all parties involved

Rising tides lifts all boats or whatever
 
Last edited:

BeI

Member
Dec 9, 2017
5,974
Would it outperform a card due to customization? Don't consoles generally outperform their on - paper specs??

It would seemingly outperform similar spec pc cards if it makes better use of its features, such as using dynamic resolution and variable rate shading, while pc is brute forcing resolution and shading.
 

Dictator

Digital Foundry
Verified
Oct 26, 2017
4,930
Berlin, 'SCHLAND
They do not outperform their paper specs, they more closely come to utilising their paper specs.

Punching above weight is a phrase I really dislike, rather they leverage their weight.
It would seemingly outperform similar spec pc cards if it makes better use of its features, such as using dynamic resolution and variable rate shading, while pc is brute forcing resolution and shading.
VRS is available on PC (that is where it started on NV cards) and DRS is, likewise, available in PC titles and is a game per game thing, much like it is on console.
 

chris 1515

Member
Oct 27, 2017
7,074
Barcelona Spain
They do not outperform their paper specs, they more closely come to utilising their paper specs.

Punching above weight is a phrase I really dislike, rather they leverage their weight.

VRS is available on PC (that is where it started on NV cards) and DRS is, likewise, available in PC titles and is a game per game thing, much like it is on console.

Yes, it is a much better explanation after new console generations are always a great moment because it moves the minimum requirement target and makes new GPU features as major target and software will improve a lot. This time the CPU, RAM and GPU improvement will not be the only big things with huge storage bandwidth and latency improvement.
 
Last edited:

HeWhoWalks

Member
Jan 17, 2018
2,522
There is no doubt the 2080 is more powerful, but I would rather get an entire console instead of an overpriced GPU.

Well, I wouldn't imagine you using the 'overpriced GPU' with nothing else. In said case, I'll take my much more powerful PC over any console, but that's besides the point. Can't make a realistic 1:1 comparison on this anyway, not when several unknowns are involved.
 

Altair

Member
Jan 11, 2018
7,901
I would hope it would be around there. It has to last the entire generation unless Sony and MS plan on doing mid-gen refreshes again. Nvidia will have it's 3xxx series cards out before those consoles launch anyway and you'll likely find something like a 3060 retailing for less than a 2080 is right now for basically the same performance.
 

Briareos

Member
Oct 28, 2017
3,037
Maine
Modern dx12 and Vulkan + more easily usable hw (you can extract perf more easily these days, especially on NV) makes this old John Carmack quote you are thinking about not very relevant. In dx9 days? Sure.
GPU-based pipelines are probably more relevant to the issue at hand, since they significantly reduce surface area exposure to the command buffer generation portion of the API. Memory management on PCs is still messy, and the compiler infrastructure is still far more opaque and difficult to deal with (PSO build time, code gen quality, intrinsics, etc.), so things aren't necessarily great still.
Haha., yeah. GCN on PC probably also loves the wavefront mapping already done on console working out so well under Vulkan :D, Turing, Pascal and such see less love there.
I'm not sure what you mean here (are you just talking about 64 thread dispatch granularity?), but if I were NV one threat I would consider is that rendering programmers think almost entirely in terms of GCN now. That said NV's devrel/architects are great folks, too, and we have excellent conversations with them fairly regularly.
 
Jul 13, 2018
469
In other news, water is wet and the sky is blue. No one with a shred of common sense would think next-gen consoles would outperform a 2080.

It's the same shit every console gen, people hype up/overestimate the "power" of next-gen hardware only to fall back to reality once specs come out.
 

SeriousGoku

Alt Account
Banned
Jun 20, 2019
752
They do not outperform their paper specs, they more closely come to utilising their paper specs.

Punching above weight is a phrase I really dislike, rather they leverage their weight.

VRS is available on PC (that is where it started on NV cards) and DRS is, likewise, available in PC titles and is a game per game thing, much like it is on console.
Nope. Consoles performed exactly as expected based on their specs, as was comprehensively proven during this generation.
Ok I concede. Regardless, we all gonna eat next gen.

If the consoles are anywhere near 2080 levels and games are being built from the ground up for that powerlevel then it will be a special gen.
 

MPrice

Alt account
Banned
Oct 18, 2019
654
They do not outperform their paper specs, they more closely come to utilising their paper specs.

Punching above weight is a phrase I really dislike, rather they leverage their weight.

VRS is available on PC (that is where it started on NV cards) and DRS is, likewise, available in PC titles and is a game per game thing, much like it is on console.
Right. Its more accurate to say that consoles perform closer to their theo. max more often than PC parts. PCs have heavier abstraction that blunts some of the performance. Consoles have abstraction as well but it can be bypassed if a dev chooses which isn't a realistic option for PC developers who have to support a wide array of archs.
 

Armaros

Member
Oct 25, 2017
4,901
Yes. They always do. Every single generation. Look what they were doing with a measly 512MB of RAM at the end of last gen.

You do realize that last gen, developer used a crap ton of tricks and lowering graphical settings in order to get things to run?

They were 100% not equivalent to a better PC while getting more 'power' out of the specs. That was 100% a lie.

The secret sauce does not exist.
 

Shalashaska

Prophet of Regret
The Fallen
Oct 25, 2017
1,423
The 2080 costs more than the new consoles will and that's just for a GPU. I don't think it should be too surprising to anyone that it is more powerful.
 

dmix90

Member
Oct 25, 2017
1,885
GTX 680 released in 2012 - 3.2TF GPU, 2GB VRAM( most popular model )



I think its doing pretty meh tbh and that VRAM is a joke for a long time.... almost 2 times the power of PS4 GPU( and those are NVIDIA flops vs old AMD flops ). I bet some of those games are running at lower than PS4 settings even.
 

sangreal

Banned
Oct 25, 2017
10,890
Seriously people, watch the video. he is talking about a laptop. You're doing all of this for nothing

It's not even a new line, he said the same thing at CES
"Laptops are the fastest growing gaming platform — and just getting started," said Jensen Huang, founder and CEO of NVIDIA, who introduced the lineup at the start of the annual CES tradeshow. "The world's top OEMs are using Turing to bring next-generation console performance to thin, sleek laptops that gamers can take anywhere. Hundreds of millions of people worldwide — an entire generation — are growing up gaming. I can't wait for them to experience this new wave of laptops."
nvidianews.nvidia.com

NVIDIA GeForce RTX Powers Record Number of New Gaming Laptops

CES -- NVIDIA today announced that the world’s top manufacturers are bringing to market a record number of laptops based on its revolutionary Turing GPU architecture — including the fastest, thinnest and lightest systems ever created.
 

metalslimer

Avenger
Oct 25, 2017
9,558
Why are people talking about gpu costs as if sony and ms would be paying consumer prices for these cards?
 

Altair

Member
Jan 11, 2018
7,901
In other news, water is wet and the sky is blue. No one with a shred of common sense would think next-gen consoles would outperform a 2080.

It's the same shit every console gen, people hype up/overestimate the "power" of next-gen hardware only to fall back to reality once specs come out.

2080 was the baseline in regards to what people thought would be in those consoles. People were bringing up the 2080 Ti in some threads. I get that next gen gets people hyped, but some of the things being said in those threads were absolutely ridiculous.