• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.
Status
Not open for further replies.

kostacurtas

Member
Oct 27, 2017
9,060
What are they going to do with tensor cores on consumer GPUs.

Tensor cores are ultra specialized to train(ed) neural nets. What is a consumer going to do with that?

As I've seen so far the only real application today is denoising raycasts. Though I don't know how flexible that is, whether the algoritm requires retraining for various visual inputs or whether it works in a wide variety of situations.
NVIDIA says tensor cores could be useful for gaming too
And I think I already really appreciated the work that we did with Tensor Core and although the updates they are now coming out from the frameworks, Tensor Core is the new instruction fit and new architecture and the deep learning developers have really jumped on it and almost every deep learning frame work is being optimized to take advantage of Tensor Core. On the inference side, on the inference side and that's where it would play a role in video games. You could use deep learning now to synthesize and to generate new art, and we been demonstrating some of that as you could see, if you could you seen some of that whether it improve the quality of textures, generating artificial, characters, animating characters, whether its facial animation with for speech or body animation.

The type of work that you could do with deep learning for video games is growing. And that's where Tensor Core to take up could be a real advantage. If you take a look at the computational that we have in Tensor Core compare to a non optimized GPU or even a CPU, it's now to plus orders of magnitude on greater competition of throughput. And that allows us to do things like synthesize images in real time and synthesize virtual world and make characters and make faces, bringing a new level of virtual reality and artificial intelligence to the video games.
 

Celcius

Banned
Oct 25, 2017
2,086
If the rumored specs are true then I'm a bit disappointed with the memory sizes staying the same, especially if the rtx 2080 is marketed as a 4K gaming card. Makes me wonder how it will fare once next-gen consoles come out.
 

Bernd Lauert

Banned
May 27, 2018
1,812
Yes the Turing chip on RTX 2080/Ti has dedicated cores for ray tracing, tensor cores (576 on Ti and 384 on 2080) and the usual cuda cores (4352 on Ti and 2944 on 2080).

It's impressive that they can put all that on a card and still offer a decent increase in performance. Now we just need more games that support this stuff. Maybe they will announce some in 2 days.
 

Vimto

Member
Oct 29, 2017
3,714
I'm hoping they show lot of gameplay videos with and without raytracing for us to see how good this tech is.
 

asmith906

Member
Oct 27, 2017
27,354
Just saw that the 2080 is supposed to be $800. That's insane. I was hoping to sell my 1070 to make upgrading less painful but prices have tanked super hard. I feel like this might be a really bad time to sell. If prices are really that high I could easily see the 10 series card raising in value.
 

Deleted member 25042

User requested account closure
Banned
Oct 29, 2017
2,077
Just saw that the 2080 is supposed to be $800. That's insane. I was hoping to sell my 1070 to make upgrading less painful but prices have tanked super hard. I feel like this might be a really bad time to sell. If prices are really that high I could easily see the 10 series card raising in value.

Same story with my 1080
Some people have been panic selling their cards at ridiculous prices and it seems like nobody's willing to offer a decent price anymore (not that I blame them).
If the 2080 is somewhat disappointing and expensive, who knows...
 
Oct 26, 2017
505
Italy
Should be an interesting night in Cologne. :)
6259119.gif
 
Feb 10, 2018
17,534

MrBob

Member
Oct 25, 2017
6,668
Just saw that the 2080 is supposed to be $800. That's insane. I was hoping to sell my 1070 to make upgrading less painful but prices have tanked super hard. I feel like this might be a really bad time to sell. If prices are really that high I could easily see the 10 series card raising in value.
That's because mining is at a low and gamers are waiting for the 20 series.

If ray tracing is the goods with decent developer support right away and the 20 series isnt seriously overpriced then I don't see the 10 series value going higher. Best time to sell at peak value was a couple weeks ago. Normally if the 20 series is sold out it could raise the value temporarily of older cards but those older cards don't do ray tracing. So someone interested in the rtx isn't going to buy the GTX instead.

Your best hope is if Nvidia tries to price gouge the 20 series but I don't believe they will tank the 20 series to sell more 10 series cards. We will find out Monday.

Also that 800 price for the 2080 is not confirmed.
 
Last edited:

Tovarisc

Member
Oct 25, 2017
24,396
FIN
These ray tracing implementations on current games like in the new metro are not that impressive at all to me. I guess its a nice feature for owners of these cards.
I wonder when will see a game built from the ground up using these new ray tracing techs?

Have we even seen any game running ray tracing tech other than that first glimpse few months ago from Exodus?

Will be interesting see if and how implementation and use has improved since.

Should be an interesting night in Cologne. :)

!

Very cool, excited to see how you guys leverage new GPU gen and new feature set.
 
Last edited:

Deleted member 1852

User requested account closure
Banned
Oct 25, 2017
2,077
For anyone who is not at one with this site, there is a LOT of leaked info ahead of official announcement already:
https://videocardz.com/

The huge current rumor going around is that Ti will debut alongside regular 2080, if so then MIND BLOWN

I was planning on buying the Ti 9 months later like usual, not at the same time as regular 2080
 

kostacurtas

Member
Oct 27, 2017
9,060
Goodbye 4K60 if true.
How so?

There are dedicated RT cores just for ray tracing. Any game that will have ray tracing elements will utilize these cores and that doesn't affect the cuda cores and the performance of the card.

The same about the tensor cores. There are available to use them for AI, animations, etc without again affecting the performance.
 

Nothing

Member
Oct 30, 2017
2,095
What we really need to know and understand is how 7-8gb of GDDR6 RAM comapres to 11gb of GDDR5 RAM.
 

1-D_FE

Member
Oct 27, 2017
8,252
Have we even seen any game running ray tracing tech other than that first glimpse few months ago from Exodus?

Will be interesting see if and how implementation and use has improved since.



!

Very cool, excited to see how you guys leverage new GPU gen and new feature set.

Not that I'm aware of. Youtube recommended me the Nvidia channel today, and I watched a bunch of new videos, but nothing that was really new. Things like Tim Sweeney showing off the Porshe demo and then a graph showing the difference and how Turing was 6X faster than Pascal at the scene (Pascal was 2 fps vs 12 fps for Turing). No videos that actually demonstrated why you're gonna want to use these features, though.
 

kostacurtas

Member
Oct 27, 2017
9,060
What we really need to know and understand is how 7-8gb of GDDR6 RAM comapres to 11gb of GDDR5 RAM.
Well any game that doesn't use more than 7-8GB (basically every game until today, except from a few special cases, mods, etc) will perform better on GDRR6 because of the higher speed and as a result the higher bandwidth of the card.

That if you are asking in theory and for the same memory interface. If you think about a comparison of the RTX 2080/2070 (8-7GB GDDR6) and the GTX 1080 Ti (11GB GDDR5) then the Ti is still better because has bigger memory interface and higher bandwidth.

The GDDR6 has also better power consumption (if that matters to you).

Next year? NVidia has no competition in the high end segment. They will milk those for a long time.
Indeed if the new AMD cards are not competitive Nvidia could easily refresh this new generation without going to smaller process.
 

KKRT

Member
Oct 27, 2017
1,544
Not that I'm aware of. Youtube recommended me the Nvidia channel today, and I watched a bunch of new videos, but nothing that was really new. Things like Tim Sweeney showing off the Porshe demo and then a graph showing the difference and how Turing was 6X faster than Pascal at the scene (Pascal was 2 fps vs 12 fps for Turing). No videos that actually demonstrated why you're gonna want to use these features, though.
This is good video to see differences in technology using RT vs rasterization/shaders.
https://www.youtube.com/watch?v=bFUWu387ErM

And here the same, but extended: https://youtu.be/tjf-1BxpR9c
 

BobLoblaw

This Guy Helps
Member
Oct 27, 2017
8,288
I have absolutely no need for a 2080TI since I can still shred through most of my games with my 1080 at 1440, but I'm gonna get one anyway. 4k downsampling everything!
 

Deleted member 25042

User requested account closure
Banned
Oct 29, 2017
2,077
What we really need to know and understand is how 7-8gb of GDDR6 RAM comapres to 11gb of GDDR5 RAM.

What do you mean?
We know how the cards should compare bandwidth wise
Or are you expecting the 7-8GB of GDDR6 to push above their weight so to speak when it come to amount?
If so that has nothing to do with the memory type but with the compression techniques used
 

Fatmanp

Member
Oct 27, 2017
4,438
What do you mean?
We know how the cards should compare bandwidth wise
Or are you expecting the 7-8GB of GDDR6 to push above their weight so to speak when it come to amount?
If so that has nothing to do with the memory type but with the compression techniques used

How does the increase in bandwidth effect performance as I don't really understand it? Are there games out there which are constrainrd by bandwidth?
 

kostacurtas

Member
Oct 27, 2017
9,060
Honest question. Would these cards have been released last year had Vega been competitive?
I don't think so.

But indeed Nvidia had these cards ready for launch for some time now but they hold off the release in order to clear the stock from the 10xx series when the demand for GPUs reduced significantly when the mining craze stopped.
 

Nothing

Member
Oct 30, 2017
2,095
All the 10 series cards on PNY's website have prices like "349.99" and "599.99". The true price could be 999.99, but the entry listing $1000 even leads me to believe it's placeholder. No point panicking until it's official (if it becomes official).
Exactly. It's a purposeful "mistake" done to get everybody checking their website and then the $899.99 pre-order price is going to seem like a great deal. Lots of people will order from them. No manufacturer will want to be aiming for a $1000 price point anyways. That's also a negative water cooler talking point.

How do these companies like PNY even make these "mistakes" anyways? Because it wasn't one, that's why. It's publicity and it drives sales.
 

KKRT

Member
Oct 27, 2017
1,544
I don't think so.

But indeed Nvidia had these cards ready for launch for some time now but they hold off the release in order to clear the stock from the 10xx series when the demand for GPUs reduced significantly because of the mining.
Really doubt that as they just announced and released Quadro RTX versions. They just werent ready earlier, otherwise they would not block their non gaming markets release.

---

You have secrets right?
Support for RTX in BF V was almost given as Microsoft confirmed that DXR will be used by few titles already this year and on the list engines and vendors were Unity, Unreal and Frostbite.
 
Status
Not open for further replies.