Sadly i don't play any of those games that currently has DLSS 2.0 needs much more widespread adoption
There's a branch of UE4 with DLSS built in now, so it should only be a matter of time
Sadly i don't play any of those games that currently has DLSS 2.0 needs much more widespread adoption
THIS rofl.I still remember people on this forum shitting on ray tracing and saying its just better reflections , before new consoles got announced . Good times
That is what happened in the old type. The new type does not look oil painty. 2.0 works very differently
Engine support or engine provider process automation had to happen and glad it is already happening. Other big ones like Frostbite, Unity, Diuna or Snowdrop will probably follow sooner rather then later.There's a branch of UE4 with DLSS built in now, so it should only be a matter of time
I still remember people on this forum shitting on ray tracing and saying its just better reflections , before new consoles got announced . Good times
Too true.It's always like that.
- Feature gets announced. Only cutting-edge PC hardware that supports it gets released.
- Gets shit on by everyone that cannot get said feature (customers from the oppossing "team" and console fanbase)
- Opposing team announces hardware for feature, their customers praise it.
- Console makers announce hardware for feature, their customers praise it.
- Users that defended said feature back when it was first revealed sigh simultanously because now people realize "feature is good and is the future".
They will be. As more games gain support, older games that support 2.0 will get better through Nvidia driver updatesI hope games that support DLSS 2.0 will automatically be supported by any further improvements to the tech. As in, they will benefit from future updates to the AI training, if that makes sense.
I agree with you lol.You went from a 2070 to a 5700xt? Ouch. Your previous GPU was much more future proof with DX12 Ultimate support. Can't you get it back?
Btw, control used DLSS 1.9 which looks okay but not very good either. DLSS 2.0 comes this week for control and Dlss 2.0 looks so much better.
March 26 or 29.
This technology is super cool, but does anyone else find it a little concerning that Control with RT at 4K and DLSS just barely scrapes above 60FPS on a $1000 GPU? It's already an impressive showcase, but what is going to happen once new consoles raise the bottom line for game development? I guess 1440p can mitigate this a bit, but DLSS is already supposed to drop the actual rendering cost to similar levels.
Then again, maybe full RT will be fairly niche with the PS5 coming in at only 36CUs. Excited to see what the Nvidia 3000s series has in store at the very least.
Absolutely. Nobody can say what the overhead and results would be, but there isn't an obvious technical reason why next-gen machines can't do it.Is there any chance that something like this can be implemented in the next gen consoles? AMD's answer to DLSS basically
Is there any chance that something like this can be implemented in the next gen consoles? AMD's answer to DLSS basically
Nah, there's only Control. It was considered "DLSS 1.9". Every game before that used the tensor cores.Don't think there is much hope for dedicated hardware like tensor cores being present, microsoft have directml though so could possibly see that used for a similar thing. If they can achieve dlss 2.0 levels without tensor core style hardware will have to wait and see. There are some games using dlss without running it on the tensor cores, control was 1 and I think there was another but that improved control dlss is usng the hardware apparently.
Nah, there's only Control. It was considered "DLSS 1.9". Every game before that used the tensor cores.
Unlikely - the tensor cores are totally separate from the shaders etc, therefore there is no impact on performance when using them. Whereas the DirectML method is literally taking FLOPS away from rendering.Don't think there is much hope for dedicated hardware like tensor cores being present, microsoft have directml though so could possibly see that used for a similar thing. If they can achieve dlss 2.0 levels without tensor core style hardware will have to wait and see. There are some games using dlss without running it on the tensor cores, control was 1 and I think there was another but that improved control dlss is usng the hardware apparently.
They did. It's only Control that doesn't.You mean after that :D Every game before Youngblood i think didn't use tensor cores.
I think MS will do it in some way since they have int 8 and int 4 support and have demonstrated it in the past in prototyping on Forza - and it is faster rate (a multplicative amount faster than 1:1 TF for fp 32). Sony did not seemingly add this functionality to PS5, or at least, did not at all talk about it.Is there any chance that something like this can be implemented in the next gen consoles? AMD's answer to DLSS basically
The biggest problem is that it uses shader cores, so its not free like in Tensor cores, but the it will still be very impactful for performance.I think MS will do it in some way since they have int 8 and int 4 support and have demonstrated it in the past in prototyping on Forza - and it is faster rate (a multplicative amount faster than 1:1 TF for fp 32). Sony did not seemingly add this functionality to PS5, or at least, did not at all talk about it.
That int8 support MS added is like 8x slower than the tensor core in a RTX 2080 I think, but still, it is supported.
I hope to see it in more UE4 games as there's a DLSS branch of the program.It should show up in more games now that there isn't a training cost for developers
Yeah that is indeed a problem. It requires taking away resources on xsx, where it is offloaded on Turing (still takes time of course for execution)The biggest problem is that it uses shader cores, so its not free like in Tensor cores, but the it will still be very impactful for performance.
I actually think console games will use dynamic upscaling with DLSS like solutions in games.
I hope to see it in more UE4 games as there's a DLSS branch of the program.
But the biggest detriment is that it's a black box right now.
I think MS will do it in some way since they have int 8 and int 4 support and have demonstrated it in the past in prototyping on Forza - and it is faster rate (a multplicative amount faster than 1:1 TF for fp 32). Sony did not seemingly add this functionality to PS5, or at least, did not at all talk about it.
That int8 support MS added is like 8x slower than the tensor core in a RTX 2080 I think, but still, it is supported.
For now, as Sony hasnt said anything, its not totally of the charts, but could be extremely expensive to the point of not being worth it.So does this basically mean this entire tech is off the cards for PS5 from what we know?
Andrew Goosen said, they added the Integr support on the XSX, but I don't really believe Sony has nothing on that front to offer. Could also be part of RDNA2, in response to the Turing Tensor Cores. I think Machine Learning will be very important in the next generation, way beyond reconstructing images. It would be foolish from Sony to not incorporate hardware support for ML in some way.So does this basically mean this entire tech is off the cards for PS5 from what we know?
So does this basically mean this entire tech is off the cards for PS5 from what we know?
For now, as Sony hasnt said anything, its not totally of the charts, but could be extremely expensive to the point of not being worth it.
Andrew Goosen said, they added the Integr support on the XSX, but I don't really believe Sony has nothing on that front to offer. Could also be part of RDNA2, in response to the Turing Tensor Cores. I think Machine Learning will be very important in the next generation, way beyond reconstructing images. It would be foolish from Sony to not incorporate hardware support for ML in some way.
Not necessarily. Even if they didn't have explicit acceleration it may still be possible at a lower performance level. But idk what kind of penalty that might be. And it may have the same features just not currently disclosed. Not much point speculating with so many unknowns.
There was before? I was under the impression that Nvidia handled the training.It should show up in more games now that there isn't a training cost for developers
There was before? I was under the impression that Nvidia handled the training.
Actually having to pay to get results like dlss 1.0? If that is the case then it doesn't surprise me that there was poor support.they do, but I've always presumed it was paid for or at the very least by application ,hence such low support for something that gives such large free benefits.
Only Turing cards (yours is not supported) and it's a graphics options that needs to be made for the game.Trying to learn all about this now so pardon me if this was already answered.
Is DLSS only available on RTX cards? My 1080Ti FTW3 cannot use this tech?
Also...
Is this an option that only available for certain games? Meaning not in the nvidia CPL, but rather game devs have to add a graphics options for DLSS?
Only Turing cards (yours is not supported) and it's a graphics options that needs to be made for the game.
Yeah that is indeed a problem. It requires taking away resources on xsx, where it is offloaded on Turing (still takes time of course for execution)
If you look at Nsight, you can see how it happens always at the very end of a frame generation when nothing else is happening anyway since the frame has already been composed. I guess we would have to learn that by looking at tensor core usage for something like a ML denoiser, which would occur earlier.im still unclear as to whether Turing can do other work concurrently or if tensor cores saturate the shared resources too heavily