• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.
  • We have made minor adjustments to how the search bar works on ResetEra. You can read about the changes here.

ILikeFeet

DF Deet Master
Banned
Oct 25, 2017
61,987
senior art director at Nvidia, Gavriil Klimov, posted some high resolution screens of their Marbles RTX project; and in his description, he states:
It's a fully playable game that is ENTIRELY ray-traced, denoised by NVIDIA's AI and DLSS, and obeys the laws of physics - all running in real-time on a single RTX GPU. Absolutely mind-blowing.
so far, Nvidia only uses the tensor cores for DLSS and DLSS doesn't provide any denoising, just upscaling, sharpening, and anti-aliasing. I think it was in Digital Foundry's interview that Nvidia said they were working on using the tensor cores for denoising, so could this be the first showing of it?

for those that don't know, denoising is pretty resource heavy. in some instances even heavier than ray tracing itself. finding a way to speed up denoising and increasing the quality would be a major component in making RT a more viable tool for rendering

xeDLOQQ.jpg


also images of Marbles RTX cause it's pretty

gavriil-klimov-m-final-02.jpg

gavriil-klimov-m-final-07.jpg


 

Deleted member 12317

Account closed at user request
Banned
Oct 27, 2017
2,134
In Blender 2.82 you can use an RTX card's AI to denoise renders when using the Cycles renderer with OptiX (using RT Cores to render), and in 2.83 beta you can also denoise the viewport in realtime when previewing in Cycles.

In the 2.83 beta it's quite useful to be able to preview renders without noise really quickly, and it works really well on renders also, you can save some time by rendering with less samples by pixels and let the AI denoise it.

 
OP
OP
ILikeFeet

ILikeFeet

DF Deet Master
Banned
Oct 25, 2017
61,987
In Blender 2.82 you can use an RTX card's AI to denoise renders when using the Cycles renderer with OptiX (using RT Cores to render), and in 2.83 beta you can also denoise the viewport in realtime when previewing in Cycles.

In the 2.83 beta it's quite useful to be able to preview renders without noise really quickly, and it works really well on renders also, you can save some time by rendering with less samples by pixels and let the AI denoise it.

it uses AI, but it runs denoising on shader cores like everything else. RT cores don't do rendering
 

KKRT

Member
Oct 27, 2017
1,544
Yeah, we need to this ASAP :) It will be such a boost to RT performance based on Minecraft frame time distribution.

I think it was in Digital Foundry's interview that Nvidia said they were working on using the tensor cores for denoising, so could this be the first showing of it?
I'm pretty sure that Nvidia already talked about it on RTX reveal.

--edit--
Even earlier it seems:

--
So why OptiX can only runs on RTX cards?
It states here : https://wiki.blender.org/wiki/Reference/Release_Notes/2.81/Cycles#NVIDIA_RTX


What I understand is that it uses the RT Cores to render, and it's quite faster than CUDA on the same card.
They don't tell how the denoiser run, but they say you need an RTX card and use OptiX, thus it should use some RTX specific hardware.

If it only runs on shader cores, why limiting it to RTX cards ? (and OptiX when using an RTX one)
Optix can use Tensor cores. From my link above:
" The OptiX AI denoising technology, combined with the new NVIDIA Tensor Cores in the Quadro GV100, delivers 3x the performance of previous-generation GPUs and enables fluid interactivity in complex scenes. "
 
Last edited:

Deleted member 12317

Account closed at user request
Banned
Oct 27, 2017
2,134
it uses AI, but it runs denoising on shader cores like everything else. RT cores don't do rendering
So why OptiX can only runs on RTX cards?
It states here : https://wiki.blender.org/wiki/Reference/Release_Notes/2.81/Cycles#NVIDIA_RTX
Cycles now has experimental support for rendering with hardware-accelerated raytracing on NVIDIA RTX graphics cards. To use, enable OptiX

What I understand is that it uses the RT Cores to render, and it's quite faster than CUDA on the same card.
They don't tell how the denoiser run, but they say you need an RTX card and use OptiX, thus it should use some RTX specific hardware.

If it only runs on shader cores, why limiting it to RTX cards ? (and OptiX when using an RTX one)
 
OP
OP
ILikeFeet

ILikeFeet

DF Deet Master
Banned
Oct 25, 2017
61,987
So why OptiX can only runs on RTX cards?
It states here : https://wiki.blender.org/wiki/Reference/Release_Notes/2.81/Cycles#NVIDIA_RTX


What I understand is that it uses the RT Cores to render, and it's quite faster than CUDA on the same card.
They don't tell how the denoiser run, but they say you need an RTX card and use OptiX, thus it should use some RTX specific hardware.

If it only runs on shader cores, why limiting it to RTX cards ? (and OptiX when using an RTX one)
ask Nvidia, ain't the first time they have arbitrary limitations; just see the whole deal with Pascal and RT. and RT Cores don't render, they're for accelerating BVH traversal and intersection testing. the shader cores do the rendering. same with games
 

Deleted member 11276

Account closed at user request
Banned
Oct 27, 2017
3,223
Very good catch, I didn't notice that. It really seems like AI denoising on the tensor cores for games will be ready soon.

I suppose the AI training took a long time to get good results, similar to DLSS 2.0.
 

Zaimokuza

Member
May 14, 2020
957
In this video Eric Haines says Nvidia is very interested in denoising as it dramatically reduces the number of rays that need to be traced.

Edit: grammar

www.youtube.com

Ray Tracing Essentials Part 7: Denoising for Ray Tracing

In the final video of the series: NVIDIA’s Eric Haines describes the process of denoising for ray tracing. A critical element in making realistic, high-quali...
 

Deleted member 11276

Account closed at user request
Banned
Oct 27, 2017
3,223
In this video Eric Haines says Nvidia is very interested in denoising as it dramatically reduces the number of rays that need to be traced.

Edit: grammar

www.youtube.com

Ray Tracing Essentials Part 7: Denoising for Ray Tracing

In the final video of the series: NVIDIA’s Eric Haines describes the process of denoising for ray tracing. A critical element in making realistic, high-quali...
Yep. A significant speedup in RT performance could happen with AI denoising for two reasons:

1. Currently denoising is done on the shader cores, meaning the GPU has less power to render the graphics itself. By offloading that task to the tensor cores, the ALUs are freed up, so they can invest more rendering power to the graphics pipeline, resulting in significantly higher framerates.

2. A well trained AI denoiser could potentially be much more efficient and better at denoising than a shader filter, meaning less rays have to be created which significantly reduces load on both, RT and shader cores.

This is going to be good. I hope it's implemented in a game soon.