Thanks Dennis :)I am impressed with the speed Digital Foundry can put out quality content like this.
Your work is fantastic. The in depth detail and the knowledge behind the concepts makes it a joy to watch.
what's with everyone deep diving these days ?
(very interesting, will watch later. I would like a comparison with SEUS PTGI etc. I still think looks better than the official RTX, which can be a bit garish and overdone at times)
I mean, it looks lovely and all but the fact that it takes quite a bit of horsepower to run makes it a bit of a niche feature for the masses is it not?
edit - going off some of the videos where the GPU is a 2080ti
I mean, it looks lovely and all but the fact that it takes quite a bit of horsepower to run makes it a bit of a niche feature for the masses is it not?
gonna watch, might do a summary if no one else is up to it
- "what's the difference between ray tracing and path tracing?" not much, RT is more single solutions to a task (RT shadows, reflections, etc) while PT is everything at once
- started in March 2019, no one remembers who came up with the idea
- contrary to popular belief, the simple style doesn't make path tracing easier as Minecraft has a lot of polygons on screen and physically based materials
- RenderDragon already supported DX12 so adding DXR wasn't too much work
- it took 2-3 weeks for some simple path traced AO
- uses DXR1.0, they'll look into 1.1 support, but might not happen (might not need it)
- irradiance caching is in Minecraft while it's not in Quake 2
- irradiance caching stores ray data on geometry and accelerates secondary rays for more detail and performance
- used to get multiple bouncing
- Metro Exodus stored GI data using spherical harmonics to reconstruct specular data
- perfect mirrors has 8 bounces, rougher surfaces only do 2 bounces
- volume fog uses a similar method as rasterized volume fog
- RT allows them to make colored shadows
- each transparent surface has a transmission value (rgb) and as the ray passes through, a transmission ray is cast to collect the values of every transparent surface afterwards to determine the color of non-transparent surface the ray might hit (this one was difficult to summarize, so correct me if I'm wrong)
- water is slightly different in that transmission loss is heightened in order to get that fade to darkness with the depth
- their method for motion vectors "works", will be getting better
- denoising costs the most (15% cache updates, 40% ray tracing, 45% denoising)
- GI is noisy AF
- denoising is very bandwidth heavy
- Minecraft uses 3 separate denoisers (shadows, speculars, diffuse)
- diffuse and shadows move differently than specular/reflection so they have to be handled differently
- screen space denoising via spherical harmonics
- more emissive surfaces = easier denoising, because of a larger, cleaner signal
- explicit lights (torches, rods, lamps) are small, so harder to denoise
- more rays = less noise, but there are diminishing returns
- rays have to increase exponentially to have the same jump in denoise quality
- AI denoising is used a lot in offline rendering, but not yet for realtime, but research is being done
- performance drop with higher render chunks is a memory issue
- objects in the distance are shaded at the same level as objects up close (I recall somewhere, probably Beyond3D, that LoD is an issue with ray tracing because of this)
- primary visibility is done via RT rather than rasterized because laziness, but to rasterize, they'd needed to modify the render engine to output a g-buffer
- lensing effect is cause by casting those primary rays so changing the primary visibility to rasterizer would lose that effect unless they worked it into the renderer, "they'll see"
- global illumination data is in the fog volume, but not in the beta build. so emissives will light the fog, hope to update the beta
- mesh-based caustic are a crazy idea
- temporal lag is being worked on, is a denoising issue
- there's a light leaking issue in caves
- particles are rasterized
- path tracing is more general than rasterization (rasterizers can be different between genres due to needs)
- path tracing allows for a lot of unplanned features like lenses and camera obscura
- in first person, you don't have a body to be rendered. maybe they'll make something for the final release
what's with everyone deep diving these days ?
(very interesting, will watch later. I would like a comparison with SEUS PTGI etc. I still think looks better than the official RTX, which can be a bit garish and overdone at times)
For full ray tracing solutions yes, but a hybrid implementation is still feasible (say the way Control does it) for more graphically intense games.
You were asking the right questions too! I personally was hoping you were asking about DXR 1.1 and AI-Denoising and you delivered! You are doing such a good job at the coverage and especially with RT related stuff, I can't stress it enough!
I mean, it looks lovely and all but the fact that it takes quite a bit of horsepower to run makes it a bit of a niche feature for the masses is it not?
edit - going off some of the videos where the GPU is a 2080ti
that probably requires more bounces than might be feasibleOne thing I've wondered about - not sure if talked about in this video - I haven't seen any RT bouncing specular highlights from one bounce into the lighting data for a primary bounce.
ie I have two shiny non-mirrored surfaces, the lighting on the surface I'm directly viewing doesn't account for the speciality of surfaces that are lighting the surface I'm viewing.
is this just a practical processing choice - ie, diffuse data only is read for secondary+ bounces?
tldr: why don't I see the shiny highlights of surfaces reflected in the shiny surfaces I'm directly viewing?
One thing I've wondered about - not sure if talked about in this video - I haven't seen any RT bouncing specular highlights from one bounce into the lighting data for a primary bounce.
ie I have two shiny non-mirrored surfaces, the lighting on the surface I'm directly viewing doesn't account for the speciality of surfaces that are lighting the surface I'm viewing.
is this just a practical processing choice - ie, diffuse data only is read for secondary+ bounces?
tldr: why don't I see the shiny highlights of surfaces reflected in the shiny surfaces I'm directly viewing?
it's something I've seen noticed in other games with RT reflections, particularly, Deliver Us the Moon. rougher surfaces incur more of a penalty when using RT reflections than mirror and mirror-like surfaces, so the performance cost is probably too highWas also wondering this. and also how much of a performance cost each light bounce is when multiplying from a single bounce.
There is going to be no scenario where Seus is more realistic, it does not path trace nearly as much.Seems like the SEUS PTGI looks more realistic, except in really low light situations.
It will become more and more common on each passing day, and one of the next gen consoles has demonstrated capability to run ray-traced Minecraft.
When looking @2:31 in the video I quoted the RTX version looks washed out compared to the Seus version.There is going to be no scenario where Seus is more realistic, it does not path trace nearly as much.
In spite of that, i think it is an excellent mod that I have spent dozens of hours in.
I think this will highlight the biggest hurdle for ray tracing: people assuming something is more realistic when they just prefer one method for aesthetic reasonsSeems like the SEUS PTGI looks more realistic, except in really low light situations.
MC RTX has volumetric lighting (fog/air condensation), so air itself is being lit by the sun. So if it is on the screen, it will blend over backgrounds, making them tinged sun coloured.When looking @2:31 in the video I quoted the RTX version looks washed out compared to the Seus version.
There is going to be no scenario where Seus is more realistic, it does not path trace nearly as much.
In spite of that, i think it is an excellent mod that I have spent dozens of hours in.
When looking @2:31 in the video I quoted the RTX version looks washed out compared to the Seus version.
Seems like the SEUS PTGI looks more realistic, except in really low light situations.
That is indeed how it works - pretty neat!I always felt that the hard shadows in minecraft RTX were too sparse compared to what I thought they should look like. But someone pointed out that the sun in minecraft is really big and that's a big factor in why outdoor scenes have softer lighting or are "too lit" with a lack of shadows making them seem less accurate to life. When I learned that, it all made sense. It's not because RTX is inaccurate but rather because it adheres completely to how realistic lighting would look using minecraft assets as a base. I realize I might be completely wrong but it makes sense to me. Thoughts?
I've been showing the Minecraft RTX demo to my friends that don't even know what Raytracing is and they are constantly in awe at how much the game changes. Raytracing is gonna do wonders next gen.
That said, I think it would be great if the consoles reach current Turing-levels regarding RT performance, but it wouldnt surprise me if its a lot less than that.