Chaos Corona Forum
General Category => General CG Discussion => Topic started by: agentdark45 on 2017-11-16, 14:41:48
-
https://blogs.nvidia.com/blog/2017/05/10/ai-for-ray-tracing/
Around the 2 minute mark in the video above.
Very interesting happenings! I wonder what else neural nets could speed up in the rendering process?
-
It looks very interesting!
Maybe it will be possible to integrate in Corona as a "fast denoise GPU solver" =)
-
maybe ;)
-
Mother-of-God! O_O It will be very interesting!
-
maybe ;)
=D
-
NVidia will actually ships the denoiser as part of the OptiX 5.0 SDK so anyone can integrate it in their application. I've looked at the OptiX samples briefly and it doesn't seem too complicated to use it. The denoiser can be used separately - the renderer itself doesn't need to be done with OptiX or use GPUs at all.
However keep in mind that the OptiX denoiser *requires* an NVidia GPU in order to run. If you wanted to use it on a render farm for final frames, you'll need GPUs on the farm (which is what NVidia wants, for sure :))
Best regards,
Vlado
-
And what about memory requirements? Does implementing this, would mean a. lesser system RAM consumption during denoising phase and b. would it be limited by the available amount of GPU RAM?
-
Wouldn't mind to try either :- )
-
Faster denoising would be a huge bonus. We occasionally do some stupidly big renders (20k or more) and denoising those takes crazy long, and even on standard 5-8k renders it can be a really slow process. So this would be pretty exciting to see implemented!