Author Topic: Question about Optix Denoiser  (Read 2887 times)

2018-10-01, 17:50:08

xwilz14

  • Users
  • *
  • Posts: 4
    • View Profile
Hello Corona Team,
I want to ask questions about recently added Nvidia Optix Denoiser.
Is it running on CPU or GPU? In my test, the overall render speed still determined by CPU speed.
The nvidia optix is only used for denoising only. Is this correct?
Do I need a GPU with a lot of memory for complex scenes?
Thanks for the answers.

2018-10-01, 18:01:04
Reply #1

TomG

  • Administrator
  • Active Users
  • *****
  • Posts: 6118
    • View Profile
Yes, it runs on GPU. Yes, it's Denoising only, no rendering is harmed in the usage of the Optix Denoiser.

Image size will determine how much GPU memory you need, not scene complexity (as it is only running on the final 2D image).

Hope this helps!
Tom Grimes | chaos-corona.com
Product Manager | contact us

2018-10-01, 18:15:42
Reply #2

xwilz14

  • Users
  • *
  • Posts: 4
    • View Profile
Ok, thanks for your reply, so I don't need fancy GPU, just enough so it can run Optix right?
It's a very great feature for making preview renders!
Thanks.

2018-10-01, 18:24:28
Reply #3

TomG

  • Administrator
  • Active Users
  • *****
  • Posts: 6118
    • View Profile
Kepler support is required, I can run it on my 680 GTX for IR, but there are some 7xx series cards that don't support Kepler so it's not just GPU generation. However I can't run it for final renders, due to the 2Gb memory on that card. Naturally, the newer and more powerful the card, the faster the AI Denoising is. So, the GPU processor will determine speed, and the GPU memory will determine render size (regardless of GPU processor).
Tom Grimes | chaos-corona.com
Product Manager | contact us

2018-10-01, 18:48:43
Reply #4

xwilz14

  • Users
  • *
  • Posts: 4
    • View Profile
Okay, I understand most of it.
Do you know the resolution limit of 2 GB card?
In my case, because I have 5 pc and only 3 of them have Nvidia card, my best bet is to install Nvidia card I have (GTX 1070 8GB) to my manager right?
The other 4 pcs can help using DR server, but they don't need nvidia card since the only help on the cpu side.
Am I right?

note:
my current workflow mainly using backburner to send renders

2018-10-01, 18:57:55
Reply #5

TomG

  • Administrator
  • Active Users
  • *****
  • Posts: 6118
    • View Profile
For DR, correct, only the master machine needs the compatible GPU.

Note that this is different to using Backburner (or other network manager) where each machine will render an entire frame by itself; in that situation, every machine would need a compatible GPU.

I don't have any figures on what resolutions would fit within a 2GB restriction, sorry :(

EDIT - I have never tested sending DR via Backburner, I'd think then only the master DR machine would need a compatible card, since it gathers the image then does the Denoising. Again, that's different than Backburner where each machine does its own frame :)
Tom Grimes | chaos-corona.com
Product Manager | contact us

2018-10-01, 19:26:56
Reply #6

xwilz14

  • Users
  • *
  • Posts: 4
    • View Profile
Thank for your answers Tom.
You are correct, my current setup for the last 8 months is like that.
So I have 5 pc, I only start backburner manager and backburner server on master pc.
For the other 4, we keep working on those pc. And when we take a break for the day,
We just launch drserver on those 4 pc to help with DR (no backburner server is required).
This setup is never failed on us for the past year.
I will further test it, but this will decrease a lot of time for making preview render for our clients.
Just 5-10 pass is all it takes to make ‘presentable image’.
Thanks a lot Tom. :)

2018-10-01, 19:38:14
Reply #7

TomG

  • Administrator
  • Active Users
  • *****
  • Posts: 6118
    • View Profile
Sounds like it should work fine in that case - of course, we'd be interested in the results of your tests, do let us know! And you are welcome!
Tom Grimes | chaos-corona.com
Product Manager | contact us