Author Topic: nvidia denoise out of memory error  (Read 2874 times)

2019-03-06, 18:23:27

iancamarillo

  • Active Users
  • **
  • Posts: 281
    • View Profile
Hi,
I rendered a 10k image but I'm running out of memory and it encounters an error when denoising. Is it the memory on my old quadro K4200 graphics card or system ram? Because I have 128gb. Is it possible to resume rendering and use high quality denoise in max so I don't have to rerender? Thanks

2019-03-06, 18:28:14
Reply #1

TomG

  • Administrator
  • Active Users
  • *****
  • Posts: 5468
    • View Profile
Depends on what error message you are getting, best to screen grab and post that.
Tom Grimes | chaos-corona.com
Product Manager | contact us

2019-03-06, 18:33:10
Reply #2

iancamarillo

  • Active Users
  • **
  • Posts: 281
    • View Profile
Hi, When I print screen CIE crashes. But I got the window in the clipboard. Thanks

2019-03-06, 18:35:54
Reply #3

TomG

  • Administrator
  • Active Users
  • *****
  • Posts: 5468
    • View Profile
And the answer is in the message :) It's the NVIDIA denoiser, ie the GPU, that has run out of memory.
Tom Grimes | chaos-corona.com
Product Manager | contact us

2019-03-06, 18:41:10
Reply #4

iancamarillo

  • Active Users
  • **
  • Posts: 281
    • View Profile

2019-03-06, 18:58:09
Reply #5

TomG

  • Administrator
  • Active Users
  • *****
  • Posts: 5468
    • View Profile
On the question of whether you can resume rendering and switch denoisers - you can, but, it is possible denoising won't work or won't be as high a quality as starting from scratch (see https://coronarenderer.freshdesk.com/support/solutions/articles/12000009087)

I tested starting with NVIDIA denoising, stopping after 5 passes, and saving to CXR. Then I changed to High Quality, and both Resume Last and Resume From File worked, and both did denoise. I didn't compare whether denoising that way was as effective as starting with High Quality from the beginning. You could switch to the Intel Denoiser, as it uses the same information as the NVIDIA one (High Quality saves and uses extra information, which is one of the reasons it is higher quality, and why it takes more system RAM; Intel will use system RAM too, of course).
Tom Grimes | chaos-corona.com
Product Manager | contact us

2019-03-06, 19:01:37
Reply #6

iancamarillo

  • Active Users
  • **
  • Posts: 281
    • View Profile
Ok great. I'll give it a shot! For now I'm gonna try denoising on a 1080ti because of time constraints. Thanks for following up

2019-03-06, 19:10:37
Reply #7

TomG

  • Administrator
  • Active Users
  • *****
  • Posts: 5468
    • View Profile
Ok - note that a 10K image is going to be large for a GPU to handle, plus that might multiply based on how many render elements you have (if using LightMix for instance). Let us know how it goes!
Tom Grimes | chaos-corona.com
Product Manager | contact us

2019-03-06, 19:29:17
Reply #8

iancamarillo

  • Active Users
  • **
  • Posts: 281
    • View Profile
Yes I was thinking about that prior to render so I disabled all elements except wire and it worked on the 1080ti. Really looking forward to using the Intel denoiser soon because my pc is unsuable if I'm rendering with nvidia's :)

2019-03-06, 20:00:20
Reply #9

mferster

  • Active Users
  • **
  • Posts: 523
    • View Profile
I'm just curious. Is there particular reason why you are using k4200 on your workstation when you have access to a 1080ti?

2019-03-06, 20:07:45
Reply #10

iancamarillo

  • Active Users
  • **
  • Posts: 281
    • View Profile
good question. our workstations are currently in flux as we're awaiting our new workstations with rtx's. So we're kinda jumping around depending on who needs what.