Author Topic: Large ram usage change during render?  (Read 2521 times)

2018-03-29, 10:13:28

Rhodesy

  • Active Users
  • **
  • Posts: 560
    • View Profile
Im rendering out quite a big scene (Beta 1) which hits 70GB RAM usage during the initial few passes. But when I check back on it it's only using about 40GB with the CPU still at 95%. Its denoising now and the RAM has peaked up a bit to 50GB and back down to 37GB but nowhere near the 70GB. There is 128GB on the machine so its not disk swapping. Is this to do with adaptivity? Great that it comes down but annoying it has to break the 60GB mark to start with so I can render on one of our 64GB nodes.

Having said that I have had this work in my favour before when I was rendering a scene which started off disk swapping and really slow on a 64GB node but I left it overnight anyway and in the morning the scene had actually rendered OK.

2018-03-29, 10:34:32
Reply #1

3dboomerang

  • Active Users
  • **
  • Posts: 217
  • Head of 3D
    • View Profile
    • 3DFLOW
same in my scenes; i have 48Gb and it can go up to 47,5Gb, starts dumping on a local disk and when it starts rendering it suddenly uses like 25Gb RAM

Had the same problem with the 64Gb treshhold - very annoying when using rebusfarm for example - cause my renders wouldn't render, had to contact IT, they'd manualy had to put them on 128Gb rendernodes. Waste of time - energy

Would be lovely to be able to SET a Ram limit inside max-corona...


2018-03-29, 10:37:24
Reply #2

houska

  • Former Corona Team Member
  • Active Users
  • **
  • Posts: 1512
  • Cestmir Houska
    • View Profile
Hi there, Rob!

The plugin and Corona is actually so complex that it's hard to say from the top of our heads if what you're describing is correct, but memory usage swings are expected as various parts of the code kick in. At the beginning, you have document cloning, scene parsing, preparation and conversion to Corona, on the side of Corona, you have UHD and various other precalculations, then rendering itself (the RAM should be pretty stable at this point or dropping the usage by releasing memory to the system) and at the end postprocess with tonemapping, denoising, etc..., which will see an increased usage of the memory again.

So yeah, what you're descriging is actually pretty natural.

2018-03-29, 13:42:39
Reply #3

Rhodesy

  • Active Users
  • **
  • Posts: 560
    • View Profile
Thanks Cestmir. Yes complicated no doubt. Its just quite a big swing. In the end its better than starting small and getting too big! Always need to watch out for denoising crashing with RAM usage I have had it do that once where it crashed during denoising and I put it down to sailing too close to the ram limit. But I could be wrong. I find Corona does gobble up quite a bit of RAM but I do use a lot of passes so maybe I need to back off on those a bit and really look at what I use.

On a similar note I find that big scenes often dont work with IR. It just gets in a GI cache / displacement loop (in the scene I'm working on but its happened in others too).