I'd like to understand whats happening with Corona and network rendering, and Corona RAM usage, as it seems my render nodes do not have enough RAM. I've setup network rendering, using my workstation and 3 render nodes. Each of my nodes have 16gb of ram. It works, I have successfully sent jobs across dr, the nodes join in and render passes.
Below is last nights 4k render, using DR. Usual type of scene, Xref's proxies, Forest Pack, downloaded models, 4k textures etc.

I'm not sure if it's rendering on my larger scenes, but I'm not sure why it says I'm running out of RAM. As you can see, it says the nodes Updated but did not render any passes. Does this mean they still contributed? I limit the render to 100 passes, 6 hours, and a noise level of 3.5. The log showed it reached something like 65/100, with no nodes rendering passes and took a lit under 4 hours to finish. I've pasted in the log from the one of the render nodes,
4.1 gb used by render elements
5.6 gb used by geom
17.9 gb used by textures.
WOW, whats actually happening? I can see the nodes are running out of ram, but all my bitmaps for this scene come to no more than 2gb at most.
So why is this happening, and how can I work around it?
Updated the nodes with more ram is out of the question. DR only starts to become useful on big and complex scenes that take a long time to render, is it actually working if it doesn't contribute passes? If not, then how can I make it work on the scenes that will be render intensive?