Chaos Corona Forum

Chaos Corona for 3ds Max => [Max] I need help! => Topic started by: snakebox on 2014-12-10, 07:31:17

Title: Memory optimization - discuss please
Post by: snakebox on 2014-12-10, 07:31:17
Hi Everyone,

I would like to start a discussion about how people manage memory usage by Corona for big commercial jobs.  Because Corona eats A LOT of memory, and way more than what is practical for most jobs. We are having big issues currently with very standard and simple exterior shots, that when rendering in 5000x5000 pixels use more than 34 GB of RAM...  now the quick and easy tip would be, get 64GB, but that's not a solution.....  similar scenes in Vray use 8GB at the same resolution.

Corona HAS to become more memory efficient one way or another, but due to the mostly bias nature of it I can't figure out what to do different to get some memory back.

My problems:
1. corona use a lot of memory, just by itself.
2. proxy's seem to only be for viewport performance, because unlike Vray corona loads the entire scene into Memory before rendering, no matter what (proxy, xrefs etc), where Vray only loads proxy's as per needed per bucket.
3.... well im already out of ideas here.

But the fact that 5x5 K images are painful to render without too much detail, actual big scenes with of forest etc will become impossible.
Title: Re: Memory optimization - discuss please
Post by: Ludvik Koutny on 2014-12-10, 10:46:38
Vray can eat less memory because it can render out of core. Meaning you can fit more stuff into scene than what fits in your ram. That's not gonna change anytime soon for Corona. Unless you use internal res. multiplier set at 2, or many render elemenents, then 5k*5k image should not eat 34GB of RAM due to the VFB. It will be probably the problem  of you being used to Vray's out of core rendering. I can render very huge scenes at very high resolution with Corona, but i  always have to be mindful about amount of unique polys and texture size. It's always been that way. If you really have a job that regularly requires huge amounts of unique geometry and high detail texturing (high end VFX), then Corona may not be a renderer of choice for you in near future.
Title: Re: Memory optimization - discuss please
Post by: snakebox on 2014-12-11, 04:47:57
I agree with you there.

My concern is that on most commercial jobs you don't get to choose and pick your geometry to be perfect and optimal as you 95% of the time have to deal with a super shit CAD model from some other software. 

I'm really just trying to figure out what specifically throws Corona off memory wise. But i guess I will have to do some testing.
Title: Re: Memory optimization - discuss please
Post by: Juraj on 2014-12-11, 05:08:51
That's quite discrepancy to be honest.. I use Vray3 minorly on animation jobs mostly but since I used it side-by-side with Corona for more than year I can tell from my experience my memory usage is pretty equal with Corona being only slightly more hungry (but growing more with instances, since I do not use proxy, in any case)

I btw use 64GB ram workstations, not really for scenes but rather for freedom of multi-task working but I rarely go over 20GB per scene, and I render 8k, with Internal res=2, zero optimization whatsover, 4-8k textures,etc.

If it's not because of 3D displacement (remember Corona does not have texture-space option like Vray, and most people equal these two without realizing the difference of what they use, 3D disp can eat a lot of memory if used carelessly),
it could be because of geometry threshold for instances ? Try using Vray3.0 with Embree on high memory scene and watch it fail due to memory leak since it will load instances as unique geometry as well.

I think this should be traced to something specific, because on general I do not see that much of a difference in my daily use.

Experimental guess (could be stupid advice, if so I am sorry, just puting it out since you mention it revolves mostly around forests and other scattering), if you in Devel/Debug set Embree mode to low-memory and lower the min. instance saving does that change anything ? My only forest scene in Corona didn't have thousands of trees but there were still few hundreds of them each 10mil poly and it rendered quite fine.
Title: Re: Memory optimization - discuss please
Post by: snakebox on 2014-12-11, 05:29:23
Thanks for the feedback!

Firstly we generally do not use displacement, at all.. just don't need it, if something need the detail it usually ends up being done in geometry anyway.

And my "problem" is also only rare, and very scene specific, but I do occasionally see scenes (max files around 1GB or above worth of 3d crap). Also I may be slightly jaded as we currently do not run vray and corona side by side.. because corona has made life so much easier on a lot of fronts. But I guess the reason why I feel a slight concern is because I don't have a lot of knobs to tweak in Corona (visible anyway).

Also we don't actually do that big forests, your typical forest pack pro planting and a few feature trees on average. But there is always the one off job.

I will do a quick test of overall mem use with the suggested settings (didn't even notice there was a debug UI).

Edit: we have 1-2 workstations with 64GB of ram, but all the render nodes and the remaining workstations are on 32GB.

Title: Re: Memory optimization - discuss please
Post by: snakebox on 2014-12-11, 06:17:13
So interestingly..

if I enable "Low memory" under embree none of the geometry is rendered at all, or parsed.  Just like it's hidden.  untick it again, and everything renders as normal.

I will try and up and or lower the min instance setting..  but the little I have tried it seems like lowering it, makes max take up more memory (I am assuming it's because corona keeps less for instances?)

Edit: So the min. instance saving setting I can't visibly see any difference..  Scene renders on my test machine regardless, but we have had a lot of render nodes fail after X time, which makes it a little hard to test.  But what is that setting meant to do?   (and why isn't anything loading when low mem is on?)
Title: Re: Memory optimization - discuss please
Post by: Captain Obvious on 2014-12-11, 12:14:15
One thing worth noting is that Corona seems to be much more efficient at dealing with heavy point instancing (like forests and such). It's only really noticeable if you render on the order of millions of instances, but still.



I understand that out of core for geometry can be tricky, but some kind of bitmap paging system (a native one) could be really beneficial, especially for big heavy HDRIs. You can used things like tiled OpenEXR files and load/cache only the bits needed, as and when they're needed. That helps a lot.
Title: Re: Memory optimization - discuss please
Post by: Ondra on 2014-12-11, 13:26:50
low memory embree option is currently broken, I know about that issue
Title: Re: Memory optimization - discuss please
Post by: zules on 2017-02-21, 11:59:32
Any newness about RAM usage of Corona ?

My render node, i7 / 16Gb RAM always get alerts for ram usage, when my 32Gb Workstation get also full pretty quickly...
Title: Re: Memory optimization - discuss please
Post by: maru on 2017-02-21, 18:54:38
Previous post is from 2014. :) There were some memory optimizations added since then. There are also some new ones in 1.6 already.

The main question is - in what specific case are you experiencing these issues? It would be best if you could contact us at https://coronarenderer.freshdesk.com/support/tickets/new
Title: Re: Memory optimization - discuss please
Post by: Charlie Nicols on 2017-02-27, 16:46:26
I made a post about this a little while ago. In the same position, as soon as we go over 5k images with passes even our 64GB machines nearly out of RAM.

Looking forward to solutions for this.
Title: Re: Memory optimization - discuss please
Post by: maru on 2017-02-27, 17:23:24
Do you think that would be possible to render with Vray or some other engine? If so, do you have some test results, samples scenes that you could share?
One idea that would help a bit with RAM usage when rendering in very high-res would be disabling the VFB. You can learn about it, and other memory optimization methods, in this guide:
https://coronarenderer.freshdesk.com/support/solutions/articles/5000675854
Title: Re: Memory optimization - discuss please
Post by: Charlie Nicols on 2017-03-14, 17:50:59
Ok, I have some test results to post on this issue;
A summary for people not wanting to read;

Test Computer
i7-6800k - 64GB Ram - Windows 10

Test Scene - https://we.tl/BYBvTJEDfP   (We Transfer Link)
1 Plane - 1 Box -- 1 Sphere Light

**All units in GB**

Base Ram usage for 3DS MAX
0.932

-- Render size 2000x2000x --
Render for the first 4 passes -1.1
Render 5 passes then after 5th pass compleated - 1.25
Render 10 passes - 1.25

-- Render size 2000x2000x -- Denoising--
Render first 4 passes -1.49
Render 5 passes then after 5th pass compleated -1.63
Render 10 passes - 1.63

Stopped at 10 passes - Denoising kicks in and RAM usage goes up to 2.7gb before finishing


-- Render size 7000x7000x --
Render for the first 4 passes 4.53
Render 5 passes then after 5th pass compleated -6.25 - (Ram spiked to 7.4)
Render 10 passes and at the end of 10 -6.26 (7.24 Spike)

-- Render size 7000x7000x -- Denoising--
(Base 3ds Max Ram usage has increased to 2.54 even though VFB has been cleared in Corona)**note

Render for the first 4 passes 8.925
Render 5 passes then after 5th pass completed - 10.6 -  (13.2 Spike)
Render 10 passes and at the end of 10 - 10.6 (14 Spike)

Stopped at 10 passes - Denoising kicks in and RAM usage goes up to 18gb and slowly rises to 24.4 before finishing.


** Note at this point my 3DS max is consuming 9GB of ram - to bring it back down I had to render a 10px 10px render -- it currently sits at 2.2gb **
--I tried Erasing the previous 7000x7000x render with denoising and then refreshing the window as well as resetting coronas settings--

To get the base ram usage of 3DS max back down I restarted max and opened the test file - current Usage 0.67

-- Render size 7000x7000x -- Denoising + Elements we need for production --

Normals,AO,Alpha,Direct Lighting, 6 Mask passes for IDs 1-19, Shadows and the following Raw components Raw Refraction, Raw Reflection(With denoising on element),Raw Diffuse

Render for the first 4 passes - 19.98
Render 5 passes then after 5th pass compleated -21.6 - (Ram spiked to 24.2)
Render 5 passes then after 10th pass compleated -21.6 - (Ram spiked to 24.2)


Stopped at 10 passes - Denoising kicks in and RAM usage goes up to 34gb and slowly rises to 36 before finishing.


And I tested this in Vray All very high settings- Most amount of ram it used is 11.7gb -- (Took ages though,don't make me go back!)
Title: Re: Memory optimization - discuss please
Post by: maru on 2017-03-24, 14:44:07
The RAM usages described here are probably expected. Each render element requires some RAM. Denoising requires extra render elements, which are not visible to the user. Adaptivity also requires additional RAM (that's why RAM/CPU usage may be changed every 5 passes).
Title: Re: Memory optimization - discuss please
Post by: Charlie Nicols on 2017-03-24, 18:46:39
Hey maru, if it's to be expected in the current corona version that's ok but it will force most visulisation industries where clients expect this size of render to be produced to have to invest in at lest 64GB ram machines.

 I do think this should be pointed out somewhere as I have seen people asking for help with what kind of spec machine they should buy to run corona and having 32gb machine recommended by part of your team.

Don't get me wrong though we love this render and it has done wonders in production. The RAM issues have caused a major headache though and a lot of extra investment in hardware. 
Title: Re: Memory optimization - discuss please
Post by: Juraj on 2017-03-24, 22:19:52
I do think this should be pointed out somewhere as I have seen people asking for help with what kind of spec machine they should buy to run corona and having 32gb machine recommended by part of your team.

I would personally start suggesting 64gb as minimum for current workstations, I do run out of it constantly on machines where I have it.
Title: Re: Memory optimization - discuss please
Post by: zules on 2017-04-24, 17:20:39
The RAM usages described here are probably expected. Each render element requires some RAM. Denoising requires extra render elements, which are not visible to the user. Adaptivity also requires additional RAM (that's why RAM/CPU usage may be changed every 5 passes).

That's interesting... so you could add a check box to render but hide Lightselect RE or not ? Maybe I didnt see how to do that already..?
Using PSDManager, I'm removing those RE everytime from it...
Title: Re: Memory optimization - discuss please
Post by: tony_morev on 2017-04-25, 18:45:18
the most useful thing we miss compared to v-ray is fully functional proxies
not just a viewport gimmick, but something that actually allows to save RAM while rendering
instancing helps, but only to a point 
Title: Re: Memory optimization - discuss please
Post by: Fluss on 2017-04-25, 18:52:13
I think Vray proxys are effecient in bucket mode coz you can load geometry dynamically when it's needed. I don't know if it could be acheived with a progressive render engine.
Title: Re: Memory optimization - discuss please
Post by: Daniel Schmidt on 2017-05-17, 22:53:56
Quote
That's interesting... so you could add a check box to render but hide Lightselect RE or not ? Maybe I didnt see how to do that already..?
Using PSDManager, I'm removing those RE everytime from it...

@zules - you can control manually if an element is added to the PSD or not by toggling the Create Layer/Channel checkbox in the render elements rollout. If you right click on a render element in the list of render elements a popup menu opens and lets you save new default options for this type of render element (e.g. CShading_LightSelect). However, I just noticed that this preset does currently NOT include the option if the element is added to the PSD or not.

Would it help you if this was changed in psd-manager (so you could exclude LightSelect elements from export in psd-manager by default)?

(http://i.imgur.com/OYz1RtJ.png)

Daniel
Title: Re: Memory optimization - discuss please
Post by: zules on 2017-05-19, 10:47:16
Yes that's a good trick, thank you Daniel.