Author Topic: handle heavy scenes causing high ram usage (automotive)  (Read 11935 times)

2017-03-10, 12:50:29

Duron

  • Active Users
  • **
  • Posts: 106
    • View Profile
    • Portfolio
Hi,

i just want to know your experiences on how to handle heavy scenes with large amount of polygons at high resolutions and low RAM such as a 32GB workstation for example.

Since time pasts and the quality of the models gets higher and higher i'm running more and more often to max crashes if i want to render high resolution renders >4k with >30mil polygons (CAD files). I know that the internal resolution is set to 1 by default which is great but i need further optimizations just to get a more stable scene on rendering for production as i get crashes even at 2-3k without any displacement or render elements. I'm working on a 32GB workstation at home. Same scenes working fine at workstations with 96GB at work but i need to get workarounds for my 32GB workstation.   

I read at corona helpdesk something about "Reduce the total number of instances". Normally i prepare all my scenes by deleting all mirrorable parts and then mirror them as instance again. So now i'm a bit confused a little bit with that "Reduce the total number of instances". Which way is now better to reduce ram usage?

What exactly corona does when "lowmem" is activated? How much it can help for render above 4k?

What or how high are the benefits of disabling the VFB? Any drawbacks? Anyone with experience here?

No option:
satisfy with low res
reduce polygon size

Also i could imagine a "high res production" mode at corona which can be activated optionally where the VFB shows only a 10-20% image resolution just to see progress and have control on post process while corona renders internally a resolution like 12k. This could be really helpful as i know that from Autodesk VRED which already have this kind of feature.

Thanks for any tips or help!

Cheers,
Duron

2017-03-10, 14:13:07
Reply #1

pokoy

  • Active Users
  • **
  • Posts: 1861
    • View Profile
I don't know how much geometry you have in the scene but this may help. Since you say CAD data - how do you get it into Max? Import as Body objects or already as mesh? If you are using Body objects, make sure you specify a reasonable render tesselation resolution and make sure it's not using different resolutions for the viewport and the rendering. Also, when you convert Body objects to Mesh/Poly they will have all their patches unwelded which will add a lot of memory overhead depending on the number of subelements. A good way to get rid of them is to weld with the ProOptimizer modifier (enable 'Merge Vertices' with a tiny threshold, leave everything else as it is, run 'Calculate' and convert to Mesh/Poly). Don't weld as Mesh/Poly command as this will invalidate explicit normals.

Using Render Elements will increase memory usage since it's allocates additional memory per element, this can easily add a few GB if you use a few of them at this resolution. As for VFB being displayed, I'm not sure how much overhead this adds.

Another option is to use Corona Proxy for each instanced mesh. And the term 'Instance' in the helpdesk probably means unique mesh, that's at least how the term is defined in other places such as Corona's MultiMap.

2017-03-10, 17:43:36
Reply #2

Ondra

  • Administrator
  • Active Users
  • *****
  • Posts: 9048
  • Turning coffee to features since 2009
    • View Profile
we have some slight memory optimization in plans for 1.6 (and already did some), but nothing drastic. The bad part is that 3dsmax efficiently doubles the memory usage by storing the entire evaluated scene and retaining it during and after rendering. We have yet to find out how to flush this copy
Rendering is magic.How to get minidumps for crashed/frozen 3ds Max | Sorry for short replies, brief responses = more time to develop Corona ;)

2017-03-10, 18:14:40
Reply #3

pokoy

  • Active Users
  • **
  • Posts: 1861
    • View Profile
...The bad part is that 3dsmax efficiently doubles the memory usage by storing the entire evaluated scene and retaining it during and after rendering...
Ouch!

2017-03-10, 19:02:43
Reply #4

Duron

  • Active Users
  • **
  • Posts: 106
    • View Profile
    • Portfolio
@pokoy
the geometrys are definetely mesh data, no nurbs.

@ondra
Too bad! But nothing is wondering me which has autodesk in it's name...
What do you think about the idea with a small render window? Is it generally possible to do something that way?

2017-03-10, 19:05:14
Reply #5

Ondra

  • Administrator
  • Active Users
  • *****
  • Posts: 9048
  • Turning coffee to features since 2009
    • View Profile
i would love to do it, it would fix some other stuff (such as "Cannot create render output" message), but it is probably not doable in transparent way, so the UI would suffer
Rendering is magic.How to get minidumps for crashed/frozen 3ds Max | Sorry for short replies, brief responses = more time to develop Corona ;)

2017-03-10, 19:34:38
Reply #6

pokoy

  • Active Users
  • **
  • Posts: 1861
    • View Profile
@pokoy
the geometrys are definetely mesh data, no nurbs.
If the data came from CAD, you might still want to check for separate elements within the meshes, if there are lots of unwelded vertices it might help a bit. But probably it will be more effective to sacrifice a render element.

2017-03-10, 19:41:04
Reply #7

Duron

  • Active Users
  • **
  • Posts: 106
    • View Profile
    • Portfolio
Silly about that memory issue which remains after rendering. I just noticed that the ram usage is decreasing slowly if you leave max for a while which is also really strange.

How is the denoiser stressing ram usage here? Does the denoiser causing higher ram usage after rendering compared to the actuall render process? This could also be an indication for the crashes i get over night. I guess the rendering itself is just good, but the denoiser then leads to the crash, because more ram is needed. Correct?

@pokoy
i think this can really help but the amount of worth i have to do together with that risk to destroy the normals which is generally a problem with 3ds max with cad mesh is questionable.

2017-03-10, 19:55:47
Reply #8

pokoy

  • Active Users
  • **
  • Posts: 1861
    • View Profile
@pokoy
i think this can really help but the amount of worth i have to do together with that risk to destroy the normals which is generally a problem with 3ds max with cad mesh is questionable.
Hence the ProOptimizer workaround - it'll respect and keep normals intact. Maybe useful for future projects then ;)

2017-03-10, 22:17:04
Reply #9

Ondra

  • Administrator
  • Active Users
  • *****
  • Posts: 9048
  • Turning coffee to features since 2009
    • View Profile
denoiser requires about 4-5 additional render elements to be allocated, so it can actually make a big difference for large resolution renders
Rendering is magic.How to get minidumps for crashed/frozen 3ds Max | Sorry for short replies, brief responses = more time to develop Corona ;)

2017-03-11, 03:48:51
Reply #10

chilombiano

  • Active Users
  • **
  • Posts: 53
    • View Profile
damn! So 32GB is low ram ? :)

I don't know about how things come from CAD but always found tha Max sucks when dealing with simpy too many objects independent of the poly count. and would say that scene and the moment the renders starts to kick out improves a lot by collapsing things like crazy. like a whole building . don't know why but it works for me.
There is an old very handy script called DANDG_Collapse. It screws the uv's in some rare cases but generally gives a nice single mesh with a multi material in seconds.
Or you could split the scene and try Xref. gross.



2017-03-11, 09:48:53
Reply #11

FrostKiwi

  • Active Users
  • **
  • Posts: 686
    • View Profile
    • YouTube
Would the V-Ray framebuffer trick work in Corona aswell, if Corona had it's own way of inputting Resolution or did Corona already fix, what V-Ray couldn't?

Quote
Due to technical reasons, the original 3ds Max frame buffer is still created at render time even if rendering to the V-Ray Frame Buffer. However, when the V-Ray Frame Buffer is enabled, V-Ray will not render any data to the 3ds Max frame buffer. To preserve memory consumption when using the V-Ray Frame Buffer, the following settings are recommended:
  • Under the Common tab Output Size section in the Render Setup window, set both Width and Height to low values such as 100.
  • In the V-Ray Frame Buffer rollout (described on this page), disable Get resolution from MAX and enter the desired Width and Height values along with any other needed information such as Image aspect.
I'm 🐥 not 🥝, pls don't eat me ( ;  ;   )

2017-03-11, 20:04:48
Reply #12

Ondra

  • Administrator
  • Active Users
  • *****
  • Posts: 9048
  • Turning coffee to features since 2009
    • View Profile
i can probably do the same trick - but not in "UI smooth" way :/
Rendering is magic.How to get minidumps for crashed/frozen 3ds Max | Sorry for short replies, brief responses = more time to develop Corona ;)

2017-03-12, 08:42:00
Reply #13

FrostKiwi

  • Active Users
  • **
  • Posts: 686
    • View Profile
    • YouTube
i can probably do the same trick - but not in "UI smooth" way :/
Maybe hide it in devel tab, because if I render in 4k vs 8k, my usage goes from 8gb to does not fit into ram anymore.
For these special cases maybe, who are willing to dig into the Corona UI, to get the render going at all.
I'm 🐥 not 🥝, pls don't eat me ( ;  ;   )

2017-03-12, 10:21:21
Reply #14

romullus

  • Global Moderator
  • Active Users
  • ****
  • Posts: 8833
  • Let's move this topic, shall we?
    • View Profile
    • My Models
+1
Yes please, if there's choice to be able to render at higher resolution even with crippled UI or not to render at all, i'd choose former. As long as there will be ability to have normal UI at lower resolution renders of course.
I'm not Corona Team member. Everything i say, is my personal opinion only.
My Models | My Videos | My Pictures