Author Topic: better memory managment with limited RAM pc/ option to set memory usage limit  (Read 16648 times)

2017-03-26, 13:04:43

shiftman2012

  • Active Users
  • **
  • Posts: 29
    • View Profile
i would like to request implementation of better memory management for pc with limited ram amount.
it is common problem when there is not enought memory when renderign with corona ; even with 32gb ram. now there is no option to tell corona not to use all available ram.
in this case cpu usage drops, because corona starts caching to disk, when memory not enough.
in my experience corona 1.5 uses almost all available ram (displacement turned off), and crashes (total pc 31.6gb ram of 32gb ram).i have read same experience on russian forum, that corona uses way more ram than vray.
it would be great to have option to set limit for ram memory corona to use ( as it is done in vray- dynamic geometry ; and low memory embree checkbox), so that computer does not run out of memory.
this option should make corona load information on rendering time, not at one before rendering.

still i am not using corona on daily basis, because of this memory management problem. because i dont know if corona renders my project fine, or it runs out of all my memory and crashes.

2017-03-26, 13:08:41
Reply #1

shiftman2012

  • Active Users
  • **
  • Posts: 29
    • View Profile
also i noteced that after corona rendering is done, corona does not clean used memmory well. and i need to restart 3ds max to have less ram used by 3ds max.
when i start my project 3ds max uses 8gb ram, after rendering is done - it uses 16gb ram. at rendering time it uses 31.5gb ram. is it possible to make corona clean memory after rendering? or is there workaround to solve this problem? how to clean memory after rendering?

2017-03-26, 13:11:58
Reply #2

shiftman2012

  • Active Users
  • **
  • Posts: 29
    • View Profile
is it posiible to implementin Corona same . memory limit - dynamic memory option, with conserve memory option?

i atached vray settings.

2017-03-26, 13:28:22
Reply #3

Juraj

  • Active Users
  • **
  • Posts: 4763
    • View Profile
    • studio website
also i noteced that after corona rendering is done, corona does not clean used memmory well. and i need to restart 3ds max to have less ram used by 3ds max.
when i start my project 3ds max uses 8gb ram, after rendering is done - it uses 16gb ram. at rendering time it uses 31.5gb ram. is it possible to make corona clean memory after rendering? or is there workaround to solve this problem? how to clean memory after rendering?

Yeah this is big issue, I also have to restart scenes when I am rendering finals and the subsequent renderings takes double the original memory print. It easily overpowers 64 gb of memory I have in most machines.

Ondra mentioned 3dsMax creates a copy and they don't know how to flush it.

Please follow my new Instagram for latest projects, tips&tricks, short video tutorials and free models
Behance  Probably best updated portfolio of my work
lysfaere.com Please check the new stuff!

2017-03-26, 13:58:20
Reply #4

shiftman2012

  • Active Users
  • **
  • Posts: 29
    • View Profile
i am not sure if anyone already posted that request, but is it possible to implement in Corona dynamic geometry ram memory usage limit ( as in vray it is done) ? now we have only static mode - that loads all geometry and other stuff at once before renderin, am i right?

2017-03-26, 17:32:29
Reply #5

Cheesemsmsm

  • Active Users
  • **
  • Posts: 98
    • View Profile
+1

Memory issue is pain.
I have to move back to V-Ray when I'm working on a big scene because it's much more memory friendly.

2017-03-27, 08:17:06
Reply #6

Ryuu

  • Former Corona Team Member
  • Active Users
  • **
  • Posts: 654
  • Michal
    • View Profile
As Juraj already said, 3ds Max caches a lot of data on its side and we have no way of releasing that data.

Also are you sure that it is really the geometry that is consuming most of your memory? From our experience this biggest memory consumers are render elements and textures.

2017-03-27, 09:18:52
Reply #7

shiftman2012

  • Active Users
  • **
  • Posts: 29
    • View Profile
yes i am sure geometry consumes a lot of geometry. laubwerk plugin uses a lot memory. in vray i have to switch from static to dynamic geometry with memory limit to be able to render scene with laubwerk.

anyway, it is a must to have option to set dynamic geometry and memory usage limit in corona. i also can not use corona for big exterior projects, because corona eats all my ram and crashes. there are no big textures in my projects, an i use only few render elements.

it would be nice to hear Ondra's opinion about this problem, and will it be solved soon.

2017-03-27, 10:04:52
Reply #8

Frood

  • Active Users
  • **
  • Posts: 1922
    • View Profile
    • Rakete GmbH
laubwerk plugin uses a lot memory

You are referring to the plugin itself here? So there is a essential difference in using 10 Laubwerk trees with their plugin compared to 10 Laubwerk trees converted to mesh and scattered/instanced?


Good Luck


Never underestimate the power of a well placed level one spell.

2017-03-27, 14:08:50
Reply #9

shiftman2012

  • Active Users
  • **
  • Posts: 29
    • View Profile

2017-03-30, 15:26:08
Reply #10

fraine7

  • Active Users
  • **
  • Posts: 79
    • View Profile
I'm surprised that Memory optimization (lower memory usage in 3ds Max; not out-of-core) is currently in last place in the most wanted features poll (1% of the vote?!) - I have a constant battle to render final images of around 6K. Upgrading all machines from 32gb to 64gb would be the most immediate solution but this is going to set me back a few thousand £'s and from what Juraj mentioned it may not fix my issues completely.

Typical scenes include scattered grass, gravel and proxied trees (Corona Scatter), probably in the region of 25-30mil polys. It seems to get mentioned a lot and for good reason, but Vray's use of RAM makes for painless rendering of similar scenes and it's the only thing that bugs me about Corona.

Also, displacement is no longer an option for me unless I am rendering tiny preview images - when I'm at full resolution it's just not possible to have anything displaced which really affects the quality of my output at times.

2017-03-30, 16:30:39
Reply #11

Juraj

  • Active Users
  • **
  • Posts: 4763
    • View Profile
    • studio website

Also, displacement is no longer an option for me unless I am rendering tiny preview images - when I'm at full resolution it's just not possible to have anything displaced which really affects the quality of my output at times.

I know this is not solution, just some of my workarounds:

-At highres, I move to 3px displacement detail. The displace then looks rather fugly... (for the amount of precomp time & memory it will eat), so B..
-.... I rather use Turbosmooth with isoline display (and render time multiplier), use manual displace (baked into geo) to the rough outline, and use detailed normal map for the rest. It does surprisingly look even better than pure disp most of the time, is very predictable. But hardly usable for big surfaces.

Please follow my new Instagram for latest projects, tips&tricks, short video tutorials and free models
Behance  Probably best updated portfolio of my work
lysfaere.com Please check the new stuff!

2017-03-31, 01:45:35
Reply #12

fraine7

  • Active Users
  • **
  • Posts: 79
    • View Profile
Thanks Juraj,

I have found myself using normal maps and displaced mesh modifier a lot more since the switch to Corona. As you mentioned it's not a perfect solution but better than nothing I guess. There is a scene I am currently working on which features a shot of a path leading up to a house, the camera is about 30cm above the path and the shot is intending to show the 'stamped' pattern in the concrete - fair to say that my bank balance will be taking a hit on additional room very soon ;)

2017-05-19, 01:47:09
Reply #13

annkos

  • Active Users
  • **
  • Posts: 90
    • View Profile
I have big issues with the ram management too..i did 2-3 projects lately with full nature etc and for the final res renders i was praying 3dsmax not crash the most of the times! Even if the used memory was about 28gb  of my 32gb available ram still the most of the scenes crashed, i had to manually exit all running programs from skype to antivirus so i can save a couple more gb of ram in order to make the renders!..i am in love with corona but that ram management push me to get back to vray for big projects. I really hope guys to do something about that matter in 1.7 version.


Edit: I just check 1.7 road map and there will be memory optimisations! Cant wait!

2017-05-19, 09:35:53
Reply #14

Frood

  • Active Users
  • **
  • Posts: 1922
    • View Profile
    • Rakete GmbH
Obviously yes, there is difference.

Had the opportunity to test this, I could not reproduce. Tried 300 trees, native Laubwerk vs. one mesh of the same tree scattered with FP and got (almost) no memory usage difference. Are you sure the Laubwerk objects were instanced? After making them unique everything from parsing time to memory went crazy of course, ended in 84GB committed memory =:)


Good Luck



Never underestimate the power of a well placed level one spell.