Author Topic: Backburner VS DR RAM usage  (Read 6052 times)

2020-10-22, 13:51:59

Gabiru

  • Active Users
  • **
  • Posts: 43
    • View Profile
Hello everyone, ive been working on a very big architecture project of a district. it has like 7 buildings with a lot of shops and people.
I know that i should cut and use low poly models for things that are relative far from camera so i can save some RAM usage.
The problem im having is really weird. If i send my renders trough backburner with DR activated only 2 of our nodes can render it because of RAM usage. (those 2 has 48GB the other 8 has 24GB)
Altough if i deativate DR and send trought backburner to one of those nodes with 24GB and make the render only on that node it only spends like 14GB.
The same goes for the 48GB nodes, i noticed with DR they go up to 60GB because of Virtual memory when i send trough backburner with DR deativated its spending 15GB.
This makes no sense, i know that 24GB isnt that much we are about to upgrade it but still no logic that from backburner it spends much less memory.
What should i do? The deadline is coming and im a bit worried bout our renders. I feel like on VRay the renders were much faster, i noticed that Corona denoiser and render elements spends way to much RAM, but we need some render elements for post production and the denoiser so it doesnt have much noise, so i cant cut that much on those things.
Best regards
« Last Edit: 2020-10-22, 13:56:35 by Gabiru »

2020-10-22, 14:59:05
Reply #1

dj_buckley

  • Active Users
  • **
  • Posts: 872
    • View Profile
I've raised this multiple times with no real valid answer as yet.

DR Nodes use way more RAM than the scene actually needs.

I'm rendering right now. 

The scene on my host PC is using less than 50GB RAM, yet both nodes are using over 70GB.  The only program running on those nodes is Corona DR/Max.  On the host PC, I'm also running Skype, Chrome (with lots of tabs open), Spotify, Photoshop amongst others.

It happens on every scene without fail.  It's actually stopping one of nodes being productive to the overall render even though it has enough RAM (64GB) to handle what the scene is actually using (45GB).

Here's my original post from 3 years ago, that I revisted a year ago and got no resolution - https://forum.corona-renderer.com/index.php?topic=18118.15


2020-10-22, 15:06:57
Reply #2

Gabiru

  • Active Users
  • **
  • Posts: 43
    • View Profile
holy, thats not what i would like to hear..
do u have any tricky thing to counter this problem?
ive been using corona for almost 1 year, i really like the workflow before renders. During rendering times VRay was much more stable and we could calculate our times much better. I really hate how corona behaves with renders, its always a lucky thing to have a good render time...
i hope that anyone can help us.. if you find something that can help please answer me, ill do the same.
at this time i kinda prefer to send a render to a single node (like we do with animation) and get all the renders done at the same time, because DR is actually a joke.

2020-10-22, 15:17:15
Reply #3

dj_buckley

  • Active Users
  • **
  • Posts: 872
    • View Profile
Nope nothing, no tricks to resolving it.

I don't know why it does it.  That's why I originally asked.  It seems nobody is able to answer why its happening.

I've been using Corona pretty much since Alpha (switching from VRay) - but i'm becoming increasingly frustrated with it.


2020-10-22, 15:27:39
Reply #4

Gabiru

  • Active Users
  • **
  • Posts: 43
    • View Profile
yea same here, specially when i have my boss complaining about render times all the time, and i dont know how to solve it..
i hope we get an answer soon, if i find any solution ill reply here, but i hope any corona team member comes in to help, ive already spent to much time searching solutions and none seems to work.

2020-10-22, 17:09:56
Reply #5

maru

  • Corona Team
  • Active Users
  • ****
  • Posts: 12717
  • Marcin
    • View Profile
@Gabiru and @dj_buckley
Have you contacted us about this via our helpdesk? If not, please do: https://coronarenderer.freshdesk.com/support/tickets/new
We do monitor the forum, but the support we can provide here is limited. We can offer better help, and more often, via the helpdesk.

First, we need all the basic information:

Which version of Corona is this?
Which version of 3ds Max?

What exact network rendering setup are you using? Corona's DR only? Backburner? Deadline? Other?

Is this happening only in one specific scene?

Is this happening in a scene with just 1 teapot? (RAM usage unexpectedly high?)

Are you using some 3rd party plugins? (Forest Pack? Multiscatter? Siger? Other?)

Can you send any problematic scene to us? (archived, with all assets)

Have you tried using the "conserve memory" option? (screenshot)
Following the RAM guides? https://coronarenderer.freshdesk.com/support/solutions/articles/12000023310

Please contact us with the above information using https://coronarenderer.freshdesk.com/support/tickets/new


Looking forward to your support tickets!

Marcin Miodek | chaos-corona.com
3D Support Team Lead - Corona | contact us

2020-10-22, 18:03:10
Reply #6

dj_buckley

  • Active Users
  • **
  • Posts: 872
    • View Profile
Hi Maru

If you look at the other thread I posted a link to, all of this information was in that thread and it was yourself I was discussing it with, the thread was almost 2 years long and then just went cold.

I haven't done this through the helpdesk as that previous thread looked like it was gaining traction - the reason for keeping it to the forums was also so other users could chip in if they had the same issue - which they did.  I've also opened 3 official tickets previously, all of which are now closed and only 1 was solved (by me).  I tend to have much more success in the forums.

But to answer the questions here for the benefit of others, and I'll copy and paste into an official support ticket -

Which Corona Version - every version that I can remember back to whatever version I was using in 2017 in that original thread

Which Max Version - I use Max 2018 (can't vouch for anyone else)

Which Network Rendering Method - Corona DR - fire up DR Server on the nodes, press render on the master workstation.

Is this happening in a specific scene - nope, every single scene I've created since first noticing the issue.

Is this happening in a scene with just 1 teapot - probably but I've never tested, but because RAM usage would be so low you'd probably not notice - it appears cumulative, the heavier the scene the bigger the discrepency between master and nodes.

Are you using some 3rd Party Plugins - the only plugins I use are Forest Pack, Railclone very very rarely and Multitexture/Floorgen - nothing unusual

Can you send any problematic scene to us - this would suggest you can't replicate the issue at your end?  Is that right?  Does this not happen if you open any random heavy scene you have access to and render using the method above?

Have you tried using the conserve memory option? - no - because render speed isn't a hit I can afford to take 90% of the time especially after investing heavily in 128GB RAM loaded machines, I don't see how this would solve the discrepency either surely it would just make RAM usage reduce by the same amount on each machine but maintain the fact the nodes use more

Following the RAM Guides - this isn't really about how much RAM is being used in order to save RAM, the scenes aren't using as much RAM as I have in my machines when I render them locally (i made sure of this by spending big on lots of RAM) it's only when I render using DR, then they use more, so I shouldn't really have to spend time on every scene reducing the RAM when it should really just work. 

This is the bit I get frustrated with too - whenever I have issues, the solutions always feel like backwards steps or counterintuitive processes, or go against what I've been led to believe previously from Corona's own documentation - delete this, delete that, don't use this, don't use that, compress this, compress that, or they just don't get resolved and I have to just deal with the issue.  The biggest consumer of RAM in most of my scenes is displacement, no doubt about it.  Enter 2.5D Displacement - and we know how that turned out https://forum.corona-renderer.com/index.php?topic=26782.0 - the solution to that thread was use the old displacement so back to square one and keep your high ram usage issue, or subdivide the geometry to the point the artifacts became unoticable - not very practical on entire scenes because this in turn increases the number of polys and subsequently RAM usage, completely negating the RAM saving in the first place.  It's almost at the point where I need to factor in 'troubleshooting' to every project timeline I issue - if only I could charge that time back to clients and have them understand why the renders might not be ready on time.  I genuinely can't remember the last project that ran trouble free whether it be displacement, ram, DR, caustics, tonemapping or some other issue.

Finally the other reason I like to stick to the forums is because if more people chirp in with the same problem, it becomes less easy to be passed off as user error.  Other users may have the same issue, and not report it thinking it's something they're doing wrong.  Had I gone through the support system nobody would know I too have the same issue.

Mini rant over :)
 

2020-10-22, 18:14:53
Reply #7

maru

  • Corona Team
  • Active Users
  • ****
  • Posts: 12717
  • Marcin
    • View Profile
Sure, we can continue on the forum, no problem. If the discussion starts "freezing", please bump this forum thread and/or feel free to contact us at support@corona-renderer.com or me directly at miodek@corona-renderer.com

(started an internal report to keep this issue in one place - id=582842271)
Marcin Miodek | chaos-corona.com
3D Support Team Lead - Corona | contact us

2020-10-22, 18:24:58
Reply #8

dj_buckley

  • Active Users
  • **
  • Posts: 872
    • View Profile
no problem - just to add to this sentence "Following the RAM Guides - this isn't really about how much RAM is being used in order to save RAM, the scenes aren't using as much RAM as I have in my machines when I render them locally (i made sure of this by spending big on lots of RAM) it's only when I render using DR, then they use more, so I shouldn't really have to spend time on every scene reducing the RAM when it should really just work. "

It's about the wildly differing RAM usages between host workstations and nodes when rendering through DR, sometimes it's not a 'problem' on smaller scenes but that doesn't mean the issue isn't there.  It only becomes a 'problem' when the RAM usage does force the node to become unproductive

2020-10-22, 18:27:41
Reply #9

Gabiru

  • Active Users
  • **
  • Posts: 43
    • View Profile
i agree with dj_buckley if there are at least 2 persons with this problem, and i believe that are a lot more, it should be in public domain so everyone can have their answers fast while searching on web.
although i opened a ticket and shared my file, hope get an answer soon and get all things solved. the number of problems related to RAM and DR (together) are simply frustrating to everyone, specially when you work on a studio and your bosses pressure you... They are always talking that we should go back to VRay due to number of problems that corona renderer have been causing. i know that it should be problems with the workflow and not with the renderer engine itself, but the lack of documentation and solutions are just ridicolous to everyone that works and need to solve things fast
If i get a specific answer or a solution to my problem ill post it here.
« Last Edit: 2020-10-22, 18:35:47 by Gabiru »

2020-10-23, 10:47:26
Reply #10

jms.lwly

  • Active Users
  • **
  • Posts: 171
    • View Profile
    • jms.lwly studio
Just to add a +1...

I've experienced this on many occasions: scene renders locally on workstation fine (e.g. using ±45GB RAM) but pushes upwards of 64GB on DR node which is doing nothing else - workstation always completes more passes than the DR machines. Using the 'conserve memory' function did reduce the load on both workstation and DR nodes, but as Dave expresses this compromises on speed - and doesn't explain the difference in memory usage.

Switching over to Backburner and the render on the nodes return normal (50GB) memory usage.

I don't have a specific scene to share, it seems to be "just how it works"... 🤷🏼‍♂️


2020-10-23, 12:59:04
Reply #11

pokoy

  • Active Users
  • **
  • Posts: 1850
    • View Profile
Have seen this the other day as well. RAM usage on a DR machine was 15-20% higher than local, which is strange considering the DR machine doesn't need any resources to display max and framebuffers and only needs RAM for what is actually rendering (no overhead from material editor with unused maps etc). I don't recall this being the case with earlier versions but can't tell for sure.

Are you guys using displacement in your scenes? Maybe DR machines tessellate displacement with higher settings for some reason?

2020-10-23, 14:26:22
Reply #12

Gabiru

  • Active Users
  • **
  • Posts: 43
    • View Profile
I have a theory dont know if it's correct.
So when you use backburner it transfers the file to the target machine, while DR nodes has to load it on RAM? It's the only thing that makes sense on my head
If it's that I think this should be fixed somehow makes no sense that DR uses so much ram

2020-10-23, 16:39:02
Reply #13

dj_buckley

  • Active Users
  • **
  • Posts: 872
    • View Profile
Have seen this the other day as well. RAM usage on a DR machine was 15-20% higher than local, which is strange considering the DR machine doesn't need any resources to display max and framebuffers and only needs RAM for what is actually rendering (no overhead from material editor with unused maps etc). I don't recall this being the case with earlier versions but can't tell for sure.

Are you guys using displacement in your scenes? Maybe DR machines tessellate displacement with higher settings for some reason?

Can't remember the last time I didn't use displacement - never 2.5D displacement though because it's not production ready imo

2020-10-24, 00:43:56
Reply #14

pokoy

  • Active Users
  • **
  • Posts: 1850
    • View Profile
I had to test it here - displacement doesn't seem to have much effect here (2.5D in my case).

But it seems to depend on 'who' you ask. On a test scene I got these results while rendering:

DR Master
Task manager reports that 3dsmax.exe uses 31.5 GB

DR Slave
DR window reports 38.2 GB
however, Task manager reports that 3dsmax.exe uses 25.8 GB (30.5 GB initially but usage drops after a few minutes)

It looks like while the DR window reports a higher RAM usage, Task manager reports a much lower number for the render actual task. So maybe the DR window just reports the system usage in general, NOT the actual render task. When looking at the entire system's usage, DR window's number is even a bit higher than what the system reports (37.5 GB reported for general RAM usage by Task manager)...