Author Topic: Memory usage does not build up during "parsing scene"  (Read 3004 times)

2021-12-06, 15:49:00

Image Complete

  • Active Users
  • **
  • Posts: 69
    • View Profile
    • Our Website
Hello,

I am fairly new to distributed rendering so I am not confident with it.
I try to use distributed rendering but it seems that something isn't working correctly.
Although everything looks fine at the beginning, the memory usage does not build up during the "parsing scene" phase. It goes to 15-20GB and then it stops. After approx. 5 mins memory usage goes down and DrServer is acting like restarting. After that memory usage goes again close to 20GB and then it does the same thing again and again.

I am using corona 6 Hotfix 2 3ds max 2020.
Any help?

Thanks
We offer Visual Communication

2021-12-13, 10:47:00
Reply #1

Image Complete

  • Active Users
  • **
  • Posts: 69
    • View Profile
    • Our Website
Any idea someone?

I really need DR

Cheers
We offer Visual Communication

2021-12-13, 11:15:54
Reply #2

muoto

  • Active Users
  • **
  • Posts: 66
    • View Profile
i think DR uses the memory it needs according to the scene you send. How's the CPU usage on the slaves ? Because if they are at 100%, it should be OK

2021-12-13, 11:25:08
Reply #3

Frood

  • Active Users
  • **
  • Posts: 1902
    • View Profile
    • Rakete GmbH
Any help?

Max.log and DrLog.txt of a (failing) slave would be useful. Additionally: are you able to open/render the scene on that DR slave without issues? Does it work with any other, maybe smaller test scene?

Btw: you should have posted here to get a quick answer from support:

https://forum.corona-renderer.com/index.php?board=29.0


Good Luck


Never underestimate the power of a well placed level one spell.

2021-12-14, 10:11:25
Reply #4

Image Complete

  • Active Users
  • **
  • Posts: 69
    • View Profile
    • Our Website
i think DR uses the memory it needs according to the scene you send. How's the CPU usage on the slaves ? Because if they are at 100%, it should be OK

The scene is heavy. It needs 90GB of Ram. Slave machine doesn't stress at all. Essentially, it doesn't do anything.
We offer Visual Communication

2021-12-14, 10:20:29
Reply #5

Image Complete

  • Active Users
  • **
  • Posts: 69
    • View Profile
    • Our Website
Any help?

Max.log and DrLog.txt of a (failing) slave would be useful. Additionally: are you able to open/render the scene on that DR slave without issues? Does it work with any other, maybe smaller test scene?

Btw: you should have posted here to get a quick answer from support:

https://forum.corona-renderer.com/index.php?board=29.0


Good Luck

Funny, I didn't know about the "I need help!" part of the forum!

The scene needs aprox. 90GB of Ram, which are more than what slave machine has, which is 64GB. The awkward thing is that it never reaches its potential of 64GB. It goes up to 20GB and down, again and again. I have seen the same slave machine under the same LAN to maximize its potential and as a result it became super slow on giving passes. But that's not the case now. Is it something you heard of again?

Thank you!
We offer Visual Communication

2021-12-14, 11:23:31
Reply #6

Frood

  • Active Users
  • **
  • Posts: 1902
    • View Profile
    • Rakete GmbH
I asked my questions you did not answer not for no reasons :)


Good Luck



Never underestimate the power of a well placed level one spell.

2021-12-14, 13:00:45
Reply #7

Image Complete

  • Active Users
  • **
  • Posts: 69
    • View Profile
    • Our Website
I asked my questions you did not answer not for no reasons :)


Good Luck

I forgot to answer on the whole question. My apologies.
So as I said, slave machine has 64GB of Ram and scene needs 90GB. So no, I can't render the scene on the slave.
I have used the same LAN and DR setup with smaller scenes and it worked in the past. I don't know what will happen if I try now. Although it should work as nothing has changed since I have used it with success in smaller scenes.

I don't have the logs available at the moment but when I checked them everything was looking alright. I can run the DR again and save them this time, if that will help.
We offer Visual Communication

2021-12-14, 14:00:55
Reply #8

Frood

  • Active Users
  • **
  • Posts: 1902
    • View Profile
    • Rakete GmbH
So no, I can't render the scene on the slave.

There are scenes which need 90GB on a 128GB node but render fine (and need less memory) on a computer with significantly less ram. But my question was more about missing plugins or loading/parsing errors, thus the logs.

Does the scene use 90GB to open it or to render it? What happens if you open the scene on a slave interactively? If you are not able to load it on a slave due to lack of memory or any missing plugins for example, you are of course also not able to render it there via DR, simple as that.


Good Luck




Never underestimate the power of a well placed level one spell.

2021-12-14, 15:44:33
Reply #9

Image Complete

  • Active Users
  • **
  • Posts: 69
    • View Profile
    • Our Website
So no, I can't render the scene on the slave.

There are scenes which need 90GB on a 128GB node but render fine (and need less memory) on a computer with significantly less ram. But my question was more about missing plugins or loading/parsing errors, thus the logs.

Does the scene use 90GB to open it or to render it? What happens if you open the scene on a slave interactively? If you are not able to load it on a slave due to lack of memory or any missing plugins for example, you are of course also not able to render it there via DR, simple as that.


Good Luck

I am not sure I understood your statement about some computers might need less memory than others for the same scene. It sounds super weird to me. Or I just don't understand what's your point. Nevermind.

Logs does not show any error. All the necessary plugins are there. Ram amount goes up and down again again during the "parching scene" phase and never leaves from there.

Scene needs 55GB to open and 90GB to render. I am pretty sure I can open the scene on the slave but I will not be able to render it. Or I will, superslow, because Corona will start to use my ssd instead of Ram for the rest 26GB of information.

What's unusual, at least for me, it is that during the "parching scene" phase it doesn't reach slaves' potential of 64GB. It just goes around 20GB max. When you try to render a scene that needs more Ram than you actually have, Corona initially fills up your Ram and then it goes to ssd. It doesn't just load 20GB of Ram and then stops.
« Last Edit: 2021-12-14, 15:50:14 by Image Complete »
We offer Visual Communication

2021-12-15, 16:59:38
Reply #10

Frood

  • Active Users
  • **
  • Posts: 1902
    • View Profile
    • Rakete GmbH
It sounds super weird to me.

Windows memory management, 3ds max memory usage and garbage collection, 3ds max/plugin requests for ram, leading to absurd commit values which are twice as big as the actually needed memory - all that is exactly weird :)

What's unusual, at least for me, it is that during the "parching scene" phase it doesn't reach slaves' potential of 64GB. It just goes around 20GB max.

Hard to say without some live task manager. If the process just sits there at 20GB and restarts after a while, it could be some (too) large memory amount requested at once or maybe a blocked task. You can try this if you are still interested. Anyway, you will never be able to render that specific scene in any reasonable resolution with 64GB I fear.


Good Luck



Never underestimate the power of a well placed level one spell.