Hi folks,
When rendering an animation sequence, is it correct/expected that the other machines on the network have to close and reopen max for every frame? I get this when rendering a sequence either with DR or just plain old backburner frames. For every frame the server seems to dump max and reload the scene from scratch. This isn't the case with my main workstation where the job is running from, where it only loads the scene once.
This is the case even when the HDcache is saved out and being read, so just pure frame rendering happening.
My network nodes are super slow as a result of this. They need maybe 1 min or so to load the file and get going each frame.
Any ideas?