Author Topic: Displacement testing needed  (Read 15512 times)

2013-02-08, 14:04:20
Reply #15

lacilaci

  • Active Users
  • **
  • Posts: 749
    • View Profile
Cause Vray does smooth subdivisions of its own afaik. It's even possible to set vraydisplace modifier to do subdiv only, giving you a memory efficient "turbosmooth" equivalent on the fly during rendering(memory efficient cause it subidivides only rendered portions by bucket)...

2013-02-08, 15:09:10
Reply #16

Ondra

  • Administrator
  • Active Users
  • *****
  • Posts: 9048
  • Turning coffee to features since 2009
    • View Profile
This is because the normals are computed from the new geometry after displacement, that is derived from the old real geometry, so old shading normals are disabled. This causes basically disabling of shading normals when using very little displace height/mostly black texture. A workaround is to subdivide the geometry before displacement. I tried to go further and do a spherical interpolation of the shading normals, but it gave unstable results. I'll revisit it at some point, but for now I've already spent too much time with displacement.

PS: it looks like vray interpolates the normals correctly, but not the actual geometry.
Rendering is magic.How to get minidumps for crashed/frozen 3ds Max | Sorry for short replies, brief responses = more time to develop Corona ;)

2013-02-08, 21:06:27
Reply #17

lacilaci

  • Active Users
  • **
  • Posts: 749
    • View Profile
Here's a pretty nice result using 32bit .exr for displacement.
Mesh on the left is original(about 1500-2000 poly), on the right it's corona displacement with 1px projection and 500 max subdivs.

It's not perfect but mainly because the original mesh is too low poly, but I wanted the displacement to do most of the work.

2013-02-11, 09:18:36
Reply #18

Javadevil

  • Active Users
  • **
  • Posts: 399
    • View Profile

Here's a quick test with "Projected size pixel" 3000 Max size Polygons and 1px.

I found "Max Size units" chews up ram like crazy, and to get anything decent, it needs to be 0.2cm,  I haven't have a successful displacement from it yet, even with low memory Embree.

cheers


2013-02-11, 10:14:21
Reply #19

lacilaci

  • Active Users
  • **
  • Posts: 749
    • View Profile
Well using max size in units tesselates the whole geometry so it will surely be heavier. Useful for animations so that you don't see geometry changes on objects, but I guess if you wan't to use it, then you need to do most of subdiv. with turbosmooth or so...

2013-02-18, 10:38:19
Reply #20

Javadevil

  • Active Users
  • **
  • Posts: 399
    • View Profile
Hi lacilaci, 
Yep I realise that, I still don't think max unit size is memory efficient, it chews up my 32GB very quickly. Other renders cope a lot better.

2013-02-18, 15:04:33
Reply #21

lacilaci

  • Active Users
  • **
  • Posts: 749
    • View Profile
Well that's possible, I haven't really compared that with other renderers...
I'm curious though.
Is it possible to have corona dynamically load and unload geometry during rendering? I guess it wouldn't work in progressive mode but maybe in bucket mode?
You know, like set ammount of ram you want to use and let corona do the magic :D This would make possible to render scenes heavier than would fit into ram at the cost of performance.