Author Topic: Vray Hybrid rendering  (Read 15826 times)

2017-06-29, 20:54:54

Jadefox

  • Primary Certified Instructor
  • Active Users
  • ***
  • Posts: 251
    • View Profile
    • Renderlab
Hi Guys

I just watched a video on this new feature promoted by Vray.
Is a hybrid CPU-GPU planned for Corona ? It really seems to utilize
the whole system and greatly improved render times

Thanks

2017-06-30, 07:12:02
Reply #1

xwilder

  • Active Users
  • **
  • Posts: 26
    • View Profile
    • My Portfolio
Vray guy (Blago) said, that hybrid rendering in vray is processing VrayRT code by GPU and CPU.  CPU can process Vray RT code well, but processing Vray Adv code by graphic card is very slow and ineffective (now). And because Corona doesn't have GPU renderer, it will not have hybrid rendering (for now I hope).   
But it's just my amateur point of view, maybe I'm wrong.

2017-06-30, 11:29:49
Reply #2

Ryuu

  • Former Corona Team Member
  • Active Users
  • **
  • Posts: 654
  • Michal
    • View Profile
The "Vray guy" is Vlado, as in Vladimir Koylazov :)

If I understood it correctly, VrayRT was originally written for GPU and the same code has been just recompiled to run on CPU. While it may seem trivial to do the same the other way around and just recompile CPU code for GPU, it is not that simple. Without going into details, let's just say that we would have to pour considerable resources (time and money) to implement and maintain a usable GPU renderer. We're currently preparing a CPU vs GPU article where this issue will be explained in a bit more detail.

On the other hand we never said a definite no to using GPUs. While we don't see any benefit in moving our rendering code to GPU at this moment, there are few GPU related things that have been on our internal wishlist for quite some time. An example would be implementing post processing effects or denoising on GPU, since these are the tasks that GPU is best suited for. We just can't make any promises on when we might have the time to try and implement it.

2017-07-03, 05:39:20
Reply #3

Christa Noel

  • Active Users
  • **
  • Posts: 911
  • God bless us everyone
    • View Profile
    • dionch.studio
..We're currently preparing a CPU vs GPU article where this issue will be explained in a bit more detail..
nice idea, that will be very interesting article. really appreciate it.
these days we (users) see many of promising gpu renderers, with some words and proofs that they're better than cpu renderers but at the other side this great renderer team said that there is no benefit in having gpu power for our renderer.
so, simply we're waiting for that article to release our confusions.

We just can't make any promises on when we might have the time to try and implement it.
just promise us for that article, that's more than enough ;)

2017-07-03, 18:24:13
Reply #4

Ondra

  • Administrator
  • Active Users
  • *****
  • Posts: 9048
  • Turning coffee to features since 2009
    • View Profile
I see it is that time of year, where people copy changelog of one renderer to wishlist of another renderer ;). Our position on the GPU is still the same - maybe one day, not our current task
Rendering is magic.How to get minidumps for crashed/frozen 3ds Max | Sorry for short replies, brief responses = more time to develop Corona ;)

2017-07-06, 19:26:50
Reply #5

tedvitale.cg

  • Users
  • *
  • Posts: 1
    • View Profile
The "Vray guy" is Vlado, as in Vladimir Koylazov :)

Well, the other V-Ray guy is Blago, or Blagovest Taskov, the V-Ray GPU Team lead developer.    But yes, essentially they ported all the GPU code over the the CPU.  I still haven't played much with it, but I know it was a massive pain in the ass to do.
« Last Edit: 2017-07-06, 19:30:50 by tedvitale.cg »

2017-07-06, 20:18:47
Reply #6

Ludvik Koutny

  • VIP
  • Active Users
  • ***
  • Posts: 2557
  • Just another user
    • View Profile
    • My Portfolio
From my experience, Vray RT GPU is significantly slower than Vray Adv on CPU of a price and release date similar to the GPU. Even Bertrand feels that way: http://bertrand-benoit.com/blog/not-just-another-coffee-machine/

So you may arrive at a grotesque scenario, where you for example render a scene on GTX1080Ti + Ryzen 1800X on V-Ray GPU, and then realize it renders much faster on Vray Adv on CPU only :)

2017-07-10, 09:24:38
Reply #7

Ryuu

  • Former Corona Team Member
  • Active Users
  • **
  • Posts: 654
  • Michal
    • View Profile
Well, the other V-Ray guy is Blago, or Blagovest Taskov, the V-Ray GPU Team lead developer.

OK, sorry for that. I don't know that one yet :)

2017-08-17, 03:13:31
Reply #8

dfcorona

  • Active Users
  • **
  • Posts: 342
    • View Profile
From my experience, Vray RT GPU is significantly slower than Vray Adv on CPU of a price and release date similar to the GPU. Even Bertrand feels that way: http://bertrand-benoit.com/blog/not-just-another-coffee-machine/

So you may arrive at a grotesque scenario, where you for example render a scene on GTX1080Ti + Ryzen 1800X on V-Ray GPU, and then realize it renders much faster on Vray Adv on CPU only :)

I would love to see some of your test results, not being a smart ass but just would like to see the other side of a comparison.  I have been using GPU renderers lately and I find them miles faster than CPU,  I have a 1080ti in one system which is $800, I can't imagine there is a cpu out there for that price that can match it.  I am about to get a Threadripper 1950x so I will see how that performs against the 1080ti, but I still don't expect it to beat it and it's a $1,000 cpu.  If it does beat it great! but the thing is that even if it can beat it by say even 5%, unlike the cpu I can put 3-4 more 1080ti's in the one system, the cost of building cpu systems to compare to that is astronomical. But I really am interested in your findings and that CPU vs GPU article.

Also Bertrand says it might be an issue with his system why the GPU was a lot slower, I have a feeling that was exactly the problem.

2017-08-17, 08:20:36
Reply #9

Ryuu

  • Former Corona Team Member
  • Active Users
  • **
  • Posts: 654
  • Michal
    • View Profile

2017-08-18, 09:36:39
Reply #10

dfcorona

  • Active Users
  • **
  • Posts: 342
    • View Profile
I'll just leave this in here :)

https://software.intel.com/en-us/articles/how-embree-delivers-uncompromising-photorealism

LOL.... did you really just post a marketing piece by Intel telling you that there products where better for rendering?  This tells us nothing, this is like Nvidia telling you to buy there cards cause there better at rendering.  A lot of holes in there story too, love how they compare there 6900k on Corona with.... "two GPU" renderers..... what two GPU renderers? The slowest and worst optimized ones they can find.  That scene should of taken about a minute for a good GPU renderer with a GTX 1080.  Worthless Intel marketing.

2017-08-18, 10:00:47
Reply #11

Ryuu

  • Former Corona Team Member
  • Active Users
  • **
  • Posts: 654
  • Michal
    • View Profile
Well, I would neccessarily call that biased marketing crap. I know a little bit about how those scenes were rendered & measured since I did the renderings myself on my home computer ;)

Since I would love to do some GPU programming (and I've been pushing GPU for post processing for at least a year now), you can believe me that I am by no means biased towards CPU being better.

While I can't disclose which GPU renderers were used, I can say at least that one of them has attracted very big crowd of GPU rendering fanboys and has been almost universally praised as the future of the GPU rendering.

2017-08-18, 10:16:47
Reply #12

dfcorona

  • Active Users
  • **
  • Posts: 342
    • View Profile
Well, I would neccessarily call that biased marketing crap. I know a little bit about how those scenes were rendered & measured since I did the renderings myself on my home computer ;)

Since I would love to do some GPU programming (and I've been pushing GPU for post processing for at least a year now), you can believe me that I am by no means biased towards CPU being better.

While I can't disclose which GPU renderers were used, I can say at least that one of them has attracted very big crowd of GPU rendering fanboys and has been almost universally praised as the future of the GPU rendering.

I'm pretty sure I know what GPU renderer your talking about since I own and use it also.  I can tell you with that GPU renderer there is no way it would take 5min for that render especially with a GTX 1080, unless someone deliberately set it up to fail.

GPU for post processing would be great, it is super fast at that also.

2017-08-18, 10:38:42
Reply #13

lacilaci

  • Active Users
  • **
  • Posts: 749
    • View Profile
Well, I would neccessarily call that biased marketing crap. I know a little bit about how those scenes were rendered & measured since I did the renderings myself on my home computer ;)

Since I would love to do some GPU programming (and I've been pushing GPU for post processing for at least a year now), you can believe me that I am by no means biased towards CPU being better.

While I can't disclose which GPU renderers were used, I can say at least that one of them has attracted very big crowd of GPU rendering fanboys and has been almost universally praised as the future of the GPU rendering.

I'm pretty sure I know what GPU renderer your talking about since I own and use it also.  I can tell you with that GPU renderer there is no way it would take 5min for that render especially with a GTX 1080, unless someone deliberately set it up to fail.

GPU for post processing would be great, it is super fast at that also.

Well, since you own octane and corona aswell... Maybe you could do a more "fair" comparison on a production scene 1:1 result.

2017-08-18, 14:29:24
Reply #14

romullus

  • Global Moderator
  • Active Users
  • ****
  • Posts: 9284
  • Let's move this topic, shall we?
    • View Profile
    • My Models
While I can't disclose which GPU renderers were used, I can say at least that one of them has attracted very big crowd of GPU rendering fanboys and has been almost universally praised as the future of the GPU rendering.

Must be furryball then, no? :]
I'm not Corona Team member. Everything i say, is my personal opinion only.
My Models | My Videos | My Pictures