Chaos Corona Forum

Chaos Corona for 3ds Max => [Max] General Discussion => Topic started by: eXeler0 on 2020-02-19, 23:34:44

Title: How much longer will using CPU only for rendering make sense?
Post by: eXeler0 on 2020-02-19, 23:34:44
Howdy
I love Corona, for me it kick the hell out of Vray from a usability viewpoint, but what I find myself doing more and more is doing stuff in Unreal4. After getting an RTX card doing stuff in Unreal and getting the instand feedback from raytraced reflections, and RT-AO it makes me wanna stay in that workflow. All non realtime renderers make you feel like you're wasting time instead of being productive.
So what do you guys think. How much longer will it make sense to have a CPU based renderer, or even a hybrid one as opposed to moving entirely to future versions of say Unreal backed up by ever more powerful GPUs with hardware RayTracing using more and more clever AI to save computational time. (Who knows, maybe we can have hardware accelerated, AI assisted, realtime Caustics in a couple of years.
Title: Re: How much longer will using CPU only for rendering make sense?
Post by: Designerman77 on 2020-02-19, 23:49:20
I guess in 10 years, no-one will have to wait for test renders. :)
Title: Re: How much longer will using CPU only for rendering make sense?
Post by: sprayer on 2020-02-20, 10:50:13
You should understand speculation about RT on RTX and how low quality it is. There is other similar renders for example D5 Render
but it is also not fully real time and have low quality compare to real RT renders. So it is speculation about realtime and ray tracing.
It's the same if you can draw in photoshop similar result and faster than you will make it every 3d models for the same scene in real 3d. Some people concept artist works in similar way basic 3d scene and all details they drawing in 2d this is faster workflow for cool pictures
Title: Re: How much longer will using CPU only for rendering make sense?
Post by: eXeler0 on 2020-02-22, 00:11:00
You should understand speculation about RT on RTX and how low quality it is. There is other similar renders for example D5 Render
but it is also not fully real time and have low quality compare to real RT renders. So it is speculation about realtime and ray tracing.
It's the same if you can draw in photoshop similar result and faster than you will make it every 3d models for the same scene in real 3d. Some people concept artist works in similar way basic 3d scene and all details they drawing in 2d this is faster workflow for cool pictures

Sure, actual quality from RTX without denoising is terrible, but then you have hardware accelerated AI that does its magic and ta-daa, image looks good. That will obviously only improve over time. So, hypothetically speaking, if you can't tell the difference from AI denoised RTX render and a 4 hour CPU render then this discussion will be highly relevant.


Title: Re: How much longer will using CPU only for rendering make sense?
Post by: Juraj on 2020-02-22, 19:27:52
Devs are already experimenting with this in some way, ChaosGroup just launched public beta (or alpha?) of their LAVINA project (very similar to D5).

Closest to interactive workflow right now would be, 3dsMax 2020, Corona Interactive running within viewport (not framebuffer or active shade,...) and having nVidia powered AI Denoise going on at same time.
Title: Re: How much longer will using CPU only for rendering make sense?
Post by: eXeler0 on 2020-03-08, 22:07:39
Devs are already experimenting with this in some way, ChaosGroup just launched public beta (or alpha?) of their LAVINA project (very similar to D5).

Closest to interactive workflow right now would be, 3dsMax 2020, Corona Interactive running within viewport (not framebuffer or active shade,...) and having nVidia powered AI Denoise going on at same time.

You are right,
Question then becomes, will Chaos Group differentiate between these products where Project Lavina tech becomes the next Vray RT while Corona is continuing to be CPU only.
Difficult to know what development will go where. We remember a time when Corona was a competitor to Vray :-)
Anyway, these are all just tools of course and people use whatever does the job best from their perspective. I sure hope Corona will continue to evolve into something simple and beautiful while managing to remain relevant for a long time to come.
Title: Re: How much longer will using CPU only for rendering make sense?
Post by: maru on 2020-03-09, 11:57:15
It seems that CPU and GPU technologies are competing right now in a reasonable way. This drives the development of both. GPU results are great, but on the other hand both Intel and AMD are working on some exciting tech:
https://www.tomshardware.com/news/intel-lakefield-foveros-3d-chip-stack-hybrid-processor,40205.html
https://www.tomshardware.com/news/amd-announces-x3d-chip-stacking-and-infinity-architecture
Title: Re: How much longer will using CPU only for rendering make sense?
Post by: Nejc Kilar on 2020-03-09, 12:22:58
It seems that CPU and GPU technologies are competing right now in a reasonable way. This drives the development of both. GPU results are great, but on the other hand both Intel and AMD are working on some exciting tech:
https://www.tomshardware.com/news/intel-lakefield-foveros-3d-chip-stack-hybrid-processor,40205.html
https://www.tomshardware.com/news/amd-announces-x3d-chip-stacking-and-infinity-architecture

I just watched AMD's FAD presentation and they touched on the topic of x3d chip stacking. Seems like a great innovative way to scale up. The same goes for the Foveros stacking, of course :)

Still, I can't help but wonder what is Microsoft going to do about it. You can pretty much stack a 4+ GPUs together in one node but Windows seems limited when it comes to thread handling. I know its probably not going to be like that forever but they seem somewhat slow in making Windows handle high thread count CPUs better. Yeah, 128 threads can be handled right now but it kind of seems like x86 CPUs will plateu a bit early in terms of architectural improvements and I personally suspect we could see even higher thread count CPUs to offset that and then... Hello Windows! :)

Still, fun times on both the CPU and GPU fronts!
Title: Re: How much longer will using CPU only for rendering make sense?
Post by: Jpjapers on 2020-03-09, 14:42:32
I saw a news article a few days ago about intel having a breakthrough in CPU architecture and it outpaced a titan GPU by a factor of something like 5. I cant find the link.
Title: Re: How much longer will using CPU only for rendering make sense?
Post by: Nejc Kilar on 2020-03-09, 15:33:12
I saw a news article a few days ago about intel having a breakthrough in CPU architecture and it outpaced a titan GPU by a factor of something like 50. I cant find the link.

Can't find the link myself either but afaik it was for AI purposes alone. Some company figured out a different algorithm or something of the sorts :)
Title: Re: How much longer will using CPU only for rendering make sense?
Post by: Jpjapers on 2020-03-09, 16:02:53

Can't find the link myself either but afaik it was for AI purposes alone. Some company figured out a different algorithm or something of the sorts :)

True but do you not think that AI will have a hand in path tracing eventually?
Title: Re: How much longer will using CPU only for rendering make sense?
Post by: Nejc Kilar on 2020-03-09, 16:09:34
Good point!

I think there are some papers out there experimenting with it and I know some are in development but I can't talk about them (not that I know much at this point anyway) but I would guess its potentially still ways off.

Even if not, GPUs are predominantly used for AI purposes because they excel at that - If this Intel / 3rd party collaboration is really that successful we'll just have to wait and see. Might be useful for certain AI training and not for other, at least thats what I'm guessing.

But bottom line, GPUs already have RT cores so that definitely helps them out. CPUs, if they get to stack chips first and Microsoft gets their crap together, seems fun as well :D

We really need to find that article lol :)
Title: Re: How much longer will using CPU only for rendering make sense?
Post by: Jpjapers on 2020-03-09, 16:31:29
Good point!

I think there are some papers out there experimenting with it and I know some are in development but I can't talk about them (not that I know much at this point anyway) but I would guess its potentially still ways off.

Even if not, GPUs are predominantly used for AI purposes because they excel at that - If this Intel / 3rd party collaboration is really that successful we'll just have to wait and see. Might be useful for certain AI training and not for other, at least thats what I'm guessing.

But bottom line, GPUs already have RT cores so that definitely helps them out. CPUs, if they get to stack chips first and Microsoft gets their crap together, seems fun as well :D

We really need to find that article lol :)

Found it

https://wccftech.com/intel-ai-breakthrough-destroys-8-nvidia-v100-gpu/
Title: Re: How much longer will using CPU only for rendering make sense?
Post by: Nejc Kilar on 2020-03-10, 12:58:04
Ah, I thought it was wccftech I just couldn't find it there. Thanks for the link! I guess we'll just have to wait and see, ha! :D
Title: Re: How much longer will using CPU only for rendering make sense?
Post by: Juraj on 2020-03-10, 13:02:44
Both Intel getting into GPUs and AMD testing Infinity Fabric for absolutely everything, we might see some interesting solutions in 2-3 years horizon.

My workstation is prepared to install 4 GPUs if that becomes benefit in future.
Title: Re: How much longer will using CPU only for rendering make sense?
Post by: Jpjapers on 2020-03-10, 13:21:30
Both Intel getting into GPUs and AMD testing Infinity Fabric for absolutely everything, we might see some interesting solutions in 2-3 years horizon.

My workstation is prepared to install 4 GPUs if that becomes benefit in future.

I think we might see more power in smaller form factors.