Author Topic: How much longer will using CPU only for rendering make sense?  (Read 6338 times)

2020-02-19, 23:34:44

eXeler0

  • Active Users
  • **
  • Posts: 7
    • View Profile
Howdy
I love Corona, for me it kick the hell out of Vray from a usability viewpoint, but what I find myself doing more and more is doing stuff in Unreal4. After getting an RTX card doing stuff in Unreal and getting the instand feedback from raytraced reflections, and RT-AO it makes me wanna stay in that workflow. All non realtime renderers make you feel like you're wasting time instead of being productive.
So what do you guys think. How much longer will it make sense to have a CPU based renderer, or even a hybrid one as opposed to moving entirely to future versions of say Unreal backed up by ever more powerful GPUs with hardware RayTracing using more and more clever AI to save computational time. (Who knows, maybe we can have hardware accelerated, AI assisted, realtime Caustics in a couple of years.

2020-02-19, 23:49:20
Reply #1

Designerman77

  • Active Users
  • **
  • Posts: 507
    • View Profile
I guess in 10 years, no-one will have to wait for test renders. :)

2020-02-20, 10:50:13
Reply #2

sprayer

  • Active Users
  • **
  • Posts: 794
    • View Profile
You should understand speculation about RT on RTX and how low quality it is. There is other similar renders for example D5 Render
but it is also not fully real time and have low quality compare to real RT renders. So it is speculation about realtime and ray tracing.
It's the same if you can draw in photoshop similar result and faster than you will make it every 3d models for the same scene in real 3d. Some people concept artist works in similar way basic 3d scene and all details they drawing in 2d this is faster workflow for cool pictures

2020-02-22, 00:11:00
Reply #3

eXeler0

  • Active Users
  • **
  • Posts: 7
    • View Profile
You should understand speculation about RT on RTX and how low quality it is. There is other similar renders for example D5 Render
but it is also not fully real time and have low quality compare to real RT renders. So it is speculation about realtime and ray tracing.
It's the same if you can draw in photoshop similar result and faster than you will make it every 3d models for the same scene in real 3d. Some people concept artist works in similar way basic 3d scene and all details they drawing in 2d this is faster workflow for cool pictures

Sure, actual quality from RTX without denoising is terrible, but then you have hardware accelerated AI that does its magic and ta-daa, image looks good. That will obviously only improve over time. So, hypothetically speaking, if you can't tell the difference from AI denoised RTX render and a 4 hour CPU render then this discussion will be highly relevant.



2020-02-22, 19:27:52
Reply #4

Juraj

  • Active Users
  • **
  • Posts: 4765
    • View Profile
    • studio website
Devs are already experimenting with this in some way, ChaosGroup just launched public beta (or alpha?) of their LAVINA project (very similar to D5).

Closest to interactive workflow right now would be, 3dsMax 2020, Corona Interactive running within viewport (not framebuffer or active shade,...) and having nVidia powered AI Denoise going on at same time.
Please follow my new Instagram for latest projects, tips&tricks, short video tutorials and free models
Behance  Probably best updated portfolio of my work
lysfaere.com Please check the new stuff!

2020-03-08, 22:07:39
Reply #5

eXeler0

  • Active Users
  • **
  • Posts: 7
    • View Profile
Devs are already experimenting with this in some way, ChaosGroup just launched public beta (or alpha?) of their LAVINA project (very similar to D5).

Closest to interactive workflow right now would be, 3dsMax 2020, Corona Interactive running within viewport (not framebuffer or active shade,...) and having nVidia powered AI Denoise going on at same time.

You are right,
Question then becomes, will Chaos Group differentiate between these products where Project Lavina tech becomes the next Vray RT while Corona is continuing to be CPU only.
Difficult to know what development will go where. We remember a time when Corona was a competitor to Vray :-)
Anyway, these are all just tools of course and people use whatever does the job best from their perspective. I sure hope Corona will continue to evolve into something simple and beautiful while managing to remain relevant for a long time to come.

2020-03-09, 11:57:15
Reply #6

maru

  • Corona Team
  • Active Users
  • ****
  • Posts: 12781
  • Marcin
    • View Profile
It seems that CPU and GPU technologies are competing right now in a reasonable way. This drives the development of both. GPU results are great, but on the other hand both Intel and AMD are working on some exciting tech:
https://www.tomshardware.com/news/intel-lakefield-foveros-3d-chip-stack-hybrid-processor,40205.html
https://www.tomshardware.com/news/amd-announces-x3d-chip-stacking-and-infinity-architecture
Marcin Miodek | chaos-corona.com
3D Support Team Lead - Corona | contact us

2020-03-09, 12:22:58
Reply #7

Nejc Kilar

  • Corona Team
  • Active Users
  • ****
  • Posts: 1251
    • View Profile
    • My personal website
It seems that CPU and GPU technologies are competing right now in a reasonable way. This drives the development of both. GPU results are great, but on the other hand both Intel and AMD are working on some exciting tech:
https://www.tomshardware.com/news/intel-lakefield-foveros-3d-chip-stack-hybrid-processor,40205.html
https://www.tomshardware.com/news/amd-announces-x3d-chip-stacking-and-infinity-architecture

I just watched AMD's FAD presentation and they touched on the topic of x3d chip stacking. Seems like a great innovative way to scale up. The same goes for the Foveros stacking, of course :)

Still, I can't help but wonder what is Microsoft going to do about it. You can pretty much stack a 4+ GPUs together in one node but Windows seems limited when it comes to thread handling. I know its probably not going to be like that forever but they seem somewhat slow in making Windows handle high thread count CPUs better. Yeah, 128 threads can be handled right now but it kind of seems like x86 CPUs will plateu a bit early in terms of architectural improvements and I personally suspect we could see even higher thread count CPUs to offset that and then... Hello Windows! :)

Still, fun times on both the CPU and GPU fronts!
Nejc Kilar | chaos-corona.com
Educational Content Creator | contact us

2020-03-09, 14:42:32
Reply #8

Jpjapers

  • Active Users
  • **
  • Posts: 1659
    • View Profile
I saw a news article a few days ago about intel having a breakthrough in CPU architecture and it outpaced a titan GPU by a factor of something like 5. I cant find the link.
« Last Edit: 2020-03-09, 16:31:56 by Jpjapers »

2020-03-09, 15:33:12
Reply #9

Nejc Kilar

  • Corona Team
  • Active Users
  • ****
  • Posts: 1251
    • View Profile
    • My personal website
I saw a news article a few days ago about intel having a breakthrough in CPU architecture and it outpaced a titan GPU by a factor of something like 50. I cant find the link.

Can't find the link myself either but afaik it was for AI purposes alone. Some company figured out a different algorithm or something of the sorts :)
Nejc Kilar | chaos-corona.com
Educational Content Creator | contact us

2020-03-09, 16:02:53
Reply #10

Jpjapers

  • Active Users
  • **
  • Posts: 1659
    • View Profile

Can't find the link myself either but afaik it was for AI purposes alone. Some company figured out a different algorithm or something of the sorts :)

True but do you not think that AI will have a hand in path tracing eventually?

2020-03-09, 16:09:34
Reply #11

Nejc Kilar

  • Corona Team
  • Active Users
  • ****
  • Posts: 1251
    • View Profile
    • My personal website
Good point!

I think there are some papers out there experimenting with it and I know some are in development but I can't talk about them (not that I know much at this point anyway) but I would guess its potentially still ways off.

Even if not, GPUs are predominantly used for AI purposes because they excel at that - If this Intel / 3rd party collaboration is really that successful we'll just have to wait and see. Might be useful for certain AI training and not for other, at least thats what I'm guessing.

But bottom line, GPUs already have RT cores so that definitely helps them out. CPUs, if they get to stack chips first and Microsoft gets their crap together, seems fun as well :D

We really need to find that article lol :)
Nejc Kilar | chaos-corona.com
Educational Content Creator | contact us

2020-03-09, 16:31:29
Reply #12

Jpjapers

  • Active Users
  • **
  • Posts: 1659
    • View Profile
Good point!

I think there are some papers out there experimenting with it and I know some are in development but I can't talk about them (not that I know much at this point anyway) but I would guess its potentially still ways off.

Even if not, GPUs are predominantly used for AI purposes because they excel at that - If this Intel / 3rd party collaboration is really that successful we'll just have to wait and see. Might be useful for certain AI training and not for other, at least thats what I'm guessing.

But bottom line, GPUs already have RT cores so that definitely helps them out. CPUs, if they get to stack chips first and Microsoft gets their crap together, seems fun as well :D

We really need to find that article lol :)

Found it

https://wccftech.com/intel-ai-breakthrough-destroys-8-nvidia-v100-gpu/

2020-03-10, 12:58:04
Reply #13

Nejc Kilar

  • Corona Team
  • Active Users
  • ****
  • Posts: 1251
    • View Profile
    • My personal website
Ah, I thought it was wccftech I just couldn't find it there. Thanks for the link! I guess we'll just have to wait and see, ha! :D
Nejc Kilar | chaos-corona.com
Educational Content Creator | contact us

2020-03-10, 13:02:44
Reply #14

Juraj

  • Active Users
  • **
  • Posts: 4765
    • View Profile
    • studio website
Both Intel getting into GPUs and AMD testing Infinity Fabric for absolutely everything, we might see some interesting solutions in 2-3 years horizon.

My workstation is prepared to install 4 GPUs if that becomes benefit in future.
Please follow my new Instagram for latest projects, tips&tricks, short video tutorials and free models
Behance  Probably best updated portfolio of my work
lysfaere.com Please check the new stuff!