Author Topic: GPU Doubts....  (Read 7186 times)

2017-02-17, 10:43:58

orenvfx

  • Active Users
  • **
  • Posts: 21
    • View Profile
hey corona guys

How the thought of "no plan for corona GPU" is bother you...

im a vray user thet make steps to corona but i found myself really think about all the GPU future and the words of ondra "no plan to GPU render"

you feel like me too or you have another smart thought about gpu rendering future

dont tell me about vram memory limit Because soon we see 24 giga card with more sens prices...

Sorry to rush it but im sharing my thoughts with you as an 3d artist

Thanks

2017-02-17, 14:58:31
Reply #1

FrostKiwi

  • Active Users
  • **
  • Posts: 686
    • View Profile
    • YouTube
hey corona guys

How the thought of "no plan for corona GPU" is bother you...
Thanks
The devs have stated often, that they are looking at the GPU market and other alternatives.

you feel like me too or you have another smart thought about gpu rendering future
Future™ is still ways off. Corona has proved, that with well developed kernels like Embree, the CPU is on the same level in "$/speed" and even can outpace it in highly cluttered environments like 3dsMax with a ton of plugins.
GPUs are not the only ones seeing rash development. Intel announced  Lake Crest, the next architecture after KabyLake, which is supposed aimed at increasing deeplearning performance a ton, possibly boosting classic numbercrunching quite a bit aswell

Lastly, VRAM is not the main reason, why GPU is still not an option, dev time is.
Writing a renderer for the GPU is not simply copy pasting code and clicking compile for GPU.  As such all GPU renderers struggle with compatibility across many plugins and the devtime for new features is slower. Also you have to include and put up with the industry standards and 3dsMax being aweful, but eveyone using it.

As such you are right, GPU rendering is the Future™, but right now Future™ still takes some time to arrive.
Once GPU becomes so much better to develop for and it truly outpaces CPU in the $/speed metric, I'm sure Corona devs will consider it.

This discussion has been made tons of times, use the search function and you will find detailed dev statements.
I'm 🐥 not 🥝, pls don't eat me ( ;  ;   )

2017-02-17, 16:03:25
Reply #2

orenvfx

  • Active Users
  • **
  • Posts: 21
    • View Profile
i like your answer

2017-02-17, 16:42:14
Reply #3

maru

  • Corona Team
  • Active Users
  • ****
  • Posts: 13655
  • Marcin
    • View Profile
We will release a comprehensive blog post about CPU vs GPU rendering. Promise. But it's definitely not high priority, so I am not yet telling when exactly to expect it.
Marcin Miodek | chaos-corona.com
3D Support Team Lead - Corona | contact us

2017-02-18, 02:47:56
Reply #4

Christa Noel

  • Active Users
  • **
  • Posts: 911
  • God bless us everyone
    • View Profile
    • dionch.studio
imho, based on my experiences, the only thing stole my attention is how GPUrenderers cant handle noises in scene when the lighting is dominated by GI. but it was a long time ago, I'm an outdated person when it talks about technologies developments.
We will release a comprehensive blog post about CPU vs GPU rendering. Promise. But it's definitely not high priority, so I am not yet telling when exactly to expect it.
sounds yummy, I'm waiting for that.

2017-02-18, 07:47:53
Reply #5

orenvfx

  • Active Users
  • **
  • Posts: 21
    • View Profile
We will release a comprehensive blog post about CPU vs GPU rendering. Promise. But it's definitely not high priority, so I am not yet telling when exactly to expect it.

until there you can read here about cpu vs gpu

pros and cons

http://www.ronenbekerman.com/unbaised-gpu-rendering-what-is-the-big-deal/

2017-02-19, 00:07:20
Reply #6

Benny

  • Active Users
  • **
  • Posts: 171
    • View Profile
Hmm, that is an interesting article, the guy seems to know what he is talking about. Still, it seems as the greatest benefit is the interactivity and not the final render time, which is fine as I also think interactivity is more important as that is where I actively sit in front of the monitor and work.

I must admit I haven't really played that much with these GPU solutions beyond a quick dabble with FStorm, but he lists the following as the main arguments against Vray and Corona. I kind of feel most of these apply to Corona as well, am I wrong?

•A fantastic real-time renderer with a live render region.
•Insanely fast previews of the result while working.
•Unbiased: Super easy setup, hardly any settings to care about, it just gives me realism out of the box.
•More or less final images directly from the renderer.
•Change exposure, white balance, and other camera settings without having to re-render.
•Beautiful, error free GI calculations, no splotches or glitches.
•Lens effects like glow and glare calculated in real time.
•White balance and camera focus pickers directly in the frame buffer.

2017-02-23, 13:37:30
Reply #7

hybaj

  • Active Users
  • **
  • Posts: 5
    • View Profile
GPUs might not be the real competitor of CPUs actually when it comes to raytracing.

A CEO of a company that does GPU rendering package in a recent interview said that they theoretize that a new version of the raytracing ASIC (application specific integrated circuit) from Imagination Technologies (company that does GPUs for iPhones) at the power draw of 120W could do 6 billion rays/pix/sec.

A 2 Watt mobile version already does around 150 million rays/pixel/sec!

So 6 billion at 120 watts even if you take in account scenes where rays have to have complicated paths (which are enemy of the GPU) it's still orders of magnitude faster than what todays CPUs or GPUs can do. We're talking about realtime path-tracing at let's say 1080p basically.

I do wonder what might be the downsides of using such architecture - which types of scenes/materials etc might pose a problem for such tech when it comes to utilization in high-end VFX or real-time games.
« Last Edit: 2017-02-23, 13:46:28 by hybaj »

2017-02-23, 14:54:15
Reply #8

FrostKiwi

  • Active Users
  • **
  • Posts: 686
    • View Profile
    • YouTube
A CEO of a company that does GPU rendering package in a recent interview said that they theoretize that a new version of the raytracing ASIC (application specific integrated circuit) from Imagination Technologies (company that does GPUs for iPhones) at the power draw of 120W could do 6 billion rays/pix/sec.
Asics for rendering have been proposed so god damn often, I lost count.
Not to mention insane dev costs, not a single prototype has ever been shown, by now I ignore all of the asics nonsense. (Xeon Phi doesn't count) Closest we got were the bitcoin asics units.

Quote
We're talking about realtime path-tracing at let's say 1080p basically.
No, we are not. There are multiple players in this field, otoy brigade at the fore front. If there is someone who show a prototype, it is them. And even they stated often, this is far, far off into the Future™
If it actually happens, Siggraph will have a conference and every outlet will write about it. We will notice.

Until then, next time you read "look at this proposal X, that we revolutionize field Y with Incredible Product Z by company Ω", don't click and give them adsense money.
« Last Edit: 2017-02-23, 14:58:31 by SairesArt »
I'm 🐥 not 🥝, pls don't eat me ( ;  ;   )

2017-02-23, 17:56:52
Reply #9

hybaj

  • Active Users
  • **
  • Posts: 5
    • View Profile
Quote
Asics for rendering have been proposed so god damn often, I lost count.
Not to mention insane dev costs, not a single prototype has ever been shown, by now I ignore all of the asics nonsense. (Xeon Phi doesn't count) Closest we got were the bitcoin asics units.

Umm Imagination Technologies have already released Caustic R2100, R2500 cards that had a limited sort of commercial run and people have used them with the Caustic Visualizer (development of the Visualizer ended in 2015) and liked them. Cards were released in 2013 and discontinued just year later in 2014. So umm not sure if we're talking about the same things... if not then I'm sorry.

Quote
No, we are not. There are multiple players in this field, otoy brigade at the fore front. If there is someone who show a prototype, it is them. And even they stated often, this is far, far off into the Future™
If it actually happens, Siggraph will have a conference and every outlet will write about it. We will notice.

Until then, next time you read "look at this proposal X, that we revolutionize field Y with Incredible Product Z by company Ω", don't click and give them adsense money.

It was Jules Urbach who said that they received a new PowerVR 2W PCI-E prototype for testing and achieved 50 to 120 mrays - in a video there are little glimpses of 2 realtime scenes being rendered and it seems fast. He also said they received a 10W prototype and it "scaled perfectly" which probably means at least 5x the performance. Then in an recent interview for Road to VR he said that the probably upcomming 120W card would do 6 brays. This all seems to be their plan for the future of Brigade in Unity. Foveated rendering for VR HMDs in real-time - sort of his words too.

Even John Carmack has shown optimism about the whole thing in a tweet "I am very happy with the advent of the PVR Wizard ray tracing tech. RTRT HW from people with a clue!" from 18th of March 2014

So if I'm getting this correctly - Urbach is not someone from Imagination Technologies talking about their product. I'm not sure if there is any lying involved or over-hyping. He does seems to be invested since he's planning a grand future for both Octane and Brigade in 2 biggest game engines.

2017-02-23, 19:44:12
Reply #10

FrostKiwi

  • Active Users
  • **
  • Posts: 686
    • View Profile
    • YouTube
Wow, I looked into it and I was wrong.

Still, all of the things I read about were controlled tech demos. Interesting stuff either way.
Thanks for the info. Interesting times we live in.
I'm 🐥 not 🥝, pls don't eat me ( ;  ;   )