Author Topic: Embrace GPU rendering  (Read 776 times)

2025-01-13, 22:56:05

ATa

  • Active Users
  • **
  • Posts: 15
    • View Profile
Corona has long been a CPU powerhouse, but with the RTX 50 series cards becoming more affordable, it's time to consider hybrid GPU rendering. This shift would significantly speed up workflows while maintaining the simplicity and quality that Corona is known for. As the industry moves towards GPU-based rendering, embracing this change will ensure Corona remains competitive. Hope the developers change their minds this time and take this crucial step forward!

2025-01-14, 14:44:21
Reply #1

TomG

  • Administrator
  • Active Users
  • *****
  • Posts: 5922
    • View Profile
Sorry, this will never change - as mentioned previously, it would take about 2 years work and involve almost all the devs in order to get it done, as it's a fundamental change to the codebase (the way GPUs render is just totally different from how CPUs render). This would mean the core source of our business would see no improvements, features, and only limited bug fixes for that development time (CPU Corona), and this is not acceptable to us. And then GPU rendering, no matter how fast your GPU, always has certain limitations and compromises, which is exactly what people want to avoid when they choose Corona; they really do want the best in realism without shortcuts. I think there is a mistaken belief that "GPU Corona" would be "Corona but just on GPU" but this is not the case - it's the same as how V-Ray GPU does not look identical to V-Ray CPU, and is just inevitably different in some ways. So people wouldn't get what they actually want, as such a thing is not possible.

Of course, if you want GPU rendering, then you have Vantage, already one of the best ray tracers for GPU out there, and that doesn't require Corona development in other areas to come to a halt. So there is the best of both worlds there!

You will notice that the NVIDIA figures for "how amazingly fast these new GPUs are!" is based on DLSS4, which AI imagines 3 out of every 4 frames. In other words, not real frames at all. Take that out of the equation and the speed / power boost is looking more like 10 to 20%, which is fine, but is no quantum leap over previous generations. I look forward to seeing more third party reviews of performance, rather than NVIDIAs selected figures :)
Tom Grimes | chaos-corona.com
Product Manager | contact us

2025-01-14, 20:41:38
Reply #2

ATa

  • Active Users
  • **
  • Posts: 15
    • View Profile
Hi Tom, thanks for your detailed response earlier—it made me think more about the technical challenges. As an architect and industrial designer, realism is my top priority, and I’ve now come to appreciate why CPU-based engines like Corona are unmatched in this regard. CPUs excel in handling the intricate calculations, recursion, and massive datasets required for photorealistic rendering without compromises. GPUs, while fast, often face limitations in memory and precision, which can impact the level of detail critical for achieving true realism. Your commitment to preserving Corona's quality over speed makes perfect sense to me now. 🤠
« Last Edit: 2025-01-14, 20:51:01 by ATa »

2025-01-14, 20:51:10
Reply #3

TomG

  • Administrator
  • Active Users
  • *****
  • Posts: 5922
    • View Profile
TY for the feedback, and you are correct in why CPUs are better at this stuff when it comes to realism and accuracy. GPUs excel at "performing the same calculation loads of times, once per pixel" as it can then divide that up among all the cores on the GPU, and those cores don't need to talk to each other very much - the thing is, as light bounces around a scene, the calculations have to branch, meaning lots of different calculations have to be done, not just "the same thing loads of times". In the end, CPUs are architecturally built (physically) to handle just that sort of situation where each core may be doing something very different from the others and may need to exchange data and synch efficiently on doing that so there is little wasted idle time as logic branches, while GPUs are built (physically) in a way that is not so good at that.

So CPU rendering will be around for a long time to come, and we aim to keep Corona right up there as a top choice for realism in archviz on CPU :)
Tom Grimes | chaos-corona.com
Product Manager | contact us

2025-02-05, 15:15:38
Reply #4

Mr.White

  • Active Users
  • **
  • Posts: 5
    • View Profile
Corona is very close to my heart and I use it professionally every day - this will not change in the medium to distant future.

However, Octane as a pure GPU renderer is technically absolutely on a par - in terms of quality and features without a doubt. I don't see any technical limitations in the pure result of the image. From caustics to the fantastic simulation of light, etc.

I think VERY realistic images are possible in both worlds...CPU as well as GPU.