Author Topic: The new 3000 series graphics cards from Nvidia and GPU rendering  (Read 2820 times)

2020-09-05, 23:09:12

John.McWaters

  • Active Users
  • **
  • Posts: 143
    • View Profile
    • JohnMcWaters.com
As the release date for the new 3000 series GPU's from Nvidia draws near, I wanted to see what the general thoughts are from the Corona community about the future of GPU rendering. Is it known if Corona plans on implementing GPU-based rendering into it's capabilities in the future? Or do you think CPU rendering is here to stay?

I've seen some impressive arch viz from The Boundary that was done in Unreal, so I think that would mean it was GPU based. If real time rendering gains popularity and is able to deliver at a quality comparable to what is achievable in Corona today, then I would say it's future is promising.

I'm curious because I could see investing in a 3000 series card a wise decision if GPU rendering picks up... especially if engines like Corona can take advantage of the RTX capabilities of the cards.

2020-09-06, 14:43:51
Reply #1

marchik

  • Active Users
  • **
  • Posts: 63
    • View Profile
I hope that over time we will be able to use the GPU, at least to speed up the calculation of caustics just because sooner o later the "proudly cpu based" approach will start losing competitiveness to hybrid solutions

2020-09-07, 08:30:41
Reply #2

Ondra

  • Administrator
  • Active Users
  • *****
  • Posts: 8921
  • Turning coffee to features since 2009
    • View Profile
Hi,
while there are no immediate plans, we are watching the development closely. There are not only raw performance improvements, but also development of the whole ecosystem, which is important for developers, as it determines how much work is needed to unlock the performance, and what features will be supported. If/when gpu rendering technology reaches the marketing promises ("make all your features 100x faster over a weekend"), we will definitely spend the weekend :D. Currently we have 3 GPU renderers in Chaosgroup, so we have good source of verified info.

I would personally advice you to NOT buy super-expensive GPU unless you want to do GPU rendering RIGHT NOW. If you do and have the cash, then sure, go ahead and buy the card. But we had so many instances of people buying strong GPUs "for the future" that never materialized. I personally bought GPU over my intended budget back in 9x00 times (yes, 9000, not 900), because usable GPU computing was "just around the corner". Hardware is depreciating asset and it does not make any economic sense to buy it "for the future". I wouldnt make any sense to buy a brand new car to sit in the garage for 2 years, "before things pick up". And hardware becomes obsolete much faster than cars. If you buy it now for future, it might be already obsolete when you finally make the switch.
Rendering is magic.
Private scene uploader | How to get minidumps for crashed/frozen 3ds Max | Sorry for short replies, brief responses = more time to develop Corona ;)

2020-09-07, 14:08:08
Reply #3

burnin

  • Active Users
  • **
  • Posts: 1041
    • View Profile
"Buy it for the future" - Pay for and be part of beta test group in our R&D for next couple of years.

2020-09-08, 13:39:16
Reply #4

John.McWaters

  • Active Users
  • **
  • Posts: 143
    • View Profile
    • JohnMcWaters.com
Thank you for your detailed reply. I guess investing it a GPU for GPU rendering is only wise if you know you'll be able to implement it within that generation of GPU.

I'm curious to know what the 3 GPU rendering engines are from Chaos group.

2020-09-18, 11:52:59
Reply #5

Juraj

  • Active Users
  • **
  • Posts: 4159
    • View Profile
    • studio website
My guess: VrayGPU, Lavina Project, Vray for Unreal

Ampere GPUs are amazing though <3 3090 is just glorious, there will not be better GPU in next two years, it's the next 1080ti which was top for 4 years.
talcikdemovicova.com  Website and blog
be.net/jurajtalcik   Our studio Behance portfolio
Instagram   Our studio Instagram, managed by Veronika

2020-09-18, 13:34:24
Reply #6

agentdark45

  • Active Users
  • **
  • Posts: 554
    • View Profile
I do feel like the value proposition of GPU's for consumers (at least in terms of rendering vs single monster CPU's) is highly overlooked, and should be factored into development decisions.

For example, the 3080 is about twice as fast in Blender/Vray GPU/Octane than the previous 2080ti: https://techgage.com/article/nvidia-geforce-rtx-3080-rendering-performance/

Now consider the cost of system upgrades for end users:

One could easily get 4x3080's or two+ 3090's (with 48gb's shared Vram since it's the only model in the range that supports NVLink), and use them in pretty much any barebones system, within reason. Now compare this to someone wanting to upgrade to a Threadripper 3990x based system from an older platform, you're looking at a hell of a lot more money all things considered - and this is coming from a TR 3970x user.

There are times where I use Fstorm and my single 2080ti to render scenes that Corona just can't keep up with (mostly high end product shots/animations with DOF). Others on the Fstorm facebook group are also pumping out stunning high res interior visuals in the 30 min render time region.

Looking at future CPU development, I can't see a 256+ core Threadripper emerging within 5 years (at least for a reasonable price), however if we keep getting double the performance in the GPU space every 1.5-2 years, and a doubling of Vram, out of core tech, dedicated raytracing core utilisation in render engines, and NVlink keeps being updated I have no doubt that GPU based rendering will dominate the market.
« Last Edit: 2020-09-18, 13:45:04 by agentdark45 »
Vray who?

2020-09-19, 08:38:13
Reply #7

Juraj

  • Active Users
  • **
  • Posts: 4159
    • View Profile
    • studio website
I don't know how well are all features supported when using Optix with RT Cores in something like Octane. I always remember how VrayGPU was behind VrayCPU and that didn't even account for Optix/RT Cores. Does F-Storm support Optix engine?
How long did it took to support NV-Link? Worse, only 3090 has NV-Link now, and only at 1/4 of Bandwidth links available. So while 3090 does support memory pooling, it probably doesn't support it without quite a performance loss as opposed to Quadro and A100. nVidia gave us bone (24GB Vram ! Hell yeah), but made sure we don't overeat into their enterprise profits.

When looking at conventional pure CUDA gains, we're not talking of "doubling" every two years. It took 4 years to actually double (1080ti --- ) 3080). And it will take another 4 years as well with the progressive worsening of nodal upgrades. Ampere got very lucky with the relatively cheap Samsung node, but that's after two years of battle trying to get better silicon themselves (Intel is struggling with nodal upgrade for 6 damn years). So this is where the progress will halt again.

I still agree GPUs are amazing at the moment and they are far easier to use, stack and afford. No comparison at all. They rock. Selling outdated workstations is pure loss of value (90perc.). Selling outdated GPUs ? Easy reclaimable 40perc. value.
3090 is galactic value compared to 3990X Threadripper. You can buy 3 of them !

But they're not progressing that fast ;- ).

Still... Corona-GPU would be spiffy =) Let's hope!

talcikdemovicova.com  Website and blog
be.net/jurajtalcik   Our studio Behance portfolio
Instagram   Our studio Instagram, managed by Veronika

2020-09-20, 13:11:37
Reply #8

sprayer

  • Active Users
  • **
  • Posts: 785
    • View Profile
Juraj but there is 3090 and will be TI i think. Nvidia lazy because do not have competitor from AMD on desktop. Why they should double performance if you buying their expensive cards? And no ray tracing from AMD.

2020-09-20, 15:36:48
Reply #9

Juraj

  • Active Users
  • **
  • Posts: 4159
    • View Profile
    • studio website
In my opinion nVidia really tried their absolute best this time. They couldn't affordably (without more expensive 7nm node from TSMC instead of the 8nm Samsung) be able to create anything faster than 3090. The small margin (<15-20perc.) between 3080 & 3090, the fact the GGDR6X memory is clocked 19 instead of 21 Gbit and already run 100 C, let alone the whole <350W :- ) TDP, shows it's at absolute edge of what they could achieve.
It's well reasonable they did it because the AMD will not be such a fail like their current marketing campaign suggests :- ).

The sillicon is getting denser and denser, and the dies themselves are massive. Only thing that could rapidly improve the performance (sans tech like Optix & DLSS) is "gluing" dies together like AMD Zen :- ).

nVidia is definitely charging (at least did for Turing) massive "winner" tax. But they were never lazy and complacement in advancing their tech. I find their GPUs fascinating and 3090 will be day-one buy for me.
talcikdemovicova.com  Website and blog
be.net/jurajtalcik   Our studio Behance portfolio
Instagram   Our studio Instagram, managed by Veronika

2020-09-20, 22:29:01
Reply #10

Jpjapers

  • Active Users
  • **
  • Posts: 1400
    • View Profile
In my opinion nVidia really tried their absolute best this time. They couldn't affordably (without more expensive 7nm node from TSMC instead of the 8nm Samsung) be able to create anything faster than 3090. The small margin (<15-20perc.) between 3080 & 3090, the fact the GGDR6X memory is clocked 19 instead of 21 Gbit and already run 100 C, let alone the whole <350W :- ) TDP, shows it's at absolute edge of what they could achieve.
It's well reasonable they did it because the AMD will not be such a fail like their current marketing campaign suggests :- ).

The sillicon is getting denser and denser, and the dies themselves are massive. Only thing that could rapidly improve the performance (sans tech like Optix & DLSS) is "gluing" dies together like AMD Zen :- ).

nVidia is definitely charging (at least did for Turing) massive "winner" tax. But they were never lazy and complacement in advancing their tech. I find their GPUs fascinating and 3090 will be day-one buy for me.

Tech Jesus mentioned that there is a larger ampere die on the compute node cards so the ampere support is there for a more powerful card with a larger chip. Plus theres more skus been leaked by gigabyte for a 3080 with 20gb vram which could potentially make the value proposition much much more compelling. The only issue being no nVlink support and no more hardware SLI after the 3090 :(

2020-09-27, 12:32:38
Reply #11

Flavius

  • Active Users
  • **
  • Posts: 133
    • View Profile
First of all, I really really like Corona and the Corona team which has always been of support when I had a problem with the scene, a bug etc. so thanks for that

However, with the new 3000 series it's the first time when I EVER look into GPU rendering-> Vray GPU and Redshift.

Before spending 1500EUR on a GPU I thought to do some tests with the environment I typically use, large oceans, foam and boats.

The results were that on Vray GPU, a puny 1060 GTX(what I currently have) is slightly, just slightly slower than a 3950X Ryzen (Both Vray CPU and Corona) so price/perf imagine what a 3080 RTX (similar to 3950x) would do.

I have 2 huge problems with Corona, it renders big refraction planes unbelievably slow, as i posted previously it doubles the render times and the unending lights fireflies, as on yachts everything is reflective (ceilings, walls, floors, railings, glass, pools etc etc). If you add small led lights near these surfaces (ie steps lighting) I am screwed. I have to wait for about 3h-4h for the render to clean, on a scene where there is very little GI bouncing. GPU renders seem to perform much better in situations like these.

Now, Vray GPU is buggy and unstable as hell, Redshift isn't, it is much much more mature.

However Corona just feels more polished and the feeling that it just bloody gives the results you are expecting with little effort is gold.

My bottom line is........

Corona GPU please
 
I imagine there is a lot, a lot of work but reading on the Vray forums I will quote one guy working at some CGI studio ".Nevertheless, adding some cards, specially the new 30 serie, the speed increase may be enough to convince us to gradually abandon Corona and adopt the GPU way (with the lack of features and stability, I know)."

So... Please

Corona GPU I'm sure it will the best GPU renderer out there because you guys don't mess about when doing things.

Thanks!


2020-09-28, 13:52:06
Reply #12

Juraj

  • Active Users
  • **
  • Posts: 4159
    • View Profile
    • studio website
How do you feel about VrayGPU compared to Corona? I briefly tried it... didn't know how to set it up properly and the result looked visually weird to me (also the converter did ultra shitty job).
But it did feel quite fast...and I loved the interactive in viewport manipulation (though selection was somewhat offset..?)

At this point Corona is so straightforward that Vray feels very weird to me, even though I love the features of it (Crazy good framebuffer for example).

But yea, absolutely bring on CoronaGPU :- ). It's time for heresy!

(Although, one aspect I don't welcome :- D The damn heat... my 3990X is mini-sauna, and that's just 280W (often 270W only). RTX 3090 has continous draw of 380W lol. Imagine 3990X + NV-Linked 2x3090X = 1040W of heat alone! I guess I would have to make water-loop and put the radiator outside of window because that is half of my air-conditioning power).
talcikdemovicova.com  Website and blog
be.net/jurajtalcik   Our studio Behance portfolio
Instagram   Our studio Instagram, managed by Veronika

2020-09-28, 14:51:44
Reply #13

romullus

  • Global Moderator
  • Active Users
  • ****
  • Posts: 6739
  • Let's move this topic, shall we?
    • View Profile
    • My Models
The results were that on Vray GPU, a puny 1060 GTX(what I currently have) is slightly, just slightly slower than a 3950X Ryzen (Both Vray CPU and Corona) so price/perf imagine what a 3080 RTX (similar to 3950x) would do.

That's weird, i have the same GPU and i feel it's not much faster if any, than my 4771 celeron, which is more than 5 times slower compared to your 3950x. But to be honest, i didn't do comprehensive tests - renderers were different, scenes were different and i didn't track the render time. It's just my subjective feeling, that my GPU renders at similar speed compared to my CPU with added "benefit" of much shitier workflow.
I'm not Corona Team member. Everything i say, is my personal opinion only.
My Models | My Videos | My Pictures

2020-09-28, 16:22:59
Reply #14

Flavius

  • Active Users
  • **
  • Posts: 133
    • View Profile
How do you feel about VrayGPU compared to Corona? I briefly tried it...

At this point Corona is so straightforward that Vray feels very weird to me, even though I love the features of it (Crazy good framebuffer for example).

But yea, absolutely bring on CoronaGPU :- ). It's time for heresy!

(Although, one aspect I don't welcome :- D The damn heat... my 3990X is mini-sauna, and that's just 280W (often 270W only). RTX 3090 has continous draw of 380W lol. Imagine 3990X + NV-Linked 2x3090X = 1040W of heat alone! I guess I would have to make water-loop and put the radiator outside of window because that is half of my air-conditioning power).

Hi there,
I have only tried it on exterior scenes, so shiny yachts on water with something like 10-12 million plus particles for the foam and it feels quite fast. But as you said, the converter does a shit job and I have my brain wired for Corona :)) There are not that many settings to mess with in it anyway. I cannot say that much at the moment because I am limited by my GPU-s VRAM, I had to strip the scenes a lot. I really wanted to get a 3090 RTX but yeah...go figure.. no chance at all. Damn bots and scalpers.

Also Vray GPU feels a bit buggy and not very trustworthy so to say, like a lot of the stuff is still not supported and you literally have to have the "supported checklist" in front of you. At one point I felt like that VW 32 Golf concept Jeremy Clarkson was driving which had almost no buttons and dials linked to anything. Same here, you change some stuff nothing happens then you realize it is not supported.

I don't know how it handles complex GI, I only need like 5-6 bounces for my scenes.

I may not have a 3990WX but, the heat...well..I have my appartment facing south, 11th floor (last one) and the 3950X runs hot. AC is working 24/24. I don't even want to know what 3090 RTX will do. I might disconnect my central heating, I guess I won't be using it during the winter IF I can get my hands on one of those cards by then.

However I keep an eye on Intel's Xe graphics and have high hopes on Raja Koduri. Check this out
feature=youtu.be&t=2740   ( https://forum.corona-renderer.com/index.php?topic=25571.msg153082#msg153082

The bottom line is for interiors I will not switch from Corona to anything else, but for animations I might need to go with Vray GPU and 3090 RTX's

@romullus -> I only tested what I have specified above, using the latest Vray GPU 5 hotfix 2. The way I did it is by setting up a time limit on my very simple scenes and eyeballing the noise. I don't see any other way, at least now of comparing these. At least in Vray there is consistency between CPU and GPU results, of course using only supported features






« Last Edit: 2020-09-28, 16:28:40 by Flavius »