Both, I guess? iray can produce really good results, but it suffers from some pretty severe workflow limitations. Limited texturing, it's always limited by GPU memory (unlike Redshift, and Octane is also going in that direction), and it really is quite slow. For iray to be really fast you need a big cluster of GPU machines, and preferably Quadro-cards to get additional memory (which you will need), so the whole thing is going to be massively expensive. If you spent the same amount of money and just rendered in Corona or Maxwell, or hell even V-Ray, you'd probably get better results in less time.
I never had problems with textures in memory with iRay, but I had with goemetry, A LOT of problems, you may not have some materials, but the main problem is in Nvidia's hand, that is what I said before
One of the biggest strengths of Corona is that it's extremely well-integrated into 3ds Max -- more so than iray -- which means you don't need to adapt your workflow. It's not limited by GPU memory, it supports all (more or less) native textures, material blending, render elements, etc etc etc.
Agree, specially the memory limitation thing.
I respectfully but strongly disagree. If you look at the new generation of Nvidia cards, the GTX 970 basically gives you Titan-level performance (albeit with 1/3rd less memory) for a significantly lower cost, at a significantly lower power level. The Titan was released about a year and a half ago, cost $999 and consumed 250 watts. Now, the GTX 970 gives you about the same performance for $329 and about 150 watts. In terms of performance per TCO* -- which is what really matters for a farm -- the 900-series is a huge improvement over the previous generation. Performance isn't vastly improved, but the power reduction and price reduction means you can buy more of them.
The picture I showed you show a 980 against a 780ti, where did you saw those benchmarks that put the 970 at the same performance level as a Titan? I'm really interested, and I'm also interested in a 980 vs Titan benchmark, I wasn't able to find anything that is not related to real time gaming performance, but the only GPGPU benchmark I found is that one.
Additionally, the fact that they're able to get such great performance out of a card using a relative low amount of power, despite being manufactured using the same 28 nm fab, means that they have plenty of headroom to grow.
Honestly, in terms of hardware performance evolution, GPU rendering has never looked better. The new Haswell-E is a good improvement as well, but mostly for cost reasons. They're not much faster than the previous generation, but they are much cheaper.
* Total Cost of Ownership
You really need to have a look at Redshift :-) it's not a perfect render engine, but it definitely shows that fast development and a plethora of features is possible on the GPU as well.
You are right, and here is where the greed comes out, there used to be two kinds of GPU's, as they are two kinds of CPU's, the ones with great power comsumption and the ones with great performance, now they pick the low powecomsumption series and name it as the best performance series and that's the flag, think if they still maintain the high performance series (wich does not exists at least publicly) we could be speaking about an astonishing difference for rendering performance.
Also they maintain 4Gb as the GPU memory... soooo great scenes will you be able to fit there... specially when you don't have instancing, and no matter what octane people says, if you use instancing in GPU you lower it's speed by a high factor, depending onw how many times do you use it, and iRay doesn't has instancing so... welll 4Gb is't nothing, so no spectacular opportunity for GPU rendering with those limitations.
Regarding Redshift, I don't like mental ray/vray render engines anymore, and I won't invest in a GPU farm to have a biased render engine going on when Corona has demonstrated that a biased render engine could be optimized to deliver an incredible speed, it just have to be worked out, for me such level of bias is a big NONO nowadays, and such level of complexity in configuration is also a big NONO also, that's why redshift has never been an option for me, also it's been said in this forum before, RedShift has great speed but it comes with a quality cost, I'm sure Corona could be able to deliver same speed with a similar quality cost, but IMHO it's just not it's pursued target.
Of course don't take my thoughts about RedShift as someting TRUE in capital letters, specially because I have not tried it personally, but I really don't have time to deal with another complex-to-configure render engine, I prefer to spend my time in creativity, modelling, texturing, animation, etc... :) and that is what Corona gives me.
Cheers!