Show Posts

This section allows you to view all posts made by this member. Note that you can only see posts made in areas you currently have access to.


Messages - Juraj

Pages: 1 ... 279 280 [281] 282 283 ... 318
4201
Gallery / Re: BMW
« on: 2014-09-08, 19:31:56 »
Excellent !

I believe though some slight blurring of highlights would avoid little bit of aliasing I am seeing there. Have you tried downsampling or rendering with internal res=2 perhaps ? And little glow in post ?

But really great and I love the slightly lifted blacks.

(Cool username)

4202
[Max] General Discussion / Re: Glossiness behaviour
« on: 2014-09-08, 19:23:58 »
One thing that's particulary in mind for me is that I often find it hard to achieve consistent materials among scene. Some materials will inherently end up being too reflective, and some too little. And there is nothing
I can do about it outside of eye-balling, not something I am very talented for. Corona will of course, always render the material to be physically plausible thanks to energy conservation law, but it will look off to what my eyes are used to when I
compare the whole scene.
So while it's easy enough to setup materials right now, it's easier with PBR aproach to make them all consistent with each other.

It's nothing about scientific, nerdy or complicated stuff. This stuff gets widely adopted now because it's even more artist-friendly than the outdated spec/gloss model, which really is just legacy.

4203
Not to mention there are some with like 128 or 96 GB of ram and SSD-s also inside..

That's not how it works though most time. When a whole node-pack is listed in this fashion, you need to divide it per node, and on average the nodes come with 24GB, or 48 in more expensive variants.

Do your calculations precisely though, because from what I observed, you often get only notch higher performance compared to conventional contemporary builds. After all, you are buying core-2 generations xeons or early westmere ones. The higher-specced C6100 often go up to 4000 dollars on Ebay (8x X5670, which actually is quite serious performance) but even than you save about 40perc. of price (ok, that can be a lot to some) to performance of contemporary custom build.
It looks like amazing budget deal if you only count "cores" on paper, but if you take into account their architecture family, frequency, and power draw...well, it's not that amazing anymore. It has quite too many drawbacks imho.

Not that it can't be good deal, but I see a lot of stary-eyed guys on internet forums thinking like it's amazing deal. Then I look at it...and, not really. Be cautious.

4204
[Max] General Discussion / Re: Glossiness behaviour
« on: 2014-09-08, 18:25:03 »
Yes, I am absolutely sure, at least for next 2+ years :- ). I don't want to side-topic this thread, so briefly: All the demos are nice (really nice !), but...they are extremely simple. It's few boxy shapes, simple furniture. Try taking complex regular archviz scene, doing proper (even if automatic!) unwrap for lightmapping and watch the fun. If you're able to produce something in month, that you would in off-line do in 3 days, come back to me and I will give you golden grail ;- ).

I am not talking quality (which is often quite excellent...to certain point of course). Archviz isn't about quality only, it's about productivity. The current workflow is very laborious and based on concepts long alienated in such streamlined environment like Visualization is (try unwrapping your whole set of decorations from Evermotion/DC/etc, optimizing models and textures..). Once more advanced real-time GI systems come back (I don't hold my hope for any voxel based solution like SVOGI, it's sadly "dead" for Epic, but whatever else comes eventually, we can only hope it's usable).

Koola created a lot of hype...but, most people just won't be able to produce anything at all. They don't see much behind all the complexity, it's only a "toy" for most people right now and will stay at such state for some time.

Look at how many people can't create simple nice rendering with off-line tools, where everything is single click of button, thousands of tutorials, ready-to-import assets (Ever/DC+/etc..)...and you except to see something way more complex like real-time pick-off any soon ? Real-time has been here for past 5 years (I was using CryEngine 4 years ago in college!!). The tools only evolved, they're not drastically different, workflow is identical, just streamlined. There is no revolution going on. There is just hype, and easy dreams :- ).

But yes, would be great if you prove me wrong :- ) I am trying to prove myself wrong !

4205
[Max] General Discussion / Re: Glossiness behaviour
« on: 2014-09-08, 18:02:57 »
I will appreciate any paper/resource you guys know about this matter. I usually end up reading to physics bases documents and i lost in the process.

Here is the original paper : "Physically-Based Shading at Disney"

http://disney-animation.s3.amazonaws.com/library/s2012_pbs_disney_brdf_notes_v2.pdf

The most important is the chapter 5 "Disney “principled” BRDF"

"1. Intuitive rather than physical parameters should be used.
2. There should be as few parameters as possible."

I always see some counter-argument how it is complicated, unecessary,etc.. Not at all. It is in fact, easier, more logical and artist friendly.

Notice some of these parameters, sheen, clearcoat,etc. Single shader, that lets you directly create every type of material, in super easy fashion and fully plausible behaviour.

4206

Remember that GGX hype or Grant Warwick tutorial hype in general? For a while, people thought they could not live without GGX and manually separating RGB components in reflectance curve, yet, after a short while they realized it just makes the workflow take a bit longer while not really making their materials looking noticeably better. And now the hype is over, and everyone is back to their usual routine. Turns out absence of GGX or not having separated RGB component curves in their reflectance falloff wasn't really the only reason their materials did not look the way they would like to ;)

Not really. Everyone wished for GGX so they could avoid absolutely bothersome stacking of multiple specular layers. RGB curves was completely other thing, that was to get slightly unique color behavior for metals instead of single shade but that was even more over the top. But not so much GGX, which does all that with single internal algorithm. Necessary for life ? No, nothing is. But then why even bother with anything new ? How can you at same time defend software with unique and novel approach and at same time, always reserve that same opinion about any sort of progression that's not in line with your current understanding of workflow. You're not even on same page with your stuff.

And regarding your overused argument with Alex Roman and Bertrand (or other outliers) having zero problems making nice stuff. Except that Bertrand reiterates his workflow every few months and it takes him quite effort to do what he does and yeah, I don't think few outliers identify the critical mass of users just because they manage to utilize extreme out-of-box thinking and peculiar attention. Most do not. And neither of those two needed Corona (or even use it daily for that matter), so who do you think this software is actually catering to ?

With that I agree with you not every addition is necessary I also like simplicity of Corona because it doesn't add stuff. But there are changes that arent' addition, but could streamline the workflow a lot who wish to to "easily" achieve photoreal results. Which after all is like the main motto behind Corona renderer.

Ad Examples: Almost every of your argument is logical fallacy or doesn't apply to topic you illustrate. What does the correlation of users's quality of work have to do with usability of renderer ? All these minor-market renderers like Thea, Indigo,etc.. only showcase they never attracted critical mass of users among which few outliers could demonstrate their highly above-average skill. Yet give it to that outlier and he will largely exceed their portfolio (Bertrand with Octane, Maxwell, Corona,etc... not only Vray).

Can we apply reverse logic to that ? MentalRay has some of the most amazing work on internet given how long it is on market and how large its userbase used to be (and probably still is). Does it prove Mental Ray's superiority in usability against others ?
It doesn't, and neither does your point.

4207
[Max] General Discussion / Glossiness behaviour
« on: 2014-09-08, 02:32:02 »
I wanted to start a small discussion on behavior of glossiness parameter.

How the model currently works, is not unique to Corona, but is equally shared by Vray and MentalRay and is more a trait of the specular/glossy shader model legacy.
And since both Maxwell engine, and currently various real-time renderers integrated Disney's PBR model, where this is instead, swapped to "Roughness" parameter.

On paper: Roughness=Inverted Glossiness. Almost. The big difference between these two, is that while in Specular/Glossy shader model, glossiness is purely a subset of specular reflection,
a value (numeric or texture driven) that defines how spread out the reflection appears (and being in-line with energy conversation, more spread out obviously appears weaker),
Roughness defines the surface property both in how the specular reflection spreads but most importantly, how much of the grazing angle reflectance (white 1.0 for non-metals, white or tinted 1.0 intensity for metals)
becomes visible.
And in line with that behavior manifests the following difference in practice:

Spec/Gloss:    0.0 Glossy (matte)= Visible specular sheen, or rather overlay of grazing color.
Roughness:    100perc.    (matte)=100perc. lambertian shader, or rather, diffuse color for non-metals, and base specular reflectivity for metals becomes fully visible with no overlay of grazing color at all.

Why the roughness model is more logical and easier to use with photorealistic materials in my opinion:

 Consistency: Polished wood (not laquered!) and rough,matte wood is the very same material. The only difference is microscopic in its surface. Both have the same reflectivity, which is IOR 1.52, or rather, 0.04 base reflectivity, and 1.0 grazing reflectivity.
   The only thing you need to change to achieve either, is change the roughness parameter. Since its linear value, its same across all engines (from Maxwell, to Unreal Engine 4 and many others soon) and is absolutely easily eye-balled from reference.
   We are only changing one value.

In Spec/Gloss (Vray,Corona,MentalRay,etc..) to simulate identical behaviour, we have to "guess" the reflective value. Because full reflectivity, but zero glossiness produces visible sheen, and overlay of our 90 degree value (white by default). And they don't go against each other in linear fashion, i.e, we can't just set it to 30perc. glossy, and deduct 70 from our reflectance value,i.e 0.3. There is no such direct relationship, so we are juggling 2 values, and at any step, we can't be sure if this is actually physically correct material. The fact that it looks "right" to us, doesn't change anything about the fact that it's more complicated and much less logical.
Biggest problem for most users here is, they often end up with incorrect albedo, because they have both too high specular reflectance and diffuse (which is also 'reflectance'). Corona recently introduced small algorithm that "corrects" this for you in background (by dimming your diffuse if your specularity is too high).

Physical correctness=/=physical correctness. This term quickly became buzzword for markerting and is in such fashion used by all major renderers. But all it means is basically, they follow physical laws. You are still allowed to create material wildly differing from its real-world counterpart. That is actually good, the problem is, you don't know when you crossed that line.


Some random illustrations:

1)Vray: 0.0 glossy produces very strong, velvet like sheen.
2)Corona 0.0 glossy produces weaker sheen but exhibits similar behavior.
3)Corona with 128 grey material and Red 1.0 reflection, 0.0 glossy the overlay is quite visible
4)Roughness chart (this one is from Maxwell website, but its always the same). On left, 100perc. roughness (=0.0 glossy) = ideal lambertian surface, zero overlay of grazing angle color (which is blue)


End: Nothing :- ) Not actually saying I want this strongly right now or anything. It's not really any sort of request. I just wish to talk about your understanding of this issue, and generally, what you think.

Me personally, I am big fan of the pbr approach, as you see me joining all the tidbits here and there in forum. It's small, but revolutionary thing, something I am much bigger fan of than
features everyone else already has. In free time, I am creating stuff in Unreal4 (and no, it's not going to replace off-line rendering at all, it won't even become popular in archviz...trust me ;- ) not any soon),
and boy...I am having so much fan using the material system. It's vastly superior imho.

Cheers, all in good faith:- ) !

4208
Vicnaum deserves a medal for clarifying stuff ! :- )

Yes, some PBR models (like the one in Unreal engine 4 derived from Disney's paper) use inverted specular reflectivity values because they wanted UNIFIED model for specular reflectivity across both metals and not metals.

Why is this logical: (I am not advocating we need this actually at all, but just writting argument against "all the mainstream renderers do something, so it's obviously right" fallacy type of argument above).

1)Metals don't translate well using only the simplified fresnel formula (only n-number), so they end up looking "incorrect" (of course, depends to what point at which case), and to avoid using the complex model (n,K), which
for example Maxwell has, they ditched it altogether and just let you specify base reflectivity and metal-ness (0 or 1, whether the material is metal, or not metal). The rest is adjusted internally then.
This also lets you lineary scale the specular value from non-metal (which will be at 0.02-0.04) to something like 0.6 for chrome or 0.9 for aluminium. One value for all material types, instead of two values.

2) If we discount refractiveindex.com , it's easier to get measured values in inverted fashion. It's still potato values, but easier to use, because again, we are only using ONE value (fresnel is computed internally aftewards).
From 0.02-0.04 for non-metal likes plastics, paints,etc.. to 0.2-0.95 etc.. for metals. This is both easier to custom modelling the curve from ref index web, as well as inputting 2 values for full fresnel n/K (too scientific).

Both points above become strong positive when it comes to authoring textures. The whole workflow was intended so you can paint your textures more universally. You only paint single specular map which will only apply with metals,
and will contain both color and intensity (in the scale above,0.2+) but will be ignored for non-metals, which will return to default 0.04 (1.52 IOR as Vicnaum wrote). Both will keep 1.0 at grazing angle and their variance will be driven
by single another map, roughness. It's very simple model actually. I suggest everyone to play with Unreal to get the feel of how it works. It's not more abstract, it's more user-friendly and logical.

It doesn't make creating material "like hell", nor does it look like anything Modo does. It's the same stuff. It's actually easier, and it prevents people from creating "incorrect" materials (metal with too high albedo because you use diffuse, but incorrectly low specularity because you used random IOR number like 8-12 instead of unique curve which varries a lot for metals, and can only be simulated by the respective curve, or full fresnel formula, n/K (like Maxwell).

All the other stuff applies same way, there is no "capping" of specular reflectivity, it always go to 100perc. but the curve can be so steep that coupled with rough surface, the material simply appear mat (and shader lambertian)

Last but not least to counter the "People have no-problem creating photo-realistic materials":

Well, I would say they do.. 99perc. of people including me have that problem. But the problem is not so much that's not possible now. It is just fine. But that it could be refined to allow "easier" creating of them, by a model that very much
navigates you to correct result and avoids the possibility of having physically incorrect materials.
Which the current model promotes, the fact the you can cap reflectivity for non-metals means that vast majority of people have materials that are less reflective then they should. Instead, they often compensate by too high diffuse values or generally upping exposure for whole image and then complain on forums of "flat, washed out" look.


4209
[Max] General Discussion / Re: Corona render speed
« on: 2014-09-07, 22:24:03 »
Seriously, I'm not so amazed by gpu rendering so far. So many highend cards to buy, only to obtain a little improvement at the end, and oh, scalability doesn't seem good as well, not to mention the difficulty to program stuff for it.
I don't understand this gpu hype. They'd better develop new cpu cards, improve software, and leave the gpu for games and display...

Than you are smarter than most and don't fall easily for marketing scam :- ). GPU rendering is indeed still mostly just 'hype'. It's not 10 000 times faster, it's not 10 times faster either. It IS faster considering its parallel nature and
the fact that GPU chips (by both nVidia and AMD) evolved quite much more than CPU chips did over the last 5 years, but at what cost ? Until Redshift came, you had two options: Fit your scene into mainstream gaming card (Radeon, GTX), which go between
2 to 6GB, but c'mon, 6GB is still nothing. Or, pay for re-branded "pro" cards and get 6 to 12GB of Vram at 10!!! times the cost (Quadro, Tesla for Cuda, and FirePro for OpenCL), at which it was no longer scalable or cheap. It was in fact, more expensive.

GPU rendering is too much tied to drastically changing politics of GPU companies (AMD ignoring OpenCL development at times ? nVidia 'crippling' gamecards to literally force you to buy the same chip in pro segment ? Adding additional memory only when others (gamers, scientists,etc..) need it (so it took 4 years ! to get to 6GB in mainstream, congrats, you can render rotating cars).

Octane render came in 2009, 5 years ago with claims of gpu-rendering to be game-changer, revolutionary act. 5 years later...where is it ? Still nowhere. So maybe in next 3-4 years, it will do better, it might. But really, nothing amazing about it now,
just tons of cluessed brainwashed people by simple (and factually wrong) propaganda.

tl;dr: gpu rendering still sucks. Get over it.

{leg: I own Octane licence, did for many years. I use Unreal/CE3, I love GPUs, just not for pure ray-tracing }

4210
[Max] General Discussion / Re: Is corona using MLT?
« on: 2014-09-07, 20:41:51 »
Perfect document !! Was it shared enough around.. ? {I don't know, I just came from "vacation", so I am bit out of touch} It's plenty informative and describes Corona philosophy nicely.

Actually, the most informative thing out of it, it describes what "stuff" (to avoid being rude :- ) ] regular mainstream user does, and why the software needs to adapt to his behaviour, rather
then catering to few "power-users".

I have question regarding slide "34". I surely wouldn't remove that reflection and have always dreamed of days when proper reflected (GI) Caustics (ones strongly boosting interior illuminations and forming shapes reflected from floor, and not the fancy refracted ones behind beer glass cup..) become efficient to be used. I didn't play with VCM because it's currently suggested to avoid in production due to lack of certain features and sidelined development, but does VCM also speed-up this type of caustics ? If so, that would be brutal and would definitely give a try.

4211
[Max] General Discussion / Re: Materials Guide
« on: 2014-09-07, 20:34:00 »

Just an question, this Marmoset Toolbag something simillar like the Quiexel ddo ? Or any differences between those stuffs ?


It's two completely different softwares. Quixel suite (nDO, dDO) is texture authoring package. It's creation tool.
Marmoset Toolbag is real-time content viewer, with material and lighting system.

Only thing they share is Computer/Video games oriented market, but there is no reason why you can't use them elsewhere.
I used Marmoset to quickly play with my normal maps for sculpted things I did in Zbrush, and I use nDO to create all my normal maps to be used for regular off-line rendering.

The PBR guide is excellent thing to go through, even though it can't be applied directly (unfortunately...) to Corona/any other renderer (with slight exception of Maxwell, which shares half the principles),
because Specular/Glossy workflow which we use (and which is subpar) is different from Metalness/Roughness workflow from general (Disney's)PBR model.

Still, you learn lot of important concepts:
1)Everything is darker in diffuse than we would think.
2)Things are more reflective then we think, and the surface specular reflection varies because of its roughness (inverted glossy parameter, but driving both intensity and spread) not because of specularity (single material surface is mostly even).
3)Metals do not have diffuse property, in pure form they only contain specular reflection. That's why metals and non-metals should be created in different manner.

4212
[Max] General Discussion / Re: Questions about Corona
« on: 2014-09-01, 18:23:08 »
You could see me on cgarchitect, I talk a lot everywhere :- ).

Anyway, I think what you're asking for isn't any different in Corona, or any rendering engine therefore, the same workflow concept applies. IBL lighting is easier, because Corona samples Environment directly, and doesn't require Domelight side-step.
But materials, textures, their relationship to scene, it's all identical. Vray and Corona really isn't that different.

What might interest you are content management plugins to speed your workflow perhaps ? Like SigerShaders library. It does come with pre-made (very mediocre) materials, but mostly with easy manageable library to store your own materials, much
superior to the default 3dsMax clusterfuck. Check that out

4213
[Max] General Discussion / Re: Questions about Corona
« on: 2014-09-01, 15:34:49 »

What I don't like about Vray (that really gets me annoyed) is how complex it is to appy materials. Hopefully, Corona will address this and make it simpler.


What do you mean if you don't me mind me asking ?

4214
Gallery / Re: Rolleiflex
« on: 2014-08-31, 15:36:46 »
It's not bad at all :- ) Worth doing some short anim to show it of better ?

Little bit 'bleaching' of the shadows would give it more cinematic look.

4215
Yeah not a bug, then.

Anyway, I'm trying to get a sofa to look right (again) and this still confuses me. How should I go about replicating a fabric? Is a falloff diffuse really the only way to get anything like the low glossiness reflection effect, because it's nineties as hell. Overall I don't think the glossiness curve Corona has right now is quite natural. Am I doing something wrong here or do others have the same problem?

Yes, diffuse only. I do agree the curve is bit odd. In best practice , low glossies, even at full specular reflection (ideally, there is no reason to lower it) should produce identical look to fully lambertian shader.
In most renderers (with exception of Maxwell) this doesn't happen (and instead, often leave artificial 'sheen' )...and imho complicates life, because you have to tweak it with wrong stuff (lowering actual specularity, ior/curve,etc...) to achieve your needs.

[it works excellently in Disney's modeled PBR, like in Unreal4, :- ) ..]

Pages: 1 ... 279 280 [281] 282 283 ... 318