Author Topic: Tonemapping - Plz Halp  (Read 79817 times)

2020-04-28, 22:59:10
Reply #60

cjwidd

  • Active Users
  • **
  • Posts: 1069
    • View Profile
    • Artstation
Isn't there something futile about discussing color management at all with respect to Corona Renderer for 3ds Max? As I understand it, Max is not a color managed workspace, and if it's taken Autodesk like 7 years just to add a new spinner to the chamfer modifier, why should we even begin to expect that color management is going to appear(?)

2020-04-28, 23:22:37
Reply #61

Juraj

  • Active Users
  • **
  • Posts: 4324
    • View Profile
    • studio website
Waiting for Autodesk might indeed be futile but renderer can supply lot of native features like Corona shows 🙂
As long as its the main rendering engine you use it can even be better integrated solution (selection in framebuffer, etc.).

Vray already went this path so it makes sense Corona will opt so too.
talcikdemovicova.com  Website and blog
be.net/jurajtalcik   Our studio Behance portfolio
Instagram   Our studio Instagram, managed by Veronika

2020-04-29, 00:20:44
Reply #62

Jpjapers

  • Active Users
  • **
  • Posts: 1507
    • View Profile
A Trump meme already 😀Boy this escalated quickly.

Points for guessing the kelvin of his face.

2020-04-29, 08:36:37
Reply #63

niljut

  • Active Users
  • **
  • Posts: 16
    • View Profile
Isn't there something futile about discussing color management at all with respect to Corona Renderer for 3ds Max? As I understand it, Max is not a color managed workspace, and if it's taken Autodesk like 7 years just to add a new spinner to the chamfer modifier, why should we even begin to expect that color management is going to appear(?)

https://makeanything.autodesk.com/3DSMAX/public-roadmap-25B1-201DB.html#rendering-future

It is definitely on their minds at least.

2020-04-29, 09:21:43
Reply #64

cjwidd

  • Active Users
  • **
  • Posts: 1069
    • View Profile
    • Artstation
oh wow, thanks for sharing that^ I had no idea they were even thinking about it

2020-04-29, 09:23:53
Reply #65

Fluss

  • Active Users
  • **
  • Posts: 553
    • View Profile
Isn't there something futile about discussing color management at all with respect to Corona Renderer for 3ds Max? As I understand it, Max is not a color managed workspace, and if it's taken Autodesk like 7 years just to add a new spinner to the chamfer modifier, why should we even begin to expect that color management is going to appear(?)

https://makeanything.autodesk.com/3DSMAX/public-roadmap-25B1-201DB.html#rendering-future

It is definitely on their minds at least.

Well it was added 3years ago so wouldn't capitalize on this 😁: https://forums.autodesk.com/t5/3ds-max-ideas/opencolorio-ocio-support/idi-p/7028915

2020-04-29, 09:55:06
Reply #66

Fluss

  • Active Users
  • **
  • Posts: 553
    • View Profile
Also, for those who missed it, you really should read Chris brejon online book: https://chrisbrejon.com/cg-cinematography/

He is a lighting artist and he worked in some of the biggest studios. There is a whole chapter about ACES and a lot more useful tips. Check it out!

2020-04-29, 10:05:58
Reply #67

cjwidd

  • Active Users
  • **
  • Posts: 1069
    • View Profile
    • Artstation
yeah this is a great [FREE] resource, I've had it bookmarked for some time now :/

2020-04-29, 10:38:25
Reply #68

Fluss

  • Active Users
  • **
  • Posts: 553
    • View Profile
did you check it from time to time? he is adding new stuff, recently in the ACES chapter

2020-04-29, 10:49:51
Reply #69

Juraj

  • Active Users
  • **
  • Posts: 4324
    • View Profile
    • studio website
Holy shit that resource has become serious wiki-bible. Now it's even more daunting :- D

Seriously if I were to start with CGI today, and came across this I would immediately find other career.
I need to bite into it.

The first section is just mind-boggling. The whole CGI community spent 10 years deciding on question of "What can change the nature of gamma?" and this guy just outlines the whole matrix.
talcikdemovicova.com  Website and blog
be.net/jurajtalcik   Our studio Behance portfolio
Instagram   Our studio Instagram, managed by Veronika

2020-04-29, 10:57:22
Reply #70

cjwidd

  • Active Users
  • **
  • Posts: 1069
    • View Profile
    • Artstation
Seriously if I were to start with CGI today, and came across this I would immediately find other career.

lmao

did you check it from time to time? he is adding new stuff, recently in the ACES chapter

Nah, I haven't checked it since it was first recommended to me some time ago - it was already very thorough then. I'll have to see what chapters have been updated.


2020-04-29, 12:08:48
Reply #71

Fluss

  • Active Users
  • **
  • Posts: 553
    • View Profile
Holy shit that resource has become serious wiki-bible. Now it's even more daunting :- D

Seriously if I were to start with CGI today, and came across this I would immediately find other career.
I need to bite into it.

The first section is just mind-boggling. The whole CGI community spent 10 years deciding on question of "What can change the nature of gamma?" and this guy just outlines the whole matrix.

hehe, if I came across this at the beginning, I'd be a way better artist today!

2020-04-30, 08:18:04
Reply #72

cjwidd

  • Active Users
  • **
  • Posts: 1069
    • View Profile
    • Artstation
When I first posted this thread I was a little concerned there would be some self-righteous indignation about how I was basically asking for a magic button to cure my bad images or something like that. Of course, that was not what I was asking, but I was - more or less - implying that a significant proportion of what constitutes a good image is tonemapping (or lighting).

I recently subscribed to Johannes Lindqvist's Patreon to get a behind the scenes look at his approach to archviz / image creation, and he shared an anecdote in response to a Corona forum thread that *WAS* asking for a magic trick (based on Betrand Benoit's Varenna project).

He said,

Quote
If you take a photo in really bad lighting, the image looks photo realisitic. It may look like shit, but still realistic. It doesn’t matter what lighting circumstances you take your photo in, it still look realistic, since it’s obviously a real photograph. And the same principle actually applies to 3D as well, even if you have bad, ugly lighting, that alone is not the reason to why your image doesn’t look realistic.

I guess I did not have the common sense or clarity of mind to arrive at this really obvious truth before reading it, but it really struck me.

2020-04-30, 10:14:36
Reply #73

Fluss

  • Active Users
  • **
  • Posts: 553
    • View Profile
While I totally agree with what Johaness said about real photography, I'm also a bit disappointed by the CGI part. That statement is true when everything is done right, not only tone mapping. There are a lot of phenomena hidden behind the hood. Caustics, light polarization, diffraction etc etc, to name a very few. The thing is while we are calling render engines unbiased, we're still faking it or mimicking it. Tonemapping really is important but with the wrong shaders, wrong light energy, not enough details, perfect surfaces, it won't look realistic at all. That's a balance between a whole load of stuff. A good tone mapping is crucial for achieving photorealism, so as the rest. So yes we need it but it won't revolutionize the whole industry like some people seems to say.

If you compare a sony and a fuji camera output, pictures do not look the same. Colors are different, tone mapping is different but it still looks realistic. Simply because you capture the right scene-referred data (Everything is behaving as it should, how we use to see it). So if you don't feed the rendrer with the right data, the tone mapping curve won't make it look good.

So photorealism really isn't that simple to define as there is a whole lot of intricacies. The key is to use references and observe the world around you. I often look at light interactions around me and a lot of these observations are making me think "If I'd see this in one of my renders, i'd probably think this is a bug".

It would be really interesting to make a thread with great images and review them to highlight why it works well.

Also, what's make Fstorm appealing on the first sight, to me at least, is that the tone mapping curve is not driven by the LUT so you can use the LUT slot for other purposes. If you look at Johaness workflow, he is using LUTs to stylize his renders (desaturate colors, shifting black etc etc..)
« Last Edit: 2020-04-30, 10:27:14 by Fluss »

2020-04-30, 10:39:45
Reply #74

Juraj

  • Active Users
  • **
  • Posts: 4324
    • View Profile
    • studio website
Quote from: Fluss
That statement is true when everything is done right, not only tone mapping. There are a lot of phenomena hidden behind the hood. Caustics, light polarization, diffraction etc etc, to name a very few.

Hah I started writing response before you and then you write 90perc. of what I wanted ! Absolutely agree with the above.


I would argue that "ugly" lighting in CGI is easier to pull of photorealism, best example being flesh light. High contrast, small & directional, creating super strong highlights, placing focus on shapes instead of surfaces.
And vice versa, high-end studio photography sets look very CGI like, similarly like retouched people look almost plastic.

This is where I would caution against jumping to conclusion there is something this has to do with "tonemapping" necessarily. Your cell-phone is not doing any advanced tonemapping that wood look better than Reinhard. Every digital sensor captures data linearly, those linear data when extracted in that way look exactly like Corona with everything at 0 (and HC=1). Yet it looks photoreal because of the amount of detail.

The opposite is also true to some degree, that very good tonemapping & grading can make simple & lazy CGI set look photoreal...but only in low-res and only for a moment.

So both of those aspects are important, something Lolec and few others brought out. It can't be tied to tonemapping.

Also...the mythical "DSLR-like" tonemapping. It's not imho what people think it is. What your camera does, it only applies gamma curve onto linear data, then raw convertor applies S-Curve (either its own, or the one described in raw file, which is usually based on JPEG post-production by camera maker) and their own bit of tonemapping which you have available in the settings on front page of ACR for example. At that point, camera hasn't done anything, the Sony sensor (whether sold to you by Sony, Nikkon, Fuji, etc..) has only provided linear data. No tonemapping whatsoever. They do attach associated s-curve into the file (which the raw converter only interprets! the curve is only identical if you save directly to jpeg from camera).

So if we want "DSLR-like" tonemapping, there would have to be way to use any raw converter tools. But there's a catch, they don't expect the amount of information provided by CGI. Raw formats are barely 16bit at best, already with gamma curve.
If you force-open 32bit file in Adobe Camera Raw, it will do random equilizing, which can look almost bizzare depending on dynamic range of your scene. This is the only "magic" it does to photos as well... but the algorithm obviously gets very confused by the kind of data it gets with rendering.
You can do the old trick of zero-ing everything in older process setting, then you can use the default state of rendering and only apply tonemapping that AdobeCameraRaw offers. It only affects highlights.

Quote from: Fluss
So photorealism really isn't that simple to define as there is a whole lot of intricacies.

Yeah, very much this.

There is interesting research done by Paul Devebec & Co (or John Hable? I am not sure, I came across it on FilmicWorlds which is his blog) where ultra-realistic models (like the most high-end Hollywood VFX stuff we can do) still looked wrong to our eyes. Then they flipped them above to fool our brain finally. The evolutionary amount of sensory recognition is so well trained, that we perceive almost minutae differences between real humans and CGI humans. Even if we are talking super rendered, super lit, super post-produced photo-scans with perfectly shaped-in emotion.

I believe this effect manifests to much higher degree in architecture as well as we would like to admit. After all, how much time humans spend inside some kind of architecture? Same with nature. It always looks somewhat wrong. Opposed example would be cars, recent innovation, rather simple in shapes. I think almost nobody can tell difference between CGI or real car... even not exactly great renderings.

Every time people congratulate something for looking very photorealistic in forum, there is some crutch that strongly helps it. Super massive DOF, odd angle focusing on detail, etc. Lot of time these projects have 4-5 very "real" looking images...until the last one, with flat lighting, wide angle showing the whole room that client actually requested. And it doesn't look real at all. With no hack to help it, the same grading & tonemapping suddenly doesn't help at all.

Can most artists be sure when the issue can be down to tonemapping & grading, and when it's the detail? I would say this ratio is lot different than people think.

talcikdemovicova.com  Website and blog
be.net/jurajtalcik   Our studio Behance portfolio
Instagram   Our studio Instagram, managed by Veronika