Chaos Corona Forum
Chaos Corona for Cinema 4D => [C4D] General Discussion => Topic started by: FRENKH on 2024-10-24, 11:09:43
-
Hi guys, is there a correct way to match colors between Corona and Fusion? I'm exporting linear .exr 16bit from Cinema 4D Corona, LUT turned off, I keep it on only for visualization while I m working. Not no mention in Davinci color page colors are completely different.
In Fusion 18 I was using the same Corona LUT with gamma correction to 2.2 for visualization and It worked fine.
Anyone who goes through the same workflow successfully?
Thaks
-
Hmm hmm, fellow Resolve user here (use it daily) and I'm not really 100% sure what you're running into but things do work as expected on my end if I do the following:
- Make sure the tonemapping stack is disabled (including ACES OT)
- Save a 32 bit EXR
- Drag it into Resolve (Color science I'm on Davinci YRGB but without the color managing, Rec 709 (scene))
This will allow for a fully linear workflow aka you can do advanced compositing on with things then. If you'd like to get the ACES OT look back in directly in Resolve then you go to the Color page and throw the "ACES Transform" node in there. You'll probably want the sRGB (Linear) as input and sRGB as output.
Now if you'd like to use tonemapped images but still want some extra data by going EXR 16/32 bit then from what I recall you just export a tonemapped image and load it up in Resolve and it should all work automatically.
Let me know if the above helps, if not I'd be happy to dive deeper :)
-
Hi Nejc, thanks for the suggestions
- Tone mapping is disabled, the only option is "Photographic Exposure"
- 32 bit, done.
- Color management disabled, done.
What kind of LUT do you use for visualization in Fusion? And can you use a LUT from Corona?
-
keeping it 'linear' you use "Linear to Gamma 2.2" LUT (LUT>VFX IO>...)
-
keeping it 'linear' you use "Linear to Gamma 2.2" LUT (LUT>VFX IO>...)
It doesn't work, you get washed out color and less contrast in Fusion compared to Corona VFB
-
Well, then you're simply doing something wrong.
Here are 2 EXRs (from VFB & C4D) w/ aforementioned LUT applied & JPG w/o, in Resolve.
-
Thank you burnin but how do you see colors in Fusion?
-
Same.
PS
I suspect, you're applying LUTs later on, and not in Media Pool...
-
At this point I think it's a bug on the mac version of Davinci. I'm pretty sure I did everything you suggested but I'm still having a color mismatch...
Using "Davinci YRGB color managed" maybe was the worst error, now fixed, and applying the gamma LUT on the media page is a good idea but there is something else going on mac environment in my opinion. There are also a couple of preferences, mac specific I suppose, that I tried to change but it went even worst.
-
Speaking of Corona FVB, image contrast is veri close to what I see in Fusion while saturation is closer to Davinci editing page, so there is noway to say what's right.
-
Ahh, mac...
Maybe this thread can help you: Apple DWG Workflow Gamma Shift in Rec709 2.2 (https://forum.blackmagicdesign.com/viewtopic.php?f=21&t=191822&sid=b9f9d40f7332a60598f15a8bda391575) (@ Blackmagic Forum)
-
Ahh, mac...
Maybe this thread can help you: Apple DWG Workflow Gamma Shift in Rec709 2.2 (https://forum.blackmagicdesign.com/viewtopic.php?f=21&t=191822&sid=b9f9d40f7332a60598f15a8bda391575) (@ Blackmagic Forum)
Thank you burnin, anyway that post was about a mac export issue that has recently been resolved with the introduction of "REC709-A gamma" specifically for mac. I'm pretty sure mine it's a different problem.
-
Yes, that was specific issue, but there are also other tips (and links) there, informing how to get consistent results, manage colour, grading, levels, ... over different types of displays.
Either way, you can post your exemplary Resolve project here and we'll see how it fares on other systems.
Edit/
or just share your .exr, for starters
-
Yes, that was specific issue, but there are also other tips (and links) there, informing how to get consistent results, manage colour, grading, levels, ... over different types of displays.
Yes, very interesting!
Either way, you can post your exemplary Resolve project here and we'll see how it fares on other systems.
Edit/
or just share your .exr, for starters
This is a link to a dropbox folder containing both the cinema 4d and the davinci scene. If anyone wants to have fun... :D
https://www.dropbox.com/scl/fo/mg1ccmc6z0yx0i2ki67vc/AIJRF_BVCMCOJ0vXbKiV3pE?rlkey=wp33nh7sz5y241zv8tpdk349x&st=t9qm4ve8&dl=0 (https://www.dropbox.com/scl/fo/mg1ccmc6z0yx0i2ki67vc/AIJRF_BVCMCOJ0vXbKiV3pE?rlkey=wp33nh7sz5y241zv8tpdk349x&st=t9qm4ve8&dl=0)
-
OK. Have checked and observed you didn't fully disable Corona's Tonemapping and you also have Cinema's LWF set to sRGB, which are most likely source of your issues.
More you know, less you know... ;)
-
Hi burnin,
should I chose "linear" instead of "sRGB" for Cinema "Input color profile"?
The tone mapping in Corona was for "photographic exposure", looks like it affect only general exposure calculation related to camera settings, am I wrong? But I can turn it off, no problem.
Anyhow on Windows you managed to match color perfectly between the fusion and the editing page with the same .exr, this is the most important fact to me.
-
I'm not aware that you should change it to linear, but I'm open to learn, if I'm wrong...
-
LWF choice depends on input maps/textures.
sRGB is correct in this case.Had messed test render results/names. Also, I mostly use HDR textures, thus uncertainty and confusion. Here's your scene redone anew with render results passed through Resolve.
(https://forum.corona-renderer.com/index.php?action=dlattach;topic=43781.0;attach=204413;image)
source:
Plaster material maps (https://ambientcg.com/view?id=Plaster001), Studio IBL maps (https://polyhaven.com/hdris/studio)
Edit: typo correction
-
Wow! Good job!
I see a huge difference in the mid gray sphere... looks more accurare with sRGB, am I wrong?
-
That's "Color checker" issue that needs fixing (adapting values manually, 'till some TLC from Maxon). While using properly prepared maps it's more than fine. (preferred ;)
So here's corrected LWF-Linear render, where you can only observe slight difference in dark areas between EXR (HDRi) & JPG (LDRi) (but that's old Corona issue ~ and another topic :P
-
Hmm hmm, fellow Resolve user here (use it daily) and I'm not really 100% sure what you're running into but things do work as expected on my end if I do the following:
- Make sure the tonemapping stack is disabled (including ACES OT)
- Save a 32 bit EXR
- Drag it into Resolve (Color science I'm on Davinci YRGB but without the color managing, Rec 709 (scene))
This will allow for a fully linear workflow aka you can do advanced compositing on with things then. If you'd like to get the ACES OT look back in directly in Resolve then you go to the Color page and throw the "ACES Transform" node in there. You'll probably want the sRGB (Linear) as input and sRGB as output.
Now if you'd like to use tonemapped images but still want some extra data by going EXR 16/32 bit then from what I recall you just export a tonemapped image and load it up in Resolve and it should all work automatically.
Let me know if the above helps, if not I'd be happy to dive deeper :)
Sorry to hijack this, but I'm using Max and Fusion. Regardless my question is about saving out the Linear exr. You say disable the tonemapping stack. But ... if using simple exposure, would you not leave that switched on? Also there was a thing a while back that when using AcesOT you had to add a stop of exposure to compensate for something (i can't remember what we were compensating for) however, if this is still the case, once we disable AcesOT, should you then not remove a stop of exposure from the Simple Exposure to compensate for AcesOT being disabled. Hope that makes sense.
-
Don't get confused. Disabling whole stack simply avoids any manipulation/contamination.
1 = 1
-
Disclaimer: I'm no photography pro and I far from have a deep technical understanding of it but ...
Surely you want your exposure to remain in the stack at a bare minimum. Just like if taking a photo but plan on editing it in raw, you still expose the photo first and that's your starting point right?
It's something that always confuses me haha. Especially that AcesOT +1 stop of exposure thing.
I'd love someone to point me in the direction of a tutorial where someone takes a linear render and tonemaps it in Fusion/Resolve (ideally Fusion) and gets it back looking similar to how it did in the Frame Buffer before disabling the stack.
The way I see it, if editing the linear untonemapped image is akin to editing a raw photo, then the frame buffer is akin to the screen on the back of your camera. Its giving you a preview of the tonemapped raw render/photo. You then take that render/photo into your editor of choice (fusion/lightroom etc) and tonemap it using the flexibility of the tools in that editor. But my point is - your raw photo still relies on the exposure you set when taking the photo right? I appreciate the idea of RAW is that you can alter whatever you want after the fact, but you still want as close to a correctly exposed starting point as possible right?
What am I missing? I want to be educated .... hit me with it
-
If you export it as exr, you can and should still use Exposure and White Balance.
I have my workflow like this:
- set correct Exposure and White Balance (you can still adjust it later in post, but better if you are already close to what you want, like in real life)
- for test renderings and IR I use a LUT that I created to simulate the color corrections I will do later in post
- for final rendering I deactivate LUT, so only Exposure and White Balance is active.
- export it as exr
- Final editing in Affinity Photo or Fusion.
-
If you export it as exr, you can and should still use Exposure and White Balance.
I have my workflow like this:
- set correct Exposure and White Balance (you can still adjust it later in post, but better if you are already close to what you want, like in real life)
- for test renderings and IR I use a LUT that I created to simulate the color corrections I will do later in post
- for final rendering I deactivate LUT, so only Exposure and White Balance is active.
- export it as exr
- Final editing in Affinity Photo or Fusion.
Exactly as I've always assumed it would be done, although I don't use a LUT, I just use ACES OT and a couple of other things in the stack to counteract the parts of ACESOT that I don't like
-
Going back to the original topic, I spoke with someone from Blackmagic’s customer support, and he admitted the problem. He also noticed the difference in colors and contrast between the Color and Fusion pages on the Mac version.
-
Disclaimer: I'm no photography pro and I far from have a deep technical understanding of it but ...
Surely you want your exposure to remain in the stack at a bare minimum. Just like if taking a photo but plan on editing it in raw, you still expose the photo first and that's your starting point right?
It's something that always confuses me haha. Especially that AcesOT +1 stop of exposure thing.
I'd love someone to point me in the direction of a tutorial where someone takes a linear render and tonemaps it in Fusion/Resolve (ideally Fusion) and gets it back looking similar to how it did in the Frame Buffer before disabling the stack.
The way I see it, if editing the linear untonemapped image is akin to editing a raw photo, then the frame buffer is akin to the screen on the back of your camera. Its giving you a preview of the tonemapped raw render/photo. You then take that render/photo into your editor of choice (fusion/lightroom etc) and tonemap it using the flexibility of the tools in that editor. But my point is - your raw photo still relies on the exposure you set when taking the photo right? I appreciate the idea of RAW is that you can alter whatever you want after the fact, but you still want as close to a correctly exposed starting point as possible right?
What am I missing? I want to be educated .... hit me with it
Basics of HDR? (http://facweb.cs.depaul.edu/sgrais/HDR.htm)
There's really no 'consumer' digital camera with HDRI sensor available. Last I checked, 14bpc was max. (?)
RAW is just standardized industry image capturing format but still 'lossy' (minimally processed) in this regard.
To create HDRI with consumer camera, multiple exposed shots need to be combined into single image.
Virtually, Corona engine processes light transporting simulation in parallel universes, either w/ Adobe RGB or ACES color space and depth of 32bpc. (Results highly depend on 'Creators' input).
So, to express intent, display 'vision' or just tele-vise idea, adapting all gathered information/data that is considered influential to another being's experience, using certain medium to best of abilities is must (Know thy tech/industry). Then, depending on capital/budget, results are cut, clamped and filtered down for 'big, lovely eyes' attention and consumption, to 8-12 bpc :)
Here's where "Tone Mapping Operators" (https://docs.chaos.com/display/CRC4D/Tone+Mapping+Operators) (docs) come to help.
Personally I use many different ones, depending on style I'm after and mood I'm in. Keeping it linear as much as possible and 'till the end is my preferred way. As for some shit and giggles comparison, IMhO, human vision's dynamic range surpasses 160bpc. Depending on food, mood, age, skill, tear and ware... But note that ALL OF THIS HERE is just discrete 'digital-shit' measure, excerpt from 'vision of reality' in an infinite analog spectrum. Avoid distractions. Fix ADHD w/ OCD. :P
-
Going back to the original topic, I spoke with someone from Blackmagic’s customer support, and he admitted the problem. He also noticed the difference in colors and contrast between the Color and Fusion pages on the Mac version.
Nice to know :)
Care to link threads...