Show Posts

This section allows you to view all posts made by this member. Note that you can only see posts made in areas you currently have access to.

Messages - Juraj

Pages: [1] 2 3 ... 281
[Max] General Discussion / Re: Tonemapping - Plz Halp
« on: 2021-01-21, 21:59:46 »
Great comparison. Would be nice to see it also with Filmic from latest Vray 5 VFB2.

If you have really high-detailed, high resolution normal map, it will create certain level of natural anisotropy by itself.

Anisotropy is the crucial factor of wood look, it can't be simulated with reflection or gloss maps (they can help faking it though, so it's best to use everything).

Dual 3090 enables NV-Link memory pooling to 48GB. Although at far lower bandwidth than Quadro and A100s, so not sure if there are speed bottlenecks.

Tripple and Quad 3090 are back to 24GB though, only single pair of cards can be linked with all cards that are not A100s (which cost 10 times more than GeForce and 3times as much as Quadros). At least in Windows.

48GB is a lot, since GPU engines have a lot of really good compression methods and lot of them offer out-of-core for textures.

Hardware / Re: Upgrading RAM?
« on: 2021-01-21, 15:00:06 »
You have a decently powerful laptop CPU, I have the same one in my Razer Blade.

64GB of memory is surprisingly affordable for it, I think I paid less than 220 Euro in total for two 32GB Modules. (HyperX I think)

Make sure you buy 2667 Memory, since most laptops can't run XMP profiles so they will run the JEDEC profile that is default for the kit.
(What I mean is, there is a risk that if you buy faster memory, 2933 or 3200, it will run even slower because it could default to 2133 or 2400).

Tom and Maru said everything, more memory doesn't make laptop faster but it helps you run more tasks and bigger scenes. 64GB is good minimum today, even for laptop.

If I am buying brand-new, I mostly buy from local e-shops like Alza. If it's unavailable, I go to Amazon, and if I need something that's not crucial to have brand new (like PSU or Motherboard) I look up eBay :- ). I am quite a bargainer...

Zenith II is worth it only for overclockers as all other boards have more than enough VRM capacity anyway. Zenith II does have better memory support for those who want to run extreme configs like 256 GB at 3600\CL16. Something like that is not quite stable on other boards.

Your memory, you do need 4 kits since it is quad-channel platform and it will not run with full performance without it.

Corsair is tricky because each of their models (not just model-ranges) are made by someone else. AX for example is made by Seasonic, except for the super-high end AX1600i which is using digital platform from Flextronics.
So there is no particular benefit to Corsair, they are simply popular brand. Corsair PSU can be super shit or super high quality, each model fully unrelated.

While I use the AX1600i for my two latest workstations (it;s the best PSU on market) I use Seasonic Prime series for everything else (Seasonic doesn't have 1600W unit).

My currently suggested memory kits for Zen architecture is Crucial Ballistics (Micron Rev-E), Patriot Viper (Hynix CJR) and G.Skill, all listed in 3600 CL18 32GB DIMMs.

As fantastic as CX-range from LG is, there are some specifics to its PC-use that pertains to accuracy, mainly the sustained 100perc. white brightness where the TV dims considerably. It's best to treat this like monitor, run it in SDR mode at 120 to 150 NITs when not used for gaming & cinema.

It also doesn't feature good factory calibration for PC-mode and doesn't have true hardware calibration (through 3D 14bit LUT).
So to have true system wide calibration, it has to be calibrated through manual settings in DarkMode preset.

(Usage of ICC profile in Windows is not considered 'system wide' because it's ignored by non-managed application, which is almost everything except for Photoshop, Raw Converterts, etc. 3dsMax and Corona are not color-managed, so they ignore ICC profile calibration and show stretched colors mapping to gamut the TV currently has selected).

I would personally not buy CX for color-critical work as main display even as beautiful as it is (And it is stunningly beautiful I would definitely buy one for my living room if I didn't consider it waste of money).

Now, the LG OLED 32" Ultrafine... that will be something else. It sadly features OLED from JOLED, not LG so it will be probably very expensive... (4-6k Euro).

Last, IPS panels with hardware calibration have no saturation issues that would be inherent to the technology. Any mid to high-end IPS monitor with hardware calibration option have (or good Factory sRGB preset) show saturation without any issue.

Still.. long live OLED. Hopefully we get as many options as possible to buy! There is nothing quite like it.

Unfortunately I don't have much time to answer personally in private, I have baby and husky dog jumping around me all-day while I try to run business as best I can..
I just go here every few days to check and write something :- ). It's best to ask a general question on forum, so everyone can chime in and read answers.

If it's something specific, you can reach me on the 'info' email on my website. But I don't guarantee any timely answer.

But if your question concerns CPU vs GPU and CPU vs GPU renderers, I honestly have no opinion on it currently. I mainly stay with Corona because of its productive stability and large team behind development. I consider it currently the best engine for commercial work in Arch Viz.

Out of interest I tried Vray 5 GPU and... it was just weird. It didn't even seem to support most features and how long has it been in development alongside regular Vray? It's like it's always behind. F-Storm, I sadly never even tried, I just don't have time to experiment at all. Because I have considerable render-farm, I am not very limited by speed currently.

I plan to buy 3090 when FE edition becomes available in Europe again, and mainly for Unreal Engine and such.

If Corona ever adopts GPU, that would be fantastic ! I would support that in heart beat :- ). But I personally don't plan to switch away from Corona, engine itself is the least of issues in commercial production.

That was kind of expected though, no :- ) ? RTX 3090 is killer deal.

So Johannes bought 3990X and tripple 3090? Nice.

Here is how much the low-contrast of desktop monitors (applies to all desktop monitors that are not OLED, so all of them. Low-end are worse by a lot... but the effect is the same on ultra high-end as well. TN, VA, IPS, all technologies have them, but IPS has it most pronounced despite being the highest-quality type of non-oled panel used for monitors.

This is how your monitor look at dark setup, you might not perceive it as much until you photograph it. That's not white color... that's completely black display, 0/0/0 RGB. The darker the room, and the brighter is your display set, the worse this becomes. This is why you're seeing completely incorrect brightness and contrast, even though in the moment your brain is telling you different. This monitor has contrast of 1:1000, it's Dell Monitor with LG IPS panel.

And here is my setup in afternoon, I believe I had ceiling lights turned off to highlight the effect of bias lighting (Philips HUE). Of course dramatic tonemapping from cell-phone camera makes it look lot more drastic than reality :- ). But the important factor is... look how black is the black on my monitor. And that's also your standard 1:1000 IPS panel (IPS panel from AU-Optronics? It's BenQ Monitor) ! There also frontal bias lights (reading lamps, you can mount them on monitor, BenQ makes it but I was unable to buy it yet) that are excellent for working at night.

Damn, Maybejensen wrote the same in 3 sentences..

(Disclaimer before my terrible wall of text below.. sorry everyone. Anyway, I check my images on my Phone (AMOLED) and Surface Tablet (High-end glossy IPS) and they look identical to my desktop monitor, just nicer, crispier. But effectively, look the same, which I take as great comfort)

I am glad if my clients watch my renders on Phones and Tablets, because nowadays, they feature absolutely stunning displays compared to even the most high-end monitors. There are multiple reasons for that:

1) At worst, they feature high-quality (2000:1 contrast) IPS panels with glossy screen (Older iPhones).
    At best, they feature stunningly high quality ( million or rather..infinity:1 contrast) OLED panels with glossy screen (Every high-end Android and latest iPhones).

2) High brightness (They effectively go to sustained 800 NITs... that is bonkers) compared to low-brightness of desktop monitors (Sustained brightness max 250-300 NITs, but we usually use them around sRGB standard of 125)

3) Crispy details of up to 600 PPI, compared to average 140 PPI for 4k 32" display. Phones can look twice as good as high-end glossy magazine.

4) Contrary to popular opinion, they often feature perfect calibration... but by default show incorrect color gamut (often called "Vivid" mode). But all popular brands (iPhones, Samsungs, Huawei, etc..) feature standard sRGB mode for natural colors in unamanaged applications/environments.
But even if your clients use wide gamut (whether managed like DCI-P3 mode in iPhone, or just random something like Vivid in Samsung/Huawei), they will just see your image like punchier, over-saturated version. It will not look uglier. Quite contrary.. most people enjoy that carnival look, which is why it's default. "Nice colors !!"

So with that said... we can conclude that in most situations, the phones actually show your work more accurately. Often time... lot more accurately. In small amount of cases, the client is at fault with wrong device setup.

Now we can use bit of stupid pseudo-reverse engineering to try to pinpoint where comes the difference reason in typical situation :-)

A) Phones show the image too dark ---> While some phones feature drastic power savings and default to low brightness in "auto-mode brightness", which can look quite dark in outdoor setting, I presume these clients aren't total simpletons and watch the images inside, where even drastic power saving still means above 200 NITs. Which would be still brightness then how common desktop is setup.

So rather, most people use too high brightness or too dark environment. Worst, many people use both (Bright displays..and work in pitch black room at night), which is complete opposite to how people consume visual content on cellphones (during daylight in bright room).

Solution: Set your Display to 125 Nits, for most displays, that is 40-50 perc. brightness. Work in moderately lit room. Obviously not in direct sunlight... but the bullshit with dark room has to end. That makes sense for IMAX cinema post-production where the content will be consumed in dark theatres.

B) Phones show the image too contrasty ----> Unfortunately, phones show contrast much more accurately than even the best monitors on market. Since most Phones have OLED screens, their contrast is absolute. The blacks are true blacks, whites are nicely bright. That cannnot be said for desktop monitors. We have few types of panels for desktop:

- TN - Only used for cheap gaming and office monitors (Sub 250 Euros). The static contrast is often as low as 1:500.
- IPS - Used for all professional displays, feature static contrast from 1:700 (cheaper sub 700 displays) through 1:1000 (most common) to 1:1300 for the most expensive, high-tech screens (costing few thousands).
- VA - Used for most of gaming displays, all curved screens. While VA can have contrast as high as 1:8000 when used in TVs, desktop monitor VAs only have contrast of 1500 to 2000. So they are barely above IPS, not enough to offset their absolutely abysmall watching angles.

One OLED LG monitor is coming to market this year and there will be FALD and Mini-LED IPS displays (They feature array of LEDs to increase dynamic contrast mostly during HDR displaying. The static contrast is still only 1:1000 and these displays will cost 3l to 5k Euro. Very expensive. Apple XDR Pro Display also belongs into this group with its 384 LED zones behind its 32" 6K panel).

Solution:  - Use correct brightness (125 NITs, or 40-50perc.) and correct contrast ( default for sRGB mode that your monitor should have).
               - Use moderate ambient lighting (daylight). Don't work in dark room. If you do, set your monitor as low brightness as you can.. but really, don't. Don't imitate cinema post-production & color grading, it's different beast.
               -  Use bias lighting to increase the perceived contrast on your desktop monitor. Because the contrast on most desktop monitors is too low, it's best when you can perceive it as high as possible and bias lighting (something behind your display) can do this really well. It's also much easier and healthier for your eyes.

I've only touched the main hardware reasons, because color calibration and compression quite frankly, don't make great image into terrible image. They just make it slightly wrong temperature, saturation and clarity. But if the image appears dark and contrasty, Facebook's compression algorithm is not at fault.

I don't even think 128 cores are in for Epyc Milan :- ). There just isn't any space on the chip with current node and architecture design. They either have to jump node to 5nm or features some 3D stacking.
There would be fantastic IPC gains of course, and slight frequency perhaps.

(Speculation time!) I believe two things happened at same time:

1) The Lenovo deal was failure so far. I think we can clearly deduce this from fact there are almost 50perc. discounts on the website... taking the usually over-priced corporate builds from 20k + into absurd 10k price.
So now we have plenty of 3rd gen Threadrippers in "pro" variants that need to go somewhere. If 4th gen (5xxx) TR chips arrived now, these chips would be dead weight. Corporations don't buy obsolete hardware even if it's single generation past.

2) With TSMC foundries being overcrowded to hell, there is little reason to cannibalize the little amount of Epyc (Milan) chips that is possible to manufacture to select clients by dilluting it with DIY Threadrippers and OEM integrators.

Perhaps if things go well, 5xxx Threadrippers might be still announced this year, but perhaps in Summer and later. Why even have Halo product when you are winning so much.

Oh c'mon, someone is still runing 4 core cpu here.

Sometimes around year ago, when I bought my 3990X I finally reached a point (together with our render farm) when rendering power is no longer limiting me...not even tiny bit (well we don't do animations though...) and I realized further how slow I am.
I am now rendering even previews in 8k... (and downscaling before showing of course, but I can set-up my post-production layouts in final mask dimensions, feature very clean AA, catch mistakes earlier and have at least some semblance of reason to procrastrinate)

I would like to test this but I am talking purely bitmap color profiles, of course 3dsMax ignores any ICC profiles so the colors look wrong unless the Monitor is clamped through OSD profile LUT.

I tested this (it was 6 years back) on saturated red color, one that could have been outside of sRGB but I don't remember sadly (since that makes a lot of difference for this test)...
My setup was following:

My wide-gamut display at that time (Dell 3011 I believe) was set to wide-gamut AdobeRGB in Monitor OSD, Windows was set to corresponding AdobeRGB profile as well.
Thus, everything color managed in Windows like Photoshop displayed wide-gamut colors correctly, and 3dsMax oversaturated everything because it stretched its native sRGB value into AdobeRGB interpreted by monitor.

I authored the bitmap of red-color into TIFF with sRGB and AdobeRGB color profiles (Two textures). I loaded them into 3dsMax and they displayed differently.

To test this thoroughly I would need to test it with 4 sets of textures like this:

a) sRGB color bitmap with sRGB color profile
b) sRGB color bitmap with AdobeRGB (or DCI-P3 nowadays) color profile
c) Wide Gamut color bitmap with sRGB color profile
d) Wide Gamut color bitmap with AdobeRGB (or DCI-P3) color profile.

And of course, test in Vray with ICC correction in framebuffer. Since I want to discount how colors are displayed, just want to know exactly the kind of interpretation 3dsMax bitmap loader does.

Because I don't believes it ignores/strips them, but rather assigns sRGB automatically, making wide-gamut input impossible without nodes like VrayICC for wide-gamut bitmaps. Which is then of course stretched when displayed because it doesn't respect any ICC correction.

Max is kinda of weird with color profiles.. I am not sure if it ignores them completely, at least in regards to textures. With bitmap having no color profile attached, it acts as sRGB. But I loaded once AdobeRGB texture and it did appear more saturated (visible only when monitor is set to wide-gamut since, since 3dsMax does ignore system-wide ICC profiles).

I am currently not using wide-gamut monitor, otherwise I would do comparison with Vray's ICC node to see what 3dsMax actually does by default.


Pages: [1] 2 3 ... 281