Show Posts

This section allows you to view all posts made by this member. Note that you can only see posts made in areas you currently have access to.

Messages - Juraj

Pages: 1 2 [3] 4 5 ... 275
Hardware / Re: Buying a Monitor
« on: 2020-08-07, 00:26:39 »
HiDPI display don't cause eyestrain unless you are using them without scaling :- ).

But that's exactly how i want to use the monitor :] Somehow scaling makes no sense for me at all.

But have you experienced it :- )? People have poor opinion of scaling because it was implemented poorly in Windows from start. It's very good right now, at least OS wide, but still not smooth experience like on MacOS for all software since they don't follow unified approach.

I for example, thought high-refresh rate displays were overblown... until I bought Razer laptop last week with 144 (so not even the trendy 240). I got converted... in five seconds. It was 'wow' moment. And not playing any games...just moving cursor around.
Same goes for integer-scaled HiDpi displays. I am in absolute love with the screen of my SurfacePro. 12" at 3K, resulting in 270 PPI. When I used it next to my 140 PPI main display... it's just sad moment. I want to scale that Surface Display into 32".

Ask any programmer what they think of HiDPI, they love it. I know guys running code on multiple of the 8K Dell UP3218K. It's different, much more natural experience, the clarity is unreal, the ease of reading text and vector graphics is comparable to high-end print magazine almost.
Now there are some odd controversial arguments people will make about some visual accuity calculation, and how much can you perceive with 20/20 eyesight. But pixel grid stopping to be perceptible is not limiting factor, you still perceive far more clarity up to absurd PPI, it still affects AA.
Similar to how digital sensor are not stopping at 100 MPX, despite diffraction setting in (and grumpy folks complaining of marketing drive). You still get more accurate color information.

Regardless, this is where the industry moves, although slowly and from multiple directions:

-) Near future (next 5 years)  : Mini-LED FALD 500-1000 Zones, 120 HZ RR, 140-180 PPI, 1500:1 Static contrast for Desktop via IPS, 8000 Static contrast via VA for TVs, 1 million dynamic contrast for both + any OLED. Full coverage of DCI-P3.
-) Long future (next 10 years): Micro-LED alternative to OLED. 240+ HZ 180+ PPI, Static and Dynamic contrast like OLED, much higher peak brightness, best displays reaching Rec2020

Hardware / Re: Buying a Monitor
« on: 2020-08-06, 21:40:26 »
Hmm Juraj, i don't want to argue with you, but people are different and their requirements are different too. 90 ppi, might be tragic for you, but my current monitors have ~90 ppi and i know for sure that my next one will definitely have the same, or very similar pixel density, because anything higher would cause too much strain on my poor eyesight. In fact i'm thinking about 32" 2560px, but not LG. The only concern for me is physical size - i'm not sure if 32" would be too big for me.

HiDPI display don't cause eyestrain unless you are using them without scaling :- ).

On 32"/4K, my scaling is 150perc. The 32"/6K Apple XDR, is using 200perc. scaling, which being integer factor is sharpest, smoothest and most pleasant to look at.
My clients require 8k renderings, I need to see them comfortably during post-production and not at 1/8 zoom factor :- ).

Performance wise, they do require more resources to drive, but not necessarily during rendering. Go high enough PPI for integer scaling, and you can set even rendering to use 2X pixel size, and still retail crystal sharp and smooth font rendering.

And now to your arguments Sprayer:

VA being equal tech with IPS. Definitely not in monitors. We are not talking Samsung QLED TVs with FALD, contrast ratios of 8000:1 and higher and greater angles. We are talking monitors, and every single VA monitor, is a budget monitor.
Given that color calibration is far easier to be achieved with IPS dictated market development where only IPS-like panels (so LG IPS, Sharp IGZO, Samsung PLS, AU-Optronics AHVA) are developed at high standard of quality.

Common VA Monitor has barelly contrast ratio of 2000:1, far cry from where TV VA panels are. The viewing angles are abysmall, but those are not ideal in high-end TVs either, but here they are barely tolerable. I have 32" 500 Euro Samsung VA monitor in Livingroom because it is only seen as pretty decoration and... it's just bad.
Brightness has nothing to do with panel types, and most VA monitors on market don't have stronger backlighting than IPS offering. This is made up point. The highest HDR monitors on market, ProArt PA32UCX and Apple XDR, both 1600 NITs peak brightness, are both IPS. And for good reasons.
High-end IPS monitor panels reach 1300:1 contrast in bigger size (27"+) or even 1500:1 in laptops with glass coverings (like latest Dell XPS 16:10 panels with 500 Nits and 100perc. DCI-P3 coverage). The advantage VA has here... is minimal, almost imperceptible.
I am not saying they are bad, they still have their audience, but it's not professional users. And that is why you won't find one even marketed as so.

Argument with buying from factory maker. In theory it makes sense, in practice this only applies in few occasions. LG IPS is one such case, but Dell, BenQ or Apple have historically offered better build quality, factory calibration,etc.

Highly agree with . Other great source is RtRatings. But both have pretty limited sampling size.

Hardware / Re: Buying a Monitor
« on: 2020-08-04, 14:35:07 »
That is absolutely terrible choice, no offense to anyone :- ).

LG 32GK850F is 32" with only 2560px, which is tragically low PPI around 90 (for comparison, the popular Apple "Retina" is 220, but 140 is the golden standard you can get with 32/4k). It's looks super pixelated, definitely not worthy for work of any kind.
It's also low-end VA panel with terrible off-side contrast and color loss.

There are gaming monitors which are also perfect for work, like few upcoming LG 27" IPS 4K 120-240Hz, which I think will be available in Q4 2020 so few months still to go. Those will be superb for gaming, HDR content and work as well. But low-res VA panel, or even high-res VA panels are total no-go, completely hands off :- ).

Don't get 27" QHD in IPS either, go straight to 27"/4K, it's absolutely worth it, 100perc. look incredibily better. I use 27/4K as my travel monitor, and it's so much sharper then my desktop 32"/4k.
500 Euro is plenty to find some outliers or discounts. It doesn't really matter much if you get LG, Dell, BenQ or even AOC,etc.. then use the same LG or AU Optronics panels for the IPS.

Oooh ok, that now makes totally sense :- ). That is really bad setup haha, but I also thought of doing it that way at one time :- ).

Yes, big plus for Synology is ease of use as the whole system is single-oriented (though I found the venerated ecosystem of apps to be user-friendly but slow in performance across any metrics, I used to have one, and particularly back-up real-time sync was just ultra-slow).

I originally built normal file-server when I realized bigger NAS with Add-in 10GBE was 1K investment and was like LOL, I can just repurpose my old PC to 10times the performance.
The best file-server would be the 12-core ThreadripperPro with 128 PCI-lanes, that's just crazy, too bad OEM only and who knows when it comes to market in fall.

The latest Windows Server edition is based on Windows 10, so very easy to use, ultra stable. But the drivers on it are such a hassle I would still install Enterprise and just reboot it every 3 months for that required update installs.

The downside there is if that machine reboots the others lose the path

I am lost here :- ). Why do they lose paths? Reboot takes few minutes and everything is back. No different to NAS. And you don't need to reboot either one if you get server edition windows. That one lets you run 365 days on per year..installs updates without restarts. But you won't find drivers on it for lot of mainstream hardware..

No difference in how VPN will work either.

It's fundamentally absolutely identical solutions, the NAS is just linux-mini-PC with 3rd party software already installed.

I've recently built fileserver with ebay-24-core Xeon-Gold (true retail for few bucks!), Asus WS-Sage LGA3647 board with 2x10GBit, plenty of PCI-lanes for bifurcared m.2 add-in card, straight-add-in PCI-e SSDs, 10 SATA SSDs... in grand total (without cost of drives!) for 1500 +/- Euros. I can still connect billion HDDs if I want to it with simple enclosures.
NAS is still superior with people who want very easy and conventient hot-swap and RAID setups. But otherwise it's still more of a back-up than fileserver.

The moment you want to use PCI-e drives in NAS (how many are there even models that offer such option?) the performance just isn't there to drive it.

[Max] Feature Requests / Re: VFB shadows/whites/blacks
« on: 2020-08-03, 23:01:05 »
I believe they are planning on some rework here... (and hopefully it's copy of latest Vray).

But here is quick and painlessly easy to implement semi-solution without affecting current VFB in any way at all.
Let's just add parametric mode to existing point-mode to Curve like in Adobe ACR. Because the parametric sliders in ACR are basically just doing the same thing as the parametric mode of the tone curve.

Yeah but the saturation.. let's just kill that one and bring in proper Vibrance (or do the Vray and give everyone everything)


Hey Juraj,  Thanks a lot for the help. 
Next days I will order it. Really curious to try IR.

Please tell me your experience then. Because I am personally somewhat dissapointed in IR.

This is not Corona's fault I believe, but even if I only dedicate 50perc. of threads to IR, the rest of Max is still 4 times slower to react during this time. Because the I/O is saturated nonetheless.
I observe same behavior with any other tool, but with Max it's most severe.

I tried very extensive Process Lasso exercises to cheat around Windows scheduling, but I don't know if there is anything to really do here.

From perfomance point, it's massive improvement. It's beast. But it still doesn't improve multi-tasking in any way..for me IR is still painful. It's very different from GPU engines, where GPU and CPU don't share any I/O bandwidth during IR (but you do need another GPU just for viewport heh, but that is easy with GPUs since they stack, with CPU, you got one CPU, and that I/O has to deal with everything at same time, and there are only so many resources...).

I might be missing something, but it seems like the 3995wx is a wash for Corona users?  Same clock speed/core count and since the latest gen isn’t super restricted by ram speed, the gains of 8 channel memory won’t be significant.  Is that about right?  Just considering my upgrade options over the remainder of the year.  Thanks!

Are you able to source motherboard plus the chip? Because it's strictly OEM, with Lenovo being first, and then other integrators.

I think I've already seen benchmarks (by AMD itself, or Lenovo?) where there wasn't any improvement in either Corona or Vray.

BUT,...8 channels are 8 channels. Recently one of my ram sticks broke, and for moment, after taking it out, I ran benchmarks in single-channel mode. Corona absolutely tanked.. while Cinebench and many other benchmarks showed no difference.
Likewise I currently own few very high-end competitor machines, and the higher bandwidth doesn't seem to improve Corona either.

But it's important to note, that all these systems are running at JEDEC spec, which would be 3200 CL24. The Threadripper+ didn't seem to rule out unbuffered memory, at least with 256GB capacity (since that is where unbuffered ends due to 32GB DIMMs max), so you would loose the capacity benefit of TR4+, but still gain the bandwidth.

All in all, I am a bit salty something like this happened. AMD didn't want to cannibalize Epyc (since it has already single-socket high TDP P-series), until Mac Pro came along and system integrators wanted equivalent, suddenly it makes sense business wise.


Hey, so:

- Noctua is the right choice, always. Why it absolutely beats BeQuiet and other TR4 coolers is the heatpipe orientation. Noctua runs the heatpipes across the length of the chip, while everyone else across the width.
  Because of this, only Noctua actually covers all the dies on 3990X with heatpipes. All TR4 dedicated coolers cover the heatspreader, but quick access to heatpipes is what really makes the difference. KitGuru reviewed this and difference was bigger than 10C on 3990X.
On 3960/3970X the heatpipes are ok, but since those run hotter it's still bad idea. Noctua is always the correct answer. Always.

Regarding memory, I recently upgraded to 32GB 3600 CL18 DIMMs from Patriot, they use the same Hynix CJR dies that G.Skill & Corsair, but were so much cheaper :- ). I paid 280 Euro for each 64(2x32) Kit. And it works on my Aorus Xtreme TRX40 super well, super stable at XMP.
If you want higher-end kit, then Micron Rev-E Crucial Ballistics offer 32GB DIMMs with 3600/CL16. These are even better than Hynix CJR. (There is no Samsung B-Die with 32GB Dimm capacity for consumers).

Cannot say anything about HyperX Fury, no idea what it is, but since it's binned at 3200/CL16 I would say it's not a greatest quality. The speed itself isn't very important for 3rd gen Threadrippers, but higher-end memory kits have higher chance to run stable in 128-256GB configurations with higher speed. For those who want 256GB 3600 CL16-18, you need the best kits.

Good choice of SSDs, can't go wrong with those, PCI 3.0, but good controller, dram capacity and TLC with SLC Cache.

GPU, sure, any choice you like.

Hardware / Re: Monitor Recommendations
« on: 2020-07-15, 01:51:15 »
There is not a single professional or prosumer Monitor with more than 60Hz except one upcoming unit from ASUS (the 32" FALD IPS with 1120 zones and 4000 Euro pricetag). Every single other one has 60 Hz.

Is you find Samsung that's comparatively cheap, it will probably use VA panel, which is easier to driver faster (although with smeared blacks), has much better contrast, generally poor viewing angles (the angles are very good for VA TVs...but the PC panels are of inferior quality) and because this are always marketed towards gaming or general media consumption, they never come with good factory calibration even if they have good gamut coverage. Since they won't have hardware calibration either, it's best to avoid them for color-critical work.

Regarding the reviews... that depends. Some of their top models have very good reviews on sites like RtRatings and TFTCentral. If you look at Amazon reviews, that's just disgruntled people who ended up with faulty unit, no one's QC is perfect.
With that said... people don't tolerate well the quirks of expensive monitors. 1000 Euro IPS monitor is still IPS monitor, it will have backlight bleed and poor blacks, that's just the reality of technology. Even the 6000 Euro Apple XDR has static contrast of 1000:1 (when not accounting for dynamic contrast with FALD which isn't used outside of HDR content) and backlight bleed. And 60Hz ;- ).

If you want more than 60Hz, you have to make peace you are not buying monitor suited for color-critical work. That's how market segmentation ended up. It's either 4k/60Hz/Good calibration or <3k/144-240Hz/Random calibration. Nothing in-between sadly (except that one ultra expensive Asus which isn't on market yet).

Eh, no one needs to's OEM deal by Lenovo. Looks like it won't be possible to buy for DIY : /

So it's like Mac Pro, except not very pretty. But with powerful hardware.

Interesting things:
- Only 4 single-slot or 2 dual-slot GPUs ? Well effectively 3 dual-slot, but this is how they market it themselves. So how I am going to use those 128 PCI-lanes with 2 GPUs and single AIC? That would be 48, add NICs, few m.2 and we're still comfortable in normal TR4 territory.
- Memory is cooled :- ). I think this will be staple for all upcoming workstations, not just those with super-hot RDIMM. In fact, everyone should probably bolt-on tiny 4cm fan on each quad-stack. Looks ugly...but works well.

- They market it with 27" 2560px Monitor ;- )? I would rather the 6K one from Apple..

But please send me one, I will shill for free.

But is only the "blending" the culprit? Or is the whole UVWRandomizer node generally (with features of stock 5 build, not daily) taxing as well?

OK, so who is taking guess at what the 3995X will bring with itself? Wild rumor mill once again :- ).

to the devs:

Finding - using the latest daily - i noticed one of my current scenes (a simple bathroom scene) took 18 hours to complete (5K render on a Ryzen 3970)....  baffled me why it took so long when other renders take between 2-4 hours for this size.  I then started to diagnose the issue and narrowed it down to the UVWrandomizer.  The high quality blending takes the longest while without this checked, its a little better but still longer than without using the UVWrandomizer altogether.
When i removed this UVWrandomizer, the scene rendered in just under 4 hours.  I realise you tooltip says it takes longer to computer (double in the worst case scenario), however in my case it was 4x longer.

Just a heads up. more feature I will not touch then, good warning. Is this the case only when UVWRandomizer does the blending to avoid tiling, or also the Version5 basic features like simple offset/scale ?

Super craftmanship :- ). Fills me with anxiety only thinking of eventual maintenance.

10/10 for choice of tubing and fittings!

Pages: 1 2 [3] 4 5 ... 275