Show Posts

This section allows you to view all posts made by this member. Note that you can only see posts made in areas you currently have access to.


Messages - Juraj

Pages: [1] 2 3 ... 275
1
How do you feel about VrayGPU compared to Corona? I briefly tried it... didn't know how to set it up properly and the result looked visually weird to me (also the converter did ultra shitty job).
But it did feel quite fast...and I loved the interactive in viewport manipulation (though selection was somewhat offset..?)

At this point Corona is so straightforward that Vray feels very weird to me, even though I love the features of it (Crazy good framebuffer for example).

But yea, absolutely bring on CoronaGPU :- ). It's time for heresy!

(Although, one aspect I don't welcome :- D The damn heat... my 3990X is mini-sauna, and that's just 280W (often 270W only). RTX 3090 has continous draw of 380W lol. Imagine 3990X + NV-Linked 2x3090X = 1040W of heat alone! I guess I would have to make water-loop and put the radiator outside of window because that is half of my air-conditioning power).

2
Gallery / Re: .green smoke
« on: Yesterday at 13:45:46 »
Beautiful work again, I absolutely love the bathroom shot, amazing atmosphere.

I love that you post crazy good work, without any unnecessary double-page A4 bullshit essay.
And most importantly, looks like you're having a lot of fun making it.

I wish I had eye and energy for this kind of work. Really feels inspiring.

3
Would be amazing to see this built into the IR (docked and floating windows) as a series of buttons along the top perhaps...

Yeah this would not be bad idea if done correctly. No idea how such UI would look but thought the same.

4
I did, 2 or 3 years ago with scene that had lot of 3D people and particularly hair(cuts) on those people :- ). Also, plenty of scenes with displacement.

But Corona has since did so much optimizations (plus 2.5D disp) in memory usage I'm no longer even coming close.

5
In my opinion nVidia really tried their absolute best this time. They couldn't affordably (without more expensive 7nm node from TSMC instead of the 8nm Samsung) be able to create anything faster than 3090. The small margin (<15-20perc.) between 3080 & 3090, the fact the GGDR6X memory is clocked 19 instead of 21 Gbit and already run 100 C, let alone the whole <350W :- ) TDP, shows it's at absolute edge of what they could achieve.
It's well reasonable they did it because the AMD will not be such a fail like their current marketing campaign suggests :- ).

The sillicon is getting denser and denser, and the dies themselves are massive. Only thing that could rapidly improve the performance (sans tech like Optix & DLSS) is "gluing" dies together like AMD Zen :- ).

nVidia is definitely charging (at least did for Turing) massive "winner" tax. But they were never lazy and complacement in advancing their tech. I find their GPUs fascinating and 3090 will be day-one buy for me.

6
Gallery / Re: Personal project called 'shadow'
« on: 2020-09-19, 08:46:12 »
It has to be a strange lens. :)

Something like 7Artisan, getting very popular :- ).

7
I don't know how well are all features supported when using Optix with RT Cores in something like Octane. I always remember how VrayGPU was behind VrayCPU and that didn't even account for Optix/RT Cores. Does F-Storm support Optix engine?
How long did it took to support NV-Link? Worse, only 3090 has NV-Link now, and only at 1/4 of Bandwidth links available. So while 3090 does support memory pooling, it probably doesn't support it without quite a performance loss as opposed to Quadro and A100. nVidia gave us bone (24GB Vram ! Hell yeah), but made sure we don't overeat into their enterprise profits.

When looking at conventional pure CUDA gains, we're not talking of "doubling" every two years. It took 4 years to actually double (1080ti --- ) 3080). And it will take another 4 years as well with the progressive worsening of nodal upgrades. Ampere got very lucky with the relatively cheap Samsung node, but that's after two years of battle trying to get better silicon themselves (Intel is struggling with nodal upgrade for 6 damn years). So this is where the progress will halt again.

I still agree GPUs are amazing at the moment and they are far easier to use, stack and afford. No comparison at all. They rock. Selling outdated workstations is pure loss of value (90perc.). Selling outdated GPUs ? Easy reclaimable 40perc. value.
3090 is galactic value compared to 3990X Threadripper. You can buy 3 of them !

But they're not progressing that fast ;- ).

Still... Corona-GPU would be spiffy =) Let's hope!


8
My guess: VrayGPU, Lavina Project, Vray for Unreal

Ampere GPUs are amazing though <3 3090 is just glorious, there will not be better GPU in next two years, it's the next 1080ti which was top for 4 years.

9
[Max] Feature Requests / Re: Multiple LUTS
« on: 2020-09-03, 14:16:30 »
"Hmm" new Vray 5 framebuffer :-D Stack everything you want, it's full-on Photoshop heh.

10
Hardware / Re: Monitor Recommendations
« on: 2020-09-03, 14:12:19 »
That occasional flicker is unfortunate hardware defect, mine do it too, but only veeery infrequently. It's actually quite common modulator issue, but I consider it better issue than all the backlighting horrors elsewhere :- ).

Consumer & Prosumer Monitor market is like this..no matter how expensive the monitor is, it will have some issues. Looks like the design & QualityControl is just not on par, unless you pay the money for NEC & Eizo for identical visual quality at 3-4 times the price for guaranteed quality control.

11
Hardware / Re: threadripper 1950x issue while rendering
« on: 2020-09-03, 14:09:20 »
Can u confirm me that i can buy a motherboard for threadripper 3rd generation (TRX40) for my threadripper 2nd generation (2990wx) ?

No, they are unfortunately not compatible. The only good&stable boards for 2990WX were X399 MSI MEG, X399 Asus Zenith Alpha as firs tier (best), and X399 Asus Zenith (non-alpha) and X399 Aorus Xtreme as second-tier (good enough with good airflow in case).

I would consider option of selling the 2990WX and getting newer Threadripper (if you have the budget) or slightly "downgrading" to Ryzen 9 3950X. Even slightly slower multithreader Ryzen, is much better overall CPU. (Although Ryzen is just dual-channel and only 4 memory slots).

12
This is super important, so good for bringing it up!

I never tested the speed of opening, but I routinely use 4096px and my scenes open fairly quickly. I am tempted to believe the slow-down would be rather due to swapping between GPU VRAM and System RAM. 11GB of 2080ti is still not a lot, because 3dsMax doesn't use any compression at all, so single bigger scene can eat up to 10GB in textures alone and that is just viewport in Max alone, Photoshop, Web Browser, Image viewers,etc. will eat VRAM at same time.
(But now I am gonna test this up)

When you run out of VRAM (you can check this on Windows 10 with task manager, there is GPU tab, from latest update you can even assign different GPU to different task/software), the performance completely tanks as well, if you had 200 FPS for example, you'll get right down to <5 FPS...

I've been on Quadro RTX5000 with 16GB Vram for over a year now for this reason alone. And I am upgrading all PCs to GeForce 3090 with 24GB to make life even easier (though GPU rendering & Unreal 4-5 will be nice bonus).

I have other "trick" people might not know with Corona, CoronaBitmap doesn't use TextureSize like Max Bitmap, but the procedurals! I was baffled why my CoronaBitmap materials look blurry in viewport and Ondra explained it's due to this.

This is reason why I advised people all year to buy 350 Euro 1080ti from used-market, and not brand-new 2070-80 with pathetic 8GB. 3dsMax doesn't care about the speed of your GPU much (in fact it runs fantastically on integrated Intel GPU lol), but VRAM is everything.
And the solution would be so easy for Autodesk... Unreal uses lossy compression for example that saves 75perc. of VRAM, such massive compression introduces artifacts on even spaces (like pure white/grey texture), but who cares in viewport?
I pondered lately if it's not possible to super-impose such compression from nVidia settings override.

In meantime, with Unreal 5 we're gonna get real-time mipmapping and unlimited texture size&amount : /. 3dsMax, on other hand, can be perfectly killed with single 20k texture shown 1:1.
(Would be nice to get that in Corona too, Corona too can be killed fairly easily with shit-ton of 8K textures, and no one uses tiled-exrs).

13
Gallery / Re: Personal project called 'shadow'
« on: 2020-09-02, 13:56:12 »
Stunning work! Love the peculiar DOF :- ).

14
Gallery / Re: Malalcahuello Cabin
« on: 2020-09-02, 13:49:22 »
With one exception (the hillside can't be excused:- ), you did fantastic job. Great details (I love the roofing attention) but most importantly very filmic atmosphere. Which I guess was the purpose with the inspiration :- ).

15
Hardware / Re: Monitor Recommendations
« on: 2020-09-02, 13:34:10 »
Juraj could you name your top three monitor choices for color critical work, as of this writing?
I second this. Been looking at the LG ultrafine 4k made for macs. Really dig the design and picture quality

Ultrafine 4K for Mac is only two models though, old 21.5", and newer 24". Both are really small for any CGI work.

If you mean Ultrafine 5K for Mac, in 27", that is excellent monitor with few caveats:

- You need nVidia 2xxx or upcoming 3xxx card because to get 5K, you need DSC compression via DisplayPort and then USB-C to DP cable to connect it (it natively uses Thunderbolt which won't work for PC GPU connection).
- Hack-install bootcamp drivers to be even able to select brightness and color-space. It's 500 NITs, 99perc. DCI-P3 panel, so without native control in Windows...that's trouble. This workaround might not work forever, but it does right now.


And finally to my personal recommendations: Any price level? Since I recently bought RazerBlade laptop with 144HZ refresh rate... I've become addicted to refresh rate. Not for gaming, for work! It's fucking brutal, it's amazing beyond measure. Everything is so smooth. Scrolling websites, rotating 3D models. Just moving your cursor around the screen. It's glorious. So with that said, my ideal monitor is:

- 4K+ Absolute must. At least 130 PPI (in 32", it's ok, not great, not terrible), ideally 160+ PPI (which alligns with 27", very crips, nice). We can only dream on Apple's 220PPI...oh well.
- 120/144HZ Right now, the only 4K/120-144HZ monitors on market are in 27" size and most of them will come in Q4 towards the end of year. EVE Spectrum will be one of them using the LG panel. Other right now on market look bit too gamery...
- 99+ perc. of DCI-P3. Corona is not color-managed right now, but won't be without forever. All the content consumption will be moving away from purely sRGB we are right now. Like 90perc. of mobiles support wide-gamut colors and that's where people watch content.
- IPS and IPS-like (Samsung LPS, Sharp's IGZO, AU-Optronics AHVA-IPS,..). I already described why VA panels in monitors (to not be confused with high-end TV VA panels) don't quality for any color-critical work...or any work at all.

I am not writting unrealistic things like OLED screen or 1000-zone FALD IPS because we simply aren't getting that (or not any soon (1-2 years) or cheap (will be 4k+ Euro).

So right now I don't like any monitor on market anymore. 4K/60HZ just doesn't make me happy anymore :- ). I would buy the Eve Spectrum in 27"/4k/144HZ. You will need Turing (DisplayPort compression DSC) or Ampere (both DSC and HDMI 2.1) GPU to drive such monitor though, Display port 1.4b alone without DSC or HDMI 2.0 don't feature enough bandwidth.

Now 32"/4k/120-144HZ...most of those will come in Q1 2021. Seems like LG had trouble with such panel, and Innolux likewise, don't know of anyone else who was planning such panels.

If you need something really good right now in 32" size, 4K res and 60HZ, I still stand by BenQ models. 3220U for newer look, wider gamut,etc.. older 3200U for budget conscious. Or one of the top LG ones ( LG 32UL950-W or older 32UD99-W ) in ultra-fine category (not to be confused with ultra-fine for Mac, which are different).

Pages: [1] 2 3 ... 275