Show Posts

This section allows you to view all posts made by this member. Note that you can only see posts made in areas you currently have access to.


Topics - Jpjapers

Pages: [1] 2 3 ... 22
1
Im just curious if theres much benefit nowadays to rendering complex material structures to maps where possible. I know many of us use a lot of procedural and complex materials in our scenes and triplanar/randomisers dont really work with rendering to map, but is there much in the way of speed improvements when reducing material complexity? I very rarely render my nodes to maps just in case i need to tweak them.

Does anyone do it often and see a benefit outside of perhaps realtime export?

2
[Max] I need help! / Volumetric optimisations
« on: 2025-01-31, 12:19:21 »
Are there any settings i can tweak on volumetric-heavy scenes to speed up render times?

I have a very foggy and dusty scene using volumetric materials, volume grids and VDBs. No global volume material.
Ive adjusted step size (I feel that the documentation on this could feature larger scale examples) but is there anything 'under the hood' that can help?

3
[Max] General Discussion / Neural rendering - Nvidia RTX
« on: 2025-01-23, 17:12:48 »
Recently NVIDIA revealed a ton of neural features their new 50 series cards have.



A couple of them stood out to me.

RTX Mega Geometry:

Realtime LOD generation allowing for crazy amounts of geometric detail down to sub-pixel triangles allowing for highly detailed models without normal maps.

RTX Neural Materials:

Small AI models that aid shading through compression and faster material handling resulting in high fidelity materials that can render extremely quickly.



Whilst i understand that Corona is and always will be a CPU renderer for many reasons, each time i see a new development in realtime it feels like the quality gap is shrinking between offline and realtime content and it makes much of the offline workflow feel slow. Its often been the case that Corona has relied on the crutch of absolute realism to defend from this question of GPU/Hybrid rendering but now we have an upscaler built-in and have had a denoiser for some time both of which make assumptions and provide shortcuts that reduce overall accuracy. So can that really be an argument against a hybrid approach anymore?

My questions to the team.

-Can any of these features be used to speed up our offline workflow?
-Will you ever be looking at a hybrid approach at all now that the current generation of GPUs have so many neural capabilities?

When the result of things like neural shading looks 99% the same but the shader can be rendered to final quality in real-time, does absolute accuracy of your path tracing really matter?
Can rendering be accelerated by any of these developments at the cost of accuracy if the user deems it acceptable?
Could these technologies help reduce memory consumption?

It feels like there will inevitably be a limit with CPU rendering as to how far it can go without eventually needing to become a hybrid renderer in order to stay relevant. I think that time is coming around pretty quickly. Its not here just yet. But its becoming ever apparent that people will accept slightly lower quality images if it means getting them in less than half the time.  With that quality gap between real-time and offline continuing to shrink its only a matter of time before the quality gap becomes imperceptible to the vast majority of clients and audiences. Eventually clients are going to expect those faster results because of what they see in real-time and if offline doesn't find some major speed improvements somewhere, it'll become less and less important to be 100% accurate with your path tracing and more and more important to LOOK accurate.


4
[Max] I need help! / Scatter + Distance map insanely slow
« on: 2025-01-14, 18:27:21 »
I have about 100 trees on a landscape that are all proxies being displayed with a point cloud. Separate proxies for each tree, instanced.

I'm then using a distance map in a corona scatter with the trees assigned in the distance map. This distance map drivesthe scatter to add smaller plants around the trees, all of which are being displayed as boxes.

As a setup, its INSANELY slow but I don't entirely understand why. It takes about 15 minutes for the scatter to load into the scene. Any changes to the distance map take about 10-15 minutes to take effect. Even opening and closing the include/exclude list without making changes forces an entire reload of the scatter.

Its a little complex as far as scattering goes but i dont think it should be as slow as it is just to navigate the scene. I just tried importing a 700kb fbx and the entire of max locked up for over 90 minutes before i just had to quit yet it loads into a new scene absolutely fine.

Is there something inherently bad about this approach?

5
[Max] I need help! / Phoenix & Lightmix
« on: 2024-12-18, 14:34:40 »
Are we able to add phoenixFD fire to lightmix layers?

Currently lightmix places it on the 'rest' layer. I cant add the actual grid to a lightmix layer but i can add the particle source object. However this results in an empty lightmix layer and the actual light ends up on the 'rest' layer again.

What scene object do i need to add to gain control over the fire objects in my scene using lightmix?

6
General CG Discussion / Render Network
« on: 2024-12-12, 12:18:56 »
I just stumbled across this and thought it was interesting.

https://rendernetwork.com/

Its essentially a p2p style render farm if I'm understanding correctly.
Id love to see something like this for CPU renderers to somewhat democratise render farming. Does such a project exist?

Its a long shot and probably a pipe dream but I cant help but imagine if something like DR Server had additional functionality to allow users to 'rent' their licensed workstations as render nodes in a p2p fashion it would be a useful addition seeing that chaos cloud compatibility doesn't appear to be in the pipeline for corona.

7
[Max] I need help! / UHD Cache precomp issue
« on: 2024-12-09, 17:45:44 »
Im having an issue with a previously fine file whereby the UHD Cache precomp stage seemingly doesnt finish in either interactive nor production rendering. Last try at a 2k image i stopped it after 90 minutes.
There have been some small changes to the file since i last rendered but these were purely positional changes to objects in the scene and switching to a slightly different camera angle.

I tried merging to a new file but the issue persists. Ive tried hiding various selections of objects to narrow down a problematic object but still nothing.
Its having an extremely low UHD Cache success rate in the low single digits according to the VFB but in a previous save version it is around 25%.

Its an exterior scene, lit with a HDRI. Lots of translucent vegetation and some water in the right side of the image.

Any ideas how i can further troubleshoot this file as currently it just will not render and reverting to an older save loses quite a bit of work.


8
[Max] Bug Reporting / Cant always type numbers into New VFB
« on: 2024-10-04, 16:31:26 »
Several times over the past few months ive been frustrated with the new VFB and when it randomly stops accepting keyboard input. It doesnt let me type anything into any spinner. If i use the numpad it will randomly default to the 'arrow key' mode of the numpad even with numlock enabled. Numpad works fine in other applications. Even the numeric input on the main keyboard wont work when this happens. The spinner can be dragged to change the value but cant be typed in. It seems to happen at random.

I encountered this error back in v2 too. It appears to happen mostly with large scenes both during IR and rendering.

9
Does the VFB support the OCIO output transform?
Does the CIE retain the colour management info?

I rendered from a render farm and the CXR that returned has some colours that look very different to what render from the exact same file locally and im questioning my understanding now.

Essentially, what was in the VFB locally, appeared correctly. What ive got back from the farm, some colours (not all strangely) are much brighter. The image appears much more muted in general.
It also happens when i render using DR locally.

10
I work in a team like many others here likely do. Sometimes further down the pipe they might want to make a stylistic decision that could be resolved by adjusting a lightmix layer.

Would i need a separate full corona license just to give them the ability to open and adjust the cxr files?

11
Ive noticed recently whilst using the OCIO colour management that if i set a corona color node to a specific  sRGB colour, when i save and reopen the file they have all changed rather significantly.

For instance i have a brand colour 0,61,165 and on file open it had changed to 54,62,156.
0,0,255 has changed to 64,36,239

This isnt a one time occurrence. I thought perhaps i had screwed up somewhere but its happened to every coronaColor node. Ive switched to using pngs because i can rely on them.

Max 2024, Corona 12.

12
Does a higher res image require more passes to reach a certain noise level? Or does each pass just take longer?
For instance will and image of 1000x1000 and an image of 5000x5000 reach a 3% noise level at the same number of passes but each pass takes a longer time on the 5000x5000 image?

Ive never really thought about it before, i usually just let it run until it looks nice.

This help page is really useful but it didnt answer my question unless i missed it.

https://support.chaos.com/hc/en-us/articles/4528236666257-How-many-passes-is-enough


13
Hi,

Whenever i try to change the baseColor of corona materials via maxscript, the values end up being input in linear by default. Im assuming this is because of the way max internally handles colors and the conversion between rgb 255 and rgb 0-1. How can i ensure theyre added in sRGB or how can i convert the values correctly?

for instance color 250,200,10. If i manually add it into the picker in srgb mode the colour is correct.
However when i add it via maxscript it adds it in linear mode and so the resulting srgb colour is 254,228,59

How can i fix this? Is there a formula i need to implement in my script or do i need a different approach?
I tried using the max standard color map but the same thing happens even if i set the gamma to 1.0. It has to be to do with the way maxscript handles the (color R G B) values internally.

Thanks!

EDIT: The solution was to gamma correct the sRGB colours

Heres a function

Code: [Select]

 fn adjustColorGamma _inputColor _maxVal _gammaVal _doInverse =
 (
 if _doInverse then
 (
 outputR = _maxVal * ((_inputColor.r/_maxVal)^(1.0/_gammaVal))
 outputG = _maxVal * ((_inputColor.g/_maxVal)^(1.0/_gammaVal))
 outputB = _maxVal * ((_inputColor.b/_maxVal)^(1.0/_gammaVal))
 
 (color outputR outputG outputB)
 ) else
 (
 outputR = _maxVal * ((_inputColor.r/_maxVal)^_gammaVal)
 outputG = _maxVal * ((_inputColor.g/_maxVal)^_gammaVal)
 outputB = _maxVal * ((_inputColor.b/_maxVal)^_gammaVal)
 
 (color outputR outputG outputB)
 )
 )

adjustColorGamma (color 255 255 255) 255 2.2 false

 

14
It would be fantastic if we could save and load custom presets to have in the physical material presets list for common starter materials.
The ones there are useful but having your own custom ones would be even better. Think of it as an alternative to a material library for base materials already set up how you like them.

15
Scene setup:

Metal ball, white diffuse. 0.0 Roughness
Box below, Frosted Glass material library preset
Corona light below the box.


When the metal ball has a roughness value of 0.0 then the reflections are correct. Anything higher and the frosted material almost disappears and the hotspot from the light is the only thing visible.
This doesnt happen with the corona physical frosted glass preset, only the legacy material.

Pages: [1] 2 3 ... 22