Author Topic: TX image format?  (Read 4519 times)

2023-08-16, 03:04:57

Tom

  • Active Users
  • **
  • Posts: 236
    • View Profile
    • www
Hi,

I recently used a Chaos Cosmos asset in one of my scene and, after I put the material in the SME, I saw all textures used in the proxy model were .TX files.

I never used this format so far and I was wondering what are the advantages of using them?
Also, I tried to open such images in PS and it doesn't work so I was wondering how to create them?

I stumbled upon these articles:
https://www.autodesk.com/support/technical/article/caas/sfdcarticles/sfdcarticles/How-to-convert-bitmap-textures-to-TX-format-for-rendering-with-Arnold-in-3ds-Max.html

and

https://help.autodesk.com/view/ARNOL/ENU/?guid=arnold_for_3ds_max_arnold_3dsmax_html#Textures-ManuallyGenerate.txTextures

which tend to let me think .TX files are more a Arnold thing, so why are they being used by my Cosmos proxy model?

Sorry for my newbie question, but it's a bit confusing.

Thanks in advance for your help guys,

2023-08-16, 09:18:28
Reply #1

James Vella

  • Active Users
  • **
  • Posts: 540
    • View Profile
Vray also uses .tx files for the built-in library, so I would assume all Choas assets are either already in tx or will be.

The reason they use this format is the benefit of mip-mapping and render speed. A few other benefits listed below, these 3 links you can read more ( Vray, Open as Tiff ,Mip-Mapping ) but this is the TLDR;
- they are intended to increase rendering speed
- reduce aliasing artifacts
- files require less memory
- high-resolution mipmap image is used for high-density samples, such as for objects close to the camera
- lower-resolution images are used as the object appears farther away
- improve image quality by reducing aliasing and Moiré patterns that occur at large viewing distances

You can also open .tx files as TIFF to edit in photoshop and save as something else, Its not recommend to overwrite the file as you lose all the mip-mapping and optimizations. You have to use the maketx batch script for the time being. Would be nice is there was a simpler way but thats about all I know on the topic.



2023-08-17, 01:41:06
Reply #2

Tom

  • Active Users
  • **
  • Posts: 236
    • View Profile
    • www
Thanks James,

Do you use such texture files yourself?

Unless it really increase rendering speed & quality I more see this as a waste of time as you first have to convert your regular jpg, tiff ... textures in tx before you can use them so it adds one more step in the workflow.

2023-08-17, 08:21:23
Reply #3

James Vella

  • Active Users
  • **
  • Posts: 540
    • View Profile
No I don't, I would if it was an automatic function in substance/photoshop/3D or could be easily read/edit by native file viewers. 

Currently I don't have any reason to over complicate my workflow so for the time being I wont be authoring .tx files. I'll keep my eye on it but for now I tend to agree with you. I would be interested to know from the Corona devs their thoughts on this.

2023-08-17, 09:33:00
Reply #4

mraw

  • Active Users
  • **
  • Posts: 162
    • View Profile
Interesting to read that Chaos use tx-files as well. I thought tx-files were in Arnold-thing. In Arnold it is automated. There is a tool for converting textures to tx-files, but I believe most people use the checkbox 'auto-convert textures to tx'. In some cases you want to alter the mip-map-bias to get the texture sharper, but usually I don't care about the rx-files at all.

2023-08-17, 14:27:19
Reply #5

Juraj

  • Active Users
  • **
  • Posts: 4761
    • View Profile
    • studio website
I always felt this should be run-time operation behind the scenes done by renderer and not to be externalized into actual physical file on hard drive. Similar to GPU texture compression (weird that never happened in offline rendering, it's massive resource saver with minimal artefacting for most common bitmaps).

I am already angry by the insistence of many texture shops (like textures.com) to give me uncompressed TIFF files. Like I absolutely do not need 16bit lossless & uncompressed AO 500 MB AO bitmap. JPEGs for all but displacement is the right way.
Please follow my new Instagram for latest projects, tips&tricks, short video tutorials and free models
Behance  Probably best updated portfolio of my work
lysfaere.com Please check the new stuff!

2023-08-17, 14:28:50
Reply #6

James Vella

  • Active Users
  • **
  • Posts: 540
    • View Profile
I always felt this should be run-time operation behind the scenes done by renderer

Yep, my thoughts exactly.

2023-08-17, 15:22:53
Reply #7

arqrenderz

  • Active Users
  • **
  • Posts: 996
  • https://www.behance.net/Arqrenderz1
    • View Profile
    • arqrenderz
I always felt this should be run-time operation behind the scenes done by renderer and not to be externalized into actual physical file on hard drive. Similar to GPU texture compression (weird that never happened in offline rendering, it's massive resource saver with minimal artefacting for most common bitmaps).

I am already angry by the insistence of many texture shops (like textures.com) to give me uncompressed TIFF files. Like I absolutely do not need 16bit lossless & uncompressed AO 500 MB AO bitmap. JPEGs for all but displacement is the right way.

Amen to all of that

2023-08-18, 01:48:07
Reply #8

Tom

  • Active Users
  • **
  • Posts: 236
    • View Profile
    • www
I always felt this should be run-time operation behind the scenes done by renderer and not to be externalized into actual physical file on hard drive. Similar to GPU texture compression (weird that never happened in offline rendering, it's massive resource saver with minimal artefacting for most common bitmaps).

I am already angry by the insistence of many texture shops (like textures.com) to give me uncompressed TIFF files. Like I absolutely do not need 16bit lossless & uncompressed AO 500 MB AO bitmap. JPEGs for all but displacement is the right way.

Yup, I absolutely agree.

2023-08-18, 11:07:42
Reply #9

Nejc Kilar

  • Corona Team
  • Active Users
  • ****
  • Posts: 1251
    • View Profile
    • My personal website
I always felt this should be run-time operation behind the scenes done by renderer and not to be externalized into actual physical file on hard drive. Similar to GPU texture compression (weird that never happened in offline rendering, it's massive resource saver with minimal artefacting for most common bitmaps).

I am already angry by the insistence of many texture shops (like textures.com) to give me uncompressed TIFF files. Like I absolutely do not need 16bit lossless & uncompressed AO 500 MB AO bitmap. JPEGs for all but displacement is the right way.

I am not anywhere close to being an expert on compression obviously (it is complex to say the least) but from my understanding this is very much a thing for offline renderers as well - in Corona we for example have out-of-core textures which is akin to traditional mip-mapping and saves you quite a few GB of memory.

If we look at our brethren, V-Ray GPU, it also has mip-mapping support along with GPU texture compression (which is especially useful now that NVLink went the way of the Dodo and you're quicker to be VRAM limited on bigger projects).

I am assuming but most texture shops would probably want you to have the highest quality image possible so that you don't run into artefacting issues or stuff like that? Granted though, I agree with you so I personally too like to have an option of downloading a JPG or a PNG instead of a 500MB AO map... I'm quite a big fan of those "selectors" where you can decide before downloading how big of an image you'll download and in what format :)
Nejc Kilar | chaos-corona.com
Educational Content Creator | contact us

2023-08-18, 11:58:42
Reply #10

Juraj

  • Active Users
  • **
  • Posts: 4761
    • View Profile
    • studio website
Neither Out-of-core and mipmapping are compression alternatives. The first just offloads the texture and the seconds mip-maps to lower resolution.
Compression reduces the actual footprint, in lossy form in exchange for lowered visual fidelity.

https://docs.unrealengine.com/4.27/en-US/RenderingAndGraphics/Textures/TextureCompressionSettings/

Please follow my new Instagram for latest projects, tips&tricks, short video tutorials and free models
Behance  Probably best updated portfolio of my work
lysfaere.com Please check the new stuff!

2023-08-18, 12:33:53
Reply #11

romullus

  • Global Moderator
  • Active Users
  • ****
  • Posts: 8854
  • Let's move this topic, shall we?
    • View Profile
    • My Models
Forgive my ignorance, i might be speaking complete nonsense here, but wouldn't compression be detrimental to the rendering speed? I mean if every time when renderer needs to access texture, it would have to decompress it first, surely that would be massive hit on render time. I don't think that i would be happy to trade it for some memory gain. GPU's are maybe different, because they probably can compress/decompress textures at hardware level and this operation has significantly lower impact on rendering speed. I think mip-mapping already saves a lot of RAm, without significantly impacting render time, do we really also need compression? Once again , apologies if i misunderstood the topic and wrote bunch of nonsense here.
I'm not Corona Team member. Everything i say, is my personal opinion only.
My Models | My Videos | My Pictures

2023-08-18, 13:21:10
Reply #12

Nejc Kilar

  • Corona Team
  • Active Users
  • ****
  • Posts: 1251
    • View Profile
    • My personal website
Neither Out-of-core and mipmapping are compression alternatives. The first just offloads the texture and the seconds mip-maps to lower resolution.
Compression reduces the actual footprint, in lossy form in exchange for lowered visual fidelity.

https://docs.unrealengine.com/4.27/en-US/RenderingAndGraphics/Textures/TextureCompressionSettings/

Indeed they are not compression algorithms but they do help with that you stated - that a large bitmap gets automatically "converted down" to something less resource intensive.
At least thats why I thought you aimed at with your OP :)
Nejc Kilar | chaos-corona.com
Educational Content Creator | contact us

2023-08-18, 16:28:14
Reply #13

Juraj

  • Active Users
  • **
  • Posts: 4761
    • View Profile
    • studio website
Forgive my ignorance, i might be speaking complete nonsense here, but wouldn't compression be detrimental to the rendering speed? I mean if every time when renderer needs to access texture, it would have to decompress it first, surely that would be massive hit on render time. I don't think that i would be happy to trade it for some memory gain. GPU's are maybe different, because they probably can compress/decompress textures at hardware level and this operation has significantly lower impact on rendering speed. I think mip-mapping already saves a lot of RAm, without significantly impacting render time, do we really also need compression? Once again , apologies if i misunderstood the topic and wrote bunch of nonsense here.

GPU compression for real-time use in Unreal for example is not usually done at runtime (unless the shader is built to request it) and doesn't decrease performance. It does in fact often leads to faster rendering time, but that is maybe side-effect of faster swapping assets in & out.
The only time I had issue with artifacts was with very clean & smooth textures, the blocky rasterization was very reminiscent of low-quality jpegs. It was still suitable for 90perc. of bitmaps.

This is also isn't request thread and I didn't request anything either here :- ). Pretty sure no such feature is ever in plan.

To Nejc:

Mip-mapping is conditional, render at high-res framebuffer and it's back to full footprint. But I am definitely in support of it. Automated mip-mapping behind the scene, but with control to still allow user-control to modify the amount/level) would be of course fantastic. Most of my machines are 128GB ram and they're not enough... but that's also because 3dsMax and Corona load all assets twice effectively. Not much to do there.
Out-of-Core is just glorified swapping.

But I am like.. not arguing these features don't exist, but they're not features I mentioned. Here is thing A which is maybe similar to feature B. Well, ok.
« Last Edit: 2023-08-18, 16:34:03 by Juraj »
Please follow my new Instagram for latest projects, tips&tricks, short video tutorials and free models
Behance  Probably best updated portfolio of my work
lysfaere.com Please check the new stuff!

2023-08-23, 11:08:35
Reply #14

Nejc Kilar

  • Corona Team
  • Active Users
  • ****
  • Posts: 1251
    • View Profile
    • My personal website
...
To Nejc:

Mip-mapping is conditional, render at high-res framebuffer and it's back to full footprint. But I am definitely in support of it. Automated mip-mapping behind the scene, but with control to still allow user-control to modify the amount/level) would be of course fantastic. Most of my machines are 128GB ram and they're not enough... but that's also because 3dsMax and Corona load all assets twice effectively. Not much to do there.
Out-of-Core is just glorified swapping.

But I am like.. not arguing these features don't exist, but they're not features I mentioned. Here is thing A which is maybe similar to feature B. Well, ok.

Yeah I understand what you mean better now, thanks for clarifying. :) I'd still think it would be useful to post a feature request for it if not for anything else then to let us know that we have users that would still appreciate more memory optimizations and then it is up to the team to find good solutions for everyone :)
Nejc Kilar | chaos-corona.com
Educational Content Creator | contact us

2023-09-27, 08:01:26
Reply #15

nino anurogo

  • Users
  • *
  • Posts: 4
    • View Profile


Mip-mapping is conditional, render at high-res framebuffer and it's back to full footprint. But I am definitely in support of it. Automated mip-mapping behind the scene, but with control to still allow user-control to modify the amount/level) would be of course fantastic. Most of my machines are 128GB ram and they're not enough... but that's also because 3dsMax and Corona load all assets twice effectively. Not much to do there.
Out-of-Core is just glorified swapping.

But I am like.. not arguing these features don't exist, but they're not features I mentioned. Here is thing A which is maybe similar to feature B. Well, ok.

Hi Juraj, can we trick it so that 3DS Max and Corona don't load assets twice effectively by creating an asset proxy and creating a display as wirebox? does this help reduce RAM usage during the rendering process? Thank you very much

2023-09-27, 08:28:27
Reply #16

Juraj

  • Active Users
  • **
  • Posts: 4761
    • View Profile
    • studio website
Not sure to be honest, I don't usually use proxies (hassle to create & manage, although I considered some of the 3rd party plugins that offer more robust solution to this).
But maybe? :- )
Please follow my new Instagram for latest projects, tips&tricks, short video tutorials and free models
Behance  Probably best updated portfolio of my work
lysfaere.com Please check the new stuff!

2023-09-27, 13:43:04
Reply #17

burnin

  • Active Users
  • **
  • Posts: 1535
    • View Profile
Regarding, check PRMan's documentation on their "txmake" tool.
Also, almost all 'heavy lifting' was automated few years ago w/ "RenderMan Utility Manager" (by Amir Ashkezari on GitHub).

2023-09-28, 00:57:02
Reply #18

arqrenderz

  • Active Users
  • **
  • Posts: 996
  • https://www.behance.net/Arqrenderz1
    • View Profile
    • arqrenderz
I would  really love to push more for memory optimizations for Corona

I made a little experiment with Tx textures
I have some HUGE map textures that we are using to map some mountains on a video, it was made from 9 different textures, each 35K px!
The difference between corona bitmap and TX was in favor of Tx textures by 20gb of Ram... Pretty mind blowing...
This is an edge case i know.. but we all can benefit from lower memory consumption !

2023-09-28, 09:24:36
Reply #19

Juraj

  • Active Users
  • **
  • Posts: 4761
    • View Profile
    • studio website
Was it 20GB saving for all 9 maps? I.e 2GB +/- per 35K (8bit RGB?) texture?

I dislike adopting workflow that require me to use arbitrary file format mostly, but maybe this could be good solution for HDRis.

Please follow my new Instagram for latest projects, tips&tricks, short video tutorials and free models
Behance  Probably best updated portfolio of my work
lysfaere.com Please check the new stuff!

2023-09-28, 10:37:39
Reply #20

romullus

  • Global Moderator
  • Active Users
  • ****
  • Posts: 8854
  • Let's move this topic, shall we?
    • View Profile
    • My Models
I made a little experiment with Tx textures
I have some HUGE map textures that we are using to map some mountains on a video, it was made from 9 different textures, each 35K px!
The difference between corona bitmap and TX was in favor of Tx textures by 20gb of Ram... Pretty mind blowing...
This is an edge case i know.. but we all can benefit from lower memory consumption !

Did you try this with out of core feature being turned on or off? I think with out of core there shouldn't be much difference between TX and more traditional file formats if at all, because to my knowledge Corona is converting textures to TX internally.
I'm not Corona Team member. Everything i say, is my personal opinion only.
My Models | My Videos | My Pictures

2023-10-04, 15:01:23
Reply #21

Avi

  • Corona Team
  • Active Users
  • ****
  • Posts: 507
    • View Profile
Hi,

The .tx is used in cosmos models because it includes mimapping ( use different resolutions of the texture depending on its size in the final image). This results in significant memory savings and improved rendering speed. While Corona has its own out-of-core rendering for textures, the support for .TX format was provided to ensure compatibility of cosmos models across all chaos products, not just Corona.

In Corona, all textures benefit from Out-of-core rendering as it unloads the memory used by textures to hard drive by using a lower resolution version of them, depending on how far from the camera they appear in the scene.
Arpit Pandey | chaos-corona.com
3D Support Specialist - Corona | contact us

2023-10-05, 01:24:55
Reply #22

Tom

  • Active Users
  • **
  • Posts: 236
    • View Profile
    • www
Thanks, it makes a lot of sense but, honestly, given the power of nowdays cpus and the available amount of RAM in today workstations, I don't really feel the need for it, especially as it creates a new step in the workflow in term of editing the textures, as .TX files aren't directly editable in PS. So it's unlikely I will start using this format, unless the rendering speed gain is significant.

What is the average rendering speed gain?

2023-10-08, 07:21:20
Reply #23

Mohammadreza Mohseni

  • Active Users
  • **
  • Posts: 152
    • View Profile
    • Instagram
just in case somebody wants to create tx format image you can do that easily with this tool from OpenImageIO.
this will convert images to tiled, MIP-mapped textures (tx format)

it would be better to add its folder to user variable PATH in windows system environment to make system-wide accessible.


this has many options and it is indeed a great tool.

you may also find this in Arnold and V-Ray plugin folder for 3dsMax



« Last Edit: 2023-10-08, 07:31:06 by Mohammadreza Mohseni »