Author Topic: TX image format?  (Read 4517 times)

2023-08-16, 03:04:57

Tom

  • Active Users
  • **
  • Posts: 236
    • View Profile
    • www
Hi,

I recently used a Chaos Cosmos asset in one of my scene and, after I put the material in the SME, I saw all textures used in the proxy model were .TX files.

I never used this format so far and I was wondering what are the advantages of using them?
Also, I tried to open such images in PS and it doesn't work so I was wondering how to create them?

I stumbled upon these articles:
https://www.autodesk.com/support/technical/article/caas/sfdcarticles/sfdcarticles/How-to-convert-bitmap-textures-to-TX-format-for-rendering-with-Arnold-in-3ds-Max.html

and

https://help.autodesk.com/view/ARNOL/ENU/?guid=arnold_for_3ds_max_arnold_3dsmax_html#Textures-ManuallyGenerate.txTextures

which tend to let me think .TX files are more a Arnold thing, so why are they being used by my Cosmos proxy model?

Sorry for my newbie question, but it's a bit confusing.

Thanks in advance for your help guys,

2023-08-16, 09:18:28
Reply #1

James Vella

  • Active Users
  • **
  • Posts: 540
    • View Profile
Vray also uses .tx files for the built-in library, so I would assume all Choas assets are either already in tx or will be.

The reason they use this format is the benefit of mip-mapping and render speed. A few other benefits listed below, these 3 links you can read more ( Vray, Open as Tiff ,Mip-Mapping ) but this is the TLDR;
- they are intended to increase rendering speed
- reduce aliasing artifacts
- files require less memory
- high-resolution mipmap image is used for high-density samples, such as for objects close to the camera
- lower-resolution images are used as the object appears farther away
- improve image quality by reducing aliasing and Moiré patterns that occur at large viewing distances

You can also open .tx files as TIFF to edit in photoshop and save as something else, Its not recommend to overwrite the file as you lose all the mip-mapping and optimizations. You have to use the maketx batch script for the time being. Would be nice is there was a simpler way but thats about all I know on the topic.



2023-08-17, 01:41:06
Reply #2

Tom

  • Active Users
  • **
  • Posts: 236
    • View Profile
    • www
Thanks James,

Do you use such texture files yourself?

Unless it really increase rendering speed & quality I more see this as a waste of time as you first have to convert your regular jpg, tiff ... textures in tx before you can use them so it adds one more step in the workflow.

2023-08-17, 08:21:23
Reply #3

James Vella

  • Active Users
  • **
  • Posts: 540
    • View Profile
No I don't, I would if it was an automatic function in substance/photoshop/3D or could be easily read/edit by native file viewers. 

Currently I don't have any reason to over complicate my workflow so for the time being I wont be authoring .tx files. I'll keep my eye on it but for now I tend to agree with you. I would be interested to know from the Corona devs their thoughts on this.

2023-08-17, 09:33:00
Reply #4

mraw

  • Active Users
  • **
  • Posts: 162
    • View Profile
Interesting to read that Chaos use tx-files as well. I thought tx-files were in Arnold-thing. In Arnold it is automated. There is a tool for converting textures to tx-files, but I believe most people use the checkbox 'auto-convert textures to tx'. In some cases you want to alter the mip-map-bias to get the texture sharper, but usually I don't care about the rx-files at all.

2023-08-17, 14:27:19
Reply #5

Juraj

  • Active Users
  • **
  • Posts: 4761
    • View Profile
    • studio website
I always felt this should be run-time operation behind the scenes done by renderer and not to be externalized into actual physical file on hard drive. Similar to GPU texture compression (weird that never happened in offline rendering, it's massive resource saver with minimal artefacting for most common bitmaps).

I am already angry by the insistence of many texture shops (like textures.com) to give me uncompressed TIFF files. Like I absolutely do not need 16bit lossless & uncompressed AO 500 MB AO bitmap. JPEGs for all but displacement is the right way.
Please follow my new Instagram for latest projects, tips&tricks, short video tutorials and free models
Behance  Probably best updated portfolio of my work
lysfaere.com Please check the new stuff!

2023-08-17, 14:28:50
Reply #6

James Vella

  • Active Users
  • **
  • Posts: 540
    • View Profile
I always felt this should be run-time operation behind the scenes done by renderer

Yep, my thoughts exactly.

2023-08-17, 15:22:53
Reply #7

arqrenderz

  • Active Users
  • **
  • Posts: 996
  • https://www.behance.net/Arqrenderz1
    • View Profile
    • arqrenderz
I always felt this should be run-time operation behind the scenes done by renderer and not to be externalized into actual physical file on hard drive. Similar to GPU texture compression (weird that never happened in offline rendering, it's massive resource saver with minimal artefacting for most common bitmaps).

I am already angry by the insistence of many texture shops (like textures.com) to give me uncompressed TIFF files. Like I absolutely do not need 16bit lossless & uncompressed AO 500 MB AO bitmap. JPEGs for all but displacement is the right way.

Amen to all of that

2023-08-18, 01:48:07
Reply #8

Tom

  • Active Users
  • **
  • Posts: 236
    • View Profile
    • www
I always felt this should be run-time operation behind the scenes done by renderer and not to be externalized into actual physical file on hard drive. Similar to GPU texture compression (weird that never happened in offline rendering, it's massive resource saver with minimal artefacting for most common bitmaps).

I am already angry by the insistence of many texture shops (like textures.com) to give me uncompressed TIFF files. Like I absolutely do not need 16bit lossless & uncompressed AO 500 MB AO bitmap. JPEGs for all but displacement is the right way.

Yup, I absolutely agree.

2023-08-18, 11:07:42
Reply #9

Nejc Kilar

  • Corona Team
  • Active Users
  • ****
  • Posts: 1251
    • View Profile
    • My personal website
I always felt this should be run-time operation behind the scenes done by renderer and not to be externalized into actual physical file on hard drive. Similar to GPU texture compression (weird that never happened in offline rendering, it's massive resource saver with minimal artefacting for most common bitmaps).

I am already angry by the insistence of many texture shops (like textures.com) to give me uncompressed TIFF files. Like I absolutely do not need 16bit lossless & uncompressed AO 500 MB AO bitmap. JPEGs for all but displacement is the right way.

I am not anywhere close to being an expert on compression obviously (it is complex to say the least) but from my understanding this is very much a thing for offline renderers as well - in Corona we for example have out-of-core textures which is akin to traditional mip-mapping and saves you quite a few GB of memory.

If we look at our brethren, V-Ray GPU, it also has mip-mapping support along with GPU texture compression (which is especially useful now that NVLink went the way of the Dodo and you're quicker to be VRAM limited on bigger projects).

I am assuming but most texture shops would probably want you to have the highest quality image possible so that you don't run into artefacting issues or stuff like that? Granted though, I agree with you so I personally too like to have an option of downloading a JPG or a PNG instead of a 500MB AO map... I'm quite a big fan of those "selectors" where you can decide before downloading how big of an image you'll download and in what format :)
Nejc Kilar | chaos-corona.com
Educational Content Creator | contact us

2023-08-18, 11:58:42
Reply #10

Juraj

  • Active Users
  • **
  • Posts: 4761
    • View Profile
    • studio website
Neither Out-of-core and mipmapping are compression alternatives. The first just offloads the texture and the seconds mip-maps to lower resolution.
Compression reduces the actual footprint, in lossy form in exchange for lowered visual fidelity.

https://docs.unrealengine.com/4.27/en-US/RenderingAndGraphics/Textures/TextureCompressionSettings/

Please follow my new Instagram for latest projects, tips&tricks, short video tutorials and free models
Behance  Probably best updated portfolio of my work
lysfaere.com Please check the new stuff!

2023-08-18, 12:33:53
Reply #11

romullus

  • Global Moderator
  • Active Users
  • ****
  • Posts: 8854
  • Let's move this topic, shall we?
    • View Profile
    • My Models
Forgive my ignorance, i might be speaking complete nonsense here, but wouldn't compression be detrimental to the rendering speed? I mean if every time when renderer needs to access texture, it would have to decompress it first, surely that would be massive hit on render time. I don't think that i would be happy to trade it for some memory gain. GPU's are maybe different, because they probably can compress/decompress textures at hardware level and this operation has significantly lower impact on rendering speed. I think mip-mapping already saves a lot of RAm, without significantly impacting render time, do we really also need compression? Once again , apologies if i misunderstood the topic and wrote bunch of nonsense here.
I'm not Corona Team member. Everything i say, is my personal opinion only.
My Models | My Videos | My Pictures

2023-08-18, 13:21:10
Reply #12

Nejc Kilar

  • Corona Team
  • Active Users
  • ****
  • Posts: 1251
    • View Profile
    • My personal website
Neither Out-of-core and mipmapping are compression alternatives. The first just offloads the texture and the seconds mip-maps to lower resolution.
Compression reduces the actual footprint, in lossy form in exchange for lowered visual fidelity.

https://docs.unrealengine.com/4.27/en-US/RenderingAndGraphics/Textures/TextureCompressionSettings/

Indeed they are not compression algorithms but they do help with that you stated - that a large bitmap gets automatically "converted down" to something less resource intensive.
At least thats why I thought you aimed at with your OP :)
Nejc Kilar | chaos-corona.com
Educational Content Creator | contact us

2023-08-18, 16:28:14
Reply #13

Juraj

  • Active Users
  • **
  • Posts: 4761
    • View Profile
    • studio website
Forgive my ignorance, i might be speaking complete nonsense here, but wouldn't compression be detrimental to the rendering speed? I mean if every time when renderer needs to access texture, it would have to decompress it first, surely that would be massive hit on render time. I don't think that i would be happy to trade it for some memory gain. GPU's are maybe different, because they probably can compress/decompress textures at hardware level and this operation has significantly lower impact on rendering speed. I think mip-mapping already saves a lot of RAm, without significantly impacting render time, do we really also need compression? Once again , apologies if i misunderstood the topic and wrote bunch of nonsense here.

GPU compression for real-time use in Unreal for example is not usually done at runtime (unless the shader is built to request it) and doesn't decrease performance. It does in fact often leads to faster rendering time, but that is maybe side-effect of faster swapping assets in & out.
The only time I had issue with artifacts was with very clean & smooth textures, the blocky rasterization was very reminiscent of low-quality jpegs. It was still suitable for 90perc. of bitmaps.

This is also isn't request thread and I didn't request anything either here :- ). Pretty sure no such feature is ever in plan.

To Nejc:

Mip-mapping is conditional, render at high-res framebuffer and it's back to full footprint. But I am definitely in support of it. Automated mip-mapping behind the scene, but with control to still allow user-control to modify the amount/level) would be of course fantastic. Most of my machines are 128GB ram and they're not enough... but that's also because 3dsMax and Corona load all assets twice effectively. Not much to do there.
Out-of-Core is just glorified swapping.

But I am like.. not arguing these features don't exist, but they're not features I mentioned. Here is thing A which is maybe similar to feature B. Well, ok.
« Last Edit: 2023-08-18, 16:34:03 by Juraj »
Please follow my new Instagram for latest projects, tips&tricks, short video tutorials and free models
Behance  Probably best updated portfolio of my work
lysfaere.com Please check the new stuff!

2023-08-23, 11:08:35
Reply #14

Nejc Kilar

  • Corona Team
  • Active Users
  • ****
  • Posts: 1251
    • View Profile
    • My personal website
...
To Nejc:

Mip-mapping is conditional, render at high-res framebuffer and it's back to full footprint. But I am definitely in support of it. Automated mip-mapping behind the scene, but with control to still allow user-control to modify the amount/level) would be of course fantastic. Most of my machines are 128GB ram and they're not enough... but that's also because 3dsMax and Corona load all assets twice effectively. Not much to do there.
Out-of-Core is just glorified swapping.

But I am like.. not arguing these features don't exist, but they're not features I mentioned. Here is thing A which is maybe similar to feature B. Well, ok.

Yeah I understand what you mean better now, thanks for clarifying. :) I'd still think it would be useful to post a feature request for it if not for anything else then to let us know that we have users that would still appreciate more memory optimizations and then it is up to the team to find good solutions for everyone :)
Nejc Kilar | chaos-corona.com
Educational Content Creator | contact us