Important Texture changes

Talking to Mojo, we’ve decided to enable S3TC DXT1 (or DXT5 for alpha images) compression by default – when the user’s card supports it, of course. On machines supporting S3TC compression, you should see a bump in FPS. (Try TestMultitexture from cvs vs. the current .8 webstart for example.) On machines that don’t support S3TC, nothing will have changed.



When you create your Texture with the TextureManager you can override this behavior by calling the following method and specifying imageType yourself.


public static com.jme.image.Texture loadTexture(URL file, int minFilter,
            int magFilter,
            int imageType,
            float anisoLevel,
            boolean flipped)



Also, please note that the isMipMapped argument that was in many of the TextureManager methods is now completely removed. It was a lame duck arguement, not affecting or attached to anything at all. I know that will require us to change our texture code slightly (removing a single boolean) but it's better than having useless extra arguments. I've already fixed this in all the demos.

Hum… I remember making some tests on DXT quality, the DXT1 output was very ugly compared to the DXT3 or DXT5…



Well, if we can override this behavior, it’s good :slight_smile:



Note : S3TC shouldn’t be used with Normal Mapping (not implemented in the core yet, but maybe some people used it with shaders) or any other complex per-pixel lighting operations. There’s a specific compression extension for this called “3Dc”, but it’s only available on high end cards (Geforce 6800 and ATI x800). For a comparison on normal mapping with DXT5 enabled and 3Dc enabled, go here. But heh, this is off-topic :wink:



Chman

Yeah, DXT1 is only used in the case where you have no alpha channel and in such case, only DXT1 is available (at least from the S3 spec I read.)



We’ll keep the normal mapping thoughts in mind for that release. Thanks for bringing that up.

Hmmm… I’m probably missing something. But I just updated with the latest from cvs and had to make changes to my image loading based on the new stuff specified above.



However, in order to make my GUI stuff look right without any image burbs (for lack of a better word) I had to set the type to Image.RGBA8888 instead of -1.





Example:









(Don’t know why the image tags aren’t working…



Note the strange image ‘indentation’ on the title bar and the lack of smoth transition from one color to another both on the title bar and button.

Just as a follow up with the above. When using the default -1 setting, jME selects RGBA8888_DXT5 for me. This may just be an issue with using this format for my card.



The (probably lossfull) compression that the system uses is probably not a good idea with graphics like those to be used with UI where crisp graphics are needed.



Thus, I’m going to keep hardcoded to non-compression RGBA8888 until told otherwise.

Hey guurk, update from cvs and use imageType = Image.GUESS_FORMAT_NO_S3TC.



Image.GUESS_FORMAT_NO_S3TC will guess and use S3TC compression if available.

Image.GUESS_FORMAT will guess but not allow compression, even if available.



The default if you don’t specify imageType is still Image.GUESS_FORMAT.

Works like a charm, thanks.

"renanse" wrote:
The default if you don't specify imageType is still Image.GUESS_FORMAT.

Strange, I just checked out from CVS and unless I specify Image.GUESS_FORMAT the image looks poor. This is with 32 bit PNG's.

not sure… send me one of your images and I’ll check it out

It does use DXT5 for images with an alpha channel, do you mean use it for images without an alpha channel?

Renanse, I’m sending you one of the images. The PNG files are using alpha.

Cool, I’ll have a look in the morning. Please include your texturestate setup code. I’ll pop it on a box and have a look.

Renanse, I'm sending you one of the images. The PNG files are using alpha.


Strange that you get a bad quality with DXT5 enabled (due to the alpha channel)...

It does use DXT5 for images with an alpha channel, do you mean use it for images without an alpha channel?


Well, at least, use DXT3 ;)
It looks far more better than DXT1.

Chman

Hmmmmmmmmmm



The NV20 bug… That could be why the DXT1 was ugly compared to DXT3 on the computer (I’m not at home at the moment)… Here I’m running a Geforce 3 Ti.



Chman

Yeah, could be… I just tried one of EsotericMoniker’s images with and without S3TC and while compression does make the edges slightly more fuzzy, it’s really not that noticeable…



see here:



No S3TC: (defaulted to Image.RGBA8888)





With S3TC: (defaulted to RGBA8888_DXT5)





Note the difference in FPS… also please note that the texture is not square and also not power of 2…

"renanse" wrote:
It's really not that noticeable...

... On this picture ;) I'll try that at home on my ATI card, I'll see how is the result !

also please note that the texture is not square and also not power of 2...


:? I think I've missed something... This kind of texture support has been added to jME ? If so I'm happy to ear that !

Something that could interest you, here. A great, with a lot of detailed tests and screenshots, article on DXTC and FXT compression... You'll see the limit of S3TC ;)

Chman


Edit: just want to clarify something, I really like this S3TC support ! My post could make you think the opposite...

non-power 2 texture support is card dependent, not something jME dependent.



As for FXT, I’m not sure why your link shows S3TC quality is worse than FXT because his comparisons show S3TC to be better in almost every case… (compression amount, yes, but quality is much more important today) Note though that it’s a really old comparison (back when GeForce2 cards ruled the earth) and probably isn’t using the best S3TC algs we now have available.



From what I can see, if it doesn’t look good with jme’s default S3TC support turned on, you should try specifying RGBA8888_DXT3 or compress it yourself in Photoshop… Or simply shrug your shoulders and turn off compression. :slight_smile:

"Chman" wrote:
"renanse" wrote:
It's really not that noticeable...

... On this picture ;) I'll try that at home on my ATI card, I'll see how is the result !
In case anyone wonders, the above pics were on an ATI Radeon 9800XT...

The link was just to show the comparison between uncompressed and compressed textures in special case (high quality gradients, noise etc), FXT1 is an odd compression system :stuck_out_tongue:

But you’re right about the video card used, I haven’t paid attention to it…

"renanse" wrote:
In case anyone wonders, the above pics were on an ATI Radeon 9800XT...

Thanks, I have the same card :P
I'll just test using different set of textures, with gradients and such...

Chman