Question about texture formats

Solved (but another issue appeared).

First, I found that on Intel graphics cards there is a problem with LTC and LATC formats, which we use for normals…
All we need to do is a small test:

renderer.getCaps().contains(Caps.TextureCompressionLATC);

if not…

manager.registerLoader(J3MSmartLoader.class, "j3m");

and inside J3MSmartLoader, in readValue I’m testing:

        int idx = texturePath.lastIndexOf("bc5.dds");
        if (idx > -1)
        {
            texturePath = texturePath.subSequence(0, idx).toString();
            texturePath += "dxt5nm.dds";
        }

So, I only need to keep proper names for textures, rest is done automatically.
Finally I run Skullstone on Intel GC. but… lets see:

I compared Intel’s caps with NVidia’s caps. The only difference is the lack of LATC and TextureBuffer.
Is the TextureBuffer important in deferred shading? Is someone have any experience with his own deferred pipeline on Intel?

EDIT: is there a way to disable TextureBuffer caps on the machine that supports it? I found GL11.glDisable() but cannot find a proper value of TextureBuffer.
I just need to run the game without that particular cap and see if it have the same graphics issues.

…caught my eye and I was wondering if there is an advantage to this over substring(), ie:
texturePath = texturePath.substring(0, idx);

…or is it an artifact of method-completion coding?

Heh, I wanted subString but I was in hurry… Thanks for pointing that :wink:

Ok, just making sure there wasn’t some new lore I didn’t know about. :slight_smile:

How about using a dds loader in software and converting them to uncompressed textures on the fly if the gpu cannot deal with dds directly.

Looks like I solved the DDSs problems, only few formats are not supported, so I can replace them with supported ones. I only need to have two normal map textures for every model that needs to have the best quality (BC5). Other looks good with DXT3nm.

Now I need to find out what causes the white-pink-blue-pony image :wink: Additionally the game is very unstable on Intel. Today I’ll debug it on similar machine.

On my experience, Intel driver are poor and unreliable.Expecially on older hardware. Newer one are better… :expressionless:

Tell your users to upgrade.
Or buy a decent GPU.

AAA Game studio can tell to users everything. I’m (we all here) are indies, we shouldn’t do that. It is a matter of quality of the whole game, including graphics.

The funny thing is that I fixed few small issues and the game finally works on Intels - old Intels!
There are still serious issues on cards integrated with CPUs (i5, i7 and so on), now I’m looking for a laptop with such card.

The strange thing - shader’s function distance(p1, p2) is causing game crash on Intels.