texture3D() deprecated?!

Hi everyone,

in my game I use a 3D texture and access the data using the texture3D() method in the fragment shader. My problem now is that although texture3D() is marked deprecated since 130 it works perfectly on my Radeon R7970; on another computer with an NVidia card it does not work and instead returns: “error C7533: global function texture3D is deprecated after version 120”.

So I guess what is happening is, that the Nvidia card is capable of a newer version of GLSL that removes the long time deprecated texture3D() method. I tried to fix the problem by replacing all texture3D() calls with texture() calls (like suggested in the GLSL documentation). But for some reason the compiler tells me that texture() would not have an overloaded version that is applicable to the types I provide - but in the GLSL documentation it clearly states that GLSL 150 supports this call: texture (gsampler3D sampler, vec3 P [, float bias]).

My texture call looks e.g. like this:

[java]
uniform sampler3D m_LightMap0;
uniform sampler3D m_LightMap3;
uniform sampler3D m_LightMap6;

// […]

vec4 getLightMapValueAt(in vec3 blockPosition) {
vec3 shadowMapPosition = vec3(modulo(blockPosition.x, 32.0) / 32.0, blockPosition.y / 128.0, modulo(blockPosition.z, 32.0) / 32.0);

if (blockPosition.x < 0.0) {
    if (blockPosition.z < 0.0) {
        return texture(m_LightMap0, shadowMapPosition);
    } else
    if (blockPosition.z < 32.0) {
        return texture(m_LightMap3, shadowMapPosition);
    } else {
        return texture(m_LightMap6, shadowMapPosition);
    }
}

}

// […]
[/java]

So my questions are:

  • why can’t I use texture() instead of texture3D() when I define GLSL 150 in my j3md file?
  • is there a possibility to limit the version of the shader? Because should my guess be correct and texture3D() was not only marked deprecated but completely removed in Nvidia’s compiler, then I need to tell the compiler to use version 150 and accept the texture3D() call.

What does your j3md file look like before when you got: "error C7533: global function texture3D is deprecated after version 120″

It looks like this:

[java]
MaterialDef Phong Lighting {

MaterialParameters {

    Boolean Sparkle

    // Attributes
    Texture2D GlobalAttributesMap

    // Texture of the lit parts
    Texture3D LightMap0
    Texture3D LightMap1
    Texture3D LightMap2
    Texture3D LightMap3
    Texture3D LightMap4
    Texture3D LightMap5
    Texture3D LightMap6
    Texture3D LightMap7
    Texture3D LightMap8

    Boolean Holographic

    TextureArray DiffuseTextures
    Texture2D DiffuseTexture
    TextureArray GlowTextures
    TextureArray SpecularTextures
    TextureArray SpecularWithSparklingTextures

    //Additional vector for fallen items
    Vector3 ShiftVector
    Matrix3 ShiftRotation

}

Technique {

    VertexShader GLSL150:   MatDefs/Lighting.vert
    FragmentShader GLSL150: MatDefs/Lighting.frag

    WorldParameters {
        WorldViewProjectionMatrix
        NormalMatrix
        WorldViewMatrix
        ViewMatrix
        ProjectionMatrix
        CameraPosition
        WorldMatrix
        Time
    }

    Defines {
        LIGHTMAPSHADING : LightMap0
        TEXTUREATLAS : DiffuseTextures
        SHIFT : ShiftVector
    }
}

}
[/java]

“GLSL150” this has the effect of inserting #version 150 in your shader, which may be the issue.
try with GLSL100

@nehon said: "GLSL150" this has the effect of inserting #version 150 in your shader, which may be the issue. try with GLSL100

I can’t test that because I don’t have the nvidia card - and my shader also uses features of at least 150 … so this is not really a solution for me :wink:

Hmm … I just recreated the case in a small sample app and there everything seems to work. Very odd … I guess it has something todo with my code or the engine version I use …

By the way: is there a way to get the shader code that is send to the compiler? Because I can’t really make much sense of the line numbers when JME is adding/changing things beforehand.

When it fails, you should see a huge dump of shader with line numbers. Check the log and stderr.

@abies said: When it fails, you should see a huge dump of shader with line numbers. Check the log and stderr.

Yes - you’re right: my fault: I disabled the ouput by setting the level to SEVERE. Still I don’t get why a shader error is not considered severe :wink:

ok - now it’s official: I’m an idiot :wink: Look at that function:

[java]
vec4 getValueAt(in sampler3D texture, in vec3 blockPosition) {
blockPosition.x += 3.0;
blockPosition.z += 3.0;

vec3 shadowMapPosition = vec3(blockPosition.x / 38.0, blockPosition.y / 128.0, blockPosition.z / 38.0);

return texture(texture, shadowMapPosition);

}
[/java]

Do you see the problem? No? I didn’t see it either but the problem is simply that texture is defined as a variable and by that is overwriting the function definition (so it seems). That’s why I can’t use texture() as described in the standard. And it doesn’t have anything todo with the code fragment I posted before (that was from another source position - but I thought this was the problem).

So if you change it like this it suddenly works:

[java]
vec4 getValueAt(in sampler3D atexture, in vec3 blockPosition) {
blockPosition.x += 3.0;
blockPosition.z += 3.0;

vec3 shadowMapPosition = vec3(blockPosition.x / 38.0, blockPosition.y / 128.0, blockPosition.z / 38.0);

return texture(atexture, shadowMapPosition);

}
[/java]

So thanks everyone for being patient with me solving the problems I created myself :wink:

I saw yesterday that Khronos has released a reference GLSL-compiler. It would be really great to have as a plugin in the IDE to test-compile the shaders code same way the IDE does for java code.

http://www.opengl.org/sdk/tools/glslang/

Too bad it doesn’t have an OSX port so I can’t test it.