GLSL uniform sampler2D and texture units

Just tried out multitexturing with glsl on my Nvidia Geforce Go 7400 - it won't work! Apparently, only the first texture unit is used. Probably I got something fundamentally wrong somewhere, so now I am asking for help in understanding all this. According to http://en.wikipedia.org/wiki/Comparison_of_NVIDIA_Graphics_Processing_Units#GeForce_Go_7_series , the chip has 4 "Texture mapping units" - isn't that what "uniform sampler2D" in glsl refers to? If so, shouldn't the following produce a texture blended halfway between the textures in units 0, and 1?



Vertex shader:


void main(){
    gl_TexCoord[0] = gl_MultiTexCoord0;
    gl_TexCoord[1] = gl_MultiTexCoord0;
    gl_Position = ftransform();
}



Fragment shader:

               
uniform sampler2D TextureUnit0;
uniform sampler2D TextureUnit1;
void main(){
    vec4 col0 = texture2D(TextureUnit0,vec2(gl_TexCoord[0]);
    vec4 col1 = texture2D(TextureUnit1,vec2(gl_TexCoord[0]));
    gl_FragColor = 0.5 * (col0 + col1);
}


It doesn't really work, apparently it only interpolates between TextureUnit0 and TextureUnit0 again (I found this out when playing with the texture coordinates).
It won't work in an other program (Typhoon Lab's ShaderDesigner), as well, so it's definitely not a jME problem. Still, I thought it might be most appropriate in Effects, since it is not really what I'd call "Offtopic"...

are you setting uniforms for the units in jme? like this:



yourShader.setUniform( "TextureUnit0", 0 ); //set texture sampler named "TextureUnit0" use texture unit 0

yourShader.setUniform( "TextureUnit1", 1 ); //etc…

Being totally "noob" to glsl, I didn't. That solved my problem, thanks! Actually, I was wondering how my program's supposed to know about the texture units, but didn't figure I had to set them as uniforms. The tutorial I thought to be a good starting point didn't mention this, either (or maybe I just read over it…). I assume this is not jME-specific, but common in openGL?



EDIT: Nevermind, on a second thought it's actually quite obvious that you'll have to set uniforms when you use the "uniform" keyword… Oh well, looks like I should put that to rest for today, and get a little distraction, before I continue to embarrass myself in public!

it's not very clear at all, and should really be put with exclam in the wiki, so nothing to feel nooby about :wink:

i'll see if i get around to do that…

Its good that it works. Just a thing to note. Some GLSL implementations are broken (for example ATI on my laptop), and only the first 8 characters in uniform/attribute names are considered. So if i may suggest using "Tex0Unit" and "Tex1Unit" instead of "TextureUnit0" and "TextureUnit1". Its damn hard to find these kind of bugs.

Whoa, good to know. Thanks for mentioning this! I really love this community around here  :slight_smile:

got any document about that max 8 characters thing? proven or just something you know by trial and error?

Its by trial and error. I googled it now, but couldnt find anything on it. Except this thread on opengl.org:



http://www.opengl.org/cgi-bin/ubb/ultimatebb.cgi?ubb=get_topic;f=11;t=001166#000000



It says that the max length of uniform/attribute names is implementation specific, and provides functions for getting the length. On the other hand i doubt ATI has implemented this as 8 characters. Its more of a bug. The compiler accepts more than 8 character names, but if two names dont differ in their first 8 characters, the result isnt what its supposed to be.

my glsl-code work very good with ati and nvidia. I use more then 8 chars in uniforms.

You are sure, that the actual driver is installed on your system?





glsl support is implemented in the driver stuff of the graphics-card vendor, or not ?!  :?