Too many vertex shader default uniform block components?

I am working on a shader that needs an array of floats that hold definitions to calculate the vertices per instance. So I gave it a FloatArray uniform to pass this data. But it gives me the following error:

SEVERE: Uncaught exception thrown in Thread[jME3 Main,5,main]
com.jme3.renderer.RendererException: Shader failed to link, shader:Shader[numSources=2, numUniforms=5, numBufferBlocks=0, shaderSources=[ShaderSource[name=MatDefs/guiTBTBackground.frag, defines, type=Fragment, language=GLSL150], ShaderSource[name=MatDefs/guiTBTBackground.vert, defines, type=Vertex, language=GLSL150]]]
error: Too many vertex shader default uniform block components

Here I define the parameters:

    MaterialParameters {
        Int BoundDrawBuffer

        Texture2D atlasTexture
        FloatArray atlasDefinition

        // For instancing
        //Boolean UseInstancing

        // Alpha threshold for fragment discarding
        Float AlphaDiscardThreshold (AlphaTestFallOff)

        Float  time //ToDo find out built in timer 

    }

And this is the vertex shader part where I define the uniforms:

uniform float[20000] m_atlasDefination;

attribute vec3 inPosition;

varying vec2 texCoord;

const float vertCalc[12]=float[]( 1.0,-0.5, 0.0,   0.0,-0.5, 0.0,   0.0, 0.5, 0.0,   0.0, 0.5, 1.0  );

Got it… 20000 is too many floats for this array. Tuning it down to 5120 works. I did not bother to find out why and what the upper limit would be. If someone can provide the theory behind this, that would be great.

You can query the amount with GL_MAX_VERTEX_UNIFORM_COMPONENTS, GL_MAX_FRAGMENT_UNIFORM_COMPONENTS,
depending on the shader stage.

I am 90% sure the returned value defines the amount of uniforms and not primitives. but i would have to recheck the specs.

I tuned the number down, like I wrote before, the error message was gone. But somehow the data does not get to the shader.

I fed the material the definition array (of floats) and attached the material to the geometry:

  void loadMaterial(String _materialName){
       material = atlas.gui.assetMgr.loadMaterial(_materialName);
       material.setTexture("atlasTexture", atlas.atlasImg);
       material.setParam("atlasDefinition", VarType.FloatArray, atlas.definitionArray);
       
       geometry.setMaterial(material);
Utils.LOG("Def test pos#4:"+atlas.definitionArray[4]);
    }

To be sure, I had the app spit out a value to see if the array was filled. At position #4, the value in the array is 9.0
In the shader however, the value is 0.0

Even when I sized the uniform down to just 16 floats - a number mentioned in another post on this topic - the array was still empty in the shader.

worked for me, show the j3md and the relevant shader code

The material parameters from the j3md:

    MaterialParameters {
        Int BoundDrawBuffer

        Texture2D atlasTexture
        FloatArray atlasDefinition

        // For instancing
        //Boolean UseInstancing

        // Alpha threshold for fragment discarding
        Float AlphaDiscardThreshold (AlphaTestFallOff)

        Float  time //ToDo find out built in timer 

    }

And the vert-shader:

uniform float[16] m_atlasDefinition;

attribute vec3 inPosition;

varying vec2 texCoord;
varying vec4 testColor;

const float vertCalc[12]=float[]( 1.0,-0.5, 0.0,   0.0,-0.5, 0.0,   0.0, 0.5, 0.0,   0.0, 0.5, 1.0  );


void main(){

[...]
    
    int offsetAtlas=0;//int(inInstanceData[0].x*20);

    texCoord=vec2(
        m_atlasDefinition[int(offsetAtlas+4+inPosition.x)]/512,
        m_atlasDefinition[int(offsetAtlas+8+inPosition.y)]/512
    );
    
    testColor=vec4(m_atlasDefinition[4]/12,0.0,1.0,1.0);

[...]
    
}

In the frag shader I directly throw testColor into the gl_FragColor, so I should see a purple rectangle since I know m_AtlasDefinition[4] should be holding a 9.0 value. But the rectangle I get is pure blue;

looks ok.
Shooting in the dark: is the array you pass 16 floats long?

Just a note that the 12 should be 12.0.

tought that even nvidia-windows throws a warning for those casts.

Yeah, me too. But it occurs to me that I was only making assumptions about which version of JME is being used… and perhaps even then there is a regression or something.

Only way to know for sure for sure is to put an intentional error in the shader so JME dumps the full source with the #version line at the top.

This is what it says:

aug. 18, 2024 9:00:16 P.M. com.jme3.renderer.opengl.GLRenderer updateShaderSourceData
WARNING: Bad compile of:
1	#version 150 core

And maybe noteworthy: I am currently using my travel laptop that only has an onboard graphics card (Intel).

Interesting… it could be that the intel GPU driver does not give an error about bad float literals.

In which case, it’s also possible that it did integer math instead of float math. And that could explain why yourValue/12 is always 0.

For floating point literals, you should always specify the decimal point or many many drivers will fail to compile your shader at all.

Tested that right away, when I change it to value/12.0 it still gives 0, and multiply with 0.1 does that too.

And in most cases on this laptop forgetting the decimal point does result in errors. Just not in that specific case. I do a lot of int math in other shaders because it is needed for indexes. And in most cases it will complain about it.

im am 100% sure the arrays worked with 3.6 and on the master a few months ago.
I don’ t know when i find the time to boot up the pc and write a test.

If it were me… I’d confirm every other weird thing my fingers might have gotten wrong…

…starting by changing the parameter to a float and seeing if even one value gets through. Or change the parameter name so that code breaks if you are running the wrong things, etc…

“Kilroy debugging”… making sure the code you think is running is even what’s running, etc… because I also agree that setting an array “should” work.