Shader: unsized uniform array

Hello!
I need to make additional markings on object. I can do this with texture but it will require a lot more vertices, triangles and of course adjusted texture atlas.

So I am trying with my own fragment shader: in Java I create an array of int. The actual size of that array depends on 3D-object.

[java]
uniform int[] m_MyArray;
uniform int m_Columns;

varying float row; // passed from vertex shader
varying float column; // passed from vertex shader

void main() {
int index = int(floor(row)*m_Columns + floor(column));
int value = m_MyArray[index];

    ...
}

}
[/java]

Running that code I get next message: error C7559: OpenGL requires constant indexes for unsized array access(m_MyArray).

It seems that the only possible solution is to use DEFINE in material definition and to use it shader code. It will cause recompilation of shader. There can be hundreds of such objects in scene. And each object probably will have its own array size. Will that number of shaders slow down rendering rate or it’s not so important?

Why not encode your ints in a 1d texture? that way you need no recompile
pass the amount of pixels as an single int, so you can calculate the correct texture coordinates,
use the sampler function that allow you to specify the mipmaplevel.
Remeber to disable filtering for the 1d texture.

1 Like

Thank you for the fast reply!
I forgot to mention that. I tried to use 2D textures. But texture2D() returns an interpolated value. May be I missed filter.
Can you please give some link to read or an example?

https://www.khronos.org/opengles/sdk/docs/man3/html/textureLod.xhtml
Should help, you can specify there the wanted lod level. aka the full size texture version.

The rest is done in JME directly on the Texture2D object.
MagFilter MinFilter should be set to Nearest, so it always uses the next pixel instead of interpolating.

If it is possible to define yourself a global maximum lenght of the array i would go that route

[java]
uniform int[ARRAY_GLOBAL_LIMIT] myArray;
uniform int currentLenght;
[/java]

@dendrit said: The actual size of that array depends on 3D-object.

Sounds a little bit like per vertex data??

1 Like

After rethinking i guess it depends on the size and the usage which route i would go.

@zzuegg said:

Sounds a little bit like per vertex data??

No, it’s not a vertex data. In my code example row and column are vertex data passed via VertexBuffer.Type.TexCoord2 to vertex shader (as vec2) and then to fragment as separate varying values.

@Empire Phoenix said: https://www.khronos.org/opengles/sdk/docs/man3/html/textureLod.xhtml Should help, you can specify there the wanted lod level. aka the full size texture version.

The rest is done in JME directly on the Texture2D object.
MagFilter MinFilter should be set to Nearest, so it always uses the next pixel instead of interpolating.

MagFilter and MinFilter are set to Nearest by default, aren’t they?

P.S. failed to use DEFINE: too large array.

I’m not sure wht the defaults are, might be might not, bette safe and just set them always.

@dendrit said: MagFilter and MinFilter are set to Nearest by default, aren't they?

No, because that would make 95% of use-cases for 3D graphics extremely ugly. All games would look like Minecraft by default so more sensible defaults are used.

If you need discrete pixels then you have to set those yourself.

1 Like

Thank you for the answers!
I create Texture2D 3x1000 pixels (exact values depend on object).
JME adjusts size of texture to pass it to GPU. Will it affect the way to get pixel’s vec4 in shader?

@pspeed said: No, because that would make 95% of use-cases for 3D graphics extremely ugly. All games would look like Minecraft by default so more sensible defaults are used.

If you need discrete pixels then you have to set those yourself.

But Java-Doc on Texture class says: “Default values are as follows: minificationFilter - NearestNeighborNoMipMaps, magnificationFilter - NearestNeighbor”
And in code MinFilter.BilinearNoMipMaps and MagFilter.Bilinear are set.

@dendrit said: But Java-Doc on Texture class says: "Default values are as follows: minificationFilter - NearestNeighborNoMipMaps, magnificationFilter - NearestNeighbor" And in code MinFilter.BilinearNoMipMaps and MagFilter.Bilinear are set.

Yes, I guess the javadoc is wrong. Those would be very strange defaults so maybe it’s always been wrong.

Thanks to all! I did it!
For those who interested I will give some code snippets:

Java:
[java]
ByteBuffer buffer = BufferUtils.createByteBuffer(4 * rows * columns);
Image image = new Image(Image.Format.RGBA8, columns, rows , buffer);
ImageRaster raster = ImageRaster.create(image);

    for (int row= 0; row< rows; row++) {
        for (int column= 0; column< columns; column++) {
            raster.setPixel(row, column, new ColorRGBA(/*Your color here*/);
        }
    }
    image.setUpdateNeeded();
    Texture2D marking = new Texture2D(image);
    roadMarking.setMagFilter(Texture.MagFilter.Nearest);
    roadMarking.setMinFilter(Texture.MinFilter.NearestNoMipMaps);
    roadMarking.setWrap(Texture.WrapMode.Clamp);
    
    roadMat.setTexture("Marking", roadMarking);         
    roadMat.setInt("Columns", columns);          
    roadMat.setInt("Rows", rows); 

[/java]

Shader. Notice floor(column)+0.000001, it is needed for three columns cases.

[java]
uniform sampler2D m_Marking;
uniform int m_Columns;
uniform int m_Rows;

varying float row;
varying float column;

void main() {
vec2 position = vec2((floor(column)+0.000001)/m_Columns, (floor(row))/m_Rows);
vec4 color = texture2D(m_Marking, position);
…
[/java]

Here is the result and it’s wireframe

Oh cool this is for roadmarkings?

Could you maybee shortly explain how the general idea behind this works? As I kinda struggle to do that for one of my projects.

Cool, that you got it working.

I would also be interested in the underlying idea. Have to admit that i still don’t get what you are dooing with the Columns/Rows to to make the roadmarkings (are that the white lines/stripes??) work.

In general steps are follows:

1)Generate road mesh by anchor points

2)Generate marking mask for each lane for each meter of lane

e.g. markingData[lane][meter] = 0b0011111100;

Actually there are 4 mask for each lane for each meter: 2 for tiny marking lines contacting with neighbour lines and 2 internal marking line.
It allows to handle cases with single (staggered or solid) marking and with double marking (solid-staggerm like on screenshot or double solid)

3)In vertex shader get vertex data for lane number and distance from the begining of the road and pass it to fragment shader via varyings.

4)In fragment shader get marking mask for current lane for current meter. By floor(fract(distance) * 8 ) get the bit number. Extract that bit from corresponding mask.

I succeded to use only one IF statment in shader and one array of float.

@zzuegg said: Cool, that you got it working.

I would also be interested in the underlying idea. Have to admit that i still don’t get what you are dooing with the Columns/Rows to to make the roadmarkings (are that the white lines/stripes??) work.

Row and column are actually the lane number and distance from the begining. It’s a bit tricky: texture2D accepts float values from 0 to 1. In shader you know only the current lane number ( floor(lane) ). You have to divide current lane number by total lane number to get value that you can pass to texture2D. The same for distance.

Hope it helps to understand.

1 Like

I see, think i got the technique you are using.