and tried to change the class com.jme3.renderer.lwjgl.TextureUtil.java to upload the texture with the “GL_LUMINANCE_INTEGER_EXT” flag but this flag doesent exist in lwjgl…
so any idea how to read integer values from a texture? It doesent even have to be via the real Texture Coordinate Position like in the code above, i just want to use the texture like some kind of array.
I already tried using uniform arrays and it works but the amount of data which can be uploaded seems to be very limited and it seems to be uploaded every rendering step. So i don’t think uniform arrays are a good solution for this.
Normal textures aren’t designed to hold integer values. You can feed them in your program with bytes in range 0 to 255 but in your shader a texture will always produce a float in range 0 to 1. If you want an integer in your shader you should multiply this float as you said, but only use texture coordinates that will be mapped to the center of a pixel! Else the GPU interpolates the values of the surrounding pixels and you will get something like 10.3 or 9.6.