Insdie LWJGLTextureState.load() I need to call glTexImage2D to interpret the data as floats rather than bytes.
This the JME's current code:
GL11.glTexImage2D(GL11.GL_TEXTURE_2D, 0,
TextureStateRecord.getGLDataFormat(image
.getFormat()), image.getWidth(), image
.getHeight(), hasBorder ? 1 : 0,
TextureStateRecord.getGLPixelFormat(image
.getFormat()), GL11.GL_UNSIGNED_BYTE, <
diff
image.getData(0));
break;
I need to have this:
GL11.glTexImage2D(GL11.GL_TEXTURE_2D, 0,
TextureStateRecord.getGLDataFormat(image
.getFormat()), image.getWidth(), image
.getHeight(), hasBorder ? 1 : 0,
TextureStateRecord.getGLPixelFormat(image
.getFormat()), GL11.GL_FLOAT, <
diff
image.getData(0));
break;
Could I request that this is changed from being hardcoded to being a variable inside TextureStateRecord? (Which I can then access and modify in my library). Therefore the new code would look like:
GL11.glTexImage2D(GL11.GL_TEXTURE_2D, 0,
TextureStateRecord.getGLDataFormat(image
.getFormat()), image.getWidth(), image
.getHeight(), hasBorder ? 1 : 0,
TextureStateRecord.getGLPixelFormat(image
.getFormat()), TextureStateRecord.getGLType(), <
diff
image.getData(0));
break;
It would make sense to put this in all the glTex calls throughout this class.