Generating A Texture

Hello,
While writing terrain shaders I have come to the point where I cannot get around using a lookup texture for permutations (mainly for noise). I was wondering if it was possible to generate a texture for this purpose.
Thanks muchly

Yes.

You can use a custom shader to generate the texture on the GPU side or use the ImageRaster class to generate the texture on the CPU side.

1 Like

Yeah, I didn’t mean to be terse but not enough info was provided to really provide a good answer beyond “yes it’s possible”. There are about a dozen ways to make a noise texture each with pretty large tradeoffs depending on what you actually want to do and what your source data is.

1 Like

You can generally see textures as arrays for shaders, they are basically your only means to get lots of data over to the GPU with the least amount of overhead. Basically they are just that, arrays. Update the texture on the CPU side and it will be sent to the GPU for further processing. Note however that the transfer of the data is by far the most time intense part of this. It might not be intuitive but changing the data will cause much more overhead than the most complex computations on the data when its in the GPU already.

So no, it doesn’t matter if you loaded the texture with the assetmanager or generated every pixel with java code, they will work the same.

1 Like
<cite>@normen said:</cite> You can generally see textures as arrays for shaders, they are basically your only means to get lots of data over to the GPU with the least amount of overhead. Basically they are just that, arrays. Update the texture on the CPU side and it will be sent to the GPU for further processing. Note however that the transfer of the data is by far the most time intense part of this. It might not be intuitive but changing the data will cause much more overhead than the most complex computations on the data when its in the GPU already.

So no, it doesn’t matter if you loaded the texture with the assetmanager or generated every pixel with java code, they will work the same.

<cite>@kwando said:</cite> You can use a custom shader to generate the texture on the GPU side or use the ImageRaster class to generate the texture on the CPU side.

Ah ok that clears a lot up. Thanks for the time.

So I decided to do a test and generate a texture on the cpu side using ImageRaster and then sampling it to set a mesh a color. But wherever I sample it the mesh is still white…
Here’s the java code where I make the mesh and texture:

ByteBuffer data = BufferUtils.createByteBuffer( (int) ((int)Math.ceil(Format.RGBA16F.getBitsPerPixel() / 8.0) * 64.0 * 64.0));
Image img = new Image();
img.setHeight(1);
img.setWidth(256);
img.setFormat(Format.RGBA16F);
img.setData(data);
ImageRaster raster = ImageRaster.create(img);
for(int x = 0; x<256; x++){
raster.setPixel(x,0, new ColorRGBA(x,x,x,1f));
}
Texture tex = new Texture2D(img);
System.out.println(raster.getPixel(32, 0));
Mesh mesh = new Sphere(128,128,1);
Geometry g = new Geometry(“Mesh”,mesh);
Material m = new Material(assetManager, “mat.j3md”);
m.setTexture(“table”, tex);
g.setMaterial(m);
rootNode.attachChild(g);

and here is how I sample it:

color = vec4(texture2D(table, vec2(150.0,0.0)));

Am I doing something horribly incorrectly?
Thanks

color = vec4(texture2D(table, vec2(150.0,0.0)));

150 is about 150x outside of the texture coordinates (which run 0.0 to 1.0). So you are probably just sampling the edge over and over. You can turn on texture repeat for the texture or you can keep your texture coordinates within 0 to 1.