Texture Coordinates

Hey guys,

So I have this simple image:


… and I have a big mesh made up of quads. When I am generating the quads for the big mesh (terrain), I want to assign certain quads the grass, rock and dirt textures. I’m trying to play with the texture coordinates when generating the quads to apply the right texture, but I’m having some serious difficulties stretching just the texture I want over the entire quad.

How do I use the texture coordinates to do this?

Am I going about this right?

Figured it out –

These would be the texture coordinates to pull out the grass and fill a quad:

[java]Vector2f t1 = new Vector2f(0.25f, 0.875f);

Vector2f t2 = new Vector2f(0.375f, 0.875f);

Vector2f t3 = new Vector2f(0.25f, 1f);

Vector2f t4 = new Vector2f(0.375f, 1f);[/java]

Notice that 0.25f is the % of the X position of the grass in the image (e.g. 0.25 * 256 (image X size) = 64 pixels over)

0.125f are the tile sizes (32 pixels, 0.25f + 0.125f = 0.375f for the end of the grass tile).

What was really screwing me up: the second texture coordinate values (Y) are reversed (e.g. % location from the bottom of the image, so 1f is actually pixel location 0).

Is it possible to use the lightning material with such a texture atlas?

In our game bloxel we are using such a texture too … I tried to create a normal image … but the result looks not so good :slight_smile:

Supports the lightning material normal image atlas?

why not create the quad in blender the way you want ? then everything works fine.


Yes, it works with the lighting material too (as a “DiffuseMap”). I am currently using the lighting material now.


… because I am procedurally generating the quads into a large mesh during runtime… and the quads could be assigned any type of texture, not just grass. If I had a fixed quad (or mesh), blender would likely be a better option.