How to set a region of texture on a quad?

Knowing a rectangle (x, y, width, height) on a certain texture, how to set this region of a texture on a quad?

After seeing unshaded and spritesheet materials I’m achieving this via custom material which has four parameters - one for width ratio, the other for horizontal offset ratio and likewise other two for vertical coordinates and I’m computing tex coordinates in that custom material as follows:

void main() {
    texCoord.x = inTexCoord.x * m_dw + m_dx;
    texCoord.y = inTexCoord.y * m_dh + m_dy;
    gl_Position = g_WorldViewProjectionMatrix * vec4(inPosition, 1.0);
}

I’m pre-computing m_dw parameter as a ratio of crop rectangle width and texture width etc.

It works fine although I feel I’m missing something very basic and the same could be achieved much simpler. Is there another, simpler or maybe more efficient way of achieving the same?

If not, I can live with what I currently have and the only thing I’d love to improve is having an access to texture pixel width and height in the vert file. That way I could pass rectangle parameters as is to the material without computing dw, dh, dx, dy ratios. I was thinking that it is attribute vec2 inTexCoord; but apparently it is not the right one. I did look at some definitions in jmonkeyengine/jme3-core/src/main/resources/Common/MatDefs/Misc at master · jMonkeyEngine/jmonkeyengine · GitHub but didn’t find an obvious clue how it could be done.

You can set the texture coordinates on the quad. That would work with regular Unshaded. You could even batch quads with that approach.

…I’m not sure which part of that is confusing so I don’t know where to start explaining.

I apologize for a very basic question, I have near zero experience with 3D and a lot of stuff is new to me. I did read the tutorials and it is quite possible that I missed some basic part. I did find some questions on this forum but most of the links led to nowhere.

I’m not even sure if using a shader for such a purpose is an adequate solution or this is an overkill or setting texture coordinates would be a better approach.

I did find an example which seem to be exactly what I need but without custom material:

Vector2f[] texCoord = new Vector2f[4];
texCoord[0] = new Vector2f(x,y);
texCoord[1] = new Vector2f(x+1,y);
texCoord[2] = new Vector2f(x,y+1);
texCoord[3] = new Vector2f(x+1,y+1);

quad.setBuffer(Type.TexCoord, 2, BufferUtils.createFloatBuffer(texCoord));

If you don’t mind answering one more basic question - is there a way to access pixel texture width/height in the vert file (i.e. knowing that texture width is say 1024 pixels)?

Texture coordinates are in homogenous spaces so that you don’t have to worry about the actual image size in pixels. 0,0 is the min corner, 1, 1 is the max corner. 0.5, 0.5 is the middle. Trust me that it’s better this way.

…the thing setting the texture coordinates can worry about what 0 and 1 mean because it will know what the image is. (You can get the image width/height in Java code just by asking the texture’s image.)

No worries. I just wasn’t sure where to start and it’s a lengthy topic if I had to cover the whole thing.

Note: sometimes the JME source code is also helpful. Not always, but sometimes. Quad.java is one such case. It’s about the simplest and most straight forward of the prebuild JME meshes:

Germain to this discussion is this bit:

So as a test, in your own code:

Quad mesh = new Quad(100, 100);
mesh.setBuffer(Type.TexCoord, 2, new float[] {0, 0,
                                              0.5f, 0,
                                              0.5f, 0.5f,
                                              0, 0.5f});                                                                                

…stick that into a geometry, give it an unshaded material with a texture, and see what happens.

Mess with the texture coordinates again and see what happens.

Path to enlightenment begins.

Edit: P.S.: Here is a useful test pattern texture if you need one:

Looks like this:

Useful because any piece of it is identifiable from the colors.

2 Likes

Thank you very much, I’ll look more into this.


A bit of a context for my original question: I got the part about texture coordinates being relative by looking at spritesheet material definition that someone shared on this forum and coming up with my own that uses dw, dh, dx, dy ratios - which I believe is very close to the math I’d need if I do a texture mapping myself via setBuffer(Type.TexCoord...).

The reason why I was asking about texture width in pixels is b/c with that available I could offload computing those ratios to the vert file giving rectangular crop coordinates (in pixels) that I have readily available for a texture image I’m using. I do agree that available unshaded material + setBuffer is a much simpler approach than using custom material, so will try it + dive deeper into the quad code…

…but then it’s run every frame instead of just when you set the quad up.

Keeping it in Java code has other advantages, too… because you could have 500 quads that all share the same material instance but all have slightly different texture coordinates. If you put it into the material then you’d have to have 500 different material instances also, 500 different context switches in the GPU, etc…

With 500 quads with custom texture coordinates and the same Material instance, you could even batch them into just one mesh.

1 Like

Thank you for the insight. IIUC it also means that vertex-based approach is much more efficient than using custom spritesheet material if there are a lot of sprites, so basically the takeaway is try using as few materials as possible if performance/resource utilization is important - and I guess it could be e.g. for Android / old iPhone devices.

It’s always a trade off.

On any platform, what will kill you in the end are the draw calls.

500 separate quads in separate geometries if 500 separate draw calls. If you can get that down to one mesh with 500 quads in it then it’s just one draw call.

…using strategies that maximize material reuse gives you more options with respect to batching as above.

There can be other reasons to have custom sprite sheet materials but it’s usually related to animating frames and such… the parameters of which could still be encoded into the mesh in different vertex attributes and otherwise use the same material instance.

But start simple doing something you 90-100% understand… then shake things up with something new until you 90-100% understand that, and so on. Some of it becomes second nature after a while.

1 Like