UV mapping for vertices

Hey everyone,



recently I have managed to strongly reduce the vertex amount of the loaded models :slight_smile:

I made fixes so that animaions works also fine.

But I came accross a problem with UV mapping.



Imagine you have a cube made of 8 vertices.

I load the cube and I have a model with exactly 8 vertices on my scene.

But I wanted my cube to have a texture and each face will need to display totally different part of my image.

Because in my model one vertex belongs to 3 different faces it will need to use three different UV coordinates.



Up to this point every triangle of the mesh had separate vertices so there was no problem. I simply created a buffer that had UV mapping for each vertex. But now I will need to attach several UV coordinates to one vertex.



How can I achieve that ??

3 Likes

I think you need to split those vertices and have 2 everywhere you have 2 different uv mappings…

Seems like you’re unfortunately right :confused:

Ogre exporter also multiplies vertices when they have separate UV’s.

Yes a “vertex” has to include all attributes attached to the vertex. To be shared it must have the same location, normal, texture coordinate, color, etc…

OK, Im almost done with this feature.



Just have two more questions.

1.

When the mesh have several different materials applied in blender I have to separate it into several meshes in jme. Each will have different material and then I put every geometry created this way into one node.

I create a single vertex buffer and apply it to all of my meshes. Are the entries in the buffer copied by the mesh or referenced?

In other words: should I create separate Position buffer for every mesh that will contain only its vertices or can I use one buffer for every mesh?



2.

I created a mesh that had one material, no textures, no UV’s and was all smooth.

In blender I saw that it has 1008 vertices.

When I import it, my result mesh has exactly 1008 vertices - that is what I see in debugger (mesh.getVertexCount();).

But on the debug console window that is displayd on the scene I see abut 7000 vertices.

The console takes about 930 vertices itself. So I wonder why there are 5000 vertices more than expected then?

@Kaelthas said:
I create a single vertex buffer and apply it to all of my meshes. Are the entries in the buffer copied by the mesh or referenced?
In other words: should I create separate Position buffer for every mesh that will contain only its vertices or can I use one buffer for every mesh?

Do not share vertex buffers across meshes. This used to work before but its no longer supported due to overhead in the animation system handling. To make it easier to avoid sharing of vertex buffers, a new method was added called Mesh.extractVertexData(). As the javadoc says, it will use "this" buffers' index buffer to access vertex data in the "other" mesh given in the argument, the accessed vertex data is then written into the "this" buffer. As an example usage, you create a Mesh object that has all the attributes of the mesh in Blender, then each material on that mesh has its own index buffer, you extract the data away from the global mesh like so: materalsMesh.extractVertexData(globalMesh).

@Kaelthas said:
I created a mesh that had one material, no textures, no UV's and was all smooth.
In blender I saw that it has 1008 vertices.
When I import it, my result mesh has exactly 1008 vertices - that is what I see in debugger (mesh.getVertexCount();).
But on the debug console window that is displayd on the scene I see abut 7000 vertices.
The console takes about 930 vertices itself. So I wonder why there are 5000 vertices more than expected then?

Its possible you have lighting enabled. Each light (directional / point / spot) requires rendering the whole mesh again. If you use Mesh.updateCounts() then the getVertexCount() value should be correct.

OK I solved the problem by separating the vertices during loading.

Most cases seems now to work great. But I came accross a problem I have problems with.



Imagine we have two simple planes, each made of four vertices.

In other words two separate meshes in one node.

I add an armature and attach vertices of one of the planes to the bone. The second plane remains untouched.



The scene loads fine when both of the planes have the same material.

When I apply different material to one of them and load the scene, the plane attached to the bone is strongly displaced.

The vertices are moved to the positions of the second plane.



When I have two materials in one node I create two separate meshes.

Each of them has its own vertices, normals, uvs and indexes.

Binding buffers are also separate.



The BoneIndex and BoneWeight buffers are created for each plane.

For a plane that has 4 verties the BoneIndex buffer looks like that:

1, 0, 0, 0

1, 0, 0, 0

1, 0, 0, 0

1, 0, 0, 0



The first column of one’s indicates that the vertex is attached to the bone with index ‘1’.

For the second plane it looks like that:

0, 0, 0, 0

0, 0, 0, 0

0, 0, 0, 0

0, 0, 0, 0

Which is also cool because these vertices are attached to the bone with index ‘0’ which is added for non moving vertices.



They look fine to me but anyway the scene is not OK.



When only one material is used the BoneIndex buffer looks like that:

1, 0, 0, 0

1, 0, 0, 0

1, 0, 0, 0

1, 0, 0, 0

0, 0, 0, 0

0, 0, 0, 0

0, 0, 0, 0

0, 0, 0, 0

We have 8 vertices and first 4 are attached to the bone ‘1’.

And then everything is ok on the scene.



If I am unclear somewhere please ask. I’d love to do that commit as soon as possible - but not with such serious bugs :wink:

So where does that transform come from? Are the vertices themselves displaced or are they being displaced by the incorrect bone weight/index buffers? You have the resultant model, so you know where that transform comes from.

It looks like they are being displaced by bone weight/index buffers somehow.

Vertices are fine and their bind poses are also fine because when I set a single material to both planes everything is OK.

I just do not know why the displacement takes place.

Maybe I could commit the code (it works fine for all other cases) and upload you the model so you could see how the resulting meshes look like.

I made one more experiment. I added a second bone and attached the unattached plane to it.

It behaves as before.

Maybe it is not allowed to attach two separate meshes to a single armature ??

Sounds to me like the root bone’s skinning transform ends up being non identity somehow. Check your bind pose and animation transform? When multiplied it should result in identity transform (assuming there’s no animation).



Skeleton.computeSkinningMatrices() will give you the matrices that the vertices are multiplied by. Its computed by multiplying the bones’ model-space animation transform by the model-space inverse bind transform.



You can commit the code if you want, I can take a look at the resultant model and tell you where the problem comes from but you will have to fix whatever is causing it.

OK the code it there.

I’ll check your suggestion today :slight_smile: