[SOLVED] Mesh Update not working

Hi everyone,

I have a planet and what I’m trying to do is to built a simplified mesh over it.
My first attempt was to create a sphere with less triangles that has the same radius.
That mesh do not consider the mountains present on the planet. So what I’m trying to do now is to check whethet a vertex of the simplified mesh is under the mountain, and if it is the case, translating it to the contact point with the mountain (using a ray that goes from the center of the planet toward the vertex)
Problem is that when I try to update the mesh it doesn’t work.
That’s my attempt:

public void adjustMesh(Planet planet){
    Geometry planetGeom = planet.getPlanet();
    Geometry meshGeom = planet.getNavMesh();
    Mesh gridMesh = meshGeom.getMesh();
    Node node = new Node();
    VertexBuffer vb = gridMesh.getBuffer(Type.Position);
    float[] data = BufferUtils.getFloatArray((FloatBuffer) vb.getData());
    for(int i=0;i<data.length;i+=3){
        Vector3f origin = planetGeom.getLocalTranslation();
        Vector3f vertex = new Vector3f(data[i],data[i+1],data[i+2]);
        Ray r = new Ray(origin, vertex.subtract(origin).normalize());

        CollisionResults res = new CollisionResults();
        node.collideWith(r, res);
            Vector3f realP = res.getFarthestCollision().getContactPoint();
            data[i] = realP.x;
            data[i+1] = realP.y;
            data[i+2] = realP.z;


The problem is in the way I update it, because even if I do not do anything in the for loop, the mesh still disappears. Any hints?

Thanks in advance

1 Like

There are a number of things that could be going wrong here. Mesh is conceptually easy but sometimes tricky to get right in practice.

If your entire mesh is disappearing, I suspect either something is wrong with the low level buffer management or something is wrong with your coordinates and your vertices are getting displaced to somewhere strange. Try printing the original and translated vertices and seeing what comes out. Do you ever apply any transformation to the meshGeom? If so, you’ll need to take it into account when setting the vertex positions.

Also, make sure normals are being set correctly.

P.S. You have a lurking NullPointerException waiting to happen in these two lines:

            Vector3f realP = res.getFarthestCollision().getContactPoint();

If no collisions are detected (the ray misses) both of those could throw a NullPointerException.

1 Like

If I do that

      float[] data = BufferUtils.getFloatArray((FloatBuffer) vb.getData());
      float[] data2 = BufferUtils.getFloatArray(BufferUtils.createFloatBuffer(data));

they are the same, even If I do not modify the vertices, but I try to update the buffer with the same data it disappears, so I guess the problem is in the way I update them…

About the null pointer exception, I think it is not the case that it will ever happen, but even if it does I can handle it easily.

1 Like

Without looking too deeply, you could compare your update code with update code in classes like Quad to see the difference.

Though, given that you are doing the inefficient “copy all the data out” then “copy all the data in” approach, you could look at Quad.java to see even simpler ways.


Yeah, no surprise there. I’ve recently done a couple different projects involving building/modifying custom meshes, and it’s surprisingly touchy.

Yeah, as long as your meshes are correctly constructed and updated such that the rays always hit this won’t happen. Just thought I’d mention it in case it popped up later.

1 Like

Also, if you want to see what the minimum you have to do and most efficient way to update a JME mesh’s buffers in place, you can look at Lemur’s DMesh.

This is the method that should help most:


Nice it worked thanks!

One last thing, it seems that some parts are not showing up.
I’m not sure but my first guess would be that I need to change the normals now.
Right now I’m using the same normals as the basic mesh, and it seems to work for some mountains… Is there a way to update normals according to the new position of the vertices?

1 Like

Normals only matter with lighting. Normals are used to compute how light and dark a particular point is based on the direction of the light.

That’s it.

Normals probably don’t matter here.

Try turning off face culling in the material. If that fixes it then you have a vertex winding problem.

1 Like

Turning the face culling off did not fix it.
The empty zones are still there (below the screenshot from the inside of the planet with face culling off)

That’s the code

 public void updateMesh() {
    Mesh mesh = planet.getNavMesh().getMesh(); 
    Mesh target = new Mesh();
    VertexBuffer sourcePos = mesh.getBuffer(Type.Position);
    VertexBuffer sourceNorms = mesh.getBuffer(Type.Normal);

    VertexBuffer targetPos = matchBuffer(sourcePos, target);
    VertexBuffer targetNorms = matchBuffer(sourceNorms, target);

    // Make sure we also have an index and texture buffer that matches
    // ...even though we don't transform them we still need copies of
    // them.  We could just reference them but then our other buffers
   // might get out of sync
    matchBuffer(mesh.getBuffer(Type.Index), target);
    matchBuffer(mesh.getBuffer(Type.TexCoord), target);

    morph(sourcePos, sourceNorms, targetPos, targetNorms);
    Geometry newMesh = new Geometry("new",target);
  //  rootNode.attachChild(planet.getPlanet());
 protected VertexBuffer matchBuffer( VertexBuffer source, Mesh mesh ) {
    if( source == null )
        return null;

    VertexBuffer target = mesh.getBuffer(source.getBufferType());
    if( target == null || target.getData().capacity() < source.getData().limit() ) {
        target = source.clone();
    } else {
    return target;
 protected void morph( VertexBuffer sourcePos, VertexBuffer sourceNorms,
                      VertexBuffer targetPos, VertexBuffer targetNorms ) {
    FloatBuffer sp = (FloatBuffer)sourcePos.getData();

    FloatBuffer sn = (FloatBuffer)sourceNorms.getData();

    FloatBuffer tp = (FloatBuffer)targetPos.getData();

    FloatBuffer tn = (FloatBuffer)targetNorms.getData();

    morph(sp, sn, tp, tn);


  protected void morph( FloatBuffer sourcePos, FloatBuffer sourceNorms,
                      FloatBuffer targetPos, FloatBuffer targetNorms ) {

    int count = sourcePos.limit() / 3;
    Vector3f v = new Vector3f();
    Vector3f normal = new Vector3f();
    Node node = new Node();
    for( int i = 0; i < count; i++ ) {
        System.out.println(i+" out of "+count);
        v.x = sourcePos.get();
        v.y = sourcePos.get();
        v.z = sourcePos.get();
        normal.x = sourceNorms.get();
        normal.y = sourceNorms.get();
        normal.z = sourceNorms.get();
        Vector3f origin = planet.getPlanet().getLocalTranslation();
        Ray r = new Ray(origin, v.subtract(origin).normalize());

        CollisionResults res = new CollisionResults();
        node.collideWith(r, res);
            Vector3f realP = res.getFarthestCollision().getContactPoint();
             v.x = realP.x;
             v.y = realP.y;
             v.z = realP.z;

1 Like

Well, there’s nothing in the processing that will cause this. If you are missing triangles then it’s because their vertexes are bad or something.

1 Like

Found the problem, sometimes the furthest collision returned infinity, I don’t know why it happened, but now I solved it getting just the second collision.
Thanks again for the help! Much appreciated.