Applying real-world quaternions to JME model

Hello!

I have a rather complex problem (I think) that I hope you can help me with! :crossed_fingers:

In short, I am creating a motion capture app using wearable IMU sensors w. quaternion output, applied to bodyparts, to rotate a model using JME SkinningControl.

So what I have done so far is to import a .j3o model (ybot from Mixamo in N-pose) and build an app where two quaternions are applied to back and head joints of the model to rotate the body. That all works (thanks to previously help from you guys :pray:)

Now what I would like to achieve is to apply the correct rotation to the model. E.g. when I tilt the head sensor forward, then the head should do the same movement, and when rotate, then the head should rotate and so on… This is where I could use some help.

What I have done to try to achieve this is:

  1. Create offset quaternion at first frame
  2. Multiply the offset quaternion to each following quaternion
  3. Apply the new quaternion to the joint with setLocalRotation

Code example:

public class test {
    private Quaternion back_offset = new Quaternion;
    private Quaternion head_offset = new Quaternion;

    private Joint back;
    private Joint head;

    // JME3 model is initiated
    public void setupModel() {
        Node human = model.getRootNode();

        SkinningControl sc = findAnimRoot(human).getControl(SkinningControl.class);

        back = sc.getArmature().getJoint("mixamorig:Spine1");
        head = sc.getArmature().getJoint("mixamorig:Head");
    }

    public void onDataUpdate(float[] back_sensor, float[] head_sensor) {

        // Quaternion comes in continuously
        Quaternion back_quat = new Quaternion(back_sensor); // Quaternion (XYZW) output from back sensor
        Quaternion head_quat = new Quaternion(head_sensor); // Quaternion (XYZW) output from head sensor

        // Perform offset calibration at first frame, for each bodypart
        if (back_offset == null && head_offset == null) {
            back_offset = new Quaternion(0, 0, 0, 1).mult(quaternion.inverse());
            head_offset = new Quaternion(0, 0, 0, 1).mult(quaternion.inverse());
        } else {
            // After first frame, then offset quaternions are multiplied to the new incoming quaternions
            Quaternion back_quat_calib = back_offset.mult(back_quat);
            Quaternion head_quat_calib = head_offset.mult(head_quat);

            // Apply transformed quaternions to joints on JME3 model
            back.setLocalRotation(new Quaternion(quat_calib.getX(), quat_calib.getY(), quat_calib.getZ(), quat_calib.getW()));
            head.setLocalRotation(new Quaternion(quat_calib.getX(), quat_calib.getY(), quat_calib.getZ(), quat_calib.getW()));
    }
}

This results in a nice N-pose, but when I rotate the head, then the head will tilt. So the coordinate systems dont match. How should I approach to allign the coordinate systems? You can see an illustration of the setup below?

Hope you can help!

1 Like

So, you want to align the IMU sensor coordinate system with the jme3 coordinate system ?

If you apply the sensor-based Quaternions to root-level spatials not part of an armature, are they rotated correctly or does the rotation get swapped there, too?

Edit: because if they are correct in world space and just not in that level of the armature then it’s a relatively simple matter of doing a change of basis.

Yes I think that could solve it, but I dont know the correct way to align the coordinate systems. Can you help?

1 Like

Well, there are multiple ways you can do this, one way is to create an interpreter utility class to simply convert from jme vector orientation to sensor vector orientation and from sensor orientation to jme orientation, for example:

public final class OrientationUtil {
 
 private OrientationUtil() {
 }

 /**
 * Gets a Jme Vector from a Sensor Vector
 * @return a new Jme aligned vector
 */ 
 public static Vector3f getJmeVectorFromSensorVector(Vector3f sensorVec) {
       return new Vector3f(sensorVec.y, sensorVec.z, sensorVec.x);
 }
 /**
 * Gets a Sensor Vector from a Jme Vector
 * @return a new Sensor aligned vector
 */ 
 public static Vector3f getSensorVectorFromJmeVector(Vector3f jmeVec) {
       return new Vector3f(jmeVec.z, jmeVec.x, jmeVec.y);
 }
 /**
 * Gets a Jme Quaternion from a Sensor Quaternion
 * @return a new Jme aligned Quaternion
 */ 
 public static Quaternion getJmeQuatFromSensorQuat(Quaternion sensorQuat) {
       return new Quaternion(sensorQuat.y, sensorQuat.z, sensorQuat.x, sensorQuat.w);
 }
 /**
 * Gets a Sensor Quaternion from a Jme Quaternion
 * @return a new Sensor aligned Quaternion
 */ 
 public static Quaternion getSensorQuatFromJmeQuat(Quaternion jmeQuat) {
       return new Quaternion(jmeQuat.z, jmeQuat.x, jmeQuat.y, jmeQuat.w);
 }
}

I have no way to test this, so you might try this and report (you can change those interpretation if they are wrong).

EDIT:
Other ways may involve interpreting the sensor values to jme orientation before even sending the results on your hardware layer, but i think its better to do it on jme side (as this will aid if your sensor sends multiple data to different utilities at the same time, so each utility (including jme) will need an interpreter).

1 Like

This results in a nice N-pose, but when I rotate the head, then the head will tilt. So the coordinate systems dont match. How should I approach to allign the coordinate systems? You can see an illustration of the setup below?

im trying to understand. As i understand it rotate correctly, but coordinate of some “root point of head” do not match your sensor coordinate?

But based on what you told, i dont see IMU sensor to have coordinate system at all, it just have Quaternions, so what coordinates do not match?

If your sensors also have coordinate system that need match JME bone coordinate, then you would like use IK(inverse kinematic) bones, like the ones from examples in Minie physics. Tho if rotations also need match then you would need to fix rotation for head for IK when doing it.

1 Like

We need ground truth on what the sensor values mean. I asked a question that hasn’t been answered.

…until then we’re all just randomly guessing. Because it’s just as likely everything is fine but the sensors are always in world space and the bone is not.

4 Likes

yes, its like getting ticket in corpo work, where you need pass to “more information” status, because its specification lack crucial informations.

1 Like

Because the problem to me sounds identical to what would happen if you had world-oriented sensors update a bone in a hierarchy where that bones space is some multilevel concatentation of all of the bones above it.

…pretty classic example, in fact.

1 Like

Thanks for trying to help me @oxplay2 and @pspeed! I am still working on your first reply @pspeed, I am a “bit” slow when it comes to understanding all of this, so sorry for the late response and bad information I gave you :wink:

When you say I should “apply the sensor-based Quaternions to root-level spatials”, should it be to this Node: Node human = (Node) assetManager.loadModel("assets/Models/human.j3o"); ?

human.setLocalRotation(quat_calib);

When I try to do this in my application I get “Scene graph is not properly updated for rendering”. I guess I need to perform the rotation in the SimpleApplication, but I am not sure how I can pass the sensor values from my fragment (LogFragment) to the Model.java class?

Here is a more fully example of my app:

LogFragment.java:

public class LogFragment extends Fragment implements DataChangeInterface {

    private Node human;
    private Model model;
    private boolean modelReady = false;

    private Joint backJoint;
    private Joint headJoint;
    
    private Quaternion backOffset;
    private Quaternion headOffset;
    
    @Override
    public void onViewCreated(View view, Bundle savedInstanceState) {
        super.onViewCreated(view, savedInstanceState);
        
        Button startRecording = (Button) view.findViewById(R.id.recording_button);
        startRecording.setOnClickListener(new View.OnClickListener() {
            @Override
            public void onClick(View view) {
                setupModel(); // Setup JME model
            }
        });
        
        // JME stuff
        final JmeSurfaceView gl_surfaceView = view.findViewById(R.id.glView);
        model = new Model();
        gl_surfaceView.setLegacyApplication(model);
        gl_surfaceView.startRenderer(0);
    }

    protected static Spatial findAnimRoot(Spatial s) {
        if (s.getControl(AnimComposer.class) != null) {
            return s;
        }
        if (s instanceof Node) {
            for (Spatial child : ((Node) s).getChildren()) {
                Spatial result = findAnimRoot(child);
                if (result != null) {
                    return result;
                }
            }
        }
        return null;
    }
    
    // onDataChanged receives updates from sensor quaternion stream
    @Override
    public void onDataChanged(SensorData data, String sensor) {
        if (getActivity() != null) {
            getActivity().runOnUiThread(new Runnable() {
                @Override
                public void run() {
                    if (modelReady) { // if model is ready, then start moving it!
                        
                        float[] quat = data.getQuat(); // Get quaternion values from sensor (in WXYZ prder)
                        Quaternion quaternion = new Quaternion(quat[1], quat[2], quat[3], quat[0]); // Change to JME XYZW order
                        
                        if (sensor.equals("back")) {
                            if (backOffset == null) { // Perform offset calibration at first frame, for each bodypart
                                backOffset = new Quaternion(0, 0, 0, 1).mult(quaternion.inverse());
                            } else {
                                Quaternion quat_calib = backOffset.mult(quaternion); // Offset correct the following quaternions
                                // backJoint.setLocalRotation(quat_calib);
                                human.setLocalRotation(quat_calib); // RECEIVES ERROR: Scene graph is not properly updated for rendering.
                            }
                        }

                        if (sensor.equals("head")) {
                            if (headOffset == null) { // Perform offset calibration at first frame, for each bodypart
                                headOffset = new Quaternion(0, 0, 0, 1).mult(quaternion.inverse());
                            } else {
                                Quaternion quat_calib = headOffset.mult(quaternion); // Offset correct the following quaternions
                                headJoint.setLocalRotation(quat_calib);
                            }
                        }
                    }
                }
            });
        }
    }
    
    private void setupModel() {
        human = model.getRootNode();

        // Model is now ready. Initialize stuff to JME model.
        SkinningControl sc = findAnimRoot(human).getControl(SkinningControl.class);

        // Define bodyparts to elements
        backJoint = sc.getArmature().getJoint("mixamorig:Spine1");
        headJoint = sc.getArmature().getJoint("mixamorig:Head");
        
        modelReady = true; // Model is now ready to move
    }
}

Model.java:

public class Model extends SimpleApplication {
    @Override
    public void simpleInitApp() {
        Node human = (Node) assetManager.loadModel("assets/Models/human.j3o");
        rootNode.attachChild(human);
    }
}

At your android activity or fragment use :

jmeSurfaceView.getLegacyApplication().enqueue(() -> human.setLocalRotation(someRotation));

Or use the simple app instance directly, it is similar to android runOnUiThread().

What kind of android sensors are you registering ? Geomagnetic and Gyroscope ?

1 Like

I use Inertial Measurement Units (IMU) sensors with accelerometer, magnetometer and gyroscope integrated.

Thanks for the tip, I can now rotate the root model with a sensor quaternion.

I have made a short video showing the problem. In the video, the bottom sensor is controlling the root model and the top sensor is rotating the head joint.

As you can see, the head joint rotates double when rotating the body . How can that be?

https://youtube.com/shorts/_AUG_CLyaDU?feature=share

Because the head sensor is in world space and the head is relative to the body. So it rotates for the body (in world space) and itself (in world space)… so twice what it should.

You will need to do a change of basis to make the head sensor rotation relative to the body sensor rotation instead of relative to the world.

Off the top of my head:
relativeHeadRotation = bodySensorRotation.inverse().mult(headSensorRotation);

2 Likes

Yes perfect! That should do it! Thanks a lot for the help!! :pray:

1 Like

Glad it works, btw if you want to get the full out of this, you may reconsider your architecture, for example, using Activity#runOnUiThread(Runnable) inside UiThread does nothing.

In order to synchronize the Sensors input with jMonkeyEngine InputManager in a better way, you can create a BaseAppState that listens to your sensor inputs and communicate back with the values to the jme3 update on the next frame, this kinda of State is a singleton listener must be registered on both sides: the Android SensorManager and the jme AppStateManager, here is an example:

public final class JmeSensorManager extends BaseAppState implements SensorEventListener2 {
    
    private static final JmeSensorManager jmeSensorManager = new JmeSensorManager();
    // volatile for thread safety
    private volatile boolean hasRecievedSensorSignals = false;
    private final ArrayList<SensorAction>         actions = new ArrayList<>();    

    private JmeSensorManager() {
          }

    public static JmeSensorManager getJmeSensorManager() {
	   return jmeSensorManager;
    }    

    @Override
    public void onSensorChanged(final SensorEvent event) {
		 if (!isEnabled()) {
	            return;
		 }
	
	        //TODO-your orientation calculations here, this will be posted on the sensors' thread   
		 actions.    foreach(action ->           action.sensorUpdate(event));
		 hasRecievedSensorSignals = true;
    }
    
    /**
     * Updates the sensor values with the jme update.
     */
    @Override
    public void update(float tpf) {
		if (!hasRecievedSensorSignals) {
			return; 
		} 
        // update the values with jme3 by setting local rotation to some targets, etc
        actions.foreach(action -> action.jmeUpdate(tpf));
        hasRecievedSensorSignals = false;
    }

    @Override
    public void onAccuracyChanged(Sensor sensor, int accuracy) {

    }

    public void addSensorAction(SensorAction action) {
    	  actions.add(action);
    }
}
public interface SensorAction {
   void     sensorUpdate(SensorEvent evt);
   void jmeUpdate(float tpf);
}

Now in order to use it, at your SimpleApplication level do this :

...
void simpleInit() {
...
	// activate the sensor manager update
	getStateManager().attachState(JmeSensorManager.getJmeSensorManager());
...
}

And at your Activity while retrieving the sensor manager and setting up the listener do this:

// register to send sensor updates to our custom manager
sensorManager.registerListener(JmeSensorManager.getJmeSensorManager(), gyroscope,SensorManager.SENSOR_DELAY_NORMAL);

// hook a call
LogFragment fragment = new LogFragment();
JmeSensorManager.getJmeSensorManager().addSensorAction(fragment);

// attach the fragment
getSupportFragmentManager().replace(fragment);

And at your Fragment, do this:

public class LogFragment extends Fragment implements SensorAction {
   .....
   void sensorUpdate(SensorEvent evt) {
		// do calculations on sensor's thread
       // Remember, your objects should be volatile for thread safety.
   }

   void jmeUpdate(float tpf) {
      // update with jme
   }

}

EDIT:
An even more optimized solution is to create a separate thread for the SensorAction, for example:

public final class JmeSensorManager extends BaseAppState implements SensorEventListener2 {
     
    private static final SensorThread sensorThread = new SensorThread();
    private static final JmeSensorManager jmeSensorManager;
    // volatile for thread safety
    private volatile boolean hasRecievedSensorSignals = false;
    private final ArrayList<SensorAction> actions = new ArrayList<>();    

    private JmeSensorManager() {
    }

    public static JmeSensorManager getJmeSensorManager() {
           if (jmeSensorManager == null) {
                 synchronized(JmeSensorManager.class) {
                       if (jmeSensorManager == null) { 
                             jmeSensorManager = new JmeSensorManager();
                             sensorThread.start();
                       }
                 }
           }
	   return jmeSensorManager;
    }    

    @Override
    public void onSensorChanged(final SensorEvent event) {
		if (!isEnabled()) {
	            return;
		 }
           //TODO-your orientation calculations here, this will be posted on the sensors' thread  
           actions.foreach(action -> {
                    sensorThread.postAction(() -> {
                               action.sensorUpdate(event));
                               hasRecievedSensorSignals = true;
                    });
            }); 
	        
    }
    
    /**
     * Updates the sensor values with the jme update.
     */
    @Override
    public void update(float tpf) {
		if (!hasRecievedSensorSignals) {
			return; 
		} 
        // update the values with jme3 by setting local rotation to some targets, etc
        actions.foreach(action -> action.jmeUpdate(tpf));
        hasRecievedSensorSignals = false;
    }

    @Override
    public void onAccuracyChanged(Sensor sensor, int accuracy) {

    }

    public void addSensorAction(SensorAction action) {
    	  actions.add(action);
    }
    private class SensorThread extends Thread {
           private Runnable action;
           @Override
           public void run() { 
                   for (;;) { 
                         if (action == null) continue;
                         action.run();
                         action = null;
                   }
           }
           public synchronized void postAction(Runnable action) {
                   this.action = action;
           }
    }
}
1 Like

Just to be additive:
some sensor works on different space system, you may need to check this list:

  1. left-hand or right-hand space, for sensors and/or GL/Directx
  2. then, the UP-vector, for example CAM X maybe MCU Z and Lidar Y and Screen -Y.
  3. data format: Big or Little endian. Java uses Big and PCL or others use Little, WIN or Linux.
    then rotations work well.
    took a while to figure out these)
2 Likes