Android Motion Sensor in JME

Hello, I am making an android project where the jme camera should rotate the same way the user is holding his phone. I am using jme 3.1 alpha. I added a dummy scene that shows a land and a box on each side of the scene but when i set the camera rotation to the motion sensor returned quaternion values it starts to rotate in a weird way.

Here is the code i am using for the scene:

package camsensor;

import com.jme3.app.SimpleApplication;
import com.jme3.app.state.AppState;
import com.jme3.math.ColorRGBA;
import com.jme3.math.Vector3f;
import com.jme3.scene.Spatial;

/**
 * test
 * @author normenhansen
 */
public class Main extends SimpleApplication {
    public static AppState state=null;
    
    public static void main(String[] args) {
        Main app = new Main();
        app.start();
    }

    @Override
    public void simpleInitApp() {
        this.cam.setLocation(Vector3f.ZERO);//set the camera location to the center
        this.getViewPort().setBackgroundColor(new ColorRGBA(135.0f/255.0f, 206.0f/255.0f, 250.0f/255.0f, 0));//sky color
        Spatial geom=assetManager.loadModel("Scenes/Scene.j3o");
        rootNode.attachChild(geom);
        if(Main.state!=null)
            Main.state.initialize(stateManager, this);
    }
}

Also this is the code i am using for android:

package com.mycam.camsensor;
 
import android.app.Activity;
import android.app.FragmentManager;
import android.content.Context;
import android.hardware.*;
import android.os.Bundle;
import android.view.Window;
import android.view.WindowManager;
import com.jme3.app.AndroidHarnessFragment;
import java.util.logging.Level;
import java.util.logging.LogManager;
import camsensor.Main;
import com.jme3.app.Application;
import com.jme3.app.state.AppState;
import com.jme3.app.state.AppStateManager;
import com.jme3.math.Quaternion;
import com.jme3.renderer.Camera;
import com.jme3.renderer.RenderManager;
 
public class MainActivity extends Activity implements SensorEventListener{
    private SensorManager mSensorManager;
    final float[] mValuesMagnet = new float[3];
    final float[] mValuesAccel = new float[3];
    final float[] mValuesOrientation = new float[3];
    final float[] mQuaternion = new float[4];
    final float[] mRotationMatrix = new float[9];
    private final float []angles=new float[3];
    private final Quaternion quat=new Quaternion();
    private Application currentApp=null;
    private Camera cam=null;
    /*
     * Note that you can ignore the errors displayed in this file,
     * the android project will build regardless.
     * Install the 'Android' plugin under Tools->Plugins->Available Plugins
     * to get error checks and code completion for the Android project files.
     */

    private void createMotionSensor() {
        if (mSensorManager == null) {
            Context context = this.getApplicationContext();
            mSensorManager = (SensorManager) context.getSystemService(Context.SENSOR_SERVICE);
            Sensor sensor = mSensorManager.getDefaultSensor(Sensor.TYPE_MAGNETIC_FIELD);
            mSensorManager.registerListener(this,
                    sensor,
                    SensorManager.SENSOR_DELAY_GAME);
            Sensor asensor = mSensorManager.getDefaultSensor(Sensor.TYPE_ACCELEROMETER);
            mSensorManager.registerListener(this,
                    asensor,
                    SensorManager.SENSOR_DELAY_GAME);
        }
    }
 
    public MainActivity(){
        // Set the default logging level (default=Level.INFO, Level.ALL=All Debug Info)
        LogManager.getLogManager().getLogger("").setLevel(Level.INFO);
    }
 
    @Override
    protected void onCreate(Bundle savedInstanceState) {
        super.onCreate(savedInstanceState);
        // Set window fullscreen and remove title bar
        requestWindowFeature(Window.FEATURE_NO_TITLE);
        getWindow().setFlags(
                WindowManager.LayoutParams.FLAG_FULLSCREEN,
                WindowManager.LayoutParams.FLAG_FULLSCREEN);
        setContentView(R.layout.main);
 
        // find the fragment
        FragmentManager fm = getFragmentManager();
        AndroidHarnessFragment jmeFragment =
                (AndroidHarnessFragment) fm.findFragmentById(R.id.jmeFragment);
        Main.state=new AppState() {
            private boolean isInit=false;
            public void initialize(AppStateManager asm, Application aplctn) {
                isInit=true;
                currentApp=aplctn;
                cam = currentApp.getCamera();
                createMotionSensor();
            }

            public boolean isInitialized() {
                return isInit;
            }

            public void setEnabled(boolean bln) {
            }

            public boolean isEnabled() {
                return true;
            }

            public void stateAttached(AppStateManager asm) {
            }

            public void stateDetached(AppStateManager asm) {
            }

            public void update(float f) {
            }

            public void render(RenderManager rm) {
            }

            public void postRender() {
            }

            public void cleanup() {
            }
        };
        // uncomment the next line to add the default android profiler to the project
        //jmeFragment.getJmeApplication().setAppProfiler(new DefaultAndroidProfiler());
    }

    public void onSensorChanged(SensorEvent se) {
        
        switch (se.sensor.getType()) {
            case Sensor.TYPE_ACCELEROMETER:
                System.arraycopy(se.values, 0, mValuesAccel, 0, 3);
                break;

            case Sensor.TYPE_MAGNETIC_FIELD:
                System.arraycopy(se.values, 0, mValuesMagnet, 0, 3);
                break;
        }
        if (mValuesAccel != null && mValuesMagnet != null) {
            SensorManager.getRotationMatrix(mRotationMatrix, null, mValuesAccel, mValuesMagnet);
            SensorManager.getQuaternionFromVector(mQuaternion, SensorManager.getOrientation(mRotationMatrix, mValuesOrientation));
            rotateCameraQuaternion(mQuaternion);
//            rotateCameraEuler(mValuesOrientation);
        }
    }

    public void onAccuracyChanged(Sensor sensor, int i) {
    }
    
    private float toDegree(float angle)
    {
        return (float) Math.toDegrees(angle);
    }
    
    private void printCamAngle()
    {
        this.cam.getRotation().toAngles(angles);
        System.out.println("ANGLES:"+this.toDegree(angles[0])+","+this.toDegree(angles[1])+","+this.toDegree(angles[2]));
    }
    
    private void printQuaternion()
    {
        System.out.println("QUATERNION:"+this.cam.getRotation().getX()+","+this.cam.getRotation().getY()+","+this.cam.getRotation().getZ()+","+this.cam.getRotation().getW());
    }

    private void rotateCameraEuler(float[] mRadAngles) {
        if(currentApp!=null)
        {
            this.cam.setRotation(quat.fromAngles(mRadAngles[0], mRadAngles[1], mRadAngles[2]));
            printCamAngle();
            printQuaternion();
        }
    }

    private void rotateCameraQuaternion(float[] mQuaternion) {
        if(currentApp!=null)
        {
            quat.set(mQuaternion[1], mQuaternion[2], mQuaternion[3], mQuaternion[0]);
            this.cam.setRotation(quat);
            printCamAngle();
            printQuaternion();
        }
    }
 
    public static class JmeFragment extends AndroidHarnessFragment {
        public JmeFragment() {
            // Set main project class (fully qualified path)
            appClass = "camsensor.Main";
 
            // Set the desired EGL configuration
            eglBitsPerPixel = 24;
            eglAlphaBits = 0;
            eglDepthBits = 16;
            eglSamples = 0;
            eglStencilBits = 0;
 
            // Set the maximum framerate
            // (default = -1 for unlimited)
            frameRate = -1;
 
            // Set the maximum resolution dimension
            // (the smaller side, height or width, is set automatically
            // to maintain the original device screen aspect ratio)
            // (default = -1 to match device screen resolution)
            maxResolutionDimension = -1;
 
            // Set input configuration settings
            joystickEventsEnabled = false;
            keyEventsEnabled = true;
            mouseEventsEnabled = true;
 
            // Set application exit settings
            finishOnAppStop = true;
            handleExitHook = true;
            exitDialogTitle = "Do you want to exit?";
            exitDialogMessage = "Use your home key to bring this app into the background or exit to terminate it.";
 
            // Set splash screen resource id, if used
            // (default = 0, no splash screen)
            // For example, if the image file name is "splash"...
            //     splashPicID = R.drawable.splash;
            splashPicID = 0;
        }
    }
}

Also this is the assets i am using for the project: Dropbox - assets.zip - Simplify your life

Thanks in advance

1 Like

Did you have any success with this? I’ve got the same problem and can’t get it to work across all 3 axis.

Hello,

Sorry for the late response, well i got it to work across all 3 axis by following the sensor fusion project code:
https://bitbucket.org/apacha/sensor-fusion-demo/src

But take into consideration that android devices (especially cheap ones) have a common issue of not having a gyroscope, or sometimes compass and accelerometer. So handling all these cases was very frustrating, and in the end it was a bad user experience when the device doesn’t have gyroscope especially for augmented reality apps.

Hope this was helpful and good luck :slight_smile:

Thanks for the reply. I treated myself to a phone with a gyro a few days ago, as (like you say) using the accelerometer leads to a jerky irritating experience. But with a gyro it works fine, very smooth. The only problem I have now is that JME renders everything upside-down. :frowning:

Hey again,

Well did you try the sensor fusion demo, its better than the code i wrote with this example and it should be easy to implement sensor fusion code with jme options. Concerning the case of upside down, i faced this issue with jme before, and the solution was to multiply the quaternion with axis flip vector values in order to get the correct motion. Or you can multiply the retrieved motion matrix with the axis flip vector. But as i said, try using sensor fusion code instead of that code since it more advanced and more precise.

Hope this helps.