Creating Engine Support for Android Sensor Input

After reading this last post about someone wanting Android’s sensor interface, I began getting inspired to work on getting the engine to support this interface. I want to make sure people think this is a good idea and that no one is already working on it.

I think there are 2 choises: Create a new input type “SensorInput” similar to TouchInput, or shoehorn it into one of the existing input types. Personnally, I think it needs a new input type.

Is there any interest?


This was something I thought about a while ago, and its a must for the engine, was quite hard for me to learn and get to grips about using the Accelerometer, then you’ve also got the Gyroscope, Magnetic field and other sensors, built-in support for jME would be so sweet!!

1 Like

Thank you for your interest @iwgeric.

I first thought those components are already been implementd in library.

There are lot of game ideas that can be achieved via those sensor inputs in combination with jme3 library.

It would really be great if any1 adds those components in jme library.

jme3 is one of most powerful and sophisticatd game engine library.

Thank you.

@iwgeric yeah would be a nice feature.

A new SensorInput input type sounds good. Not only mobile phone has Accelerometers, some gamepads too so i guess it would not be that specific to mobiles.

@normen, @Momoko_fan what do you think?

1 Like

On a side note if you do add the input type, maybe it could work with the existing AnalogListener. If you can’t make an SensorListener, but it would be preferable.

The TouchInput is badly designed right now IMO, because it shortcuts JME’s input mapping, and does not work like any other input.

It will likely be rewritten from scratch at some point (3.1 maybe), to fit with JME input system.

So…to get clean it could be nice that the SensorInput works like Key, Mouse, and Joystick input work.


I’ve got a structure put in place to enable a SensorInput, SensorTrigger, MotionSensorInterface, MotionSensorEvent, etc… I’m planning on starting with the Magnetic_Field sensor, Accelerometer Sensor, and Orientation Sensor (Orientation is actually not a sensor, but I’m going to treat like one). I’m not sure exactly what you are unhappy about with TouchInput, but I’m assuming it’s the fact that only the keys have thier own triggers and the soft event types do not have their own triggers? If so, I’ve made it so each sensor type has it’s own trigger.


I have a lot of work to do yet to implement the sensor data in a good way, but I was wondering how to get some input on the structure that I’m planning for the core to allow for context specific sensor implementations. Should I just post the new classes and the changes to InputManager to start so that you can see what I’m thinking of or should I wait until I get the AndroidSensorInput class finished as well?

1 Like

What i’m unhappy with is that you have to do inputManager.addMapping(“whatever”,new TouchTrigger(TouchInput.ALL));

then check for the event type in the event listener.

The mapping become useless.

We should be able to do inputManager.addMapping(“whatever”,new TouchTrigger(TouchInput.TAP)); for example

Other than that don’t you have a MotionSensorListener?

@iwgeric <3 cant wait for this :slight_smile:

@nehon said:
What i'm unhappy with is that you have to do inputManager.addMapping("whatever",new TouchTrigger(TouchInput.ALL));
then check for the event type in the event listener.
The mapping become useless.
We should be able to do inputManager.addMapping("whatever",new TouchTrigger(TouchInput.TAP)); for example

Other than that don't you have a MotionSensorListener?

Yeah, that's what I thought you meant.

And yes, I have a MotionSensorListener as well. I'm treating any sensor that returns a 3 component result (x, y, z) as a motion sensor right now. They all share the same listener and event class, but each sensor has it's own trigger.
Something like:
inputManager.addMapping("myMagneticSensor", new SensorTrigger(SensorInput.MAGNETIC_FIELD));
inputManager.addListener(this, "myMagneticSensor");

inputManager.addMapping("myAccelerometerSensor", new SensorTrigger(SensorInput.Accelerometer));
inputManager.addListener(this, "myAccelerometerSensor");

public void onSensorChange(String mapName, int sensorType, float x, float y, float z, float dX, float dY, float dZ) {
logger.log(Level.INFO, "onSensorChange: {0}, Type: {1}, x: {2}, y: {3}, z: {4}",
new Object[]{mapName, sensorType, x, y, z});
Returning the sensor type is probably redundant since the trigger and map name is different for each sensor type, but I left it in for now. Is it a good idea to share the same listener for all motion sensors or break it up into seperate listeners? I was thinking it would be better to keep it common so it would be easier to add additional sensor types in the future without creating a listener for each type. Not sure yet.

@wezrule said:
@iwgeric <3 cant wait for this :)

In a couple of days, I should have the basic framework done and at least 1 sensor type implemented. There's a lot of files to touch to make SensorInput a core feature and have the non-Android implementations return null, so I'm not sure how easy it will be to apply my patches. We'll see :)
1 Like
@iwgeric said:
Is it a good idea to share the same listener for all motion sensors or break it up into seperate listeners?

Yes it is IMO, keep it as a single listener.
From the look of it one could do
inputManager.addMapping("myMagneticSensor", new SensorTrigger(SensorInput.MAGNETIC_FIELD));
inputManager.addMapping("myOwnBrewedSensor", new SensorTrigger(SensorInput.WHATEVER_OTHER_SENSOR));
So that's fine, and generic enough.

Keep it up, this looks really nice.
1 Like
@nehon said:
Yes it is IMO, keep it as a single listener.

I think I'll also leave the sensor type as a parameter in the listener as well. That way it could also be used like:
new SensorTrigger(SensorInput.MAGNETIC_FIELD),
new SensorTrigger(SensorInput.WHATEVER_OTHER_SENSOR));
1 Like

Yeah but in most case if you do it’s that you want them to have the same behavior…so the type does not really matters…

1 Like

Well, thats true :slight_smile:

1 Like

I just committed the first round of changes to support sensor input. Only Android has an implementation for Sensors at this time, but it is screwed into the Input Manager so other platforms can support sensors later.

Basically, there are 3 sensor types defined: Magnetic, Acceleration, and Orientation. Each sensor type has a seperate trigger, but they all share a common listener for receiving data. To conserve battery life, the sensors must be enabled in the applications before data is collected and sent to the application. The default state is disable so we don’t waste battery life for apps that don’t require sensor input.

One major design left to complete is to return the data in a coordinate system that is useful for applications.

Magnetic and Acceleration sensors are returning data according to Android’s device coordinate system. See this link for a description of the coordinate system used for Magnetic and Acceleration sensors.

Orientation is really a method call to Android’s SensorManager that calculates the device orientation based on the data received from the Magnetic and Acceleration sensors. The data is relative to the Earth, which makes the data a little tricky to use in applications unless we can figure out how to convert that coordinate system into the OpenGL screen coordinates. See this link for a description of the method used to get the device orientation.

Below is an example program that defines the Input Manager triggers to receive the sensor data and apply the data to an object. Be warned, the objects do not move the way you might expect. That is because of the coordinate system difference between the OpenGL world and the sensors coordinate system.

I would appreciate any thoughts you have on how to convert the coordinate system for the sensor data so that it can be directly applied to geometries, vehicles, characters.

When running the program below, touch the screen to enable / disable the sensors. When the sensor are enabled, the geometries should move as the device is rotated.


package mygame;


import com.jme3.input.MouseInput;

import com.jme3.input.SensorInput;

import com.jme3.input.controls.ActionListener;

import com.jme3.input.controls.MotionSensorListener;

import com.jme3.input.controls.MouseButtonTrigger;

import com.jme3.input.controls.SensorTrigger;

import com.jme3.material.Material;

import com.jme3.math.ColorRGBA;

import com.jme3.math.Quaternion;

import com.jme3.math.Vector3f;

import com.jme3.renderer.RenderManager;

import com.jme3.scene.Geometry;

import com.jme3.scene.shape.Box;

import java.util.logging.Level;

import java.util.logging.Logger;


  • test


    public class Main extends SimpleApplication implements ActionListener, MotionSensorListener {

    private static final Logger logger = Logger.getLogger(Main.class.getName());

    private Geometry geomZero = null;

    private Geometry geomXPlus = null;

    private Quaternion rotQuat = new Quaternion();

    public static void main(String[] args) {

    Main app = new Main();




    public void simpleInitApp() {


    Box b = new Box(Vector3f.ZERO, 1, 1, 1);

    geomZero = new Geometry("Box", b);

    geomXPlus = new Geometry("Box", b);

    Material mat = new Material(assetManager, "Common/MatDefs/Misc/Unshaded.j3md");

    mat.setColor("Color", ColorRGBA.Blue);



    geomZero.setLocalTranslation(0f, 0f, 0f);

    geomXPlus.setLocalTranslation(3f, 0f, 0f);



    inputManager.addMapping("MouseClick", new MouseButtonTrigger(MouseInput.BUTTON_LEFT));

    inputManager.addListener(this, "MouseClick");

    inputManager.addMapping("MagneticSensor", new SensorTrigger(SensorInput.SENSOR_TYPE_MAGNETIC_FIELD));

    inputManager.addListener(this, "MagneticSensor");

    inputManager.addMapping("AccelerometerSensor", new SensorTrigger(SensorInput.SENSOR_TYPE_ACCELEROMETER));

    inputManager.addListener(this, "AccelerometerSensor");

    inputManager.addMapping("OrientationSensor", new SensorTrigger(SensorInput.SENSOR_TYPE_ORIENTATION));

    inputManager.addListener(this, "OrientationSensor");



    public void simpleUpdate(float tpf) {

    //TODO: add update code



    public void simpleRender(RenderManager rm) {

    //TODO: add render code


    public void onMotionSensorChange(String string, int sensorType, float x, float y, float z, float dX, float dY, float dZ) {

    if (string.equalsIgnoreCase("OrientationSensor")) {

    logger.log(Level.INFO, "onMotionSensorChange for {0}, x:{1}, y:{2}, z:{3}, dX: {4}, dY: {5}, dZ: {6}",

    new Object[]{string, x, y, z, dX, dY, dZ});

    if (geomZero != null) {

    rotQuat.fromAngles(x, y, z);



    if (geomXPlus != null) {

    geomXPlus.rotate(dX, dY, dZ);




    public void onAction(String string, boolean pressed, float tpf) {

    if (string.equalsIgnoreCase("MouseClick") && pressed) {

    if (inputManager.getSensorInput() != null) {

    // toggle the sensors on and off with each mouse click / touch

    boolean orientEnable = inputManager.getSensorInput().isEnabled(SensorInput.SENSOR_TYPE_ORIENTATION);







Yayyyyy <333 I haven’t tested this but looks pretty sweet so far :slight_smile: The only thing I would say is the lines can become quite long (i can’t read them fully on my phone). Could the onMotionSensorChange method not use a Vector3f instead of x y and z. And I think to be more consistent with jME naming maybe call it SensorInput. SENSOR_ACCELEROMETER (without the word TYPE) etc.

Keep the good work up! Can’t wait to have a seamless android implementation for the sensors gonna be so awesome. Integrating the coordinate system, yh hmmm, it also needs to take into account the orientation of the android device (portrait or landscape). Will the data returned be the same in both cases?

1 Like
@wezrule said:
The only thing I would say is the lines can become quite long (i can't read them fully on my phone). Could the onMotionSensorChange method not use a Vector3f instead of x y and z. And I think to be more consistent with jME naming maybe call it SensorInput. SENSOR_ACCELEROMETER (without the word TYPE) etc.

Both good remarks, only thing that bothers me is that those are actually rotation angles....feeding it with a Vector3f would be convenient, but one would assume it's actually a vector....
1 Like

Oh I haven’t looked into how the code operates. I thought he was gonna translate them in a sort of 3d mouse, with normal x and y but with an additional z component. if its rotation then Perhaps an array of size 3 then would be more appropiate or maybe even a quaternion.

was just having a browse through the code for r9610. And i think in in the onPause() Event Handler:


should be


1 Like

Based on some of the comments I’ve received, it sounds like this implementation needs to change quite a bit. If I understand it right, Mappings registered with Input Manager should only be single boolean results returned via onAction or single float results returned via onAnalog and that things like the sensor input should be either method calls to Input Manager and/or received via RawInputManager with it’s own encapsulating data class.

I can change this, but I want to make sure that it is being done the correct way. I’ll try to spend a couple of days to come up with an outline of the changes. In my next post, I’ll list the planned methods that I think need to be added to InputManager and the planned listeners for RawInputListener (other than the onMotionSensorChanged that already exists in RawInputListener).

Does it make sense to allow users to register a Mapping for an individual axis of a sensor or just let them get the current sensor data with a method call to InputManager (in addition to the RawInputListener)?

Then, if we can all agree on these methods, I’ll make the changes. Sound like a plan?

@wezrule Thanks for the catch in AndroidHarness onPause. I’ll also look into whether the data is being rotated as the device is rotated. I also agree that the data should be relative to the device orientation.

@Momoko_Fan I think the changes in AndroidHarness need to be there so that the sensors can be disabled when another activity comes to the foreground and the game activity goes to the background to conserve battery life.

1 Like

Sorry I wasn’t here earlier.

I actually created an issue report for this:

The general problem seems to be that sensor and touch input is too different from keyboard or mouse input so frankly I am not sure about how to implement it within the current system.