Touch input system with InputManager

A rather big change to the input system is coming. AndroidInput gets integrated into the jme InputManager and TouchEvent is the new

generic class to transport the events. Generic here means its not android dependent anymore. Could theoretically be any touch input device.



The benefit for app developers is now no difference between android and desktop code, the same code compiles and runs on both



For the upcoming details pls look at this thread:

http://hub.jmonkeyengine.org/groups/contribution-depot-jme3/forum/topic/android-input-system-integration/



Usage details will follow as soon as the commits are in svn.

3 Likes

Using the raw input listener:



I took the following example from my own code.

This shows how to use the raw input listener interface, example for using actions / triggers will follow.



[java]public class VirtualJoystick implements RawInputListener

{

public VirtualJoystick(InputManager inputManager, Node rootNode)

{

inputManager.addRawInputListener(this);

}

…

[/java]

[java] /**

  • Gets called by the InputManager on all touch/drag/scale events

    */

    @Override

    public void onTouchEvent(com.jme3.input.event.TouchEvent evt)

    {

    switch(evt.getType())

    {

    case MOVE:

    lastX = evt.getX();

    lastY = fScreenHeight - evt.getY();

    …

    break;

    case TAP:

    lastX = evt.getX();

    lastY = fScreenHeight - evt.getY();

    …

    break;

    case KEY_DOWN:

    key = evt.getKeyCode();

    break;

    }

    }

    [/java]

Will it work with Windows Touch? I suppose there are people who like leaving greasy finger marks on their PC screens…

Theoretically yes, as touch events are now supported by the jme3 core. In practice a WindowsTouchInput class

would be needed to grab the events from the OS just as the AndroidInput class does now.

Great news ! I’m gonna test it as soon as possible… I’ll get in touch for any feedback.

Using the mapped/triggered touch listener:

[java]public class VirtualJoystick implements TouchListener

{

public VirtualJoystick(InputManager inputManager, Node rootNode)

{

inputManager.addMapping("Touch", new TouchTrigger(0));

inputManager.addListener(this, new String[]{"Touch"});

}

…

[/java]

[java] /**

  • Gets called by the InputManager on all touch/drag/scale events

    */

    @Override

    public void onTouch(String name, TouchEvent evt, float tpf)

    {

    switch(evt.getType())

    {

    case MOVE:

    lastX = evt.getX();

    lastY = fScreenHeight - evt.getY();

    logger.info("Pressure " + evt.getPressure());

    …

    [/java]

I tested it is working, but it only works on real device, how can I test it using the mouse in a standard environment.



I tried this



touchInput.setSimulateMouse(true);





but touchInput is null …

you mean with the emulator? you cant



Or on the desktop? There you wont get touch events

I meant on the desktop.

ok thanks sad :frowning:

touchInput.setSimulateMouse(true) enables MouseEvent generation in the touch input



You would need TouchEvent generation in the mouse input :slight_smile:

But i thought the android emulator simulates touch input? I mean android doesn’t even have a concept of a mouse

It seems that these events work in both desktop and android :

MouseButtonTrigger

MouseAxisTrigger



except that for me MouseAxisTrigger is inverted ( right is left and top is down) on android. So I wanted to give a try to the touchListener.



I would be cool is the mouse event could be translated into touchEvent on desktop to save the pain of testing on android. I enjoy a lot developing on desktop and doing test on real device a few time to check all is working.

You could generate the touch events yourself on the desktop side.



Implement a rawinputlistener to catch the mouse events and feed them to your touch listener.



What do other thinks about this, would it make sense to add this functionality to the mouseinput?

Just as a note, the MT4j framework has this kind of mouse touch emulation. It even has emulation of two fingers with 2 mice!!! This functionality is based on some 3rd party native dll… So if you want this in JMonkeyEngine, you could try investigate.

Hello

I have problem when I move object on screen(tablet acer A500). Object doesn’t move correctly. Please someone could help me ? Thank you very much



Here is my code:



[java]public class SimpleTexturedTest extends SimpleApplication implements

ActionListener, TouchListener {



protected Box refrid;

// protected Box refrid;

Boolean isRunning = true;

AndroidInput input;



@Override

public void simpleInitApp() {

Logger.getLogger("").setLevel(Level.SEVERE);

inputManager.addMapping(“Touch”, new TouchTrigger(0));

inputManager.addListener(this, new String[] { “Touch” });



flyCam.setEnabled(false);

viewPort.setBackgroundColor(new ColorRGBA(0.76f, 0.8f, 0.84f, 1));



// assetManager.registerLocator("./test.zip",

// ZipLocator.class.getName());



refrid = new Box(Vector3f.ZERO, 1, 1, 1); // create cube shape at the origin

Geometry geom = new Geometry(“Box”, refrid); // create cube geometry from the shape

Material mat = new Material(assetManager,

“Common/MatDefs/Misc/Unshaded.j3md”); // create a simple material

mat.setColor(“Color”, ColorRGBA.Blue); // set color of material to blue

geom.setMaterial(mat); // set the cube’s material

rootNode.attachChild(geom);



// Create a wall with a simple texture from test_data



Box box = new Box(Vector3f.ZERO, 2.5f, 2.5f, 1.0f);

Spatial wall = new Geometry(“Box”, box);

Material mat_brick = new Material(assetManager,

“./Common/MatDefs/Misc/Unshaded.j3md”);

mat_brick.setTexture(“ColorMap”, assetManager

.loadTexture("./Textures/Terrain/BrickWall/BrickWall.jpg"));

wall.setMaterial(mat_brick);

wall.setLocalTranslation(2.0f, -2.5f, 0.0f);

rootNode.attachChild(wall);



// Display a line of text with a default font

guiNode.detachAllChildren();

guiFont = assetManager.loadFont(“Interface/Fonts/Default.fnt”);

BitmapText helloText = new BitmapText(guiFont, false);

helloText.setSize(guiFont.getCharSet().getRenderedSize());

helloText.setText(“Hello World”);

helloText.setLocalTranslation(300, helloText.getLineHeight(), 0);

guiNode.attachChild(helloText);

}



@Override

public void onTouch(String name, TouchEvent event, float tpf) {

// TODO Auto-generated method stub

float lastX;

float lastY;

float deltaX;

float deltaY;



float pressure;

int objectId = -1;



switch (event.getType()) {

case MOVE:

lastX = event.getX();

lastY = event.getY();

deltaX = event.getDeltaX() / 100;

deltaY = event.getDeltaY() / 100;



Vector2f touch2D = new Vector2f(lastX, lastY);

Vector3f touch3D = cam.getWorldCoordinates(

new Vector2f(touch2D.x, touch2D.y), 0f).clone();



Vector3f dir = cam

.getWorldCoordinates(new Vector2f(touch2D.x, touch2D.y), 1f)

.subtractLocal(touch3D).normalizeLocal();



CollisionResults results = new CollisionResults();

Ray ray = new Ray(touch3D, dir);

rootNode.collideWith(ray, results);

CollisionResult closest = null;



if (results.size() > 0) {

closest = results.getClosestCollision();



}



int nNode = rootNode.getQuantity();

Log.e("", “Node number " + nNode);

CollisionResults resultsN;

for (int j = 0; j < nNode; j++) {

objectId = -1;

resultsN = new CollisionResults();

rootNode.getChild(j).collideWith(ray, resultsN);

if (resultsN.size() > 0) {

if (closest.compareTo(resultsN.getClosestCollision()) == 0) {

objectId = j;

break;

}

}

//Log.e(”", “Node number " + j);

}

if (objectId >= 0) {

com.jme3.math.Transform transform = rootNode.getChild(objectId).getLocalTransform();

Vector3f v = transform.getTranslation();

Log.e(”",

" coordinate of node “+objectId+” x= “+v.x+” y= “+v.y+” z= “+v.z);

float tmpx = v.x+deltaX;

float tmpy = v.y+deltaY;

Log.e(”", " coordinate changed “+tmpx+” “+tmpy+” “+v.z);

rootNode.getChild(objectId).setLocalTranslation(v.x + deltaX,

v.y + deltaY, v.z);

}



break;

case TAP:



break;

case LONGPRESSED:



break;

case UP:



break;

case FLING:

break;

default:

break;

}

Log.e(”", “Event Type " + event.getType());

event.setConsumed();



}



@Override

public void onAction(String name, boolean value, float tpf) {

// TODO Auto-generated method stub

Log.e(”", “” + name);

}[/java]



[java]public class JmeAndroidActivity extends AndroidHarness {

// /** Called when the activity is first created. */



public JmeAndroidActivity() {



// Set the application class to ruw

appClass = “test.android.jme.SimpleTexturedTest”;

// eglConfigType = ConfigType.FASTEST; // Default

// Edit 22.06.2011: added a switch to get the highest (best) egl config

// available on this device

// Its usually RGBA8888

eglConfigType = ConfigType.BEST;

// Exit Dialog title & message

exitDialogTitle = “Exit?”;

exitDialogMessage = “Press Yes”;

// Edit: 25.06.2011: Enable verbose logging

eglConfigVerboseLogging = true;

// Edit: 30.06.2011: Choose screen orientation

// screenOrientation = ActivityInfo.SCREEN_ORIENTATION_LANDSCAPE;

// Edit 12.07.2011: Invert the MouseEvents X (default = true)

mouseEventsInvertX = true;

// Edit 05.07.2011: Invert the MouseEvents Y (default = true)

mouseEventsInvertY = true;

// screenFullScreen = false;



}

}[/java]

This programme don’t run correctly second move, please anyone could give me an idea ?

Hello my freind,



I need your help. This is a bug of JME3-Android ?

stop it already :x