Android Input/Sensor system proposal with test

Hello,



Worked on JME/Android package and got a few suggestions. I think that trying to map Android MotionEvent’s to mouse/keyboard events is not a good idea at all … From my perspective, we should forget mouse/keyboard in JME on Android for the following reasons:


  • Android devices are exclusively touchable devices (or trackball for some, but same events)
  • Keyboard should only be used for typing text, not to drive a game/application.



    I built a completely new AndroidInput Class with high level events:
  • TAP: User taps on screen (single touch)
  • DOUBLETAP User double taps on screen (single touch)
  • LONGPRESS: User touches down without releasing and maintains touch for more than 1s
  • DRAGG: User drags his finger on screen (single touch)
  • SCALE: User pinches (or spreads) two fingers on screen (double touch)
  • FLING: User draggs rapidly the screen, like throwing away (single touch)



    and low level events:
  • GRAB: User touches down the screen (single/double touch)
  • RELEASE: User touches up the screen (single/double touch)



    Every event is synchronized with Application thread using a classical ArrayBlockingQueue + Pool mechanism to avoid deadlocks and events lost.

    Once detected and synchronized:
  1. A picking (only one at event start) is done to get the scene object at event location.
  2. The object is asked if it has an action defined for this type of event.

    => If so, the corresponding action is called.

    => If not, we ask the application if it has an action defined and so call it.



    This hierarchical process lets the programmer decide if his application will be ‘Event driven’ or ‘Object driven’.



    I used this mechanism for a while since JME2 on multitouch screen/walls and it’s very easy to program and use, and very responsive.

    In fact I ‘downgraded’ it from multitouch to bitouch by reducing # of event types and port it to Android.



    I’m working on a SensorInput class (almost finished).



    The enclosed video shows a demo on a Samsung Galaxy Tab. Some explanations about it:

    Unless I miss something, I was not able to set more than one Mesh and more than one Material in the scene. Doing this yields to a black screen but no crash and application is still responsive. This might be in OGLESShaderRenderer.java, but hard if you are not in this 2600 lines file …

    Enabling verbose shows lots of “glError 1280” but that kind of error arise even on a working scene.

    So that’s why we got 3 boxes using the same Bird Texture Material… I used Toasts to show Actions processes.



    . Left box has a ‘reversed scale’ animation attached to TAP action, a ‘move’ action on DRAG and a ‘scale’ action on SCALE.

    . Right box has a ‘rotation’ animation attached to TAP.

    . Middle box has no action attached so it passes to Application.

    . Middle box is attached to gyroscope sensor so it maintains its 3D position vertical and shows north…

    . Application has a ‘pan’ action (real pan, i.e. move in XY plane) attached to DRAG, a ‘zoom camera’ attached to SCALE and a ‘fit view’ attached to TAP.



    http://vimeo.com/19178070



    If anyone is interested at JMonkey Team I can share the code.



    Hope this helps. That’s all folks.
6 Likes

If your code is less disturbing than the title of this post, please share it :stuck_out_tongue: GSoC 2011 is coming up, so this is an ideal time to bring in the Android contributions.



I’m not completely sure if this applies, but there was a similar project referred to on this forum, called Multiplicity, which seems similar to what you are doing now. To these two developments relate somehow?

Hmm… This looks useful! Makes me wish I had an Android phone! A few questions:


  1. Is it jME3 or jME2?
  2. Does it work well with Android?
  3. Are you going to work/expand more on it? (Does it need expansion? lol)



    So yes, it looks pretty cool! I would totally help (if I was more knowledgeable/had more experience).



    Either way (be it jME2 or jME3, I have both), I would be very appreciative if you would share the code!
erlend_sh said:
If your code is less disturbing than the title of this post, please share it :P GSoC 2011 is coming up, so this is an ideal time to bring in the Android contributions.


Hello erlend_sh,
Thanks for your answer. Don't be surprised, I have changed the post name in order to be less disturbing... :P
Today, I have checked details about GSOC in order to subscribe very soon. I also have other suggestions about AppSettings and a Test Framework tailored for Android, I'll do it at that time.

In the meantime, I would have liked to go further on Android tests, but the 1 mesh / 1 texture / no Gui limitations is a very big restriction.
Have you already planned to fix these ?

erlend_sh said:
I'm not completely sure if this applies, but there was a similar project referred to on this forum, called Multiplicity, which seems similar to what you are doing now. To these two developments relate somehow?


You're right, our work relates to Multiplicity. We are working on a kind of 'Google App Inventor' multitouch framework to create 2D/3D applications for android and desktop platforms.
nacklab said:I would have liked to go further on Android tests, but the 1 mesh / 1 texture / no Gui limitations is a very big restriction.
Have you already planned to fix these ?
You'll have to get in touch with @antonyudin about that, as he's the main Android developer.

nacklab said:You're right, our work relates to Multiplicity. We are working on a kind of 'Google App Inventor' multitouch framework to create 2D/3D applications for android and desktop platforms.
Wow, that's a mouthful. Please help me get this straight. Your work relates to Multiplicity how exactly? Your project uses it? Parts of it? Are you in touch with the mp developers?

(On a sidenote, erm, did you just create a new account to change your nickname? ^^ There's no problem with the nickname of your other account, so by all means keep on using it if you thought that was an issue)

Hello.



The new input system sounds great.

Is there a way to still keep the emulation of a mouse for a case when somebody wants to have it?



Where can we find the code?



Thanks!

@erlend_sh:

Hello !

Sorry for this late answer, I was away last week.

How can I send you the AndroidInputClass I’ve worked on ? It’s about 20 files, only 3 files relates to the input, but the rest helps to understand I think. It’s the complete test I released on video last week. The pastebin does not apply for that.


erlend_sh said:
You'll have to get in touch with @antonyudin about that, as he's the main Android developer.

OK. Done. We are in contact.

erlend_sh said:
Wow, that's a mouthful. Please help me get this straight. Your work relates to Multiplicity how exactly? Your project uses it? Parts of it? Are you in touch with the mp developers?

We do not use Multiplicity at all. As I can see, we use the same philosophy: Multitouch screens/tables/walls as a new framework area.
From our side, we have developped a framework to create 2D/3D applications for target devices like Android and desktop platforms.
The user builds touch screen applications by creating objects or reusing complex objects he has already created, connect them to define their behaviour, and deploy his application on targets without any knowledge of code.
He only has to understand what is: an Object, an Object built with other objects, an Event and an Action.
Enough to rebuild the world...
nacklab said:(...) How can I send you the AndroidInputClass I've worked on ?

Either upload a zip to your Docs or simply send it to us (and we'll forward it to Anton) by mail at contact@jmonkeyengine.com.

Done. I chose the email.

It’s a ZIP file containing the source code directory (can be compiled) and a technical document.

It’s rough source code, no Javadocs, no comments.

I’ve added a PDF explaining the architecture with a technical diagram.

If you have any question, feel free to ask.

So… This is jME2, right?

JME3

Hello erlend_sh,

I have a project on JME3 +Android. I want to move object as example in video but I don’t know how I can start. You can share me the code ?

@remi said:
Hello erlend_sh,
I have a project on JME3 +Android. I want to move object as example in video but I don't know how I can start. You can share me the code ?

https://wiki.jmonkeyengine.org/legacy/doku.php/jme3#tutorials_for_beginners

Thank your for your reponse. I have code as:



public void initTouch() {

inputManager.addMapping(“Touch”, new TouchTrigger(TouchInput.ALL));

inputManager.addListener(new TouchListener() {

@Override

public void onTouch(String name, TouchEvent event, float fpt) {

// TODO Auto-generated method stub

if(event.getType() == Type.MOVE){

Vector3f v = refrid.getLocalTranslation();

refrid.setLocalTranslation((v.x + event.getDeltaX())*speed, (v.y + event.getDeltaY())*speed, v.z);

Log.i(“INFO”, Float.toString(v.x));

Log.i(“INFO UPDATE”, Float.toString(v.x + event.getDeltaX()));

}



}

}, “Touch”);

}



but when I touch on object(refrid) and move it, it goes out of screen(It is not on screen). You can give me an idea ?

(I’m sorry for my english not good)

Please please, there are someone can help me, thank you so much

Here all my code, i want to move object refrid, someone could help me ?



public class SimpleTexturedTest extends SimpleApplication {



protected Spatial refrid;

Boolean isRunning = true;

AndroidInput input;

@Override

public void simpleInitApp() {

flyCam.setEnabled(false);

viewPort.setBackgroundColor(new ColorRGBA(0.76f, 0.8f, 0.84f, 1));

refrid = assetManager.loadModel(“refrigerator_v1.obj”);

Material mat = new Material(assetManager,

“Common/MatDefs/Misc/ShowNormals.j3md”);

refrid.setMaterial(mat);

rootNode.attachChild(refrid);

// Create a wall with a simple texture from test_data

Box box = new Box(Vector3f.ZERO, 2.5f, 2.5f, 1.0f);

Spatial wall = new Geometry(“Box”, box);

Material mat_brick = new Material(assetManager,

“./Common/MatDefs/Misc/Unshaded.j3md”);

mat_brick.setTexture(“ColorMap”, assetManager

.loadTexture("./Textures/Terrain/BrickWall/BrickWall.jpg"));

wall.setMaterial(mat_brick);

wall.setLocalTranslation(2.0f, -2.5f, 0.0f);

rootNode.attachChild(wall);



// Display a line of text with a default font

guiNode.detachAllChildren();

guiFont = assetManager.loadFont(“Interface/Fonts/Default.fnt”);

BitmapText helloText = new BitmapText(guiFont, false);

helloText.setSize(guiFont.getCharSet().getRenderedSize());

helloText.setText(“Hello World”);

helloText.setLocalTranslation(300, helloText.getLineHeight(), 0);

guiNode.attachChild(helloText);

initTouch();

}

public void initTouch() {



inputManager.addMapping(“Touch”, new TouchTrigger(TouchInput.ALL));

inputManager.addListener(new TouchListener() {

Vector3f v;

@Override

public void onTouch(String name, TouchEvent event, float fpt) {

// TODO Auto-generated method stub

v = refrid.getLocalTranslation();

if (event.getType() == Type.MOVE) {



refrid.setLocalTranslation(v.x + event.getX()* speed, v.y + event.getY() * speed, v.z);

}

}

}, “Touch”);



}

}