AndroidHarness vs AndroidApplication

Hey,



I’d like to know what’s the difference between AndroidHarness and AndroidApplication cause it seems to be 2 different ways to do the same thing. Is there one better that the other ?



Thx !

Seeing question and answer post directly next to each other is always funny :stuck_out_tongue:



Also check this article on wikipedia.

This is the answer I’m looking for… I read the topic you’re talking about but it still doesn’t explain what the differences are between using the AndroidHarness which extends Activity to insert jme apps into an Android project and AndroidApplication which extends Application in order to do the same thing.



What I mean is that both insert jme apps into an Android project following two different ways… but are they providing the same features ?

For instance, using AndroidApplication makes my app bug with touch events. I’d like to figure out why because according to what is said in the topic you talk about in your screen shot, AndroidHarness is supposed to handle touch events. Is there a way to fix it ?



So in the end, on one hand we have AndroidHarness which override the Activity and AndroidApplication which override the Application and both to insert a jme app into Android. Then my question is about:


  • Which one is supposed to work better (features, monitoring, achievements, …) ?


  • Until now I was working with AndroidApplication… should I change for AndroidHarness ?

AndroidApplication is deprecated.

Ok thx.

AndroidApplication was my first attempt and is conceptually totally different from AndroidHarness.



AndroidHarness wraps a jme application, so the application does not need to know anything about android - it can be run on all target plattforms,

like desktop, applet, android, … (Whats next?) :slight_smile:

AndroidHarness emulates a mouse and sends the MouseEvents to the application. If your app is designed to use a mouse then thats the way to go.



AndroidApplication gives you access to the android motion events via input.addListener() and to the android system via activity.

But its not standard and will not run on anything other than android.



Issues start when you need multi touch or gesture recognition in your app as this can not be converted cleanly to MouseEvents.

I think the cleanest solution would be to have more than one mouse input in jme to support that.



Any ideas on this issue would be very welcome, because i currently do not know how to cleanly integrate these special smartphone features.

The app I’m trying to develop is exclusively targeting Android… which one would you recommend then ? AndroidApplication 'cause according to what you’re saying, AndroidHarness seems to runs a lot of superfluous things (mouse emulation, etc) ?



I worry also about the fact that maybe you will only focus on AndroidHarness improvements which would force me to change later.



What’s your advice then ?

If you intend to run only on android and need the native input events than i think its best to use none of the two.

Copy the harness code to your activity, derive your app from jme application and implement the AndroidTouchInputListener interface.

Then do a input.setListener(app) somewhere.



Im not sure if AndroidApplication belongs into the repository, because it does nothing more than giving you direct access to the input system

and implements an unhandled exception handler which displays the error msg on screen and not only in log. Its non standard and fits most my own needs. Its very easy to recreate this functionality in your own application.

We could have additional input types, like touch events. For that we may require to have a dedicated “TouchInput” class which the context could provide, if it is provided, then the InputManager registers for it.

Thats a great idea. If its ok for you to have changes to core jme3 that would be the best solution.

Currently only the class TouchEvent in com.jme3.input.android is completly decoupled from the android SDK

(means it compiles without the android.jar)



So if i write a TouchInput class which does not rely on android sdk it could be integrated into the core?

I like the idea too (who wouldn’t ?). Some kind of MotionInput class to handle gyroscope/accelerometer/etc would be awesome too 'cause I really have no idea of how I could do that… and it would solve my thread problem.

Yes, a TouchInput class, then a sub-interface of RawInputListener that provides touch events, TouchEvent, and TouchTrigger (for user code).

Before you will go crazy with android-specific input classes, maybe there is a chance to generalize it a bit to different kinds of input devices? Java3d went slightly overboard with that with InputDevice and Sensor, but it was aiming to support all fancy pointing devices for CAD drawing etc. It is probably out of scope for jme3, but there are some interesting gaming input devices also - accelerometers, ‘guitars’, car wheels… maybe even Kinect ?



I know lwjgl was supporting some controllers - has anybody played with that?

@abies

I would like the idea of having some kind of bluetooth PS3 controller support for android in jme. But as you stated one can easily go overboard in terms of affordability and sanity.

IMHO the important point to consider here is that Android is a plattform and not an input device.

@larynx:


...implement the AndroidTouchInputListener interface.
Then do a input.setListener(app) somewhere.


I'm not sure of what I have to do... could you explain a little more please ? For now I have my activity with the harness code into, I have my app which extends AndroidApplication (I know you said it has to extends Application but I think there is no difference) and now I'm kinda stuck.

You have to implement the AndroidTouchInputListener interface to receive the events

[java]

public class YourClass implements AndroidTouchInputListener

{



[/java]



Then you have to tell the AndroidInput class where it should send the events:



input.setInputListener(yourClassInstance);

And can I implement motion events here too or should I split it ?



Edit: Ok my bad I just took a look at the AndroidTouchInputListener… I have my answer :).

Ok now I get it for the implementation of the interface but I still can’t figure out the “input.setInputListener(yourClassInstance);” thing 'cause according to what I did with normal inputs I had to add them to the InputManager with some kind of code like this:



[java] application.getInputManager().addMapping(“Trigger”, new KeyTrigger(KeyInput.KEY_SPACE));

application.getInputManager().addMapping(“Repop”, new KeyTrigger(KeyInput.KEY_U));



application.getInputManager().addListener(actionListener, new String[] {“Trigger”, “Repop”});[/java]



So why can’t I do it the same way ? (I’m sorry but I’m a bit uncomfortable with listeners etc… even more 'cause I don’t really know what’s behind the jME code).

Thats exactly the issue we have currently here regarding making android a jme standard platform. :wink:



Android events are not supported by the InputManager yet. I have to write a TouchInput class to integrate with

the InputManager.



Until when that is finished you have to tackle input listeners yourself. :evil:

Is this gonna be long (just to have an idea - no pressure) ? :stuck_out_tongue:



Actually if I ask it’s because I have some planning to respect regarding work so if you tell me that you’re gonna do it quick (ie faster than the time I would need to do it myself - which is pretty long I think considering my coding skills :D) I’ll let it go. But if you tell me that it’s not on top of your TODO list, I have to know cause I have to review my planning with my boss in order to insert this task. So I’d just like to know it that’s all.



Thx :wink: