TouchEvent in Eclipse RCP AwtPanel


I want to add gesture control feature to my application to use it on Windows Surface Tablet.

I added the following event to my inputManager but nothing happens.

inputManager.addMapping("tap", new TouchTrigger(TouchInput.ALL));
InputListener touchEventListner = new TouchListener() {

	public void onTouch(String name, TouchEvent event, float tpf) {
inputManager.addListener(touchEventListner, "tap");

Then I had to realized that the touchInput attribute in my SimpleApplication class is NULL.

The reason why it’s null is because the method getTouchInput() of the AwtPanelsContext isn’t implemented and returns allways null.

How can I enable the touch behavior for my AwtPanel?

I configure my SimpleApplication as follow:

AppSettings settings = new AppSettings(true);

I’ve got a bad feeling when I was investigating the classes. Maybe there isn’t any possibility to get the touch events.
Is there a possibilty to convert the mouse input as touch input. The only thing I need is zooming of my model. Translation and rotation works out of the box with fingers.

The AppSettings class offers the method setEmulateMouse. If I unterstand the documentation right, I only get the position of one finger and not of both?

There are no suggestion. Perhaps I haven’t provided enougth information.

Is it in general possible to get touch events in an AWT Panel?
If not then can I get the position of my second mouse point, which is my finger :slight_smile: ?

I’m not really an expert on this, but Googling let me to understand that in Java only JavaFX supports proper touch events. AWT/Swing no. Seems that there are some really old implementations available that are pretty hard to track down (like Google Code Archive - Long-term storage for Google Code Project Hosting.).

I’m not sure how feasible it is to add one of these to JME. But you should be able to extend or craft your own AWTPanel. And maybe hook JavaFX or something to provide the touch events?

First of all, let me thank for your answer. I have a Eclipse RCP application which is using SWT as UI. I integrate JMonkey in my application as follows:

import org.eclipse.swt.widgets.Composite;
import java.awt.Frame;
Composite comp = new Composite(aParent, SWT.EMBEDDED | SWT.NO_BACKGROUND);|
Frame frame = SWT_AWT.new_Frame(comp);
final AwtPanelsContext ctx = (AwtPanelsContext) getContext(); //SimpleApplication.getCanvas()
canvas= ctx.createPanel(PaintMode.Accelerated);

The Composite, where the AWTPanel is hosted, has a method named addGestureListener but I don’t get the touch events, probably because the AWTPanel is in it.

If I don’t add AWTPanel to my composite(comp) then I get the events.

comp.addGestureListener(e -> {

If I make a zoom movement with my fingers then I get an event and prints out:

GestureEvent{Composite {} time=10144046 data=null stateMask=0 detail=8 x=337 y=162 rotation=-10.948043017931768 xDirection=0 yDirection=0 magnification=0.0}
GestureEvent{Composite {} time=10144093 data=null stateMask=0 detail=8 x=337 y=162 rotation=-10.904097034919513 xDirection=0 yDirection=0 magnification=0.0}
GestureEvent{Composite {} time=10144109 data=null stateMask=0 detail=8 x=336 y=162 rotation=-10.706340111364327 xDirection=0 yDirection=0 magnification=0.0}
GestureEvent{Composite {} time=10144109 data=null stateMask=0 detail=4 x=336 y=162 rotation=-10.706340111364327 xDirection=0 yDirection=0 magnification=0.0}

But it doesn’t matter, I can’t catch the event if the AWTPanel over the composite.

I have no idea how I can catch the touch events with the AWTPanel inside the SWT-Composite.