Lemur touch-move events not being consumed for jme3's listeners

I’m trying to make a panel (in the guiNode) move with the finger and consume the events when doing it (so the 3D scene doesn’t receive the event). However, the move events aren’t consumed for the com.jme3.input.controls.TouchListener but clicks are. Is this an intended behavior?

I always call the setConsumed() on those events.

Edit: my fault, the clicks aren’t being consumed neither. So… I suppose they aren’t meant to be consumed in the jme3’s internal system. Is there a way to achieve this?. Maybe a lemurEvent.getEvent().setConsumed()?

Edit: wrong again… Ok, I think that I get it, the DOWN/UP are being consumed but not the TAP and MOVE.
I’m adding: MouseEventControl.addListenersToSpatial(panel, ConsumingMouseListener.INSTANCE) but those events are still passing the filter.

You’d have to look in the source code to be sure. I’m not sure whether the original touch related events are being consumed when you call the Lemur-equivalent consume or if maybe JME is ignoring teh consume (less likely).

I didn’t write the touch event code so I’m a little unfamiliar with it.

Ok, by the code it seems to not consume the motion event in none of the extensions of BasePickState. In the MouseAppState the mouse observer has the next piece of code:

@Override
    protected void dispatchMotion() {
        Vector2f cursor = getApplication().getInputManager().getCursorPosition();
        getSession().cursorMoved((int)cursor.x, (int)cursor.y, scrollWheel);
    }

    protected void dispatch( MouseButtonEvent evt ) {
        if( getSession().buttonEvent(evt.getButtonIndex(), evt.getX(), evt.getY(), evt.isPressed()) ) {
            evt.setConsumed();
        }
    }

    protected class MouseObserver extends DefaultRawInputListener {
        @Override
        public void onMouseMotionEvent( MouseMotionEvent evt ) {
            //if( isEnabled() )
            //    dispatch(evt);
            scrollWheel = evt.getWheel();
        }

        @Override
        public void onMouseButtonEvent( MouseButtonEvent evt ) {
            if( isEnabled() ) {
                dispatch(evt);
            }
        }
    }

Where we can see that the motion event isn’t consumed but the button is.

With the TouchAppState we have something similar:

protected class TouchObserver extends DefaultRawInputListener {

        @Override
        public void onTouchEvent(TouchEvent te) {
            if (!isEnabled()) {
                return;
            }
            PointerData pointerData;
            switch (te.getType()) {
                case DOWN:
                    pointerData = getPointerData(
                            te.getPointerId(), (int)te.getX(), (int)te.getY());
                    if (dispatchButton(pointerData, true)) {
                        te.setConsumed();
                    }
                    break;
                case MOVE:
                    pointerData = pointerDataMap.get(te.getPointerId());
                    if (pointerData != null) {
                        pointerData.lastX = (int)te.getX();
                        pointerData.lastY = (int)te.getY();
                    }
                    break;
                case UP:
                    pointerData = pointerDataMap.get(te.getPointerId());
                    if (pointerData != null) {
                        pointerData.lastX = (int)te.getX();
                        pointerData.lastY = (int)te.getY();
                        if (dispatchButton(pointerData, false)) {
                            te.setConsumed();
                        }
                        pointerDataMap.remove(te.getPointerId());
                    }
                    break;
                default:
                    break;
            }
        }
    }

When the DOWN and UP are dispatched the original events are consumed, but not the motion one (and that event is lost in that method).

In both, the motions are performed in the next loop update, would it be possible to ensure that update being performed before the inputManager’s? (where is also dispatched it motion), or a better solution would be to dispatch the motion the moment the event is listened?

In the case of the mouse, it’s a tricky problem because we don’t use any of JMEs mouse motion events… so there is no event to consume from JME.

I guess touch is implemented in a similar way.

We aren’t good citizens if we always consume them. And it’s unclear when/if we should selectively consume them. At least in the case of touch, I guess we got an initial event to begin with. But there are likely very good reasons that the dispatch is delayed (like I guess for touch you always want the downs to be processed before the moves or something.)

I don’t know if there was a good reason or not… unfortunately, I didn’t write that bit of code and have never used Android.

Edit: note that if you know on a sort of global level when you want to disable motion then you could always add a RawInputListener to consume those JME events… like when the Lemur GUI is open or something. Kind of limited applicability I guess.

Well, the tricky problem would be solved storing the events that the observer is getting and when motion is dispatched (with it delay) it consume those events?, if it can be updated before the inputManager it would just work as intended, if not, the other solution (calling the dispatchs on the reception, just like with button events) would do the trick. Why are those solution not possible?, it there something I don’t see?.

What I want right now is to have lemur consume all the events when the mouse is on any of it panels (just if the mouse touches them, almost any of them).

On the mouse side, a raw mouse listener might receive a bunch of events for every Lemur motion event. The frequency can be set and controlled. So for example, rather than running potentially expensive picking once a frame you could do it only 20 times a second. (If you are running at 100 hz or more then that’s a big savings… huge.)

It doesn’t matter if we consume any of them because they were already long ago delivered. Potentially, Lemur could consume all subsequent events until it’s next motion dispatch… but that doesn’t seem very friendly either.

How about we take a step back and talk about your game state. What’s going on in the non-Lemur space that you want to stop doing when over Lemur panels?

In the non-lemur space all is “clicky”, the scene can be moved and selected everywhere. The lemur panels can also be dragged when touching them so, if a player is interacting with a lemur panel, he has just to interact with that panel, not the 3D scene. Currently, when dragging a lemur list, for example, the 3D entities are also being moved.
If this can’t be currently easily solved lemur-side I suppose that the best bet is to make a single static listener to add to every panel that must “consume-all” so if it receive a DOWN, it disables the 3D environment events and it re-enables it on UP.

But why is the 3D scene not also using Lemur picking and stuff?

That’s… a great question. Currently, I’m not sure, maybe this is just the way to go (do you know that feeling of being a dumb when you have a problem and try to solve it by your ways but then pspeed comes and shows you the easy it was and the silly you are?? xD).

After looking the code: However, I think I stuck with this separated-way because of the SCALE_MOVE and the LONGPRESSED events.

Might be easier to somehow add support for those than to fix the other problem.

yes, may be. Is @iwgeric doing it, or should I attempt to make my version?.

Another difference is between the InputManager and the lemur events is that with the inputManager I only get a move event if the touch is moved, however, with lemur, just holding press fires continuous move events (it may not have a huge impact in functionality, normally the fingers aren’t the precise but it is noticeable with an emulator :stuck_out_tongue: )

Well, that’s the other reason Lemur fires its own motion events. The scene could be moving underneath the mouse/cursor and you’d never know if you waited for a motion event.