TouchInput on desktop

Hello, I am trying to use the touch input on desktop. I create the listener and mapping and stuff, but it does not work. After some investigation, I have found that the context.getTouchInput() is null. I have a PQLabs on-screen touch device. How can I make it work with JME3?

I would guess that in the worst case that listener is androd only.

But if you do not use multitouch, shouldnt the mouse listener fire events for a touch surface? (Kinda like windows tablets do so you can use all normal applications?)

That what I was afraid of… and no, I need a real multitouch. While waiting for the answer, I have already incorporated the tuio4j library in my project and created a touch-managing framework. Soon I will be testing it. I think that I will try to develop this approach and produce a full-powered desktop multitouch library for JME3 in the end.



However, there are two more things I would like to ask.


  1. I am not using the input manager provided by JME for handling my input. Currently I am receiving TUIO messges from a separate thread and pipe them into my framework which then affects Spatials and other. Is there any way to plug this, instead, into the input manager? I am asking because as far as I know, it gets’s it’s inputs from system context, and I am not sure that I can plug to it.


  2. What is the best option for realizing(displaying) 2D touch interface? I know that JME is for 3D, but I need a mixture of 2D/3D and half of the input is done while in 2D. Do you suggest using GUI render bucket for that? Currently I am planning to use Quads and cast rays from the touches, displaying Quads as if it is 2D.

I don’t think jinput supports touch so you’d have to do your own driver anyway.

Weehee! Got it! Touch detection works perfectly. Complex event processors like drag, scale and stuff are yet to be implemented, but tapping already works. Geometric shapes now catch touches! Actually, not much different from cathing rays from the mouse, but handling touches properly and conviniently requires additional machinery.



BUT. The question still on is: how is it better to implement a 2D interface? I think of using textured Quads instead of buttons and screens and stuff. Or is it strongly recommended to use Nifty GUI for that?



P.S. As it happened with the ARMonkey Toolkit port for JME3, coming from my big gratitude to JME and free software in general, I can contribute this one too, when it is done (although nobody seemed like needing the AR lib, even it’s own author :D:D:D (hello, Adam!!!)), if someone ever wants to do multitouch on desktop with JME3. Although, I have yet to implement the complex event processors, the lib already has a powerful backbone touch subsystem with registries, prefabs, abstract classes and control system, based on JME3 contols.