Tamarin - Virtual reality utility library

I have been creating a library for VR applications (Tamarin) and I wanted to (a) let people know about it and (b) get some early feedback on it. I intend to continue putting “enabling code” into JMonkey itself and have more utility style stuff within Tamarin

Library Features:

Backporting Action based OpenVr into JMonkey 3.5

Recently I put in partial support for action based OpenVr (aka semi modern VR) into JMonkey but that will not be available till 3.5.1 or 3.6. Tamarin also contains that functionality (as well as a lot of other OpenVr functionality that I intend to put into JMonkey engine as soon as I get around to it).

Action based VR allows for far better cross system compatibility. Actions are basically abstract versions of button presses that are bound to specific controller buttons (and can be redefined by the user if necessary)

Support for binding a hand model to the VR controllers

The raw calls to openVr and getting back pose positions, bone positions etc are all in JMonkey itself (or the backported code), but the VRHandsAppState provides a more persistent way to bind a hand model (with an appropriate armature) such that it tracks the users hand positions (and finger positions) on an ongoing basis. The Bound hands also contain a number of other useful features:

Grab support

grabPickPoints

The bound hands can have a grab action bound to them, when that grab action is triggered geometry picking is used to detect geometries near the hands (see the red and pink lines above to see the areas scanned by default). Any spatial that is detected is scanned for an AbstractGrabControl (parents of the geometry also scanned). If one is found then it is informed of the grab event. AutoMovingGrabControl is a concrete implementation of a grab control that allows a geometry to be moved around when grabbed then remain where it is when released.

Picking support

Raw picking from a ray coming from the palm or a ray coming from just in front of the thumb is supported, giving CollisionResults as the return (just in front of the thumb is where a gun would fire from)

Nodes to hang things off

The bound hand has 2 key nodes supporting 2 coordinate systems. The palm coordinate system (good to attach held items to if not using the AbstractGrabControl). The zero point of this coordinate system is the middle of the metacarpal bone of the middle finger

The other coordinate system being the xPointing coordinate system. Which has its zero at the point openVr puts its zero, just in front of the thumb (Although the default rotation of OpenVr seemed a bit mad, with z pointing up at a 45 degree angle, that system is also available, but seems like a huge pain)

(Note that because the hands distort the x,y & z of these systems will not perfectly align, although they are similar)

Lemur support

Lemur is optional, a project that doesn’t have lemur will work fine with Tamarin, but if Lemur is on the class path then additional functionality is available to have interaction with 3D UIs

Lemur support allows for a tracking cursor to be projected onto whatever is currently being pointed at (effectively a mouse pointer in 3d). If a “click” is called on the bound hands then if either a MouseListener is attached to the Lemur UI object or its a button then that click is passed to Lemur (its a limited click, with no X,Y coordinates). Also shown is a picking line attached to the xPointing coordinate system.

Test bed

An example application using all the above features is at GitHub - oneMillionWorlds/TamarinTestBed: An example project that uses Tamarin to produce a simple VR game

Basic hand models

Also within Tamarin are some hand models (and reference textures for them). They are “fine, not great” but good enough to get started with. Equally I have included their blender files within the git project if anyone wants to start with them but make something better. Getting the bones right was a huge pain so I hope this will save people time.

License

I went back and forth between an MIT license and a BSD-3 license, its currently a BSD-3 license to copy what JMonkey has, but I have no strong feelings about it other than wanting minimal requirements on end users.

Controversial decisions

Compile only dependencies

Both JMonkey and lemur are compileOnly dependencies of Tamarin, I did this because I consider Tamarin extending both of those, not “using” them and I don’t want to pin people to a particular version (or require them to do excludes in their gradle files)

VRHands app state not controls

I considered the players hands as global things, so the hands are controlled by an AppState, not by controls on the hand geometries. I liked this because it means they are easily available anywhere rather than references to them having to be passed around, but I can see the argument the other way

Current availability

Currently this is only available to build from source from GitHub - oneMillionWorlds/Tamarin: A VR utilities library for JMonkeyEngine

What’s next

Maven central

I’m in the process of getting ownership of the groupId com.onemillionworlds on maven central. Once I do I’ll publish the library there

[Edit; now released to maven central]

JMonkey store & wiki

Similarly once I’ve got this on maven central I’ll add this to the JMonkey store and add documentation in the user contributions section to the wiki.

Add to JMonkey core

Doing the real test application I realised I needed a lot more of the OpenVr stuff than I though. I’ll put that core stuff into jme-vr so it’ll be available for 3.6, and then I can remove the backporting stuff from Tamarin

OpenXR

Obviously I’m aware that OpenXR will replace OpenVr at some point, at that point I’ll need to do some further work to update all of this, probably creating a version 2 at that point. I can imagine a similar backporting exercise into Tamarin happening then, but if I don’t write that myself I’ll obviously ask first

15 Likes

This looks extremely useful. Not that I’m currently working on any jME projects, but I can see the library getting jME at least somewhat up to speed with other engines when it comes to VR specific functionality.

Another thing to add to the library that might be worth considering is a grabable system, so it’s easy to add objects that can be grabbed and moved by the player, similar to what Unity’s XR interaction toolkit offers. It would be cool if such functionality was readily available, as I find myself always reinventing that wheel, no matter what engine I use (yes, also in Unity where such functionality is supposedly readily available :sweat_smile:).

Grab support is already in :slight_smile: I thought exactly the same that it would be useful. If a spatial (and I mean spatial, not just geometries) have an AbstractGrabControl as a control and they are picked when the hand grabs then the control is informed of the grab (and later informed of the release).

There is a concrete AutoMovingGrabControl that actually does move things (used in the video for the moving cubes). I suspect a real game would want to couple to a physics engine so may use AbstractGrabControl but AutoMovingGrabControl gets you a lot of the way there really easily for a quick demo

2 Likes

Wonder how easy it would be to tie that to Lemur’s drag-and-drop containers. That would be pretty slick.

1 Like

Love that idea. I’ll investigate

Version 1.0.0 is now available from mavenCentral with dependency:

dependencies {
    com.onemillionworlds:tamarin:1.0.0
}

It also now has documentation at https://github.com/oneMillionWorlds/Tamarin/wiki/Getting-started

And I’ve submitted it to the JMonkey store (Not yet approved, unsuprisingly as I submitted it mere seconds ago)

6 Likes

Re drag and drop, looking at how DragAndDropControl works it looks like its in 2D viewport coordinates for quite a lot of the time and projects a pick line from the capture root towards the cursor position. That feels quite difficult to make compatible with hand based interaction which would like to talk in 3D coordinates if possible (I could calculate where an imaginary cursor in one of the eye view ports would be to end up hitting that 3D position, but it would probably have issues with the hands being in the way).

Is calculating the 2d view port position the best bet or is there a 3D way in to the same system (This is why I currently for clicking I just pick off the MouseListener directly and give it a synthetic MouseButtonEvent, to avoid going via 2d)

Hi Richtea. Thanks for sharing this awesome lib.

I have a broad question regarding VR-dev. What is your typical workflow ?

I have found out that putting on the HMD, removing it, gettings hands on the controller, waiting for SteamVR to start, env.isInitialized(), is a pain. Not really a pain, but it is slow.

In software development there’s a concept called the feedback loop. It is the time between code writting and seeing the code in action. It may be an unit test (really fast feedback loop), or starting a web server and check in browser (for a webapp ; kind of slow) etc … The fastest the feedback loop the best. Back in the days I add “infinytest” plugin for Eclipse, that launched units test as soon as you save a file. There’s nothing fastest than that ! Also hot-reload was really nice, I haven’t seen it in the gamedev ecosystem (unit tests as well).

The VR device makes this feedback loop kind of slow.

Do you work with HMD always on your head ?

Totally agree wity you. The headset on headset off loop is a real pain and i haven’t found anything to make that better. Hotswap can avoid the rebooting of the app but not the headset on-off cycle.

I have the headset feed duplicated to a second monitor which sometimes means I can avoid putting on the headset if i just want to look at something which can help sometimes but often you need the full experience.

1 Like

I don’t use JME for VR, but if it makes you feel any better, it’s the same pain no matter the engine you use. Personally I got used to it, as the only real difference that you can make is to add SteamVR launch to your editor startup script (Unity in my case) and leave it running for as long as you’re working on VR stuff.
There’s no avoiding putting the headset on and off though. I’d avoid keeping the headset always on the head as random dust from hair can scratch the lenses and depending on the headset in question, it can be really uncomfortable.

1 Like

As you’re my target market what would make you choose JME for VR. If JME supported openXR would that be enough or does Unity give you more than that that’s valuable to you

1 Like

What’s that ? A JME feature ?

Regarding my question (off topic now I realize, sorry!) : I tried to code with HMD on. It’s doable. With Oculus Quest 2 Link, you can see your desktop. It makes thing a little bit faster, but it hurts the eyes a little. And you have to know your keyboard of course.

Depends. If the OpenXR support was magically working without any hitches whatsoever and with a perfectly usable API, then I might consider it, as both Unity and Godot’s OpenXR implementations are prone to random bugs, stupid API design decisions and general jankiness.

However, I’m not sure if I’d be able to deal with jme’s code first approach to asset management after spending so much time in engines with proper scene editors that embrace the scene graph object = code class philosophy. No matter how you look at it, games in those engines get built so much faster because of it, even though jme’s way of doing things might suit some people better. This isn’t a VR specific thing, but I really don’t see a use for jme in any project that doesn’t need to directly use OpenGL in some capacity (as this is the one area where I feel jme does better than more known engines). It’s simply too low level of an engine for practical usage, especially when non-programmers start getting involved in a project.
But we’re getting a little off topic :sweat_smile:

1 Like

Hotswap is a feature of some IDEs (i know both intellij and eclipse have it) where in debug mode minor code changes can be made and “hot swapped” into the application during runtime without restarting the application

1 Like

Also a feature found in jME SDK :wink:

2 Likes

The colors in VR are different from the colors in “regular” mode. In VR, all colors seems brighter, and the shadows are darker.

Below : VR View :

Below : Regular view (FPS, or non-vr) of the same scene with the sun in approx. the same position

For example, the sky color. It’s hardcoded, for now, in a fragment shader like so

31.0f/255.0f,100.0f/255.0f,179.0f/255.0f

So for the Sky, with a color picker tool (photoshop), we can see that the VR view has the closest color.

For the ground, it’s defined in java like so new Color(97, 213, 54). Again, the VR has the closest color.

The ground material is defined like so :

// in our case ambientColor is "new Color(97, 213, 54)" ; it's passed as an argument
Material mat = new Material(assetManager, "Common/MatDefs/Light/Lighting.j3md");
mat.setBoolean("UseMaterialColors", true);
mat.setColor("Ambient", ambientColor);
mat.setColor("Specular", ColorRGBA.White);
mat.setColor("Diffuse", ambientColor);
mat.setFloat("Shininess", 128.0f);

And there are two lights :

sunLight = new DirectionalLight();
sunLight.setColor(ColorRGBA.White);
sunLight.setDirection(new Vector3f(-.5f, -.5f, -.5f).normalizeLocal());

AmbientLight ambientLight = new AmbientLight();
ambientLight.setColor(ColorRGBA.White.mult(0.1f));

Both VR and non-VR render the same scene, of course.

So … I wonder which view renders the colors correctly. And also, except shaders/materials/lighting, I don’t have a slightest clue what in my code could affect the colors in such a way.

That’s weird. Do you have a small example application(s) that you could post that show the problem (I’d suggest in a new topic to avoid having too much competing stuff going on in this one)