OpenVR Available, Convert?

Making really exciting & quick progress here! Main scene isn’t rendered anymore when compositor is enabled. However, I added a “mirroring” option that will copy one of the eye’s framebuffers back to the main display. This is a really quick way to enable mirroring, and it should even display the game without distortion for spectators :smiley:

One big missing piece that still seems to be missing: camera separation for the 3D effect. I don’t think I noticed it even when plugging in my DK2…

Another status update: hooked up my DK2 to the latest build.

Extended mode worked pretty good. The distortion mesh actually looked decent this time around, for some reason. There was a bit of latency involved, but it could be due to my Optimus setup on my laptop. I would like to try it out on my primary desktop.

I made the boxes bigger in the test demo, but it was still hard to test if eye separation / stereo was working. I did notice the eye matrices were different for each eye, which means it should be working. A better test scene should help here.

Direct mode actually kinda worked! It had lots of judder, and the left eye’s view was far too big & drawing into the right eye. Not sure how to go about fixing that, or if it was just a problem with the compositor. Perhaps I’ll post about it on the SteamVR forums.

EDIT: Asking about judder & eye overlap:

I just confirmed the eye overlap & judder exists in the sample OpenVR application too, so it is either a problem with SteamVR or my Optimus setup.

EDIT: Might be a problem when trying to use 0.6.0.1 SDK. Next attempt will be moving back to the 0.4.4 SDK:

I suspect the juddering is due to Compositor->WaitGetPoses. I think it is waiting too long & causing a low frame rate. It is possible to get the poses without this compositor function, as is done with the distortion mesh path. This problem might not exist if you use 0.4.4 SDK, though (instead of the latest 0.6.0.1 SDK).

Status update: positional tracking is working great & IPD separation is now working. GUI positioning should now be working too, which defaults to AUTO.

I’m trying to get Direct mode working, or at least an easier Extended mode by finding the VR device & sending renders directly to it. I checked in an attempt to do this by creating a JFrame & using jMonkeyEngine’s Canvas context to render directly to a fullscreen JFrame on the VR device. I’m hoping the VR device will show up when in Direct mode, but it may not… we’ll see.

I think this is a good example of the power in making this library a VRApplication instead of just an AppState. If this context thing pans out, I can start the jME3 application in “canvas” mode instead of Display, based on what VR hardware is present, what OS you are running & if VR Compositor is working.

Porting to LWJGL 3.0 might make this easier, as the library it is wrapping (GLFW) has multi-monitor support. Actually this is possible even with LWJGL 2.x by creating an undecorated window and then setting its position to be on the extended display.

Yes, I did look into those two options. When jME3 goes LWJGL 3.0, it might be easier to work with. Ultimately, I want to get Direct mode working, and if GetScreenDevices() doesn’t return a Rift in Direct mode, I don’t think LWJGL 3.0 will be able to interface with it… might make handling extended mode easier at least.

Your second option is almost exactly what I am trying. Instead I am setting the undecorated window “fullscreen” with GraphicsDevice.setFullScreenWindow(…), GraphicsDevice being the detected VR headset. I also make sure to set the GraphicsDevice to the maximum refresh rate & resolution automatically (so no settings dialog needs to appear).

My only remaining worry is VSync, which I hope will still work…

Direct mode isn’t working, but god damn I am still VERY HAPPY!

This Extended mode JFrame thing is working amazingly. It really takes much of the pain of “Extended” mode away. Tracking is as smooth as butter. It is almost as good as Direct mode with its simplicity.

2 Likes

Demo & explanation of details:

We have someone confirming this works on Linux smoothly! Woooohooo! Great job, all around, guys. Now we need to make it better :smile:

EDIT: Also updated to the latest OpenVR libraries.

Great work. I have some catching up to do as I’ve been working on some other stuff.
Not being able to test things properly is a downer and my DK2 is out of town…

Lots of developments. There were camera positioning & memory usage problems, all that should be fixed. It might be better to monitor commits on the github here to see progress being made:

Next big project will be handling scene processors & filters. The trick will be getting those filters into the separate eyes, and not the distortion scene (but only in VR mode)!

Major update: finally fixed positional & rotational tracking FOR GOOD! I finally found the root cause: pose information from the headset in OpenVR is being provided relative to the headset’s orientation, not absolute. Looking up & moving up gave different positional information as looking forward & moving up, for example. All accounted for now.

Scene processing & filter system is in place. A new demo includes a filter test when hitting F:

Ambient occlusion is a little wonky, seemed to only display in one eye. Minor problem that I’ll get fixed as soon as possible…

2 Likes

I’m going to take a little time to develop 5089, so I wanted to leave a checklist of things we should accomplish for this library. @rickard (or anyone, really) – feel free to grab one of these and make a pull request:

  1. Mouse cursor in VR: Right now, if the mouse cursor is displayed, it draws over the distortion scene. The cursor needs to be replaced automatically by a spatial in the GUI scene. Perhaps VRApplication can set the cursor icon to “null” (so it doesn’t draw the default white one) & monitor if the cusor is visible? If it is visible, add the VRGuiNode-attached spatial & set it to the cursor position?

  2. VR Input: OpenVR has functions for accessing different input methods. We need to make those available, easily, to our jMonkeyEngine developers.

  3. Filter tweaks: in the current demo, I noticed the ambient shader looks different in one eye (at least when hooked up to the Rift). The cartoon lines worked fine, though, and they are part of the same shader… perhaps test other filters, like FXAA etc.

  4. More GUI positioning options: right now, it is either stuck to your face or floating freely in space. There should be more auto-positioning options, like ones that let you “look down” to the GUI, but look up away from it (like looking down at a map). In that case, the GUI would be clamped to looking left & right, but not up & down. Another option is to clamp the GUI elements to the observer spatial, and not the position of the headset. This would allow a player to look around the GUI elements, but they’d always be near.

1 Like

So much for taking a break on this project… trying to get 4089 into VR with this required lots of improvements :smile:

Major update: VRApplication no longer extends SimpleApplication. It is a more lightweight structure extending Application. No more AppState either: I needed more control over when the VR functions run, and I can’t be certain if it is just an AppState. Now stuff is always run in the right order, regardless of where AppStates are being added.

I’ve also added some functionality into the core jME3 that will tell us how long we are waiting for VSync. This allows us to better time when to grab the pose from the headset. You’ll have to use my jME3 branch for this.

All in all, TestOpenVR.java example actually got a little easier! You don’t need to call super.simpleInitApp() anymore! All of that is handled in the VRApplication.initialize(). Yay for simplicity!

1 Like

OK, @rickard, got something that I could use your help on again. This is pretty ambitious, but I think we are very close!

Direct mode.

I’ve got a wrapper library to create a DirectX device & I’m using OpenVR to get the virtual reality display’s adapter index & output index. We have jherico’s code here that got direct mode working, that I have been working off of:

This is the “core” of the new functionality:

https://github.com/phr00t/jmonkeyengine-virtual-reality/blob/master/src/jmevr/util/DirectVR.java#L60

Where I am stuck:

final IDXGISwapChain swapChain = DXGIFactory.CreateSwapChain(DXGIDevice, scDesc);

… causes an exception saying “the Present call was invisible to the user”. This is a DXGI_STATUS_OCCLUDED error which means the window wasn’t visible. It kinda makes sense, because I only see the window on the taskbar. Even if I “show” it maximized, it acts like it is on a different screen. Now, I DO NOT HAVE MY RIFT, so perhaps it will magically work when plugged in & it will be considered “visible” there? I doubt it, but don’t have my Rift to test at the moment.

I bet @jherico could really help here, too.

Hi.

I’ve been afk for a week and just got back home. I doubt if i’ll have time to look at this this week, but I’ll try to find some.

Thanks. There have been lots of changes & updates along the way, but it is getting near production-ready. I’ve actually started using this library in my commercial game, 4089, on Steam. I’m trying to tweak the latency system. I think Direct Mode is the final, big missing feature.

1 Like

Oculus will be dropping support for “Extended mode” in v0.7 of their SDK. This means getting Direct mode working is critical. Hopefully the VR Compositor will be cleaned up so we can rely on that. There is a “Direct Driver” feature alongside AMD & NVIDIA, which I’m hoping will be accessible without the Oculus SDK & actually make it easier to implement Direct mode. We’ll see.

1 Like

jMonkeyEngine’s OpenVR support is getting production-ready. I’m using it now in 4089 here:

I have a special latency system in-place when not using the VR Compositor, which is working very good. Mouse integration is in, too. However, I really look forward to VR Compositor updates so we can start using that (and say goodbye to Extended-mode).

1 Like

I’ve got a Vive developer trying out some test builds of this library. Apparently there exists an eye separation problem, but I’m having trouble figuring out why. IPD & projection, and camera location values are fine… works fine on the DK2. I had someone reporting problems on a DK1 through Facebook, though.

Anyway, this project is very much alive and rather stable for the DK2 at the moment.

1 Like