Making really exciting & quick progress here! Main scene isn’t rendered anymore when compositor is enabled. However, I added a “mirroring” option that will copy one of the eye’s framebuffers back to the main display. This is a really quick way to enable mirroring, and it should even display the game without distortion for spectators
One big missing piece that still seems to be missing: camera separation for the 3D effect. I don’t think I noticed it even when plugging in my DK2…
Another status update: hooked up my DK2 to the latest build.
Extended mode worked pretty good. The distortion mesh actually looked decent this time around, for some reason. There was a bit of latency involved, but it could be due to my Optimus setup on my laptop. I would like to try it out on my primary desktop.
I made the boxes bigger in the test demo, but it was still hard to test if eye separation / stereo was working. I did notice the eye matrices were different for each eye, which means it should be working. A better test scene should help here.
Direct mode actually kinda worked! It had lots of judder, and the left eye’s view was far too big & drawing into the right eye. Not sure how to go about fixing that, or if it was just a problem with the compositor. Perhaps I’ll post about it on the SteamVR forums.
I suspect the juddering is due to Compositor->WaitGetPoses. I think it is waiting too long & causing a low frame rate. It is possible to get the poses without this compositor function, as is done with the distortion mesh path. This problem might not exist if you use 0.4.4 SDK, though (instead of the latest 0.6.0.1 SDK).
Status update: positional tracking is working great & IPD separation is now working. GUI positioning should now be working too, which defaults to AUTO.
I’m trying to get Direct mode working, or at least an easier Extended mode by finding the VR device & sending renders directly to it. I checked in an attempt to do this by creating a JFrame & using jMonkeyEngine’s Canvas context to render directly to a fullscreen JFrame on the VR device. I’m hoping the VR device will show up when in Direct mode, but it may not… we’ll see.
I think this is a good example of the power in making this library a VRApplication instead of just an AppState. If this context thing pans out, I can start the jME3 application in “canvas” mode instead of Display, based on what VR hardware is present, what OS you are running & if VR Compositor is working.
Porting to LWJGL 3.0 might make this easier, as the library it is wrapping (GLFW) has multi-monitor support. Actually this is possible even with LWJGL 2.x by creating an undecorated window and then setting its position to be on the extended display.
Yes, I did look into those two options. When jME3 goes LWJGL 3.0, it might be easier to work with. Ultimately, I want to get Direct mode working, and if GetScreenDevices() doesn’t return a Rift in Direct mode, I don’t think LWJGL 3.0 will be able to interface with it… might make handling extended mode easier at least.
Your second option is almost exactly what I am trying. Instead I am setting the undecorated window “fullscreen” with GraphicsDevice.setFullScreenWindow(…), GraphicsDevice being the detected VR headset. I also make sure to set the GraphicsDevice to the maximum refresh rate & resolution automatically (so no settings dialog needs to appear).
My only remaining worry is VSync, which I hope will still work…
Direct mode isn’t working, but god damn I am still VERY HAPPY!
This Extended mode JFrame thing is working amazingly. It really takes much of the pain of “Extended” mode away. Tracking is as smooth as butter. It is almost as good as Direct mode with its simplicity.
Major update: finally fixed positional & rotational tracking FOR GOOD! I finally found the root cause: pose information from the headset in OpenVR is being provided relative to the headset’s orientation, not absolute. Looking up & moving up gave different positional information as looking forward & moving up, for example. All accounted for now.
Scene processing & filter system is in place. A new demo includes a filter test when hitting F:
I’m going to take a little time to develop 5089, so I wanted to leave a checklist of things we should accomplish for this library. @rickard (or anyone, really) – feel free to grab one of these and make a pull request:
Mouse cursor in VR: Right now, if the mouse cursor is displayed, it draws over the distortion scene. The cursor needs to be replaced automatically by a spatial in the GUI scene. Perhaps VRApplication can set the cursor icon to “null” (so it doesn’t draw the default white one) & monitor if the cusor is visible? If it is visible, add the VRGuiNode-attached spatial & set it to the cursor position?
VR Input: OpenVR has functions for accessing different input methods. We need to make those available, easily, to our jMonkeyEngine developers.
Filter tweaks: in the current demo, I noticed the ambient shader looks different in one eye (at least when hooked up to the Rift). The cartoon lines worked fine, though, and they are part of the same shader… perhaps test other filters, like FXAA etc.
More GUI positioning options: right now, it is either stuck to your face or floating freely in space. There should be more auto-positioning options, like ones that let you “look down” to the GUI, but look up away from it (like looking down at a map). In that case, the GUI would be clamped to looking left & right, but not up & down. Another option is to clamp the GUI elements to the observer spatial, and not the position of the headset. This would allow a player to look around the GUI elements, but they’d always be near.
So much for taking a break on this project… trying to get 4089 into VR with this required lots of improvements
Major update: VRApplication no longer extends SimpleApplication. It is a more lightweight structure extending Application. No more AppState either: I needed more control over when the VR functions run, and I can’t be certain if it is just an AppState. Now stuff is always run in the right order, regardless of where AppStates are being added.
I’ve also added some functionality into the core jME3 that will tell us how long we are waiting for VSync. This allows us to better time when to grab the pose from the headset. You’ll have to use my jME3 branch for this.
All in all, TestOpenVR.java example actually got a little easier! You don’t need to call super.simpleInitApp() anymore! All of that is handled in the VRApplication.initialize(). Yay for simplicity!
OK, @rickard, got something that I could use your help on again. This is pretty ambitious, but I think we are very close!
I’ve got a wrapper library to create a DirectX device & I’m using OpenVR to get the virtual reality display’s adapter index & output index. We have jherico’s code here that got direct mode working, that I have been working off of:
final IDXGISwapChain swapChain = DXGIFactory.CreateSwapChain(DXGIDevice, scDesc);
… causes an exception saying “the Present call was invisible to the user”. This is a DXGI_STATUS_OCCLUDED error which means the window wasn’t visible. It kinda makes sense, because I only see the window on the taskbar. Even if I “show” it maximized, it acts like it is on a different screen. Now, I DO NOT HAVE MY RIFT, so perhaps it will magically work when plugged in & it will be considered “visible” there? I doubt it, but don’t have my Rift to test at the moment.
Thanks. There have been lots of changes & updates along the way, but it is getting near production-ready. I’ve actually started using this library in my commercial game, 4089, on Steam. I’m trying to tweak the latency system. I think Direct Mode is the final, big missing feature.
Oculus will be dropping support for “Extended mode” in v0.7 of their SDK. This means getting Direct mode working is critical. Hopefully the VR Compositor will be cleaned up so we can rely on that. There is a “Direct Driver” feature alongside AMD & NVIDIA, which I’m hoping will be accessible without the Oculus SDK & actually make it easier to implement Direct mode. We’ll see.
jMonkeyEngine’s OpenVR support is getting production-ready. I’m using it now in 4089 here:
I have a special latency system in-place when not using the VR Compositor, which is working very good. Mouse integration is in, too. However, I really look forward to VR Compositor updates so we can start using that (and say goodbye to Extended-mode).
I’ve got a Vive developer trying out some test builds of this library. Apparently there exists an eye separation problem, but I’m having trouble figuring out why. IPD & projection, and camera location values are fine… works fine on the DK2. I had someone reporting problems on a DK1 through Facebook, though.
Anyway, this project is very much alive and rather stable for the DK2 at the moment.