OpenVR Available, Convert?

Wow, wasn’t aware OSVR lacked positional tracking. OpenVR is the way to go, at least for now.

I’m trying to make progress on getting the “null driver” to work:

For some reason, the “null driver” is saying it is unable to load. Awaiting a response from Valve.

In the meantime, we can probably create the shader framework from the OpenVR example:

Once the source code is out all those issues should hopefully go away.
I am sure someone would fork it and add positional tracking if it didn’t support it already.

It’s a hardware issue. It seems to lack sensors (IR-leds in Oculus Rift’s case). They claim support from other vendors (like leap motion, sixense etc). Whether it’s out of the box support in OSVR SDK or using external libs remains to be seen.

I think Momoko_Fan is referring to the general SDK, not the actual OSVR hardware. I would hope the OSVR Oculus Rift plugin supports positional tracking, even if the OSVR hardware doesn’t. However, I can see OSVR taking a slow approach to supporting third-party hardware compared to OpenVR & Oculus (whom both have more financial motivations for providing great & quick support). I hope OSVR fixes things up, but I’m not going to spend much time on that project while OpenVR is ahead.

2 Likes

Added some framework shaders from the OpenVR example:

https://github.com/phr00t/jmonkeyengine-virtual-reality/commit/d1df93025b8b0b99e80d7074e008ea8366d8bd41

Obviously not tested at all, but it looks right :smile: Here is how the texture coordinates for the 3 colors should be used (which isn’t implemented yet, as far as I know):

Feel free to pick this up if you can, @rickard

Still no progress on using this OpenVR library will a “null” driver. I’m not even sure if it’d work with my Rift hooked to my linux laptop, because the error returns “unable to load Oculus driver” in addition to the “unable to load null driver”. Still awaiting an update from Valve on this. I’m trying to make progress elsewhere, as seen above with the shaders.

Read the comment I made on that last commit – looks like OpenVR is doing some math on positioning off the GPU:

vert.position = Vector2( Xoffset+u, -1+2yh );

… while that has been done in the vertex shader (in our converted implementation), as seen in OpenVR.vert:

vec2 pos = inPosition.xy * 2.0 - 1.0;

Makes sense to do it off the GPU, so it doesn’t need to be done for every vertex. However, for now, just make sure we don’t cause a bug by doing the math twice (e.g. both off & on the GPU). I left it inside the vert shader, we will just have to exclude it when coverting the OpenVR C++ code (since I think jMonkeyEngine must handle that attribute somehow automatically).

Looks like Linux development is hitting a brick wall:

VR Compositor doesn’t exist for the platform yet. I can get past “Hmd not found (108)” when developing on my virtual machine, so something is broken with the null driver on Linux. I’m going to try pushing ahead on Windows within my virtual machine & see how far I can get. This is what happens when I run the current OpenVR conversion code in Windows:

Hit another snag. BridJ doesn’t support structs being returned as a value. OpenVR is littered with them. I posted in BridJ’s support forum asking for help:

https://groups.google.com/forum/#!forum/nativelibs4java

JNA looks like it does support this feature:

http://www.eshayne.com/jnaex/example04.html

So, this may mean switching to JNA. Bleh.

Sorry that I’m not so active on this now. I’m working on some other things and have a hard time finding motivation with the release of the Vive so distant. I’ve, however, signed up for an OSVR devkit and will probably buy one anyway if I don’t get it. When that arrives there will be something to test on, at least.

Too bad about BridJ. I’m not well-versed with the JNI stuff, but I’m slightly concerned with JNA as it’s supposed to have much worse performance compared to JNI. It might be significant in VR-terms. On the other hand, I don’t have a better suggestion.

Good to hear from you, rickard!

Release of the Vive may be some months away, but I hope to have a solid library by then. Lots to do! :smile:

JNI sounds like a last resort. I’m actually now working with direct mapping JNA, which should be very fast and comparable to BridJ.

I made some progress with a JNAerator output of the C++ interface. I even got the “Hmd not found (108)” error from VR_Init(), which was expected. Unfortunately, JNA doesn’t support C++, so I couldn’t get IVRSystem or the compositor access. Luckily, I found an OpenVR C interface:

… problem is, there is no VR_Init() function in there. Also, I get errors like IVRSystem_GetWindowsBound missing symbolic link… so I’m not sure the library is complete for the C interface. I asked for help in the SteamVR forum:

Huge success update!

I’ve got OpenVR working with the C interface & direct-calling JNA. This should be fast & have all the tools we need to complete this project. I’ve tested VR_init(), GetRecommendedRenderTargetSize(), getting the FOV & IPD. All return good, expected results. I haven’t been able to get the Compositor working, but I believe it is because I’m running Windows in a virtual machine & it may have trouble picking up the virtual display.

The solution was to add VR_ to many of the C function names. Apparently that part is missing in the openvr_capi.h header files, but I found them by dumping the symbolic link tables.

I committed my changes. I converted the first half of things inside OpenVR.java. I recommend you check this out, @rickard:

Here is the new JOpenVR library, using direct-access JNA:

Looks like we’ll need to update & reconvert the rendering stuff. A new “hello world VR” program was written with the OpenVR 0.9.2 update here:

If you find time, feel free to do this @rickard :slight_smile: I’ll get to it myself eventually, though.

2 Likes

Update: all of what was previously written in OpenVR.java is converted to use JNA. I’ve tested some basic functionality using the “null driver” in Windows, and the pose structures appear to be populating correctly. The TestVRApplication.java program runs, although all I see is a black screen. I notice the getFOV() function returns 0, because that is what the OpenVR routine returns for the null driver. I’ll probably need to report that to Valve. Next step will be getting something to display on the screen!

https://github.com/phr00t/jmonkeyengine-virtual-reality/commit/043fde0a124cd9181d055022d46acb24100de5d7

1 Like

I’ve been plowing ahead with progress, but I may be reaching the limit of what I can accomplish within a Windows virtual machine. I’m hitting a “function not supported by video hardware” exception in jMonkeyEngine within the Virtual Machine. However, I think I got everything working up to a point. I changed how the TrackedDevicePose_t structure array is handled, which I think is finally correct according to JNA examples.

@rickard, this may be something you could really help with: we need to setup the UV coordinates for the OpenVR filter shaders, which handles distortion. This is something you probably have more experience with. This is the sample application doing it:

I’ve created the shader & filter for OpenVR that should get the values from the example above set into:

Keep in mind the inPosition.xy * 2.0 - 1.0 is happening in our OpenVR.vert shader, and in the hellovr application, I think it is happening in the SetupDistortion() function.

Great. I hope to be able to spend some time on this during the vacation (which is due in a couple of weeks).
If you don’t mind doing some project management work, feel free to specify some tasks that need doing. It will help me get into it quicker.
Yeah, I can look into the distortion filters. It’s a bit different from how we did it for the Oculus. They’re using meshes, right?

You may need to enable GPU support in your virtual machine configuration. Then whatever features your host video card supports will be exposed in the guest OS.

I don’t think they are using meshes for distortion. It looks like it is only a vertex & fragment shader. It looks like they have UV mappings for each color, so they will be placed in the right positions. This is meant to solve chromatic aberrations at the same time as distortion. Pretty neat, in my opinion.

I might already tackle this piece within a few weeks… we’ll see where we are at by then.

@Momoko_Fan,

I’m running VirtualBox & I did enable 3D acceleration & installed DirectX. I can run very few 3D applications, but it seems very limited compared to what VMWare Fusion might provide. I think my best case scenario is to dual-boot with Windows , though. If I run into any problems within a VM, it might be hard to tell if it is because of the VM. Nobody will be using VR in a VM anyway. Better develop for the real deal.

Good news! I’m now dual-booting Windows 8.1, so I can develop OpenVR with a native Windows install. I also got our OpenVR library initializing both the hardware & the compositor now. The code runs without crashing, but nothing is displayed on the screen yet. That is because the OpenVR filter needs work – the texture coordinates need to be set so the fragment/vert shaders work. I’m working on a few projects (5089, Spermination, this library), so I’ll get to it as soon as I can. However, if you get some free time @rickard, feel free to pick this part up (as I posted before). If I get it done first, I’ll find something else you can work on next :smile:

Great. This is my final week before vacation. Then I’ll be able to dig into this.

1 Like

OK, I jumped in and started distortion implementation. Yes, it does use a distortion mesh. I went through and did my best to convert the OpenVR distortion mesh creation function here:

I put “TODO” comments in important spots where things still need work. One important piece, where I hope @nehon or @Momoko_Fan may be able to help with: multiple textures per vertex. OpenVR wants a UV coordinate for each color channel. I see TexCoord2 & TexCoord3 (in addition to TexCoord) as a VertexBuffer type, but I don’t see it referenced elsewhere in jme3-core…

We also need this distortion mesh to be used instead of the normal quad when this filter gets rendered (correct?). Would this require changes to the jMonkeyEngine library itself? I’m, of course, fine with opening that up in my jME3 repository if needed.

I’m going to go back and work on 5089 for a bit, so this is where you may pick up where I left off, @rickard, considering you are starting vacation soon.

Another oddity that I’m unsure of… it looks like the distortion mesh contains both eyes. I assume this means we only need 1 filter for both eyes? Sounds efficient, but I’m not sure how that fits with our current 2 filter (one for each eye / viewport) setup? Perhaps you need to render both viewports without filters, on each side of the screen, and then pass that through the single filter.