Google Cardboard support

Hi.

This is a heads-up for the core-team regarding support for Google Cardboard. The reason is some changes to core has been necessary to accomodate for this. I hope to be able to publish a repo and make a pull request this weekend.

Here’s the explanation:

A cardboard application should extend CardboardApplication to get access to the convenience methods and functions that their API supplies. Since a jMonkeyEngine android application extends AndroidHarness I’ve created a corresponding CardboardHarness that extends CardboardApplication.
So far so good.

It also expects there to be a CardboardView.Renderer implemented. To do this I extended OGLESContext and implemented it there in a class called CardboardContext.

Finally, to instantiate this, I extended JmeAndroidSystem in JmeAndroidCardboardSystem which returns the CardboardContext in newContext.

So far no changes to core, these all fit in a small jar on their own.

However, as you might realize, something needs to check for this JmeAndroidCardboardSystem. So, I had to add a check in JmeSystem :
systemDelegate = tryLoadDelegate(“com.jme3.system.android.JmeAndroidCardboardSystem”);

before it looks for JmeAndroidSystem. I couldn’t find any other way of injecting the context (although I’ve tried a few). I’ll be happy to refactor if anyone has any idea how to achieve this without modifying core. The good thing is it has no other dependencies than this.

1 Like

Hey cool.
I’m actively looking into the occulus rift support as I use it at work.
Cardboard support would definitely be nice too.

Edit :
About the system delegate injection, can’t it be done in the Harness?

…and I’m wondering why application subclassing was needed, exactly. This should be avoided when possible.

Yeah that’s what I was going to say. You also did this for the occulus rift, but couldn’t everything be put in the app state?( i guess you have one there too).

This way no need to extend the harness, and we can just have a special view for cardborad applications.

this would also allow for easy switch between standard and split screen 3D mode.

P.S.: Rickard, I also miss your old avatar. I always have to take a step to mentally translate that it’s you. :slight_smile:

and by CardboardApplication, I meant CardboardActivity.
I’m not too happy about the (essential) copying of AndroidHarness into CardboardHarness, either.
I’d like to point out that this is different from the android vr project in that this creates a jme app using the Cardboard API, whereas the other one brings vr to android using jme’s API. The benefit of doing it the Cardboard way is that it handles everything except the rendering itself. It even provides the viewports for each of the eyes.
Depending on how Google plays Cardboard, I think it can be beneficial for projects to say “It’s a Cardboard app” over “It’s an Android app with VR support”.

There is an explanation about the OVRApplication’s need in the “epic” (another word for “unreadable”) Oculus VR thread. It’s because the OVR needs to be initialized in a static context before the application is started. Doing it in in the subclassed application was the only way of doing it without the user having to do it themselves for each project.

I’ll see if I can inject the context from CardboardHarness. I believe I looked into it at some point, but now that I have everything working, it’s worth another look

Edit: Oh and my avatar was lost when the old forum lost its theme, somehow. I don’t know if I still have it.

You could do it as a static initializer in an AppState too… no really difference to me if it’s an app state sent to the constructor of application. Unless there are Application-specific things that it needs… but then maybe that’s something we need to discuss.

We’re currently spitballing various ideas for refactoring the Application classes.

In 3.1 master I added the capability to disable swap buffers - not sure if anybody noticed. Is this why you needed to extend application @rickard ?

Can’t wait for this!!! Will this work with SDK 3.0?

I would also request a easy way to get Vuforia running with JMonkey. It now has a Java version, but I was not able to get it working. I want to have AR Markers and be able to put JMonkey models on top of the camera feed, but using a google cardboard type setup where your hands are free. I want this for DnD

Thanks
-Greg

You could do it as a static initializer in an AppState too… no really
difference to me if it’s an app state sent to the constructor of
application.

It has to be done before the application starts, whether it’s static or not is of less importance (I believe)

In 3.1 master I added the capability to disable swap buffers - not sure
if anybody noticed. Is this why you needed to extend application rickard ?

No, it’s due to the Oculus having to be initialized before the application is started.

I’ve tested this against 3.1 trunk.
When I have a stable version, I’ll check against 3.0.

A static initializer will be run as soon as the class is referenced… so if you were passing the app state to the super constructor of your app, it would be run during app construction but before it started.

I think I’ve tried that at some point ( I believe I tried everything) but I can’t swear on it, so maybe it’s worth a shot.
My guess it has something to do with memory management. It seems either the JNI/JNA stuff is pretty picky about it.

But that’s for when I give the Oculus lib another shot.

Edit: And of course, since it’s open source, anyone who has a better solution is free to contribute it :slight_smile: It’s just that with this particular issue, I’ve spent countless hours trying to get around the issue. But since I don’t know the actual cause of the issue, it’s difficult to explain it to others.

To get the thread back on track:

I’ve commited an initial release which works quite well. In the end, using the new context without changing JmeSystem was easy, (just call JmeSystem.setSystemDelegate).

Repo is here:

How to and announcement blog post:

I’ll do a more in depth post about the technical stuff soon, just need some more time.

In short: The integration is much cleaner that the Oculus Rift one. No post processors. It just applies view matrices and transformations provided by the API to two cameras in the AppState.

1 Like

Ahhh … I see. You have to create a CardboardView instead of a GLSurfaceView in the OGLESContext. Makes sense.
Only thing is, there’s a lot of duplicate code… @iwgeric was actually planning on changing AndroidHarness to make it less ugly, this will then require you to change your modified version of it as well to stay compatible.

Yeah, I know. I didn’t see any other way of doing it though (except copying large parts of CardboardActivity instead)

The benefit is that the changes will make CardboardHarness ‘less ugly’ too :wink:

I’ve created another project on github with examples from the jMonkeyEngine examples running on Google Cardboard.
There aren’t a lot of terribly exciting ones to show but I got TestJaime running and TestMovingParticles. If anyone has any suggestions, I’d be happy to update it with some more.

And in the ‘pics or it didn’t happen’ department, I also made a video:

3 Likes

Google post this video this week, if someone is intersted :

[Cardboard: Java API - YouTube]

2 Likes

Is this still being worked on?

This seems like an awesome way to improve framerates:

Yes, it’s probably the most efficient method. The backside of it is it requires custom models (highish poly) and custom shaders. I want the cardboard lib to be as easy to make as a regular application (and even be able to use the same). Most of the work is on the developer in this case anyway, custom model, custom shader. If someone wants this and finds something is missing in the lib to achieve it, let me know.

1 Like