Virtual Reality FPS while walking around in real life

Hi everybody,

Last year I did my master thesis computer science about a virtual reality multiplayer game. The idea was to play a game on the oculus rift while walking around in real life and move at the same time also in the virtual world. The game was created in the jMonkeyEngine. We managed to come up with a prototype and had some fun with it. If you are interested you can read the paper version (6 pages) that I presented on a conference in Mexico or the full thesis.

We also posted a videoclip on youtube where we explain the concept and show a short demo.

I would also like to thank two guys that helped me last year with figuring out how to use the oculus rift together with the jMonkeyEngine: @phr00t ,@rickard. Thumbs up for you two!

5 Likes

Very interesting do you have a video demo for it, to be shared as use case

Edit: I missed to congratulate you, it is a really fun topic.

Thanks for your comment! I just added a link to a youtube video where we explain the concept in short and where we show a short demo

Hey this looks pretty cool! How reactive was it irl though? In the video it seems to take long to respond to the person’s movement.

Thanks! We were able to update the location of a player 4 times per second. This could be increased to roughly 8 times when using a better smartphone. But when you play the game in real life, the result was better then expected. So it is not perfect yet, but we are already looking to other methods to increase this responsiveness.

Only 4 times? But shouldn’t it only be sending like 7 floats (3 for location and 4 for rotation)? That could surely go faster. I guess there could be some interpolating to make it smoother.

The 4 times is actually quite good. The problem is that you are running a resource extensive image processing algorithm on a smartphone. We already used an quite simple algorithm but it was still the bottleneck. With the newest smartphones that have more capabilities the result gets better. Real-time image processing on a smartphone or a similar device is a quite new research area, so better results can be expected in the future. We are already looking at possible improvements to this part of the prototype. Besides improving the tracking algorithm itself, it might also be an option of trying to predict how a user is going to move. We have also some other ideas, this was our first prototype.

I should also point out that if you test the prototype it does feel better then you might think. One reason for this is that you are walking around while not seeing the real world, so you walk quite carefully.

Interesting experiment!
Like you say, @P0seid0n, there are numerous ways to improve the experience. Maybe you have already thought of the following:
Place a smartphone on the player/user and use the accelerometer and compass in the smartphone itself together with the fewer CV based positions in a filter (Kalman perhaps?) to help give more frequent predictions of movement.
Place the smartphone in front of the HMD and take snapshots with the camera which are also processed (if it spots the other player, you will know more about the locations of both players).
I’m not familiar with Camshift but it sounds expensive. Maybe a marker-based solution will give you better performance? Then you get not only position but also direction. Together with the front-mounted camera you can then also read markers placed around the ‘arena’.

The combination of tracking and using the accelerometer is definately an option.
I didn’t tought about placing a smartphone in front of an HMD, a good option.
Camshift is actually not expensive at all, it is an quite easy and old color-based algorithm. Image processing in general is still just quite heavy for smartphones and a fairly new research field.
Now we know direction when you compare your current position with the previous one. The marker-based solution sounds also interesting, the only issue is that we want to have a very mobile framework. With the marker you need to place more things then just the 4 camera’s.
Thanks for your input!