Head-tracking Math, or, Porting Johnny Lee's WiiDesktopVR C# / DirectX Demo

Hi,



I've just started having a bit of fun hooking my Nintendo Wii controller up to my Mac and PC. I'm giving a presentation at JavaOne on Thursday ("Extreme GUI Makeover") and I got a last minute idea to give a demo with the Wii doing head-tracking with Java / jMonkeyEngine in the same vein as Johnny Lee's famous YouTube video (www.youtube.com/watch?v=Jd3-eiid-Uw). Looking at the C# code for the demo, I see that the real magic occurs in apply a transformation to the vertices in the demo to simulate perspective changes. I'm in way over my head, and was wondering if some kind soul could help me port the C# code over to Java / JME. The whole program is actually rather simple with the exception of the matrix math involved in the transform; the code is available here: http://www.cs.cmu.edu/~johnny/projects/wii/WiiDesktopVR.zip. All the relevant source is in four source files in the root of the ZIP (the .cs files).



I really appreciate any help; for it to make it into the talk, I need some help in the next 24 hours. This is an optional demo for my talk, but it would be a great opportunity to further expose jME as Extreme GUI Makeover is one of JavaOne's most popular talks.



Thanks,



Ben



(Rikard and Joshua, if you read this, I'm the guy who chatted with you before and after the talk this evening.)

Wouldn't it be enough to port the camera setup code? If yes, it should be easy as jME provides similar method as used in this piece of C# code:

device.Transform.World = Matrix.Identity;

         // Set up our view matrix. A view matrix can be defined given an eye point,
         // a point to lookat, and a direction for which way is up. Here, we set the
         // eye five units back along the z-axis and up three units, look at the
         // origin, and define "up" to be in the y-direction.
         device.Transform.View = Matrix.LookAtLH(new Vector3(headX, headY, headDist), new Vector3(headX, headY, 0), new Vector3(0.0f, 1.0f, 0.0f));

         // For the projection matrix, we set up a perspective transform (which
         // transforms geometry from 3D view space to 2D viewport space, with
         // a perspective divide making objects smaller in the distance). To build
         // a perpsective transform, we need the field of view (1/4 pi is common),
         // the aspect ratio, and the near and far clipping planes (which define at
         // what distances geometry should be no longer be rendered).

            //compute the near plane so that the camera stays fixed to -.5f*screenAspect, .5f*screenAspect, -.5f,.5f
            //compting a closer plane rather than simply specifying xmin,xmax,ymin,ymax allows things to float in front of the display
            float nearPlane = .05f;
            device.Transform.Projection = Matrix.PerspectiveOffCenterLH(    nearPlane*(-.5f * screenAspect + headX)/headDist,
                                                                            nearPlane*(.5f * screenAspect + headX)/headDist,
                                                                            nearPlane*(-.5f - headY)/headDist,
                                                                            nearPlane*(.5f - headY)/headDist,
                                                                            nearPlane, 100);



Or do you mean another piece of code?

I am pretty sure that the perspective code there is initializing the camera vectors, so you can simply replace Matrix.PerspectiveOffCenterLH with camera.setFrustum, the rest of the code should be fairly easy to port. How exactly are you receiving the positional data though? Are you accessing the Wii controller through some library like WiiRemoteJ?

Hi Irrisor,



Unfortunately, having very little experience w/ 3D, I'm not sure how much of the C# code will present a problem once I try to port. The sections that I suspect would give me trouble are:


  • OnCreaseVertexBuffer(…). This has something to do with populating custom structs that are associated with each buffer. I have no idea what is happening here, nor do I know if jME has a similar event. I haven't found a "VertexBuffer" type in jME, but I do see a bunch of Vertix* classes that could potentially do the same thing.


  • SetupMatrices(). Yes, this is the code you pasted below. How do I set this up in jME?


  • Warper.cs and AffineTransformSolver.cs are what really made me call out for help, but, looking at WiiDesktopVR.cs, they don't appear to be used anywhere in the code. Hmm… sweet, that makes things simpler.



    Is there anything else you can see that wouldn't be a straight-forward port? Sounds like this is within my abilities after all…



    Thx very much,



    Ben
Momoko_Fan said:

I am pretty sure that the perspective code there is initializing the camera vectors, so you can simply replace Matrix.PerspectiveOffCenterLH with camera.setFrustum, the rest of the code should be fairly easy to port. How exactly are you receiving the positional data though? Are you accessing the Wii controller through some library like WiiRemoteJ?


RE: Wii. I have experience using wiiusej and both it and the Wiimote C# library are very similar thin wrappers around the Wiimote attributes. So that bit shouldn't be a problem.

RE: Camera. So it looks like:

Matrix.LookAtLH == Camera.setFrame
Matrix.PerspectiveOffCenterLH == Camera.setFrustrumPerspective

While the PerspectiveOffCenterLH has five args in the C# code, the jME code only takes four args, but I suspect that will be easy to work out. They two methods appear to be doing the same things based on the JavaDocs and comments in Johnny Lee's code.

Am I right?

So I did a grep and "Warper" and "AffineTransformSolver" don't return any hits in the source so… whew. This may be less complex than I feared.



Here's the vertex buffer code I mentioned earlier, along with the struct definition:



        public void OnCreateVertexBuffer(object sender, EventArgs e)
        {
            VertexBuffer vb = (VertexBuffer)sender;
            // Create a vertex buffer (100 customervertex)
            CustomVertex.PositionNormalTextured[] verts = (CustomVertex.PositionNormalTextured[])vb.Lock(0, 0); // Lock the buffer (which will return our structs)
            for (int i = 0; i < 50; i++)
            {
                // Fill up our structs
                float theta = (float)(2 * Math.PI * i) / 49;
                verts[2 * i].Position = new Vector3((float)Math.Sin(theta), -1, (float)Math.Cos(theta));
                verts[2 * i].Normal = new Vector3((float)Math.Sin(theta), 0, (float)Math.Cos(theta));
                verts[2 * i].Tu = ((float)i) / (50 - 1);
                verts[2 * i].Tv = 1.0f;
                verts[2 * i + 1].Position = new Vector3((float)Math.Sin(theta), 1, (float)Math.Cos(theta));
                verts[2 * i + 1].Normal = new Vector3((float)Math.Sin(theta), 0, (float)Math.Cos(theta));
                verts[2 * i + 1].Tu = ((float)i) / (50 - 1);
                verts[2 * i + 1].Tv = 0.0f;
            }
            // Unlock (and copy) the data
            vb.Unlock();
        }




        struct Vertex
        {
            float x, y, z;
            float tu, tv;

            public Vertex(float _x, float _y, float _z, float _tu, float _tv)
            {
                x = _x; y = _y; z = _z;
                tu = _tu; tv = _tv;
            }

            public static readonly VertexFormats FVF_Flags = VertexFormats.Position | VertexFormats.Texture1;
        };



Why is he defining his own Vertex struct? Is that just normal C# DirectX code?

You are not supposed to port ALL of the code… The only part you actually need to port is the conversion between the Wii positional data and frustum calculation. The rest should be done in pure jME code.

Momoko_Fan said:

You are not supposed to port ALL of the code.. The only part you actually need to port is the conversion between the Wii positional data and frustum calculation. The rest should be done in pure jME code.


heh... I get that the DirectX API in C# will be different than the jME API in Java. I am unfortunately rather busy through 9 pm Pacific today, and so I'll only have from 9 pm - 2 am Pacific tonight to get this working in jME and I want to make sure I won't hit any roadblocks between now and then.

So it sounds like you're saying that the VertexBuffer stuff is just weird DirectX arcana that doesn't apply to jME whatsoever? I can just render my scene graph, do the camera stuff, tie camera movement into the Wii, and I'm good to go? I can ignore the rest of the code?

Thx,

Ben
bgalbs said:

I can just render my scene graph, do the camera stuff, tie camera movement into the Wii, and I'm good to go? I can ignore the rest of the code?

Yep, that's what I'd say, too.

Is there already documentation on how I handle custom input devices? I'm primarily concerned with the threading issues. The Wii API, like other input devices I suspect, is event driven, so I'm planning on having a dedicated thread receive events from the WiiuseJ API. Can I invoke jME API directly from such a thread, or do I need to pass events to a pump that processes them on a unified thread?

This’ll be of some help regarding multi threading in jME:



http://www.jmonkeyengine.com/wiki/doku.php?id=chapter_14_-_multiple_threads&s=gametaskqueuemanager

bgalbs said:
Can I invoke jME API directly from such a thread, or do I need to pass events to a pump that processes them on a unified thread?

You need to do it in the jME update loop. If you are using StandardGame you can use the GameTaskQueueManager as Hal suggested. But as you most probably want to run any jME demo you can use an InputHandler for it, which is updated in the game loop. Simply add a repeating action:


        input.addAction( new ActionInvokedEachFrame() );


and



    public class ActionInvokedEachFrame extends MouseInputAction {
        public void performAction( InputActionEvent evt ) {
            // update camera here
        }
    }


(Don't let the "Mouse" in front of the "InputAction" confuse you, just copy it ;))

No more posts from you, bgalbs - does this mean you were successful?

irrisor said:

No more posts from you, bgalbs - does this mean you were successful?


Heh, nope! Just sitting down to play with this now...

Hi everyone,



I have to bring up this thread again to ask wether anyone followed this path successfully.

I just recently decided to start my next game as a 3D based project (therefore this is my first post in this forum :)).

My idea was to create something like the good old Time Crisis (Playstation) with a rich use of the physics engine. Furthermore I want to control/shoot with a WiiRemote, which shouldn't be a problem as I got some successful experience with the WiiRemoteJ library.

Some hours ago I thought about the video from Johnny Chung Lee, mentioned above. In my opinion this would be a great possibility to get both "technologies" together, as the game will be using a wii remote anyway. Why not giving the possibility of the ultimate FPS experience (hiding, peeking, …) :roll:



Sure it will be much more work than I can imagine right now, but should be worth the effort. so, as you already brought up a similar idea, I wanted to know if you gained any further progress?



Any other help or tips how to implement this functionality correctly is appreciated.



Thanks and regards,

Daniel


Hi,



Nope, I had one night to get it working and failed, so I fell back to a different Wii demo. Haven't had a chance to revisit since.



Ben

i might be wrong but isnt this just a camera handler?



modifying the jme standard FirstPersonHandler with accept wii mote inputs, then u should be all set. or do it the other way around, fire key pressed events in response to wii mote inputs.



but his stadium example is kind interesting though, when the sensor moves forward to the tv, the camera is actually moving backward in the game space. however, strafing actions r presistent through out all examples.

Ok thank you both.



I didn't start my tests yet. Had to buy a new Bluetooth stick, because this **** Vista did not accept my old one. Also had to change the library for my wiimote connections. The WiiremoteJ version (at least bluecove) has some issues with 64bit OS. So I'll try WiiUseJ now, which seems to be ok. Can't use the Wii Balance Board as expected but ok.



In the first place, I'll follow your hint and try to adapt the camera location/rotation to the eye-position. I'll keep you informed if there is any successful progress :wink: I will just use this thread.



Thanks and regards

So, it’s me again.



I took the liberty to try that head tracking combined with java and the jmonkey engine.



It works nearly as you expected by adjusting the cameras location/direction and frustum. Currently it stays fixed to the z-axis like the video from johnny lee (as he had 2 dimensional objects).  So the next thing I will try is, to add a motion to this axis too, to be able to provide an additional horizontal head rotation.



I captured a little video (not the best quality, sorry :frowning: ).

http://www.youtube.com/watch?v=KWVtBzAnuKg



If you are interested in further progress, I could continue to post in this thread or you just read this little blog I opened to keep track with the development status.

http://wiicrisis.blogspot.com/



Have a nice weekend.

Regards,

Daniel

Cool! :slight_smile: