Hi all, I’m trying to develop a Cardboard app where the JME camera points in the same direction is the phone. However, although I’m able to get the phone’s orientation (using SensorManager.getOrientation(), which returns yaw, pitch & roll), I’m having real problems using these figures to point the camera in the same direction.
I’ve tried the following:-
float orientation[] = /* android sensor code */;
float[] newOr = new float[3];
// The order is different between Android and JME's yaw/pitch/roll
newOr[0] = orientation[2];
newOr[1] = orientation[1];
newOr[2] = orientation[0];
cam.setRotation(new Quaternion().fromAngles(newOr));
The above code almost works. If I replace “newOr[2] = orientation[0];” with “newOr[2] = 0;” (to cancel out the yaw received from Android) then looking up/down and rolling left/right work great (but no turn left/right though). But when I have the code exactly as above, everything goes haywire. I must have read loads of pages on Stack Overflow but to no avail. Someone else must have written this code before, but I’m stuck. Any help is much appreciated!
Just to be sure because your post has this wrong, in JME:
Yaw = orientation around y axis
Pitch = orientation around x axis
Roll = orientation around z axis
Thanks for the reply. I’ve tried changing the values but to no avail. The movement still goes crazy.
I’m surprised there aren’t more (any?) examples that I can find of this kind of thing, since I assume it’s something that all JME Cardboard apps need to do. Am I going about this the right way, or is there a simpler way?
I’m outputting the angles provided by the sensors and they all seem normal and are the actual angles:-
float orientation[] = new float[3];
SensorManager.getOrientation(R2, orientation); //0=?Azim?, 1 = roll, 2=pitch
int a = (int) Math.toDegrees(orientation[0]);
int b = (int) Math.toDegrees(orientation[1]);
int c = (int) Math.toDegrees(orientation[2]);
Log.i(TAG, "onSensorChanged " + a + "," + b + "," + c);
The only thing I can think of is that the yaw (or “azimuth” as the Android API calls it) is relative to the direction of the phone, as opposed to the direction of the horizon, whereas a Quaternion is expecting the yaw to be relative to the horizon. This is only a guess though, and I don’t know how to correct this, or if it’s even correct assumption.
Here’s some values that I’ve just generated by holding the phone at various angles:-
Looking straight ahead: onSensorChanged -14,1,-85
45deg to the left: onSensorChanged -86,-1,-86
45deg to the right: onSensorChanged 21,2,-89
Flat 0n the table (screen pointing up, so phone facing down): onSensorChanged -56,0,0
I’ve never done anything with android sensors… but your values make it look like they give you yaw, roll, pitch…
And in this case, if you want looking straight ahead with the phone to be looking straight ahead with your camera then you will need to adjust pitch by 90 degrees.
Thanks for the pointers; however, AFAICS it seems that the above code is just converting a couple of the angle to negative versions, which I don’t think will help, although I’m going to try it.
Although, I’ve investigated further up in the code you’ve linked to, to see where it actually gets the orientation from in the first place, and this has led me to this code, which looks complicated enough to make me think it might provide some help:-
What integrations? I’ve started with GitHub - nordfalk/gvr-android-jme: JMonkeyEngine integration with Google VR SDK and am trying to extend it to move the camera when the phone moves since AFAICS there’s nothing currently in that code to do it. Is there an easier way? I’m completely new to VR/Cardboard. Is there something out there to do all this already?
public void onNewFrame(HeadTransform headTransform) {
logger.fine("onNewFrame ");
// Build the camera matrix and apply it to the ModelView.
if(app.getCamera() != null) {
Camera cam = app.getCamera();
headTransform.getQuaternion(headRotation, 0);
orientation.set(headRotation[3], headRotation[2], headRotation[1], headRotation[0]);
cam.setRotation(orientation);
}
}
Whether the orientation ends up correct or not in that block, I don’t know. I think the transform is not directly translatable (at least according to experience).
You should just have to do what you would in a normal application. The VR stuff is taken care of by gvr and the bridge.
Thanks. So, just to clarify, if I run sample-jme-startravel, I should be able to “look around” and the view should change as my phone’s orientation changes? I’ve tried that and unfortunately it doesn’t, but I’m guessing that’s because my phone doesn’t have a gyro, which I assume the GVR bridge uses to determine the phone’s orientation.
Yes. If I remember correctly, a gyro is required for cardboard/googlevr to work. I think you’ll find info on that in a general search, though. It’s been a while since I worked on that.
Yeah, I think I’ll have to admit defeat on this one, at least until I invest in a phone that has a gyro. I just can’t get it to work, and even for the axis that I can get to work, its very jerky and unsteady. I assume/hope that with a gyro it’s much smoother. (Although I did download a “VR Rollercoaster” app, and that seemed to work pretty even well without a gyro.)