[SOLVED] Rotating camera orientation around fixed axis

Here is a display of the problem:

I’ve stored a default camera translation and lookAt with respect to a player’s board in my game (the blue player). Other players (green) have their own boards that have been translated and rotated elsewhere in the environment. My goal is for each player (green) to have the same camera orientations with respect to their own board as the original player (blue) so the original camera orientation has to be rotated with the same rotation as each player’s board (green). Everything works wonderfully until the camera is directly over the lookAt position and then the rotation no longer works. Any idea what is happening? Here is the code:

Vector3f defaultlocation = new Vector3f(0, distanceForward * cameraUnit, distanceUp * cameraUnit);
Vector3f defaultLookAt = new Vector3f(0, lookAtDistanceForward * cameraUnit, 0);

Vector3f boardLocation = clientGUI.getBoardFromPlayerId(playSessionModel.getPlayerId()).getWorldTranslation();
Quaternion boardRotation = clientGUI.getBoardFromPlayerId(playSessionModel.getPlayerId()).getWorldRotation();

// Set up fake nodes to rotate camera and lookAt with respect to board
Node parent = new Node();

Node cameraFake = new Node();

Node lookAtFake = new Node();

// Set the locations to what was calculated
cam.lookAt(lookAtFake.getWorldTranslation(), Vector3f.UNIT_Z);

Can you be more specific? It rotates incorrectly? You cannot rotate at all?

The above line works fine until you are looking parallel to Z axis. If your X axis never changes then you can calculate a Z from look and X (cross product)… else you might have to get a little creative in picking an axis to calculate a proper up vector.

1 Like

Thanks for your response pspeed. I tried to show the gif of what was happening as I think it can explain better than words but essentially it has the wrong rotation when looking parallel to the Z axis as you mention. The calculation works for the green player (the right screen in the gif) until they are looking parallel to the Z axis and then their rotation wonks out and is 180 degrees rotated from where it should be.

In my search for what the problem could be I ran into a concept called Gimbal lock. Does anyone know if this situation applies to that? I tried changing the worldUpVector for the camera and got a camera was rotated differently but at least didn’t wonk out and produce different results when parallel to the z axis.

Here is the code change:

cam.lookAt(lookAtFake.getWorldTranslation(), Vector3f.UNIT_Y);

Here is a view of it:

if cam.lookAt() is expecting a worldUpVector why would that change or have to be different when looking parallel to the z axis? Isn’t the up vector of the world always the same?

When you are looking down the up vector what should your rotation be? It’s completely ambiguous.

To define a coordinate space you need three axes. The look, up, and left… one can always be derived from the other two using a cross product… but you have to have at least two.

When you specify a fixed up vector, then the left vector is calculated… and then the real up vector is calculated. When you are looking down the specific up vector then you can’t do either of those things because it’s entirely ambiguous which direction you should be facing.

As a cheat, if you never rotate very much from one frame to the next, then you can use the last local up vector as your next up vector. So if you are trying to use Z as your up:
Vector3f up = Vector3f.UNIT_Z;

cam.lookAt(point, up);
up = cam.getUp();

1 Like

Thank you for that explanation and I’m trying to understand it as best as I can. I also started to think that the wonky rotation had to do with not being able to disambiguate which direction to face. I think I have a misunderstanding about what worldUpVector means that cam.lookAt() is expecting. My understanding is that it is what you consider up to be in your world which would remain constant independent of different rotations of objects. Can you or someone give me their best description of what worldUpVector means for cam.lookAt()?

Let’s say you want to look at a point on the wall. Do you want to look at it sideways? Upside down? Right side up? The up vector controls this because it points to where the top of your head should be.

Now, lookAt lets you use a rough approximation of up. It’s always local up… but it can be off and will still work because lookAt() doesn’t trust it to be a proper up vector.

For any rotation, to know which direction the top of your head needs to be you need at least two vectors. One pointing forward (your nose) and the other pointing 90 degrees from that… either your left ear or the top of your head. Something to help disambiguate all of the other rotations possible looking in that same nose direction.

lookAt() takes the look direction and the up direction you supply and effectively uses that to calculate the left ear. It can’t trust the up supplied as it might not be a proper relative up vector but given two vectors it is easy to determine one that is 90 degrees from both. Once it has the look direction (the nose) and the left vector (the left ear) it knows which way to orient your head.

If the look vector and the up vector are the same, it has no idea which way to orient the head.

However, every time you successfully set the look vector, you now have a different up and left vector relative to your current orientation. You can use this up vector in your next look at because it will be pretty close to a relative up vector… and will obviously be very different than your look vector (unless you rotate 90 degrees pitch in a single frame).

1 Like

I really appreciate you taking the time to write that I understand that and your first post a lot better now. After looking at thread after thread after thread on camera rotation I found this:

Other threads seem to confirm if you are staring directly down at the origin of your world up vector, lookAt() produces undesirable results. The easiest fix I found was adding a small amount (.0001f) to your location so you aren’t staring straight down the world up vector:

Vector3f defaultlocation = new Vector3f(0, (distanceForward - .0001f) * cameraUnit, distanceUp * cameraUnit);

Seems hacky but don’t really mind my camera being slightly off position. Here is the proof:

Well, my solution is super simple and actually right… but do what works I guess.

1 Like

I did try your solution first but got these results:

I would prefer the non-hacky solution. I bet your way would work if it didn’t start looking down the up vector. But I think because it starts wonky it maintains that throughout.

Edit: I just tested the camera starting at the farthest distance you see in the demonstrations and your method does work. But if it starts by looking down the up vector it maintains the wonkiness.

Then you didn’t do it right. Because there is no more “looking down the up vector” because the up vector is constantly changing.

1 Like