Object points at the mouse

I’ve been messin with this quite a bit, and I just can’t get it figured out, basically what I want is to have an object rotate to face the position of the mouse. The camera is setup at a 45 degree isometric view, movement is along the x and z coords. Basically my problem is the mouse seems to only have a range of movement of about 1. This is using the code from the mouse picking example (and the mouse picking works perfectly) which makes it a bit confusing for me.

MouseInput thisMouse = am.getMouseInput();

Vector2f screenPos = new Vector2f();
// Get the position that the mouse is pointing to
// Get the world location of that X,Y value
Vector3f worldCoords = display.getWorldCoordinates(
screenPos, 0);
Vector3f mouseFollow = worldCoords.subtract(-15f,15f,15f);

mouseFollow variable holds the 3d location of the mouse, I've moved my camera to -15,15,15 and it points at an object at 0,0,0, the camera always points at this object from that location. I attached a node to the scene graph with a box in it so I could see where things were at, and basically the upper left corner and lower right corner only moves the box about 1 (it's basically staying in the center of the screen). I'm pretty sure I'm doing something stupid in this little chunk of code (like misunderstanding what the worldCoords vector actually represents). I'm hoping to make this my last newbie type question, I hate to ask this one but I've been stuck on it for quite a few hours now.
Basically my problem is the mouse seems to only have a range of movement of about 1.

What do you mean? The mouse only seems to move along the x or y axis about one or so units per frame? If so, that's because the Picking example is using a relative mouse. It gives the mouse position for a frame, then sets it back to the center of the screen.

The picking example is an absolutemouse, I mean that if I click in the upper left corner of the window, and then in the lower right, the vector results I’d have would be something like 0,0,0 for the upper left and 0.5,-1,-1 for the lower right. I’m aiming to get a range more like 30x30 or probably more, not sure how wide an area the screen actually covers, I’m just sure something isn’t right with the world coordinates. I think what’s going on is I’m only allowed to move about 1 logical unit because that’s all the screen represents right at the camera, but changing the z value when calling getWorldCoordinates actually had the opposite effect i expected (setting it to 15 or -15 instead of 0 reduced the movement range which isn’t what I expected to happen).

Found the posts that lead to the inclusion of the ‘z’ variable for converting mouse position to world position, apparently it didn’t do what I expected, and if you feed it the value 0, it’s the near clipping plane, and 1 is the far clipping plane? Either way, I finally figured out how to get my mouse coords like I want… it’s ugly and probably creates a lot of extra objects per frame (not to mention I probably have twice the necessary math).

The bad news is I can’t figure out how to get the angle between two vectors correctly, as I understand it vector1.normalize().dot(vector2.normalize()) should give me the angle between the vectors? I admit to being a bit of a math tard (especially when it gets all 3d craziness on me). The formula doesn’t seem to work as expected, or I’m hopefully just doing it wrong.

right, the dot product equals this:

dotProduct = LengthOfVector1 * LengthOfVector2 * Cos Theta

Now you want the angle right?

float length1 = vector1.length();
float lengh2 = vector2.length();
float dot = vector1.dot(vector2);
float cosTheta = dot/(length1 * length2);
float theta = FastMath.acos(cosTheta);

And theres your angle.

Enjoy :)


Somehow that didn’t work either, I’m going to have to take a few easy known angles and see where it’s going wrong… spend some more time hammering math back into my brain.

works fine here:

public class TestDot {

    * @param args
   public static void main(String[] args) {
      Vector3f vec1 = new Vector3f(0, 10, 0);
      Vector3f vec2 = new Vector3f(13, 0, 0);
      float dot = vec1.dot(vec2);
      float len1 = vec1.length();
      float len2 = vec2.length();
      float cosTheta = dot/(len1 * len2);
      float theta = FastMath.acos(cosTheta);
      System.out.println(theta * FastMath.RAD_TO_DEG);



That gives an output of 90 degrees which is what i would expect from two vectors pointing upwards and sideways...


When one of the vectors is a zero vector, the angle is always the same (unless both are zero vectors), is this normal?

[X=0.0, Y=0.0, Z=0.0]

[X=-2.965703, Y=0.0, Z=9.496717]

3.1415927 (180 degrees)

When testing with known numbers like 1,0,0 and 0,1,0 I got the results I expected, going to spend some time hand verifying the numbers to see where the problem is (as always I’m sure I’m doing something stupid…).

Yep, it’s quite ‘normal’ that you always get the same result if one Vector is zero. What would you expect to be the angle between a point and a line?!


I went with using the 2d method of using atan to determine the angle since I was already projecting the second vector to be level with the first vector… if I go off and create terrain that isn’t level I might have to revisit this issue but for now I’m satisfied with the idea of makin something that looks like a game out of things.