Determining line of sight offset

Given a directional vector (e.g. 0.5, 1.0, 0.0) and an up vector (e.g. 0.0f, 0.0f, 1.0f)

What you need is to do several projections of that position vector (well actually the difference between that position and your current position) against your axis.

I don't know how good is your vector math, so I will assume not very much.

      Vector3f current= null, up= null, direction= null, left= null, point = null; //initialized as desired. Assume up, direction and left are orthonormal
      Vector3f target = point.subtract(current);
      float projDr =;
      float projLt =;
      float projUp =;
      //The original point can be found as a combination of your vectors.
      Vector3f test = new Vector3f();
      test.addLocal( direction.mult( projDr ) );
      test.addLocal( up.mult( projUp ) );
      test.addLocal( left.mult( projLt ) );
      test.addLocal( current );
      //You can now compare test and point

I'd consider my vector math pretty poor. So I dont have a clue whats going on in that code.

I'm just throwing some ideas around below. The purpose of this code would be to determine were abouts to place an arrow around the four sides of the window which points to a particular target.

Therefore my thought was to identify the perpendicular offset of the target from the line of sight, which determines whether the arrow should show on the left, right, top or bottom panel. Then I was thinking the ratio of these offset values (X & Y) would determine the position along the panels. So if length(Y) > length(X) then the arrow would show on the top or bottom panel, the polarity of Y then determines top from bottom.

The other approach, not suggest by myself included the following steps;

  • First you determine what axis of the target vector is furthest away
  • That tells you whether the arrow will appear on the top, bottom, left, or right
  • So if it's Y axis is negative (below you) and that is the largest "difference" from your position you would determine that it needs to be on the bottom panel
  • From there you decide if it's x+ or x- to determine what direction on the bottom the arrow should appear
  • Then local this to your position


So, if I understand correctly, you have a target point, and your up, left and direction that determines your camera. And you want to know the relative position of your target point in your view, right… (that is, where would that point be in your window, as a function of your 3 vectors)

This image shows what I talk about… The black axis are your Camera reference vectors (assume it is at the origin, although my code does not assume it). You have a target vector at some coordinates (red dot). The gray lines are just extensions of your axis. The purple thin lines are the projections of your target to your axis… They are orthogonal to the axis (but not aligned to them)

No the good thing about your axis to be orthonormal is that a simple dot product would tell you the coordinates in the new reference system. While target might have some coordinates in the traditional coordinate system, it also has some coordinates in the system defined by your axis.

The coordinates in that system are tuples (x, y, z) where the position of a point would be xleft + yup + zdirection… and (x,y) would be the relative position of that point from the direction vector. What I compute in that code are the values of x, y and z that  yield target = xleft + yup + zdirection.

Hope this helps…

Im sure it will. Thanks for explaining. I'll give this a shot later tonight and get back to you on the result  :slight_smile: