I went through the tutorial and although useful, I didn’t help in working out what my issue was; I thought there maybe a method to rotate a vector based on a Quat, but nothing is looking obvious.
To explain a bit more then…
When the application starts a front: (0.79018366, -0.029632535, -0.6121444), and Up: (-0.61171776, 0.022939915, -0.79074347) vector is generated and used to set a a Quat using the .lookAt(front, up) method.
All spatials added to the scene, including the main Camera, have their local rotation set to this quat, so as the camera is moved using the device sensors the world appears flat (perpendicular to camera). The terrain quad’s are translated to their correct location and also rotated by this Quaternion, so the world still appears to line up with the camera.
The difficulty is creating new spatials that auto-magically sit on top of the terrain as, when using;
[java]
Vector3f worldOrigin = new Vector3f(128,-4, -43);
CollisionResults results = new CollisionResults();
Vector3f direction = new Vector3f(0,1,0);
terrain.collideWith( new Ray(worldOrigin, direction.mult(-1f)), results);
float dist = 0f;
if (results.size() > 0) {
// how to react when a collision was detected
CollisionResult closest = results.getClosestCollision();
dist = closest.getDistance();
System.out.println("Distance? " + dist );
}
[/java]
either no collision results are produced or the distance is not correct (i.e spatials are above or below the terrain).
I appreciate it would make a lot of sense to not rotate everything by the original front and up vectors, but unfortunately due to the real-world coordinate system I’m using this isn’t possible.
I anticipated that using the original Up vector as the Ray direction vector would produce the correct result, but it isn’t, hence the post.
Thanks.