Ahh, so it is rotating about the world Z axis. That’s not what I’m trying to do.
I am still confused by this. There is a good way of doing this, I just haven’t figured it out yet. So again, the application is making a sonar sensor on the vehicle using rays. I would also like a visual representation of the path of the ray. I have a couple of ideas of how to do this:
- For each cycle of the updateLoop, I would have to create the “sonar sensor” object that is made up of “Line” objects. I prefer to use my Line class as it allows me to set the line color. I would eventually like it to change the color as well if the ray hits an object within the length of that line (effectively showing a sonar “hit”)
To create the sonar object, I would draw my lines along the world UNIT_X to the length they need to be. Then rotate the lines upward (about the world UNIT_Z) to the degree of the spread of the sensor. Then I would make a cone using lines drawn in this method, but then again rotated about the UNIT_X by an increment until I have a full 360 degrees of lines. This effectively makes a cone shape. Once I have the entire sonar sensor drawn this way based on the world UNIT_X and UNIT_Z, I would then translate the sonar object to the front of the vehicle and calculate where to place it, then rotate it appropriately. This would have to happen each cycle of the update loop. Once every physicsTick, I could get the localToWorld of each line’s start and endpoint and cast a Ray along the path of each line. I think this would take way too long, especially if I have a lot of lines or if I have more than 1 sensor on the vehicle.
- Create the sonar sensor as stated above just once in the SimpleInit, and attach it as a child to the vehicle. Once every PhysicsTick I would then somehow access the start and endpoints of each line (using localToWorld) and cast a Ray along the path of that line. This takes much less time than the first method, but I’m not sure I will be able to access the start and endpoints of the lines.
For each method above I will only be casting rays if I am within range of something. This is accomplished by attaching a spherical ghostNode to the vehicle with a radius of the maximum sensing distance of the sonar. If the ghostNode overlaps any collidable, then it will go about either method 1 or 2 above to see if it is in front of the sonar sensor(s).
Now I don’t know how to get the localToWorld of the start and endpoints of the lines once they are moved or rotated. If I just use the start and endpoints I used to create them, this will not take into consideration the translations and rotations of the vehicle would it?
Am I going about this all wrong? Is there a better method? Any suggestions on how to get the start and endpoints of the lines after attaching them to the vehicle?