Hello there, im currently playing a bit with Nodes and vectors and i found something that is a bug in my opinion.

Lets imagine i have this structure of Nodes: rootNode->centerOfUniverseNode->planetNode->planetCenterNode
for my little planet space shooter. CenterOfUniverseNode is constantly rotating (planet orbiting around center), planetNode is translated in the beginning further from center, planetCenterNode’s children are the planet and planet’s building geometries. With this structure im using Ray to Geometry(planet sphere) collision to determine, where i clicked with cursor. What i found out is that for some reason the getContactPoint() method returns world coords of the hit on planet, which is fine, but the getContactNormal() does NOT follow this. The contact normal im getting starts at positionHit, but ends in the middle of world not affected by any planet sphere parent node transformation, which results in something like this(the pink arrows are contactNormals):

(for the pink arrows im using planetCenterNode, localTranslation to posHit and direction as contactNorma both transformed from world to local on planetCenterNodel)
Im currently correcting it by adding the transformation of parents myself like this:

Is it a bug or am i not getting something? Thanks in advance for any response. Im using JME SDK 3.1.0-beta1-SNAPSHOT and java 1.8.

EDIT

As pspeed suggested the contactNormal is direction, so i should not use worldToLocal on the normal as on contactPoint. But when used as local direction on PlanetCenterNode, that was rotated around Y axis for around 90 degrees, the normals are not rotated and their direction stays as its parent would not be rotated at all.

A normal is just direction. It has no location. Any “lack of transform of the parents” would be related to rotation and not location. So one of us is not understanding something about this question.

Edit: have you tried printing the value of the contact normal? A lot of the intersection paths don’t set a contact normal at all.

Im looking at vectors more like a line with start at (0,0,0) and the end at specified point, which is what they are. But ok lets say they dont have starting point and they are purely only direction defined somehow. That would mean i can use the contactNormal without modifying it for local node right? Problem is when i rotate the planetCenterNode and click on the sphere geometry the normal at that point is not rotated as well. So i have to myself manually apply the rotation. So you are right i can use local (0,0,0) as starting point but add only rotation to it.

EDIT: I clearly have value in that contact normal, since im using Ray vs Geometry collision. And im printing it as the pink arrows as i said, end of the arrow is always the normal value and start is contact point of the collision.

Basicaly what i have here is contactPoint results in correct world coords of the hit. contactNormal yields direction, that is not modified by rotation of geometry and its parents, that is being clicked on.

Mathematically they are defined as such but in 3D graphics they have special characteristics such as being " just direction. It has no location." meaning they are not translated. It is a useful distinction in many cases so rather prevalent in 3D-graphics.

Ok, so how do you define such direction if not the mathematical way (two points)? How you can compute its length (its in JME3), if you have “just direction”? How can you add/substract/… them together, if they are not defined by those two points? Also points in 3D are basically vectors too. Its just translation from center (0,0,0) in some direction of some magnitude (length of the vector).

Im just trying to understand that, since i know only mathematical definition, which make sense for me and also worked for me in 3D in JME3. I actually think 3D is math in its purest form , so im really not sure about that “special characteristics in 3D”. Every vector or point is based on some center point (0,0,0), it just depends where that (0,0,0) actually is. Maybe we are just not understanding each other, vectors themselves cannot be translated, but you can change their starting point, their direction.and strength.

Take a look at this, it is in 2d, but the concepts are the same.
If you have specific questions while reading this, feel free to ask http://mathinsight.org/vector_introduction

Was that directed to me? In that is everything i already know. Vector starting point is (0,0) and you define its direction based on that starting point. And it also have magnitude. I know how vectors work in math Empire_Phoenix. But jmaasing suggested that you dont need two points to define vector. That in 3D its different from mathematical definition. Thats why i was asking how you can do all these operations with them. Should have quoted probably.

Yes and the direction is defined by starting point (0,0,0) and ending point (your direction). If we are talking back about my problem, contactPoint has world coordinates, so i can use worldToLocal to get correct local coords based on different starting point (any point in 3D is also a vector, but the magnitude of vector is important factor), as for contactNormal i thought this would work in the same way, that i could just use worldToLocal to get correct direction based on different starting point, that was wrong of me, since i looked at it as a point. Problem is if i just use it as a direction only on my local node, it does not apply its parent rotation, so its based only on the Geometry clicked without taking in consideration rotations of its parents.

For a “direction vector” the w-component is 0 when expressed as homogenous coordinates. Meaning that when you multiply a “direction vector” with a matrix the translation is multiplied to 0 i.e. ignored. Only the rotation remains. This is useful for normals and light calculations where you don’t care about the position.

I dont know how exactly matrix multiplications works for vectors, but i understand they are some sort of combination of operations translate,rotate and scale right? If every normal and point is basically vector starting at (0,0,0), where with normal you just care about its direction (its magnitude would be most likely unit length), you dont translate any of them. You just rotate them and then multiply by scalar to match target point. Which means you cannot differentiate between “direction vector” and point in 3D, except for what information in them is relevant to you at the moment.

So im still not persuaded about that “direction vector” being some special case 3D vector different from standard mathematical one.

I’m not sure I got what the question was. Since vectors in JME defined as free by 3 values (for Vector3f) - yeah, you can think of them as defined by two points, with 1st point always being (0, 0, 0), if you like. The “specialness” of a free vector is therefore that, if to consider definition by two points, 1st point is always the origin, not something else.

Well i think of them as standard math vectors, so i consider also the starting point. But hey we are arguing about something else than my problem with contactNormal Everyone is trying to persuade me that i dont know how vectors works ignoring asked question

Wait, but I thought you responded on it yourself here:

so what I can assume from this - you need to take world rotation instead of local. Some code at this point would probably make your question more clear anyway

Just check images in question, first one is me using (wrongly) worldToLocal on contactNormal, where i assumed i could use it same way as contactPoint. Second one is me “correcting” contactNormal to match worldCoords (i was still thinking about it as just point somewhere, which it basically is), but than pspeed suggested its only direction, but if i use it as it is i have wrong world rotation of contactNormal on my local node. Is there a “worldToLocal” equivalent for rotation only? Because my correction was basically transform the contactNormal endpoint to true world coordinates based on geometry clicked.

That’s the key. Without seeing their relations one can barely advise you the magic line that will do what you want to do. We (even @pspeed so far it seems) didn’t get the context of the task yet.

I already have the magic line I was able to transform world normal to correct local normal with my “correction”. But it seems a bit clunky and inconsistent to me. The contactPoint already have correct world coordinates, on which you just need to use “worldToLocal”, but the same cannot be said about the contactNormal. It does not care if the geometry you clicked on was somehow rotated in its parents unlike contactPoint and there is no “worldToLocal” equivalent for rotation only if the normal is just the direction as many here tried to convince me. Imagine it as if the contact point of geometry you clicked on would be in the middle of world and you would have to manually add all transformations of geometry parents to it.

So it seems that this is not a bug, just a inconvenience and inconsistency?

No, I understand the question now. My problem is that I don’t have time to make my own test example and I’ve looked through the code and I can’t figure out where the contact normal is set at all in any of the collideWith() cases that would be relevant. So that makes me think it’s not even using collideWith() and we need to see an actual single class test case that illustrates the issue.

Generally, I agree that contact normal should be in world space. I can’t find what code needs to be fixed, though… as generally it must be doing extra work to put the normal back into local space.