I'm trying to implement embedding jfx scenes inside jme3 textures. This would allow embedding interactive huds directly into scene geometry. Display works already, but of course event management is the complicated part.
I was going to cast rays from camera/mouse location onto geometry, get collision results, convert into scene coordinates and process further. Unfortunately, it turns out that CollisionResult doesn't have texture coordinates of the hit! I have a feeling that even contact normal is not true normal given in geometry, but rather computed one from triangle points.
I suppose that I could compute it somehow, by retrieving triangle data by index from mesh, and then using contact point to interpolate between positions of vertices and use it for interpolation of tex coordinates. Do you think that having
public void getTriangleData(VertexBuffer.Type type, int index, Vector3f v1, Vector3f v2, Vector3f v3)
public void getTriangleData(VertexBuffer.Type type, int index, Vector2f v1, Vector2f v2, Vector2f v3)
in Mesh class would be ok? If yes, I'll create pull request for that against jme3.