From UV to Vertex on a Sphere

Hello,
I given that I have a bitmap which is used as a texture for a Sphere - I would like to write a function which gets a point coordinate on that bitmap and a sphere geometry (U,V,Sphere) and returns a Vector3f of the closest sphere vertex to that given bitmap U,V

Does anyone knows the Math or does JME3 has a ready to use function for calculating it?

Thank You,
Adi

Soooooo much easier if you generate your sphere mesh in a way that the points are predictable in the first place.

like a cube extruded as a sphere?

I’m using the Hello Material sample code:
https://wiki.jmonkeyengine.org/jme3/beginner/hello_material.html
and replaced the bitmap with Earth bitmap and now I want at any given time to “shoot” at or do some effect on a specific point on “Earth”

Then just fire a ray from the camera location in the direction of the camera and if it hit anything it will give you a contact point.

https://wiki.jmonkeyengine.org/jme3/beginner/hello_picking.html

I need to maintain all kind of “events” around the globe. each event can be represented by some special effect. An event might happen in a geographic location which is currently hidden but still I want to do something there (start a particle system for example) therefore I don’t think that finding a picking point is good for this use case

We’ll you said “shoot at”. I guess you mean it as in shoot a meteorite at London or whatever. Use a cube sphere. You can work out the cords easily then. It’s just 6 quads.

Let’s say I want to make it simple and track just the first Vertex position - I wrote this code in the simpleUpdate() func:
FloatBuffer buff=sphereMesh.getFloatBuffer(VertexBuffer.Type.Position);
float x = buff.get(0);
float y = buff.get(1);
float z = buff.get(2);

But even though I constantly rotate my sphereGeo I always get the same values for x,y,z. What am I missing?

A mesh is just a bunch of arrays. If you ask for element 0 of the array every time then you are always going to get element 0. There is no magic.

I’m still not clear why you need the vertex… what effect are you going to do that wouldn’t be ok with the UV?

I need to write a 3D simulation at work where things happens in a specific geographic points (things might be data flowing to that points or some kind of alerts raised).
I want to present those things (events) with an effect or particle system node but the globe (earth) is always rotating and i need in real-time to find the current position of a geographic location (say Eiffel Tower) on my globe so I can correctly position the effect node.

I hope my problem is clear :slight_smile:

Finding lat/lon on a globe and putting something there that rotates with the globe is really trivial.

For the first, lon is the y-axis angle and lat is the x axis angle so in a Quaternion.fromAngles(Math.toRadians(lat), Math.toRadians(lon), 0)

…then attach the children to the globe node that you are rotating and they will automatically rotate with it.

“But I’m rotating a sphere geometry…”

…make the sphere geometry a child of the globe node and rotate the globe node.

Do you want to do this in a shader? If so, I do have some glsl code that I wrote for damage splating. I pass the collision point point to the material and the shader has a buffer of locations that it translates to texture coordinates so that it can overlay the damage texture at that point on the mesh. I can dig up the code. It translates, in the shader, the mesh position to the texture position regardless of the mesh complexity or UV mapping.

1 Like

Though for planet earth, I’d expect the UV to be lon/lat respectively. Too easy to find 2D whole-earth maps projected in this way.

I designed the shader so my user designed space ships can show damage when they are pew pewed by other players.

Yes pew pew is a technical term I invented :stuck_out_tongue:

1 Like

Yes, yours is the cooler general case.

Splatting things on planet earth is pretty trivial in comparison. Though I gather from OPs posts that it’s real objects they are trying to locate on the earth… not just a splat.

Then I miss understood the use case. If he wants to just find real objects then the mesh approach is what he wants … not a shader one.

My contention is that he doesn’t even need that. (But to be fair, it’s a little unclear what OP wants… a picture would be worth a thousand words here.)

Node earth = new Node("earth");
earth.attachChild(sphereGeometry);
float lat = 38.8895f * FastMath.DEG_TO_RADS;
float lon = -77.0353f * FastMath.DEG_TO_RADS;
Quaternion rot = new Quaternion.fromAngles(lat, lon, 0);
Spatial washingtonMonument = ...
earth.attachChild(washingtonMonument);
washingtonMonument.setLocalRotation(rot); 
washingtonMonument.setLocalTranslation(rot.mult(Vector3f.UNIT_Z).mult(sphereRadius));

…and for fun…

public void update( float tpf ) {
    earth.rotate(0, tpf * FastMath.TWO_PI / (24f * 60f * 60f), 0);
}

…or something like that for real time earth rotation.

2 Likes

@pspeed

…and for fun…

public void update( float tpf ) {
   earth.rotate(0, tpf * FastMath.TWO_PI / (24f * 60f * 60f), 0);
}

…or something like that for real time earth rotation.

^That was pretty slick.

For what it’s worth, looking at it again, I think it’s rotating the earth backwards… but oh well. :slight_smile:

Edit: nope.

Has your issue been resolved? If not, I’d like to pursue it with you.