Am I on the right track here?

I’m trying to give my user a complete sphere of vision using a Lambert azimuthal equal-area projection. I was told that I should really use hardware acceleration from a graphics engine, at which point I could use a Vertex Shader to create the projection that I wanted. I’m new to 3d programming and I wanted to know if I am on the right track, using jMonkeyEngine and trying to learn how to use it’s “vertex shader”.

It sounds more like you want to change the world view projection, you are also going to need to make sure that the clipping and culling of off-screen objects is disabled or appropriately configured, etc.



It’s certainly a pretty specialised use so someone more knowledgeable of the internals than me is going to need to reply I think.

Actuall use two cameras with 180° view frustrums (you cant create one with 360 due to technical and math reasons)

Then render both on a texture each.



Then you can supply thos textures to a shader that does the rest.

Does anyone know any similar open source projects to what I’m attempting here. I feel like working examples, might be useful. At this point I’m at a bit of a loss of where to start, although I’ll be looking into the comments of the above posters.