JME2 to JME3 transition with ARMonkeyKit

Hello!



Currently I have a project that requires working with augmented reality. I have found a very neat tool for that - ARMonkeyKit, however, it is programmed to work with jME2 and NyARToolkit 2.5, and my setup is jME3 with NyARToolkit 3 and I have no wish to work with legacy versions and I also need the latest OpenGL support. So now I am trying to port ARMonkeyKit to work with the newer libraries versions. Some parts of it are simple but in many occasions I find that jME3 lacks classes and methods from jME2 and I cannot find any substitution. For example, Texture.ApplyMode is gone… and many many more things like that.That and my lack of skill with jME or internals of OpenGL makes the task very difficult for me.



Thus the questions: is there any transition guide for jME2 → jME3 ? Is it true that JOGL is excluded from jME3 since I cannot find libs for it? Have anyone else tried and succeeded in upgrading ARMonkeyKit? Are there any other suggestions and advices you may give me on Augmented Reality using JME?

Google Me

Well all those fixed pipline functions like the Texturemode is in JME3 done with shaders.

I suggest you look at how shaders work and either try to recreate those logic in a shader, or dispose of te part it did if it is not needed.



Also yes, JME3 uses officially only lwjgl, it could probably made to work with jogl, however I don’t really see why that would make any sense. Afterall it makes no difference to the end user what it uses, as long as it works.

@EmpirePhoenix:



Ok, I understand about the pipeline transition to shaders. But I am new to shaders either, however i understand the concept.



Could you please tell me which direction should I go to put these lines in the new way?:





GL11.glBindTexture(GL11.GL_TEXTURE_2D, texture.getTextureId());

GL11.glTexSubImage2D(GL11.GL_TEXTURE_2D, 0, 0, 0, videowidth, videoheight, pixelformat, dataformat, buffer);



These two lines prepare the texture and then take a part of it treated according to pixelformat and dataformat from the buffer… It uses OpenGL 1.1 - a very old thing… and I am completely lost with the new OpenGL and jME3 pipeline… I have no idea how to get what texture binding and subimaging does in this new way… any useful links to any relevant info would be appreciated.

As far as I see it, it binds a texture (this is done automagically by JME3, just load and use it)

→ The second selects a subimage (kinda like substring but for images) from the larger one. This could be done either in the shader by manipulating the texturecoordinats passed,

or in cpu by simply creating a new Texture, (depending on how often this is done)

Ok, thank you! I think I got it for now. I will keep implementing the necessary ARMonkeyKit functionality for jME3… now I am at the very beginning. I feel there is much to learn… I think I understand how to do it for now… So when I get to the testing point it will either work or not… I hope it could be done today. In case of any more problems I think I will ask something more here.