Reloading a texture 30-60 times per second

Hi, Ill start by mentioning that this is my first post here, and right off the bat i need a bit of help.



The problem can be summarized as this:

I want to capture a video feed from my camera and put it right into a texture and still go about it at 30+ fps sec.



The purpuse can be summarized as this:

I want to use the JMonkey engines features (like collada model loading etc)



I have borrowed heavily from this project:

Shiva's Cafe: NyArtoolkit on jMonkeyEngine



However, it is rewritten to use OpenCV with the help from this guy:

いー ドット ぷりんとすたっくとれーす: NyARToolkit-2.3.1 for Java on Snow Leopard (using OpenCV/JNI)



I have also had a look at this http://www.jmonkeyengine.com/forum/index.php?topic=2908.0 (without getting any the wiser)



Ive narrowed the problem (bottleneck) down to this piece of code:


bg = TextureManager.loadTexture(image,
Texture.MinificationFilter.BilinearNearestMipMap,
Texture.MagnificationFilter.Bilinear, true);
bgts.setTexture(bg);


Without this textureloading i get 200fps, with it i get about 6 fps.


About image : where image is set from a buffer (RGB24)


RGBrasterFromCamera.setBuffer( i_buffer );
image = RGBrasterFromCamera.createImage();



I would like for all of this to happen on the GPU, and was basically wondering if there is a way to shoot the buffer right into a texture, without wandering into system memory. Any help would be appreciated.

Yes I am.

Having a bit of trouble getting the buffer into the com.jme.image.Image; type. which seems like the type a setTexture call would like as input.

you could update the texture data by directly updating the texture using opengl maybe…



public   void   updateTextureData(int x, int y, int w, int h, FloatBuffer newTextureData, Texture theTexture) {
   IntBuffer   idBuffer = BufferUtils.createIntBuffer(16);
   
   // get the current texture binding
   GL11.glGetInteger(GL11.GL_TEXTURE_BINDING_2D, idBuffer);
   int oldTex = idBuffer.get();

   // bind to our texture
   GL11.glBindTexture( GL11.GL_TEXTURE_2D, theTexture.getTextureId() );
   GL11.glPixelStorei( GL11.GL_UNPACK_ALIGNMENT, 1 );

   newTextureData.rewind();
      
   // set the texture data
   GL11.glTexSubImage2D(GL11.GL_TEXTURE_2D, 0, x, y, w, h, GL11.GL_RGBA, GL11.GL_FLOAT, newTextureData);
      
   try {
      Util.checkGLError();
   } catch ( OpenGLException e ) {
      e.printStackTrace();
   }

   // rebind to old texture
   GL11.glBindTexture(GL11.GL_TEXTURE_2D, oldTex);
}


Im almost there now, (i think)

Having a bit of trouble with the buffer i think





where the buff is of this kind


public GLQtNyARRaster_RGB( int i_width, int i_height)
{
        super( i_width, i_height );
this._gl_flag = GL.GL_RGB;
this._gl_buf = new byte[this._size.w * this._size.h * 3];
        this.i_height = i_height;
}
   
    public byte[] getGLRgbArray()
{
return this._ref_buf;
}



and gets buffered like this

ByteBuffer bbuf = ByteBuffer.allocate(921600*3);
bbuf.put(i_rasterAlsoKnownasnewtexturedata.getGLRgbArray()) ;



GL11.glTexSubImage2D(GL11.GL_TEXTURE_2D, 0, x, y, w, h, GL11.GL_RGB, GL11.GL_UNSIGNED_BYTE, bbuf);



anyways error shows as, any thoghts anyone?
org.lwjgl.opengl.OpenGLException: Invalid value (1281)
at org.lwjgl.opengl.Util.checkGLError(Util.java:54)
at jogl.jp.nyatla.nyartoolkit.jogl.sample.JMETest1.updateTextureData(JMETest1.java:269)
at jogl.jp.nyatla.nyartoolkit.jogl.sample.JMETest1.simpleUpdate(JMETest1.java:169)
at com.jme.app.SimpleGame.update(SimpleGame.java:60)
at com.jme.app.BaseGame.start(BaseGame.java:84)
at jogl.jp.nyatla.nyartoolkit.jogl.sample.JMETest1.main(JMETest1.java:77)

Calling the buffer like this;

   ByteBuffer buf = ByteBuffer.wrap(i_rasterAlsoKnownasnewtexturedata.getGLRgbArray());
      GL11.glTexSubImage2D(GL11.GL_TEXTURE_2D, 0, x, y, w, h, i_rasterAlsoKnownasnewtexturedata.getGLPixelFlag(), GL11.GL_UNSIGNED_BYTE, buf);



Used succesfully here

      ByteBuffer buf = ByteBuffer.wrap(i_raster.getGLRgbArray());
      gl_.glDrawPixels(rsize.w,rsize.h, i_raster.getGLPixelFlag(), GL.GL_UNSIGNED_BYTE, buf);



Is there any significant difference between drawpixels and texture2D?

It is preferable to use textures as these are the native way that images get rendered with hardware acceleration. I would suggest directly copying the RGB data into a native ByteBuffer then updating the texture using the method Renderer.updateTexture() (or whatever its called).


ByteBuffer buf = ByteBuffer.wrap(i_raster.getGLRgbArray());
      gl_.glDrawPixels(rsize.w,rsize.h, i_raster.getGLPixelFlag(), GL.GL_UNSIGNED_BYTE, buf);


This code generates garbage. When you pass a non-native, array wrapped buffer to glDrawPixels it must be copied to an intermediate native buffer. You should create a native buffer and re-use it like so:


// on init:
ByteBuffer buf = BufferUtils.createByteBuffer(width * height * 3);

// every frame:
byte[] data = i_raster.getGLRgbArray();
buf.clear();
buf.put(data);
buf.flip();


ok, i got it working runs in 44fps with vsync (up from 4-6fps), thanks for the buffer tip  Momoko_Fan



Ill post how I got it working for any reference for others interested in shooting video to a texture.



Firstly:

Init the buffer and the image (not awt.BufferedImage) in the simpleInit


buf = BufferUtils.createByteBuffer(640 * 480 * 3);
hypaImg = new Image(Format.RGB16, 640, 480, buf);



Init the bg texture with a dummy 1x1 rgb image at startup(any suggestions to how this can be done otherwise would be appreciated)

BufferedImage bbbb = new BufferedImage(1, 1, BufferedImage.TYPE_INT_RGB);
      // SETS DUMMY TEXTURE FOR FIRST PASS (GOD KNOWS WHY)
      bg = TextureManager.loadTexture(bbbb,
            Texture.MinificationFilter.BilinearNearestMipMap,
            Texture.MagnificationFilter.Bilinear, true);



Update the hypaImg with data from cam

      
RGBrasterFromCamera.setBuffer(i_buffer);
byte[] data = RGBrasterFromCamera.getGLRgbArray();
buf.clear();
buf.put(data);
buf.flip();
//Add the buffer to the image
hypaImg.addData(buf); //this is com.jme.image type, this exists on the GPU (i think)



instead of manipulating the buffer, just flip the background node (when setting the scene up) around a few times so it all comes out nice.

Matrix3f m = new Matrix3f();
m.fromAngleAxis((float) Math.toRadians(180), new Vector3f(0, 1, 0));
background.setLocalRotation(m);
m.fromAngleAxis((float) Math.toRadians(180), new Vector3f(0, 0, 0));
background.setLocalRotation(m);




update the bg with a new Image each update


TextureManager.deleteTextureFromCard(bg); //Without this one, the computer runs out of memory after 20 seconds
bg.setImage(hypaImg);
bgts.setTexture(bg);



Hope this is helpful for anyone in the same situation.
Video here: http://www.youtube.com/watch?v=Xf4TMS6bjJ4&feature=youtube_gdata

It's good that you have it working now. Not trying to nitpick, but there are quite a lot of problems with the code in your last post which might stop you from getting even higher fps.


Init the buffer and the image (not awt.BufferedImage) in the simpleInit

hypaImg = new Image(Format.RGB16, 640, 480, buf);


Why are you using the format RGB16? This is a high precision format that uses 16 bits for each component and would be costly for some card to store. I guess the driver is smart enough to detect that you're only sending it 8 bits per component and ignore your internal format suggestion. Still to prevent issues on older video cards and such you should always use RGB8 or similar format like BGR8.

Init the bg texture with a dummy 1x1 rgb image at startup(any suggestions to how this can be done otherwise would be appreciated)

Using TextureManager is not necessary here.
Ideally you should create an image from scratch and a texture from scratch:


private ByteBuffer dataBuf;
private Image image;
private Texture tex;
//..
dataBuf = BufferUtils.createByteBuffer(width * height * 3);
image = new Image(width, height, dataBuf);
tex = new Texture();
tex.setImage(image);
// Important: Do NOT enable mipmaps for video and user interface textures
tex.setMinificationFilter(MinificationFilter.BilinearNoMipmaps);
tex.setMagnificationFilter(MagnificationFilter.Bilinear);



Update the hypaImg with data from cam

hypaImg.addData(buf);


Using the addData method is not needed, this just adds many of the same bytebuffe to an arraylist (look inside the Image class source). You should just set the ByteBuffer to the image once in the constructor and never touch the Image object again.


m.fromAngleAxis((float) Math.toRadians(180), new Vector3f(0, 1, 0));
background.setLocalRotation(m);
m.fromAngleAxis((float) Math.toRadians(180), new Vector3f(0, 0, 0));
background.setLocalRotation(m);


The fromAngleAxis() method overwrites the rotation in the transform matrix, the setLocalRotation method overwrites the rotation in the spatial. The first two lines therefore do nothing.
We eliminate the first two lines and are left with this:


m.fromAngleAxis((float) Math.toRadians(180), new Vector3f(0, 0, 0));
background.setLocalRotation(m);


A zero vector is an illegal argument for an axis, but it seems the fromAngleAxis() method ignores that. The implementation sets m00 = m11 = m22 = cos(angle). The cosine of 180 is -1 and as a result when the matrix transformation is applied to a model it will be inverted along the X, Y and Z axis. Surprisingly enough, the Quaternion class which is supposed to mirror the rotation matrix class has a different behavior. It checks explicitly if the axis is zero and if so loads the identity quaternion as otherwise an illegal quaternion will be generated. This is a bug in the Matrix3f class which should be fixed if identical results are expected with a rotation matrix and a rotation quaternion. Anyway, this should just be taken as a hint that it is not suggested to use the Matrix3f class for rotation. In fact the Quaternion is used internally within the scene graph so it's best to supply rotation in that format to avoid conversion.

TextureManager.deleteTextureFromCard(bg); //Without this one, the computer runs out of memory after 20 seconds
bg.setImage(hypaImg);
bgts.setTexture(bg);


Why aren't you using the updateTextureSubImage() method in the Renderer class? You are probably losing a quite hefty amount of fps right there. Updating a texture avoids the re-allocation costs of creating and destroying textures. The last two lines are not needed by the way, if you initialized the TextureState and Texture properly.


All of this reminds me of when I was working on my ffmpeg based cinematic lib for jME. I even bothered to use a YUV->RGB conversion shader so that I don't have to do colorspace conversions on the CPU... These kinds of things matter especially for HD videos where decoding eats most of the CPU and you can no longer afford to convert 50 million pixels per second.

Hi. Any change of sharing your code? I'm in the early stages of an Augmented Reality project, and could do with some working code to examine. Thanks!

Hello,

I would also be interested of sharing your code.

Thank you by advance,



David.

@smerten hasn’t been active on this site for a long time; it’s hard to tell if he’ll ever come back.



Maybe this is what you’re looking for though:

http://code.google.com/p/armonkeykit/



Seems there was some activity up until April, so might still be worth trying to get in touch.

Hey. Im still alive. Ill post something tomorrow.

smerten said:
Hey. Im still alive. Ill post something tomorrow.
Well what'd'ya know! Another great success for notifications. :D

Welcome back smerten, and thanks in advance.

As Erlend said, http://code.google.com/p/armonkeykit/ is probably your best bet for some nice rounded up code. There is a video of it in use here

http://www.youtube.com/watch?v=xwUkRKB80LE&feature=player_embedded



I was contacted by a talented person regarding the use of AR in the Jmonkeyengine, he even wrote a blog about it:

http://armonkeykit.wordpress.com/



Another example

http://www.youtube.com/watch?v=UkMHNwBmApE



Ive sent the creator of armonkeykit an email, he may drop by here.

Hi. I was the developer of the 42Below Vodka AR project shown above. I definitely recommend using ARMonkeyKit. I originally went through all of the steps to integrate NYARToolkit with JMonkeyEngine, but the biggest obstacle was getting video to play efficiently within JMonkeyEngine. ARMonkeyKit has dealt with this really well, based on some video code by llama. Andrew Hatch, one of the developers, talks about it here:



http://www.andrewhatch.net/2010/01/armonkeykit.html



The performance improvement was very noticeable compared to more hacky video-to-texture methods I had found floating around the forum.



The kit comes with an Eclipse project, and can be made to build pretty easily in OSX. From what I can remember, in OSX you just need to locate QTJava.zip ( its in a system folder somewhere ) exclude SWC from the library, and make it build in 32 bit.

Great to see you adding your voices in here guys. The 42Below Vodka AR won’t be forgot any time soon :wink:



I sure hope we get to see an augmented reality project powered by jME3 one of these days tralala!

Hello,

First of all, thank you for your quick answers.

I would like to detect a marker, not in a video from a webcam, but only in a picture. This picture represent, for example, a house with a marker on the garden.



I am working with the example ARIDTeapot.java and I have done some changes in ARMonkeyKitApp.java :

In this function :

[java]

protected void simpleUpdate() {

if (showCamera == true) {

cameraBG.update();

}

markerProcessor.update(cameraBG.getRaster());

callUpdates();

}

[/java]

I would like to make the update on my picture (BufferedImage)

[java]

protected void simpleUpdate() {

if (showCamera == true) {

cameraBG.update();

}

markerProcessor.update(myImage.getRaster());

callUpdates();

}

[/java]

but I need a INyARRgbRaster, and I don’t know how to make it.

Can you help me please.

Any idea? please.

This thread’s topic direction is convoluted enough as it is. I suggest you start a new topic for your troubles. Also, are you working with jME2?