It's good that you have it working now. Not trying to nitpick, but there are quite a lot of problems with the code in your last post which might stop you from getting even higher fps.
Init the buffer and the image (not awt.BufferedImage) in the simpleInit
hypaImg = new Image(Format.RGB16, 640, 480, buf);
Why are you using the format RGB16? This is a high precision format that uses 16 bits for each component and would be costly for some card to store. I guess the driver is smart enough to detect that you're only sending it 8 bits per component and ignore your internal format suggestion. Still to prevent issues on older video cards and such you should always use RGB8 or similar format like BGR8.
Init the bg texture with a dummy 1x1 rgb image at startup(any suggestions to how this can be done otherwise would be appreciated)
Using TextureManager is not necessary here.
Ideally you should create an image from scratch and a texture from scratch:
private ByteBuffer dataBuf;
private Image image;
private Texture tex;
//..
dataBuf = BufferUtils.createByteBuffer(width * height * 3);
image = new Image(width, height, dataBuf);
tex = new Texture();
tex.setImage(image);
// Important: Do NOT enable mipmaps for video and user interface textures
tex.setMinificationFilter(MinificationFilter.BilinearNoMipmaps);
tex.setMagnificationFilter(MagnificationFilter.Bilinear);
Update the hypaImg with data from cam
hypaImg.addData(buf);
Using the addData method is not needed, this just adds many of the same bytebuffe to an arraylist (look inside the Image class source). You should just set the ByteBuffer to the image once in the constructor and never touch the Image object again.
m.fromAngleAxis((float) Math.toRadians(180), new Vector3f(0, 1, 0));
background.setLocalRotation(m);
m.fromAngleAxis((float) Math.toRadians(180), new Vector3f(0, 0, 0));
background.setLocalRotation(m);
The fromAngleAxis() method overwrites the rotation in the transform matrix, the setLocalRotation method overwrites the rotation in the spatial. The first two lines therefore do nothing.
We eliminate the first two lines and are left with this:
m.fromAngleAxis((float) Math.toRadians(180), new Vector3f(0, 0, 0));
background.setLocalRotation(m);
A zero vector is an illegal argument for an axis, but it seems the fromAngleAxis() method ignores that. The implementation sets m00 = m11 = m22 = cos(angle). The cosine of 180 is -1 and as a result when the matrix transformation is applied to a model it will be inverted along the X, Y and Z axis. Surprisingly enough, the Quaternion class which is supposed to mirror the rotation matrix class has a different behavior. It checks explicitly if the axis is zero and if so loads the identity quaternion as otherwise an illegal quaternion will be generated. This is a bug in the Matrix3f class which should be fixed if identical results are expected with a rotation matrix and a rotation quaternion. Anyway, this should just be taken as a hint that it is not suggested to use the Matrix3f class for rotation. In fact the Quaternion is used internally within the scene graph so it's best to supply rotation in that format to avoid conversion.
TextureManager.deleteTextureFromCard(bg); //Without this one, the computer runs out of memory after 20 seconds
bg.setImage(hypaImg);
bgts.setTexture(bg);
Why aren't you using the updateTextureSubImage() method in the Renderer class? You are probably losing a quite hefty amount of fps right there. Updating a texture avoids the re-allocation costs of creating and destroying textures. The last two lines are not needed by the way, if you initialized the TextureState and Texture properly.
All of this reminds me of when I was working on my ffmpeg based cinematic lib for jME. I even bothered to use a YUV->RGB conversion shader so that I don't have to do colorspace conversions on the CPU... These kinds of things matter especially for HD videos where decoding eats most of the CPU and you can no longer afford to convert 50 million pixels per second.