Is it possible to have two cameras creating a splitscreen effect, and then adding a quad serving as a HUD for both? Multiple cameras is working fine for me, but when I attempt to create the HUD (a single textured quad rendered in ortho mode) the HUD only shows on the first camera. I draw the HUD by calling Renderer.draw(Spatial). Is there anything special that prevents the HUD being drawn in the second camera?
when you set the viewport in the render method, something like:
standardGame.getDisplay().getRenderer().renderQueue();
// set render to right screen
standardGame.getCamera().setViewPort(0.5f, 1, 0, 1);
standardGame.getCamera().update();
standardGame.getDisplay().getRenderer().draw(orthoQuad);
standardGame.getDisplay().getRenderer().renderQueue();
// set renderto left screen
standardGame.getCamera().setViewPort(0, 0.5f, 0, 1);
standardGame.getCamera().update();
very important are the renderQueue() calls.
That's the point - how do I set a quad to appear on a certain screen when in ortho mode?
just split the texture for the quad in two and set the second half on the second screen seperately.
It works! The call to Renderer.renderQueue() was the thing that was missing. What does this method actually do, the JavaDoc is rather minimal about it.
it renders everything that was placed in the queue
More elaborate explanation: jME puts all spatials that need to be drawn in a queue, which are then sorted either according to state (if opaque) or back-to-front (if transparent). This reduces state changes and allows correct blending of transparent objects.