Quad buffer stereo

I want some advice on how to implement quad buffer stereo with jME3. What I need is a way to render the scene twice. Once for each eye. Before each render I need to insert user code that changes the camera for the eye and call GL11.glDrawBuffer(GL11.GL_BACK_LEFT) and GL11.glDrawBuffer(GL11.GL_BACK_RIGHT).

The best reference on the forum I’ve found is this:


I think AppSettings.setStereo3D(true) will enable hardware stereo as oOMoeOo requested in his post. However I don’t know where he inserted his code as I can not find SimpleModule in the jME javadoc.

I’m still a jME newbie so I’m hoping someone can suggest where to insert the code I need. I’ve thought about overriding SimpleApplication.update() as it calls the renderManager.render(…) method. But is there another way like setting up two viewports that cover the screen and hooking up a callback before each viewport is rendered.

Note: I’m ONLY interested in quad buffer stereo, NOT anaglyphic or side by side.


See here:


search for “quad buffer”. I think that you know that it requries you to have a proper videocard like nVidia Quadro or similar, which has the quad buffer, but I will say that anyway in case you’re not sure.

@noncom said:
See here:


search for "quad buffer". I think that you know that it requries you to have a proper videocard like nVidia Quadro or similar, which has the quad buffer, but I will say that anyway in case you're not sure.

Yes, as far as I can tell, setStereo3D will enable quad buffering on the driver. However if that is all I do I will only render to the left eye. I can not find any evidence that jME3 handles rendering left and right eye. What I'm wondering is how I can hack in this code.

What I'm really asking for is how to best render the scenegraph twice. RootNode with two different cameras for the left and right eye (I'll provide them myself). GuiNode must also be rendered twice but with the same camera setup. Don't want stereo on the hud.

I work at a research institute that has got a VR lab. We have got a 4 meter projected stereo screen that I want to get up and working with jME. Before we've used Java3D which support this type of thing by design.

I understand what you’re talking about, I have implemented Quad Buffer support for Processing graphics library and been using it in a research institute too. However, I did not try this with jME3.

So there are two steps to do this kind of 3D:

  1. Enable the capability
  2. Manage rendering to the correct buffers and swap them properly. This usually results in a couple or few of extra-lines in the rendering loop and camera setup.

    Now that we have defined a solution for point 1, still it is unclear what to do with point 2. Let us look here: AppSettings source. Again, search for “Stereo” - it says:

    [java]Once enabled, filters or scene processors that handle 3D stereo rendering

    could use this feature to render using hardware 3D stereo.[/java]

    However, I did not find any such filters or processors in the standard bundle. And that is where we get to the post mentioned by you being useful. There is no SimpleModule, right, but from the code that is in that post, I can conclude that you can replace the SimpleModule with SimpleApplication. Just try that. Also pay attention that he had to do an adjustment to the pixel format. You will surely have to do adjustments to the code that is there, just see what it comes like and go with the flow. Don’t forget to tell us how is it going )

As I understand it two separate viewports each with their own camera could generate the left and right eye images and then you just merge the output of those viewports - depending on what form of 3d you are using.

It’s not an area I’ve any experience in though so someone else might have a better suggestion.

I just checked the source and AppSettings.setStereo3D(true) is not used to create the PixelFormat in the renderer. I’ve issued a bug report on that.

@noncom: thanks for your advise. I’ll look into creating my own SimpleStereoApplication.

@tom Ummm what source file exactly did you look into? I cannot find the usage of these flags anywhere so maybe I did not look well for it or maybe they are used in the native binding somehow. So this is not really a criterion here… Although I am no pro and can’t speak for that. Maybe the Mighty Creators have a words to say about this… MoeOo did request the PixelFormat automacy but I’m not sure if it has been taken care of yet. So ye, let’s see what they say about the bug. Anyway, you can downlaod the jME3 source and add (for yourself, not the official) what you think is missing there and propose a patch and maybe it gets included if it is worthy.

And yeah, their SimpleModule looks like itself extends SimpleApplication. I am too interested in what is going on with your endeavor, since it may happen to me too in a while, so please stay in touch!

Here is a link to the bug I posted: http://code.google.com/p/jmonkeyengine/issues/detail?id=532

I had to bite the bullet and download the source code to look at how to best solve this. I have fixed the PixelFormat bug locally so I can test things out.

I am pretty sure I can get it working by hacking a custom “SimpleApplication”. I say “hack” because I will have to call the opengl method glDrawBuffer directly in the application code. This should be in the renderer as all other opengl calls.

I have been thinking and have come to the conclusion that the best solution would be to add a “drawBuffer” field to ViewPort. RenderManager.renderViewPort(ViewPort vp, float tpf) would then set it on the renderer before rendering the ViewPort. That is if it is not a FrameBuffer. I think this makes sense as one can think of drawBuffer as similar to FrameBuffer as it defines where it should be rendered. Have a look at the opengl reference for more details: http://www.opengl.org/sdk/docs/man/xhtml/glDrawBuffer.xml

Would be interesting to hear what the jME3 developers think of this. Although it might be hard sell to make changes to support quad buffer stereo since not many people will be able to use it. Stereo might be more common in the future with 3D tvs and the Oculus Rift. But they will probably use split screen stereo which is something completely different. But there are people out there using Java for scientific visualization that would able to use it.

1 Like

@tom I see, well, looks like the core team is on a vacation right now! However, I am not sure that they will work on this because of many other very important tasks and the 3D being not so widely used. These two are common causes on why certain things do not get included or worked on… However, I hope that because the setStereo() command is present in the source already, that they have some plan on making this one through.

Don’t know if the 3D systems get very popular over time, they burn eyes and brains out. The only humane 3D system that I worked with was the one based on light polarization. The system is such:

On a special wall, from the backside, two groups of projectors project the pictures for the two eyes. Their light is polarized so that the two pictures have polarizations orthogonal to each other. Each group of projectors consists of three projectors to account for the intensity loss. Then, each user wears special glasses which filter out the correct polarization for each eye. So the entire system is passive which is just great! Any number of users is supported because the glasses are just plain old polaroids and the quality and color of the picrure is quite supreme. Although I think that there is a room for improvement, the technology itself is the only one acceptable for me (and many other ppl) from what I saw. At least I can look at it for more than 30 minutes without getting a headache or eyesore. But the system itself is not that cheap. I think that if glasses like Oculus Rift get to the real life, and will not be as killing as all the common modern 3D, then there is certainly a huge commrcial potential in this.

@noncom We use back projection and passive glasses as well. It is the same type of system as some cinemas use. I’ve come across two types of passive glasses. This uses the one with slightly tinted red and green lenses. We only have one projector per eye though. One of the problems we have is that the image is quite dark. That and you get a halo effect from light bulb shining threw the canvas.

As for the code. I’ve implemented the solution I mentioned above. I’m going to test it out tomorrow in the lab. Today I’ve got 4 demos and I don’t dare touch it before it is over. I’m going send the code in as a patch to the jME team. Hopefully they will accept it. The code will not break any existing applications. But it changes the Renderer interface so all third party renderers has to be updated.

Btw, our company has pledged money to get an Ocolus Rift. I can’t wait to test it out. Hopefully it is as good as they say it is. Also want to buy a razer hydra to get accurate hand tracking. Already have a kinect setup that can be used for some laggy positional head tracking.

I’ve issued a patch to add support for quad buffer stereo:


I’ve tested it on a machine with Quadro 5000 and it works.


Since it won’t work on any consumer level cards, I only applied the patch to set the stereo 3D property on the pixel format. Capability to change the draw buffer was not added.

Thank you for patching the pixel format. I understand you are focusing on games and not scientific visualisation and that you don’t want to clutter up the api. Although it forces me to use OpenGL commands outside the renderer which feels a bit hackish.

The best workaround I’ve come up with is to create a SceneProcessor that sets the drawBuffer in preFrame and sets it to BACK in postFrame. This means translucent objects always be rendered to BACK but I can live with that.

I’ll post the code here in case anyone needs it in the future.

[java]import com.jme3.post.SceneProcessor;

import com.jme3.renderer.RenderManager;

import com.jme3.renderer.ViewPort;

import com.jme3.renderer.queue.RenderQueue;

import com.jme3.texture.FrameBuffer;

import org.lwjgl.opengl.GL11;


  • Sets the specified DrawBuffer mode on preFrame and resets to BACK on
  • postFrame. Can be used to do quad buffer stereo. Set BACK_LEFT on the left
  • eye ViewPort and BACK_RIGHT on the right eye ViewPort. Remember to set
  • AppSettings.setStereo3D(boolean) to true. Note that Translucent objects are
  • processed after postFrame and will always be rendered to BACK.

  • @author tomrbryn


    public class DrawBufferProcessor implements SceneProcessor {

    public enum Mode {











    private final int openGLValue;

    private Mode(int value) {

    this.openGLValue = value;


    public int getOpenGLValue() {

    return openGLValue;



    private final Mode drawBuffer;

    public DrawBufferProcessor(Mode drawBuffer) {

    this.drawBuffer = drawBuffer;


    public void initialize(RenderManager rm, ViewPort vp) {


    public void reshape(ViewPort vp, int w, int h) {


    public boolean isInitialized() {

    return true;



    public void preFrame(float tpf) {



    public void postQueue(RenderQueue rq) {


    public void postFrame(FrameBuffer out) {



    public void cleanup() {




    Here is an example that renders a box at different positions to a left and right ViewPort:

    [java]import com.jme3.app.SimpleApplication;

    import com.jme3.material.Material;

    import com.jme3.math.ColorRGBA;

    import com.jme3.math.Vector3f;

    import com.jme3.renderer.Camera;

    import com.jme3.renderer.ViewPort;

    import com.jme3.scene.Geometry;

    import com.jme3.scene.shape.Box;

    import com.jme3.system.AppSettings;

  • Tests quad buffer stereo.


    public class TestStereo extends SimpleApplication {

    public static void main(String[] args) {

    AppSettings appSettings = new AppSettings(true);


    TestStereo helloStereo = new TestStereo();





    protected Geometry player;

    protected Camera leftCamera;

    protected ViewPort leftViewPort;

    protected Camera rightCamera;

    protected ViewPort rightViewPort;


    public void simpleInitApp() {

    viewPort.addProcessor(new DrawBufferProcessor(DrawBufferProcessor.Mode.NONE));

    leftCamera = new Camera(settings.getWidth(), settings.getHeight());

    leftCamera.setFrustumPerspective(45f, (float) cam.getWidth() / cam.getHeight(), 1f, 1000f);

    leftCamera.setLocation(new Vector3f(-1f, 0f, 10f));

    leftCamera.lookAt(new Vector3f(-1f, 0f, 0f), Vector3f.UNIT_Y);

    leftViewPort = renderManager.createMainView("Left", leftCamera);

    leftViewPort.setClearFlags(true, true, true);

    leftViewPort.addProcessor(new DrawBufferProcessor(DrawBufferProcessor.Mode.BACK_LEFT));

    rightCamera = new Camera(settings.getWidth(), settings.getHeight());

    rightCamera.setFrustumPerspective(45f, (float) cam.getWidth() / cam.getHeight(), 1f, 1000f);

    rightCamera.setLocation(new Vector3f(1f, 0f, 10f));

    rightCamera.lookAt(new Vector3f(1f, 0f, 0f), Vector3f.UNIT_Y);

    rightViewPort = renderManager.createMainView("Right", rightCamera);

    rightViewPort.setClearFlags(true, true, true);

    rightViewPort.addProcessor(new DrawBufferProcessor(DrawBufferProcessor.Mode.BACK_RIGHT));




    * this blue box is our player character /

    Box b = new Box(Vector3f.ZERO, 1, 1, 1);

    player = new Geometry("white cube", b);

    Material mat = new Material(assetManager, "Common/MatDefs/Misc/Unshaded.j3md");

    mat.setColor("Color", ColorRGBA.White);




    Use the main event loop to trigger repeating actions. */


    public void simpleUpdate(float tpf) {

    // make the player rotate:

    player.rotate(0, 2 * tpf, 0);



1 Like

I have just set this up on my PC. I did all the patches that were not moved into the core, and downloaded these two files. But all I see is one rotating box. Is it not supposed to alterate between left and right view?

It seems that all that is rendered is the left viewport. If I remove the left viewport, I see the box from the right view ports perspective…


Edit: Sorry, I was a little quick to jump to the forums. I just tested out the Quad Buffered 3D view with a 3D movie player, and it also only renders a single viewport… I tried updating my drivers, but still didn’t work. Anyways, the problem seems to be my hardware / driver setup, not jME.