Stereoscopy with jME3


I want to know, if it’s possible to create anaglyph 3D applications with jME3. I only found examples for jME1 or 2 ( / How can I do this with jME3?

Take a look at

Thanks for your answer. Your example describes how to place 2 cameras in a scene.

I think i need some kind of color filtering for the two cameras now. - whats the best approach for that?

Momko_Fan ( seems to do something different.

btw. Is there no 3D driver like iZ3D thats compatible with jME? That would be the easiest solution.

you could add a ColorOverlayFilter to each viewport, one green, one red…

I tested it and unfortunately the second viewport is black.

I’m gonna look into it

did you look into it? is it a bug? is there a workaround?

Yeah i looked into it…it’s definitely a bug, and it’s a lot more complicated than I expected.

The problem here is that the filter is applied on the full screen, so if you add filters on 2 different view ports, the first one is “erased” by the second one.

I need Kirill (momoko_Fan) master mind to resolve this, but to be honest, this problem is very particular, and is not likely to happen very often.

Since we all have a lot of things to do in the team, i think it will be fixed at best during the beta phase.

Anyway I thought about what you are trying to do, and a multi views layout is not what you need.

What you want is to “mix” the result of the 2 viewports, this can’t be achieved by rendering the 2 main viewports on screen, because they will be separated and won’t overlap.

What I would do, is create a second Camera, attach each camera (the original one and the second) to its own camera Node, then attach the cameraNodes to a single parent node, and attach this parent node to the scene.

The camera node must be positioned with an offset from each other.

then i’d create another viewPort with the second camera and attach the scene to both viewports.

I’d then create a RenderPorcessor that will handle the render of the 2 viewPorts to a frameBuffer object. The result of this 2 renders would be combined with a shader on a full screen quad. This way it can easily mix the 2 textures, and apply a color filter on each one.

What do you think?

alright, thanks… i’ll try that

do you have any example code for creating a RenderProcessor?

yep look at the FilterPostProcessor, the WaterProcessor, HDRRenderer and PSSMShadowRenderer

ok, i give up… i’ve been working on this for some hours now…

Thats the first time i’m using jMonkey, i don’t really understand this processing-system and i couldn’t find any documentation (besides the examples).

That’s my “result” which is producing a black screen (for testing i wanted to view only ViewPort1). I think i’m doing it wrong…



  • To change this template, choose Tools | Templates
  • and open the template in the editor.


    package mygame;

    import com.jme3.asset.AssetManager;

    import com.jme3.math.ColorRGBA;





    import com.jme3.renderer.RenderManager;

    import com.jme3.renderer.ViewPort;

    import com.jme3.renderer.queue.RenderQueue;

    import com.jme3.scene.Geometry;

    import com.jme3.texture.FrameBuffer;

    import com.jme3.texture.Image;

    import com.jme3.texture.Texture2D;

    import com.jme3.ui.Picture;

    import java.util.ArrayList;

    import java.util.List;

    import java.util.Queue;

    import java.util.concurrent.ConcurrentLinkedQueue;

    import org.omg.PortableInterceptor.SYSTEM_EXCEPTION;


  • @author Stefan Baust


    public class MyProcessor implements SceneProcessor {

    AssetManager ass;

    ViewPort vp1 = null;

    ViewPort vp2 = null;

    FrameBuffer fb = new FrameBuffer(640, 480, 0);

    FrameBuffer fb2 = new FrameBuffer(640, 480, 0);

    RenderManager renderManager;

    public MyProcessor(AssetManager assetManager, ViewPort vp1, ViewPort vp2) {

    ass = assetManager;

    this.vp1 = vp1;

    this.vp2 = vp2;


    Texture2D filterTexture = new Texture2D(640, 480, Image.Format.RGB32F);




    FilterPostProcessor fp = new FilterPostProcessor(ass);

    fp.addFilter(new ColorOverlayFilter(ColorRGBA.Cyan));


    FilterPostProcessor fp2 = new FilterPostProcessor(ass);

    fp2.addFilter(new ColorOverlayFilter(ColorRGBA.Red));



    public void initialize(RenderManager rm, ViewPort vp) {

    renderManager = rm;



    public void reshape(ViewPort vp, int w, int h) {



    public boolean isInitialized() {

    return vp1 != null && vp2 != null && renderManager != null;


    public void preFrame(float tpf) {




    public void postQueue(RenderQueue rq) {


    public void postFrame(FrameBuffer out) {



    Picture fsQuad = new Picture(“filter full screen quad”);


    renderManager.getRenderer().clearBuffers(true, true, true);

    Geometry g = new Geometry(“hans”);

    // renderManager.renderGeometry(fsQuad);


    public void cleanup() {




Well you certainly can’t expect that everything works right out of the box.

Useing a 3d engine and creating a game is more learning a lot and for quite some time.

okay… i think i made it…

I tried again to “port” the “StereoRenderPass”-Class Momoko_Fan posted, it seems to work now. I think i have to do some finetuning now.