Qestions on Render to Texture, RootNodes and stuff

So after some false starts and issues. I have some fairly broad questions. But i will start with what i want to do.

I intend to calulated light probes for PBR on the GPU. In pure opengl this is fairly easy. However getting this to fit into jME pipeline is proving more difficult.

The steps are:

  1. Render to a texture cube with 6 viewports, this works no problem. (ok so a small problem, but resolved)
  2. Render this to a octahedren mapping, that is a square texture. Already having issues here.

So we won’t worry about the rest.

I wish to render a texture i just rendered to texture. So render to texture. Set a new material with said texture and required shaders. Render to a another texture via just a single quad.

However any geometry not in the root node won’t work in a RenderManager.renderViewPort(ViewPort,float) call. I don’t want to be switching out root nodes all the time.

So how can i have more than one root? how can i use geometry or scene graphs that are independent from the root node.

Thanks in advanced.

Which root node? You have as many as you want… one per viewport (or many per viewport in fact).

Are you using a preview viewport to render your stuff? We may need to know more about what you have tried already and why it didn’t work.

If you want to render a quad you don’t need a scene.
You can just render the quad in a frame buffer, that’s what is done for the filterPostProcessor

Note taht the quad has a special way to be projected to screen, it’s a 1,1 quad.
Use the post.vert shader in your material to project it properly

import com.jme3.export.JmeExporter;
import com.jme3.export.JmeImporter;
import com.jme3.math.ColorRGBA;
import com.jme3.post.SceneProcessor;
import com.jme3.renderer.Camera;
import com.jme3.renderer.RenderManager;
import com.jme3.renderer.ViewPort;
import com.jme3.renderer.queue.RenderQueue;
import com.jme3.scene.Spatial;
import com.jme3.texture.FrameBuffer;
import com.jme3.texture.Image;
import com.jme3.texture.Image.Format;
import com.jme3.texture.Texture;
import com.jme3.texture.Texture2D;
import com.jme3.util.BufferUtils;
import com.jme3.util.Screenshots;
import java.awt.image.BufferedImage;
import java.io.IOException;
import java.nio.ByteBuffer;
import java.util.LinkedList;

public class RenderToTexture extends Texture2D implements SceneProcessor 
{
    private RenderManager renderManager;
    private ViewPort viewport;
    private boolean enabled = true;
    private transient float time;
    private transient float updateTime = 1 / 30f;
    private ByteBuffer cpuBuf;
    private LinkedList<Spatial> offscreenScenes;

    public RenderToTexture(String name, int width, int height, RenderManager renderManager) 
    {
        this(name, width, height, Format.RGBA8, 1, new Camera(width, height), renderManager);
    }


    public RenderToTexture(String name, int width, int height, Format format, int sample, Camera camera, RenderManager renderManager) 
    {
        //MemoryTracer.trace(this);
        setMinFilter(Texture.MinFilter.Trilinear);
        init(name, format, sample, camera, renderManager);
    }



    private void init(String viewPortName, Format format, int sample, Camera camera, RenderManager renderManager) 
    {
        this.renderManager = renderManager;
        final int width = camera.getWidth();
        final int height = camera.getHeight();
        viewport = renderManager.createPreView(viewPortName, camera);
        //viewport.setClearEnabled(true);
        viewport.setClearFlags(true, true, true);
        viewport.setBackgroundColor(ColorRGBA.BlackNoAlpha);
        // setup framebuffer's texture
        Image image = new Image(format, width, height, null);
        setImage(image);

        FrameBuffer frameBuffer = new FrameBuffer(width, height, sample);
        frameBuffer.setDepthBuffer(Format.Depth);
        frameBuffer.setColorTexture(this);

        viewport.setOutputFrameBuffer(frameBuffer);
    }



    public ViewPort getViewPort() 
    {
        return viewport;
    }

    public void setScene(final Spatial... target) 
    {
        viewport.clearScenes();
        attachScene(target);
    }

    public void attachScene(final Spatial... target) 
    {
        for (Spatial spatial : target) 
        {
            viewport.attachScene(spatial);
        }

    }

    public void attachOffscreenScene(final Spatial... target) 
    {
        if (offscreenScenes == null) 
        {
            offscreenScenes = new LinkedList<Spatial>();
            viewport.addProcessor(this);
        }
        for (Spatial s : target) 
        {
            offscreenScenes.add(s);
            viewport.attachScene(s);
        }
    }

    public boolean hasScene(final Spatial target) 
    {
        return viewport.getScenes().contains(target);
    }

    public Camera getCamera() 
    {
        return viewport.getCamera();
    }
    
    public void setBackgroundColor(ColorRGBA bg) 
    {
        viewport.setBackgroundColor(bg);
    }

    public void setRefreshTime(final float time) 
    {
        updateTime = time;
    }

    public void setEnabled(final boolean enable)
    {
        viewport.setEnabled(enable);
    }

    public void takeSnapshot() 
    {
        time = updateTime;
    }

    public void reset() 
    {
        viewport.clearScenes();
        setEnabled(false);
    }

    public void dispose() 
    {
        reset();
        setEnabled(false);
    }

    public BufferedImage getBufferedImage(BufferedImage bufferedImage) 
    {
        final int width = getCamera().getWidth();
        final int height = getCamera().getHeight();
        if (cpuBuf == null) 
        {
            cpuBuf = BufferUtils.createByteBuffer(width * height * 4);
        }
        if (bufferedImage == null) 
        {
            bufferedImage = new BufferedImage(width, height, BufferedImage.TYPE_4BYTE_ABGR);
        }
        cpuBuf.clear();
        renderManager.getRenderer().readFrameBuffer(viewport.getOutputFrameBuffer(), cpuBuf);

        synchronized (bufferedImage)
        {
            Screenshots.convertScreenShot(cpuBuf, bufferedImage);
        }

        return bufferedImage;
    }

    public void useRenderBuffer() 
    {
        FrameBuffer fb = getViewPort().getOutputFrameBuffer();
        fb.setDepthBuffer(Format.Depth);
        fb.setColorBuffer(Format.RGBA8);
    }

    public void initialize(RenderManager rm, ViewPort vp) 
    {

    }

    public void reshape(ViewPort vp, int w, int h) 
    {

    }

    public boolean isInitialized() 
    {
        return false;
    }

    public void preFrame(float tpf) 
    {
        for (Spatial s : offscreenScenes) 
        {
            s.updateLogicalState(tpf);
            s.updateGeometricState();
        }
        time += tpf;
        if (time >= updateTime) 
        {
            time = 0;
            viewport.setEnabled(enabled);
        } 
        else 
        {
            viewport.setEnabled(false);
        }
    }

    public void postQueue(RenderQueue rq) 
    {

    }

    public void postFrame(FrameBuffer out) 
    {
        viewport.setEnabled(enabled);
    }

    public void cleanup() 
    {
        renderManager.removePreView(viewport);
    }

    public void read(JmeImporter importer) throws IOException 
    {

    }

    public void write(JmeExporter exporter) throws IOException 
    {

    }

}

Just an updated version of a helper class malova posted (actually now I post it I’m wondering if it’ll work on 3.1…)

Usage:

RenderToTexture rtt = new RenderToTexture("RTT", chosenWidth, chosenHeight, Format.ABGR8, 0, cameraNewOrSameOne, renderManager);
rtt.attachOffscreenScene(someNodeThatIsNotTheRootNodeYouStartWithNorIsItAttachedToRootNode);

Since RenderToTexture is a texture, materials can be set to use at as diffuse, or filters or anything. Just thought I’d post it since I normally make a proper mess using render to texture, and this helped me keep things clean.

1 Like

Thanks guys. That should get me sorted.

To answer questions, I was trying to render a quad attached to node, but would always get an error that i can’t change the scene graph in a different thread or that the scene wasn’t initialized properly. Yet i was obviously in the same thread.

I did spend some time looking at the different filters and things, but totally missed RenderManager.renderGeometry. That is perfect for my current needs.

This happens when you are trying to change something about a Spatial after updateGeometricState() has already been called on it. Multithreading is just the most likely case that bites people so that’s what the error mentions. (99% of the cases are thread issues.)

Edit: or alternately, if you are trying to render a Spatial through certain render methods without having called updateGeometricState() on it.

Just for reference, does calling updateGeometricState again fix this issue?

I am getting this deep down the rabbit hole i may as well learn all that i can.

Yes… for various definitions of “fix”.

The real answer is “it depends”.

This is a sanity check put into the engine because usually if you try to render something that’s dirty then that’s bad. updateGeometricState() makes it “undirty” because it applies transforms, resolves lighting, whatever… to get it ready for rendering.

Whether this “fixes” it is situational. For example, rendering something, modifying it, and then rendering it again in a different pass is often a bug itself. In which case the flag check is properly showing you a problem.

The other common case where people see this is when they are making their own viewports and not managing them properly. (See the probably few hundred threads that mention ViewportAppState or whatever. :))

indeed i have read these threads. In my case this would be ok, since i was trying to have my own independent “scene” that only ever gets rendered to a texture/viewport. ie copy the rootNode life cycle. Obviously doing this would have performance implications.

However for what i want to really do. The renderManager.renderGeometry is a much better match for my needs.

thanks again.