Can a Viewport be a part of a screen, inside a panel?

Hello guys,
I am trying to create a game where the user can insert commands and a robot will respond to that commands. So the screen will be in two basic parts, left and right: at the right there will be an editor and at the left there will be the main game with the whole scene and stuff.

So to do that I created a nifty gui in xml with a square panel at the left. Then I thought that I could intergrate the whole viewport into that panel just the way you intergrate an image. The problem is that I can’t find how to do that and I don’t even know if it is possible.

So is it possible and if yes then how? else if it is not possible then what’s the correct way?

If Nifty doesn’t support it directly, JME does! Just overlay a quad with the framebuffers texture of your offscreen viewport on top of your Nifty ui. There should be fairly easy to follow tutorials on how to do this in JME’s tutorials.

@SeriousGamer

Also, there are some samples pertaining to this if you create the jme tests projects under new projects that you can look at:
TestMultiViews under render
TestAWTPanels under awt

That might help as well. (edit) though, they are supplementary info only. (edit)

(edit)
It occurs to me to clarify, the first shows how to split a viewport up into cameras, and the second shows how to split up viewports into separately rendering and visible panels
(edit)

unfortunately the first sample leads to error 404 and about the second there are some troubleshoots as I find in some topics. What I tried is create an image from the view port like this:
[java]
Image image = app.getViewPort().getOutputFrameBuffer().getDepthBuffer().getTexture().getImage();
[/java]

I don’t know if this works because I try to intergrate this with ImageBuilder but ImageBuilder reads from file. So how if this code works, how can I intergrate the image into a panel?

@Relic724 said: @SeriousGamer

Also, there are some samples pertaining to this if you create the jme tests projects under new projects that you can look at:
TestMultiViews under render
TestAWTPanels under awt

That might help as well. (edit) though, they are supplementary info only. (edit)

(edit)
It occurs to me to clarify, the first shows how to split a viewport up into cameras, and the second shows how to split up viewports into separately rendering and visible panels
(edit)

I wish I could remember the specific test, but I believe there is also one that covers rendering to texture which may be more suited to the above scenario… maybe

Oh! I found it! Yeah, the TestRendertoTexture example.

It is actually one of the ones I’m studying now. It boils down to setup a camera, viewport, texture(to receive the “color” or “diffuse” map) and then attaching the texture to a framebuffer, which is then attached to the renderManager of your app as a PreView.

that scene is now rendered before a “MainView” in the rendering pipeline. The “magic” of the setup is in this method:

[java]
// these two declarations are part of the bounding class’s globals
private Geometry offBox;
private ViewPort offView;

public Texture setupOffscreenView(){
    Camera offCamera = new Camera(512, 512);

    offView = renderManager.createPreView("Offscreen View", offCamera);
    offView.setClearFlags(true, true, true);
    offView.setBackgroundColor(ColorRGBA.DarkGray);

    // create offscreen framebuffer
    FrameBuffer offBuffer = new FrameBuffer(512, 512, 1);

    //setup framebuffer's cam
    offCamera.setFrustumPerspective(45f, 1f, 1f, 1000f);
    offCamera.setLocation(new Vector3f(0f, 0f, -5f));
    offCamera.lookAt(new Vector3f(0f, 0f, 0f), Vector3f.UNIT_Y);

    //setup framebuffer's texture
    Texture2D offTex = new Texture2D(512, 512, Format.RGBA8);
    offTex.setMinFilter(Texture.MinFilter.Trilinear);
    offTex.setMagFilter(Texture.MagFilter.Bilinear);

    //setup framebuffer to use texture
    offBuffer.setDepthBuffer(Format.Depth);
    offBuffer.setColorTexture(offTex);
    
    //set viewport to render to offscreen framebuffer
    offView.setOutputFrameBuffer(offBuffer);

    // setup framebuffer's scene
    Box boxMesh = new Box(Vector3f.ZERO, 1,1,1);
    Material material = assetManager.loadMaterial("Interface/Logo/Logo.j3m");
    offBox = new Geometry("box", boxMesh);
    offBox.setMaterial(material);

    // attach the scene to the viewport to be rendered
    offView.attachScene(offBox);
    
    return offTex;
}

[/java]

notice that you can make an entire scene to be rendered this way… in the example, the material is a simple unshaded type. (logo.j3m)
then you can access it for another geometry like a Quad in the material setup.

[java]
Texture offTex = setupOffscreenView();

    Material mat = new Material(assetManager, "Common/MatDefs/Misc/Unshaded.j3md");
    mat.setTexture("ColorMap", offTex);
    quad.setMaterial(mat);

[/java]

and because any scene you create in a pre View should be constructed fully lit… the texture should always be considered a “color” or “diffuse” map if used on a geometry.

(edit) the code is snipped from the example, no javadoc to accredit original authorship (edit)
hope that helps.

That reminds me the nifty Gui projection tutorial. So you can project the texture that is created from the buffer on a Quad (512x512) and this viewport will be on top of the nifty Gui? I guess that your code would have to be in an update method so the texture to be refreshed for each frame

As I find out, the Nifty Gui documentation doesn’t say anything about interacting with other systems. So I think that I should create a Gui and on top of that I should place the viewport but smaller and in a specific position. Can this be done without render into textures?

@SeriousGamer

Yes. Read on.

From what I’ve learned so far, the application class, by default, sets up two Nodes, two Viewports (one Mainview, one Postview for the GUI to overlay), two Cams (one set at 45 deg Z scaling, and the other orthogonal to ignore scaling in Z for the GUI), The simple Application class extends the application class by adding a control(flyCam) for the Mainview Camera, and adds some GUI Elements to the GuiNode to show FPS and Verts and Tris and stuff.

So,
If you look at the code, the initialization of the pixel width/height is set by the camera class object,
Camera myCam = new Camera(myXpixels, myYpixels);

the viewport then renders to the screen, by default, unless you want to do the whole “using a PreView as a MainView texture…”
Viewport myViewport = renderManager.createMainView(“My Special Viewport”, myCam);
// the string just needs to be a unique name for your view, and the attachment to your camera defining how many pixels to render.
// also, the method myViewport.setOutputBuffer(framebuffer mybuffer) is the data rerouter so as to render the output onto something other than your screen.

Finally, and this is probably the magic method you were looking for:
myCam.setViewPort(float left, float right, float bottom, float top);
// and it goes from 0.0f to 1.0f.
// an example for it to be rendered for the right half of your screen: myCam.setViewPort(0.5f, 1f, 0f, 1f);

So that you can easily manipulate your cameras in a simple application just try these two lines:
this.getViewPort().getCamera().setViewPort(0f, 0.5f, 0f, 1f);
this.getGuiViewPort().getCamera().setViewPort(0.5f, 1f, 0f, 1f);
// you’ll notice some odd scaling of gui and objects because the Camera’s width/height have not been altered… all that we have done is split what is normally GUI layed on top of normal rendering scene to Left/Right halves of the screen.
This is not the proper way to do it, just “a” way to show the principle of the whole Node/Camera/Viewport relationship to each other and that jMonkeyEngine sets up a couple for you to play with, by default.

I hope this helps.

2 Likes

yes it worked! I am very excited! Thank you!

EDIT: But what if we need this screen on top the gui?

@SeriousGamer said: yes it worked! I am very excited! Thank you!

EDIT: But what if we need this screen on top the gui?

ah, so kind of like having a gui as the background for your game to play out upon? Sure thing, if I’m understanding you correctly. You would create a new node/cam/viewport setup similar to JME’s guiNode/guiCam/guiViewport, with the exception that you would use a statement like:
// ViewPort guiBackViewPort;
guiBackViewPort = renderManager.createPreView(“GUI Background ViewPort”, guiBackCam);

The cool part is that you can learn a lot of things about a typical JME class by right clicking on the thing you want to learn about, and choosing the go to source option in the right click menu. You can obtain the rest of the info you need by looking at the Application and and SimpleApplication(for controls and input mapping) classes.

I have noticed that the renderManager has three major levels of rendering order: PreView, MainView, and PostView that come, by default, in an application class. I mention this, because I may have a question on “Rendering Order Dependencies” at some point later on.

The reason that I learned how to render off screen to a framebuffer is so that I could make several scenes and affect the “rendering order” by just adjusting the quad’s relationship to the camera (the z coordinate). Further cool stuff is available by setting the off screen viewport’s clearing color
someViewPort.setBackgroundColor(new ColorRGBA(1.0f,1.0f,1.0f,0.0f)); // zero alpha is “invisible”
and then adjusting the quad’s rendering style:
renderedQuad.getMaterial().getAdditionalRenderState().setBlendMode(RenderState.BlendMode.Alpha);

so then, what ever pixels don’t get rendered just pass through to your GUI back ground, like a tabletop picture or something. um, some cool examples of this can be found on the A “3D” Capable GUI thread.

Game On

1 Like

I did what you told me and it worked! Also now I have better understanding of how this whole thing works. I have to say that going to source gives a lot of information and a much better understanding. So here’s my code:

[java]
Camera guiCam = new Camera(app.getContext().getSettings().getWidth(), app.getContext().getSettings().getHeight());
ViewPort guiBackViewCam = app.getRenderManager().createPreView(“guiBackViewCam”, guiCam);
NiftyJmeDisplay niftyDisplay = new NiftyJmeDisplay(assetManager, inputManager, app.getAudioRenderer(), guiBackViewCam);
Nifty nifty = niftyDisplay.getNifty();
nifty.fromXml(“Interface/GameGui.xml”, “game”);
guiBackViewCam.addProcessor(niftyDisplay);
app.getViewPort().getCamera().setViewPort(0f, 0.7f, 0.18f, 0.9f);
[/java]

My class extends AbstractAppState, that’s why I use get Methods. Please tell me if you think there is a better way or anything else in discussion.