Non-rectangular/diagonal viewport

hi,

i wish to divide the window/screen to four parts diagonally.

imagine you have the rectangle screen and you draw a line from top left corner to bottom right corner, and bottom left corner to top right corner. it divides the screen to 4 triangles.

i will show the four sides of the same node in four view ports. is non-rectangular view ports possible in jm3?

if not is there another engine you can suggest.

thanks.

1 Like

Could you maybe explain why? It might help us understand how one would attempt such a thing.

1 Like

i will project the screen on a glass prism to achieve an holographic effect.

2 Likes

I would probably render each viewport to a different texture and then display those textures on triangle meshes.

Couple of options for the mesh. You could do four different triangle mesh objects each with their own material or you could do one mesh object comprised of four triangles and use a single material that uses a second set of uv coordinates, vertex colors or a splat map to determine which texture should be displayed on each face. The latter option would have better performance, but with just four triangles it’s probably not that big of a difference.

2 Likes

A post process would give you the screen as a texture, and you could splice it up however you see fit. That would only require shader knowledge. I’m quite certain I would head down that route. Multiple view ports sounds like a hammer-nut situation.

1 Like

Yeah, a post process would probably be the more performant method and easiest to setup, but it would depend on how much refraction for the prism we’re talking about. If you really want to show “four sides of a node” you’d need 360° of visibility and I don’t think you could accomplish that with one viewport.

1 Like

Even if you don’t want to do render to texture, you could render four overlapping viewports and just make sure they have masked the triangle area… either through zbuffer tricks (filling the near plane for the masked parts but using transparent color) or stenciling (never used it before so I don’t really know).

You’d be rendering a lot of overage but it would be discarded early because of the z-buffer thing.

1 Like

You could use fxaa more easily with pspeed’s method. With render to texture you’d have to either apply the fxaa post process filter or render to multi-sample textures then copy those over to single-sample textures and apply those to your materials. With a post processor you’d have the same options for fxaa as render to texture, but you’d probably need to rewrite the post processor if you wanted to go the copy from one texture to another route.

Copying from one texture to another in order to use anti-aliasing on a render texture will yield a better quality result, but takes up more memory. Can’t comment on the performance characteristics of the two methods.

1 Like

I appreciate all of the answers. I am trying to make sense of it all with my 20+ years of general programming but zero years in 3d/jm3 experience. It doesn’t sound “english” to me so far. :slight_smile:

this is what i’m trying to accomplish: https://ibb.co/jTJ8xw

I will appreciate if any sample code or link along with comments to get me started in the right direction.

Thank you.

1 Like

I’d probably use a post processor for that. Looks like the bottom is a reflection of the top and the left is a reflection of the right?

I don’t have my computer in front of me, but basically the post processor material will accept the rendered scene as a texture and manipulate the uv coordinates of the quad it’s drawn with.

Something like:

vec2 uv = texcoord;
if (texcoord.x < 0.5) {
    uv.x = 1.0 - texcoord.x;
}
if (texcoord.y < 0.5) {
    uv.y = 1.0 - texcoord.y;
}

gl_FragCol = texture2D(sceneTexture, uv);
1 Like

there is no (mirror) reflection. imagine i put a model car on a table at eye level. i took four dead center pictures from four sides. i wish to put those pictures on triangles with the orientation shown in the example i provided.

1 Like

Oh i see, yeah you’d want four viewports and either render them to a texture and then apply those textures to four triangles or use pspeed’s solution. I don’t have my computer in front of me so maybe @pspeed can supply you with some example code.

1 Like

thank you all for really helpful comments. there is a lot to digest for me and i have learned what to google for. thanks.

2 Likes

I had some time in front of the computer today and thought it might be nice for anyone that comes across this topic to see some code on the subject of render to texture so I updated my jME-GMath library, http://1337atr.weebly.com/gmath.html, with some methods that will create a ViewPort that renders to a Texture2D.

First the code in jME-GMath:

/**
 * <p>Creates a <code>ViewPort</code> that will render it's scene to a
 * <code>Texture2D</code>.</p>
 * 
 * @param name The name of the <code>ViewPort</code>
 * @param width The width, in pixels, of the desired texture.
 * @param height The height, in pixels, of the desired texture.
 * @param fieldOfView The field of view, in degrees, of the camera.
 * @param near Objects closer to the camera than this value will be clipped.
 * @param far Objects farther from the camera than this value will be clipped.
 * @param samples The number of samples to use in a multi-sampled framebuffer.
 * @return A <code>ViewPort</code> that will render the contents of it's scene
 * to a texture that can be obtained via {@code (Texture2D)ViewPort.getOutputFrameBuffer().getColorBuffer().getTexture();}.
 */
public static ViewPort renTex(String name, int width, int height, float fieldOfView, float near,
        float far, int samples) {
    Camera cam = new Camera(width, height);
    cam.setFrustumPerspective(fieldOfView, (float)width / height, near, far);
    return renTex(name, cam, Image.Format.ARGB8, Image.Format.Depth, samples);
}

/**
 * <p>Creates a <code>ViewPort</code> that will render it's scene to a
 * <code>Texture2D</code>.</p>
 * 
 * @param name The name of the <code>ViewPort</code>.
 * @param cam The <code>Camera</code> to render the scene with.
 * @param colorFormat The <code>Image.Format</code> for the texture.
 * @param depthFormat The <code>Image.Format</code> to use for the depth buffer.
 * @param samples The number of samples to use in a multi-sampled framebuffer.
 * @return A <code>ViewPort</code> that will render the contents of it's scene
 * to a texture that can be obtained via {@code (Texture2D)ViewPort.getOutputFrameBuffer().getColorBuffer().getTexture();}.
 */
public static ViewPort renTex(String name, Camera cam, Image.Format colorFormat,
        Image.Format depthFormat, int samples) {
    FrameBuffer buf = new FrameBuffer(cam.getWidth(), cam.getHeight(), samples);
    buf.setDepthBuffer(depthFormat);
    Texture2D tex = new Texture2D(cam.getWidth(), cam.getHeight(), samples, colorFormat);
    tex.setWrap(Texture.WrapMode.Repeat);
    buf.addColorTexture(tex);
    
    ViewPort vp = new ViewPort(name, cam);
    vp.setOutputFrameBuffer(buf);
    
    return vp;
}

There are additional methods in the GMath library with varing degrees of default values, but the above two show the meat of it.

An example on how this could be used in a jME application:

private Node renderToTexScene;
    
@Override
public void simpleInitApp() {
    getViewPort().setBackgroundColor(ColorRGBA.White);
    
    //All nodes attached to this node will render to our texture
    renderToTexScene = new Node("RenderToTextureScene");
    //Create the ViewPort and texture for our render to texture scene.
    ViewPort vpt = GMath.renTex("RenderToTexture", 640, 640, 45, 1, 1000, 1);
    //jME doesn't allow adding your custom created ViewPorts to one of the default
    //render queues so we can either render it ourselves or copy the data over to
    //a ViewPort created via the RenderManager as such
    ViewPort vp = renderManager.createPreView("RenderToTexture", vpt.getCamera());
    vp.setClearFlags(true, true, true);
    vp.setOutputFrameBuffer(vpt.getOutputFrameBuffer());
    vp.setBackgroundColor(ColorRGBA.Black);
    vp.attachScene(renderToTexScene);
    
    //Move and rotate the camera that will render to the texture
    vp.getCamera().setLocation(new Vector3f(0, 0, 10));
    vp.getCamera().lookAt(Vector3f.ZERO, Vector3f.UNIT_Y);
    
    //Get the texture we are rendering to
    Texture2D renderTexture = (Texture2D)vp.getOutputFrameBuffer().getColorBuffer().getTexture();
    
    //Create a box to display on a texture
    Box box = new Box(2, 2, 2);
    Geometry boxGeom = new Geometry("Box", box);
    boxGeom.rotate(0, 37 * FastMath.DEG_TO_RAD, 0);
    Material boxMat = new Material(assetManager, "Common/MatDefs/Misc/Unshaded.j3md");
    boxMat.setColor("Color", ColorRGBA.Blue);
    boxGeom.setMaterial(boxMat);
    //Attach it to our render to texture scene
    renderToTexScene.attachChild(boxGeom);
    
    //Create a quad to display our texture
    Quad quad = new Quad(5, 5);
    Geometry quadGeom = new Geometry("Quad", quad);
    quadGeom.rotate(0, 23 * FastMath.DEG_TO_RAD, 0);
    quadGeom.setLocalTranslation(-5, -2.5f, 0);
    Material quadMat = new Material(assetManager, "Common/MatDefs/Misc/Unshaded.j3md");
    //Assign the texture we render to
    quadMat.setTexture("ColorMap", renderTexture);
    quadGeom.setMaterial(quadMat);
    //Attach it to the default root node
    rootNode.attachChild(quadGeom);
}

@Override
public void simpleUpdate(float tpf) {
    //Do stuff with the render to texture scene here.
    
    renderToTexScene.updateLogicalState(tpf);
    renderToTexScene.updateGeometricState();
}
2 Likes

Tryder hi, the texture is still rectangular. :slight_smile: how can I render this onto a triangular texture? Thanks.

1 Like

You can’t, to the best of my knowledge, render to a non-rectangular texture, but you can render to rectangular textures then render those textures on triangular meshes.

I put together a few things to get you going here. First we have the Geometry/Mesh which creates a plane comprised of 4 triangles with different vertex colors assigned to each triangle so they can be distinguished between one another in the accompanying shader.

import com.jme3.math.ColorRGBA;
import com.jme3.scene.Mesh;
import com.jme3.scene.VertexBuffer;
import com.jme3.scene.Geometry;
import com.jme3.util.BufferUtils;

import java.nio.ByteBuffer;
import java.nio.FloatBuffer;
import java.nio.ShortBuffer;

/**
 *
 * @author Adam T. Ryder http://1337atr.weebly.com
 */
public class TriDisplay extends Geometry {
	public TriDisplay(String name, final float width, final float height) {
		super(name);
		setMesh(new TriMesh(width, height));
	}
	
	private class TriMesh extends Mesh {
		private TriMesh(final float width, final float height) {
			updateGeometry(width, height);
		}
	
		public void updateGeometry(final float width, final float height) {
		    FloatBuffer verts = BufferUtils.createVector3Buffer(12);
		    FloatBuffer tex = BufferUtils.createVector2Buffer(12);
		    ByteBuffer col = BufferUtils.createByteBuffer(48);
		    ShortBuffer indices = BufferUtils.createShortBuffer(12);
			
			int blue = new ColorRGBA(0, 0, 1, 0).asIntABGR();
			int green = new ColorRGBA(0, 1, 0, 0).asIntABGR();
			int red = new ColorRGBA(1, 0, 0, 0).asIntABGR();
			int alp = new ColorRGBA(0, 0, 0, 1).asIntABGR();
			
			//Top triangle
			verts.put(-width / 2f);
			verts.put(height / 2f);
			verts.put(0);
			tex.put(0);
			tex.put(0);
			col.putInt(blue);
			
			verts.put(width / 2f);
			verts.put(height / 2f);
			verts.put(0);
			tex.put(1);
			tex.put(0);
			col.putInt(blue);
			
			verts.put(0);
			verts.put(0);
			verts.put(0);
			tex.put(0.5f);
			tex.put(1);
			col.putInt(blue);
			
			indices.put((short)0);
			indices.put((short)2);
			indices.put((short)1);
			
			//Right triangle
			verts.put(width / 2f);
			verts.put(height / 2f);
			verts.put(0);
			tex.put(1);
			tex.put(0);
			col.putInt(green);
			
			verts.put(width / 2f);
			verts.put(-height / 2f);
			verts.put(0);
			tex.put(1);
			tex.put(1);
			col.putInt(green);
			
			verts.put(0);
			verts.put(0);
			verts.put(0);
			tex.put(0);
			tex.put(0.5f);
			col.putInt(green);
			
			indices.put((short)3);
			indices.put((short)5);
			indices.put((short)4);
			
			//Bottom triangle
			verts.put(-width / 2f);
			verts.put(-height / 2f);
			verts.put(0);
			tex.put(0);
			tex.put(1);
			col.putInt(red);
			
			verts.put(width / 2f);
			verts.put(-height / 2f);
			verts.put(0);
			tex.put(1);
			tex.put(1);
			col.putInt(red);
			
			verts.put(0);
			verts.put(0);
			verts.put(0);
			tex.put(0.5f);
			tex.put(0);
			col.putInt(red);
			
			indices.put((short)6);
			indices.put((short)7);
			indices.put((short)8);
			
			//Left triangle
			verts.put(-width / 2f);
			verts.put(height / 2f);
			verts.put(0);
			tex.put(0);
			tex.put(0);
			col.putInt(alp);
			
			verts.put(-width / 2f);
			verts.put(-height / 2f);
			verts.put(0);
			tex.put(0);
			tex.put(1);
			col.putInt(alp);
			
			verts.put(0);
			verts.put(0);
			verts.put(0);
			tex.put(1);
			tex.put(0.5f);
			col.putInt(alp);
			
			indices.put((short)9);
			indices.put((short)10);
			indices.put((short)11);
			
			verts.flip();
		    VertexBuffer vb = new VertexBuffer(VertexBuffer.Type.Position);
		    vb.setupData(VertexBuffer.Usage.Stream, 3, VertexBuffer.Format.Float, verts);
		    setBuffer(vb);
		    
		    tex.flip();
		    vb = new VertexBuffer(VertexBuffer.Type.TexCoord);
		    vb.setupData(VertexBuffer.Usage.Static, 2, VertexBuffer.Format.Float, tex);
		    setBuffer(vb);
		    
		    col.flip();
		    vb = new VertexBuffer(VertexBuffer.Type.Color);
		    vb.setupData(VertexBuffer.Usage.Stream, 4, VertexBuffer.Format.UnsignedByte, col);
		    vb.setNormalized(true);
		    setBuffer(vb);
		    
		    indices.flip();
		    vb = new VertexBuffer(VertexBuffer.Type.Index);
		    vb.setupData(VertexBuffer.Usage.Static, 3, VertexBuffer.Format.UnsignedShort, indices);
		    setBuffer(vb);
		    
		    updateBound();
		}
	}
}

In the constructor the name can be anything you like, it doesn’t have to be unique. The width and height values should be set to the width and height of the view in world coordinates. For example if you have an orthographic camera with a frustum ranging from -0.5 on the left to 0.5 on the right and 0.25 on the top and -0.25 on the bottom then you’d enter a width and height of 1 and 0.5.

Now let’s take a look at the shader starting with the vertex shader:

uniform mat4 g_WorldViewProjectionMatrix;
attribute vec3 inPosition;
attribute vec2 inTexCoord;
attribute vec4 inColor;

uniform float m_yscale;

varying vec2 texCoord;
varying vec4 vertCol;

void main() {
	vertCol = inColor;
	if (vertCol.b > 0.01) {
		texCoord = vec2(inTexCoord.x, inTexCoord.y * m_yscale);
	} else if (vertCol.g > 0.01) {
		texCoord = vec2((inTexCoord.x * m_yscale) + m_yscale, inTexCoord.y);
	} else if (vertCol.r > 0.01) {
		texCoord = vec2(inTexCoord.x, (inTexCoord.y * m_yscale) + m_yscale);
	} else {
		texCoord = vec2(inTexCoord.x * m_yscale, inTexCoord.y);
	}
	
    gl_Position = g_WorldViewProjectionMatrix * vec4(inPosition, 1.0);
}

The fragment shader:

uniform sampler2D m_tex1;
uniform sampler2D m_tex2;
uniform sampler2D m_tex3;
uniform sampler2D m_tex4;

varying vec2 texCoord;
varying vec4 vertCol;

void main() {
	vec4 col = vertCol;
	if (vertCol.b > 0.01) {
		col = texture2D(m_tex1, texCoord);
	} else if (vertCol.g > 0.01) {
		col = texture2D(m_tex2, texCoord);
	} else if (vertCol.r > 0.01) {
		col = texture2D(m_tex3, texCoord);
	} else {
		col = texture2D(m_tex4, texCoord);
	}
	
	gl_FragColor = col;
}

And finally the material definition:

MaterialDef trishade {
    MaterialParameters {
		Texture2D tex1
		Texture2D tex2
		Texture2D tex3
		Texture2D tex4
		Float yscale : 1.0
    }
    Technique {
        VertexShader GLSL120: Shaders/trishade.vert
        FragmentShader GLSL120: Shaders/trishade.frag
        
        WorldParameters {
            WorldViewProjectionMatrix
        }
		
		RenderState {
            Blend Alpha
        }
    }
}

What this does is use the vertex colors assigned to each vertex to determine which texture to display, but also allows you to stretch the textures to help you match the aspect ratio of the texture’s you’re using. For instance if your top and bottom textures are rendered with a height of screenHeight / 2 and your left/right textures are rendered with a width of screenWidth / 2 everything should match up fine. If you’re rendering each of your textures at full screen resolution then you’ll probably want to set the material’s yscale value to 0.5.

Let’s setup the default camera so it can easily render our TriDisplay Geometry.

cam.setParallelProjection(true);
cam.setLocation(new Vector3f(0, 0, 4f));
float ratio = (float)settings.getHeight() / settings.getWidth();
cam.setFrustum(1f, 5f, -0.5f, 0.5f, ratio / 2, -ratio / 2);

The above will setup an orthographic camera that can see everything from -0.5 to 0.5 on the x axis and everything from negative half the ratio of the height to the width to positive half the ratio of the height to the width.

Now all you need to do is instantiate a TriDisplay, add it to the rootNode and assign the custom material to it:

Geometry view = new TriDisplay("TriDisplay", 1, ratio);
rootNode.attachChild(view);

Material mat = new Material(assetManager, "MatDefs/trishade.j3md");
mat.setTexture("tex1", myTexture1); //Top
mat.setTexture("tex2", myTexture2); //Right
mat.setTexture("tex3", myTexture3); //Bottom
mat.setTexture("tex4", myTexture4); //Left
mat.setFloat("yscale", 0.5f); //Only if your textures are rendered at full screen resolution or otherwise need to be stretched or squished to fit the triangles properly.
view.setMaterial(mat);
2 Likes