Is BitmapText to BufferedImage possible?

I am developing a boardgame designer where users can define the widgets that make up their game simply in a set of json files like this:

{
		"Name": "Greenseer",
		"TemplateName": "Default",
		"TemplateTexts": {
			"Name": [
				"Greenseer"
			],
			"Attack": [
				"1"
			],
			"Health": [
				"2"
			],
			"Description": [
				"<a>Passive:</a> Play with the top of your deck revealed. You may play any non-instants as if they were in your hand."
			],
			"Tribe": [
				"NONE"
			]
		},
		"TemplateImages": {
			"Art": "Art/greenseer.png"
		}
	}

They can run the program to play their game in a networked 3d table top environment which looks like this:

image

I am working on the user being able to export their widgets to manufacturer templates for physical printing. I couldn’t think of a way to just write the whole Node with all of the quad textures and bitmaptexts as rendered on screen to a png. So I wrote some code that separately scales all of the images to recreate a png that fits the manufacturers template. But the BitmapText is causing issues for a number of reasons. I’d rather not write a set of equivalents with using Graphics drawString() function on a BufferedImage to achieve the same text results that are being rendered on screen if possible.

So I have 2 questions:

  1. Is there actually a way to write the whole Node with all of the quad textures and bitmaptexts as rendered on screen to a BufferedImage?
  2. Is there a way specifically for BitmapText to be rendered to a BufferedImage?

What are those reasons?

There is no point in rendering separate texts.
Have a look at TestRenderToTexture.java, that’s probably what you want.

Basically you have a separate camera, viewport etc and then you only add the card node there, align the camera and then save the viewport content.

The quality on screen will be noticeably worse than the print quality you desire. I would seriously suggest you use your card object as a base of data and render a printable card using a better method than a “print screen” otherwise it will just look like a low-res photo in effect.

Yes, just RenderToTexture as suggested with high enough resolution. Then just download the texture from the gpu. You can convert the raw data to BufferedImage and save it.

I believe you may be looking for the ImagePainter lib.

Image painter works in the other direction

Is there actually a way to write the whole Node with all of the quad textures and bitmaptexts as rendered on screen to a BufferedImage?

Image painter works in the other direction

It… generates bitmaptext nodes and textures from a bufferedimage? Hah! If it really did that it would be an expensive piece of neural network software.

Of the options on offer the RenderToTexture example seems to be the best fit so far. Here is the code I am using to output a png for each of my aforementioned “Nodes”, I’m getting transparent images of size 512x512. I would guess it’s because the viewport is never actually rendering although it could be for another reason. From the RenderToTexture example I don’t see much of a difference in terms of the viewport explicitly or implicitly being told to render. Any Insights?

Also as manufacturer templates could have quite high resolution am I going to run into issues making and rendering to a large FrameBuffer, say 4k resolutions for example?

public void exportManufacture(ClientModel cm) {
	try {
		Camera offCamera = new Camera(512, 512);
		ViewPort offView;

		offView = cm.app.getRenderManager().createPreView("Offscreen View", offCamera);
		offView.setClearFlags(true, true, true);
		offView.setBackgroundColor(ColorRGBA.DarkGray);

		// create offscreen framebuffer
		FrameBuffer offBuffer = new FrameBuffer(512, 512, 1);

		// setup framebuffer's cam
		offCamera.setFrustumPerspective(45f, 1f, 1f, 1000f);
		offCamera.setLocation(new Vector3f(0f, 0f, -5f));
		offCamera.lookAt(new Vector3f(0f, 0f, 0f), Vector3f.UNIT_Y);

		// setup framebuffer's texture
		Texture2D offTex = new Texture2D(512, 512, Format.RGBA8);
		offTex.setMinFilter(Texture.MinFilter.Trilinear);
		offTex.setMagFilter(Texture.MagFilter.Bilinear);

		// setup framebuffer to use texture
		offBuffer.setDepthBuffer(Format.Depth);
		offBuffer.setColorTexture(offTex);

		// set viewport to render to offscreen framebuffer
		offView.setOutputFrameBuffer(offBuffer);
		offView.setEnabled(true);
		
		// attach the scene to the viewport to be rendered
		List<Widget> widgets = cm.gameRulesModel.getWidgetLibrary().getWidgets();

		for (Widget widget : widgets) {
			WidgetSingleNode node = new WidgetSingleNode(widget, cm, false, true, true);
			offView.attachScene(node);

			ByteBuffer outBuf = BufferUtils.createByteBuffer(offBuffer.getWidth() * offBuffer.getHeight() * 4);
			cm.app.getRenderer().readFrameBuffer(offBuffer, outBuf);
			Image image = new Image(Format.RGBA8, offBuffer.getWidth(), offBuffer.getHeight(), outBuf);

			BufferedImage texImage = ImageToAwt.convert(image, false, true, 0);
			String filename = node.getModel().getName() + ".png";
			File outFile = new File(filename);

			ImageIO.write(texImage, "png", outFile);
			offView.detachScene(node);
		}
	} catch (IOException e) {
		// TODO Auto-generated catch block
		e.printStackTrace();
	}
}

ImagePainter generates Jme3 BufferedImages from the standard java painting tools used to do Java bitmaps (including writing text).

That text is done using the standard java font systems etc though, it’s nothing to do with BitmapText.

The OP is really looking to go the other way, take a Jme3 image and turn it into a java one.

Just FYI: There is no “being told to render”… it’s done as part of normal rendering. So you may have to wait a frame.

If your gpu supports 4k res then no problem. If it doesn’t, you can split render into tiles, and render each tile, download from gpu and put together with cpu.

1 Like