ScreenshotAppState for Android

I took a stab at making ScreenshotAppState work for Android. The current ScreenshotAppState is only included in Desktop and not in Android. When I dug in, I noticed that the current ScreenshotAppState read the framebuffer from the renderer which was not implemented in Android OGLESShaderRenderer. So, instead of figuring all that out (way beyond me), I implemented the necessary code inside the ScreenshotAppState. I have the file in the com.jme3.app.state package in the Android Renderer folder so it only gets included for Android.



The main guts of the code was copied from the internet, so I can’t take credit for that. I’m not even sure this is the best way to do it, but without figuring out how to merge it into the renderer, it seems to work pretty well.



On the desktop, the image is stored in the app directory (as it always has). On Android, the image is stored in the storage folder retrieved from JmeSystem which is defined as /mnt/sdcard/Android/data//files



First, you have to add permissions to the manifest file to get access to the storage folder:

[java]

<uses-permission android:name=“android.permission.WRITE_EXTERNAL_STORAGE”></uses-permission>

[/java]



Then you have to initialize the input trigger and the app state. Below is from the Initialize method of my game app state:

[java]

inputManager.addMapping(TOUCH_SEARCH,

new TouchTrigger(TouchInput.KEYCODE_SEARCH),

new KeyTrigger(KeyInput.KEY_SLASH)

);

inputManager.addListener(this, TOUCH_SEARCH);



screenShotTaker = new ScreenshotAppState();

app.getStateManager().attach(screenShotTaker);

[/java]



onTouch and onAction method of my game app state (faking out the default SysRq trigger used in the Desktop version with the Search button on the phone):

[java]

public void onTouch(String string, TouchEvent event, float tpf) {



if (string.equalsIgnoreCase(TOUCH_SEARCH) && event.getType() == TouchEvent.Type.KEY_DOWN) {

if (screenShotTaker.isEnabled() && screenShotTaker.isInitialized()) {

screenShotTaker.onAction(“ScreenShot”, true, tpf);

}

}



}



public void onAction(String string, boolean pressed, float tpf) {



if (string.equalsIgnoreCase(TOUCH_SEARCH) && pressed) {

if (screenShotTaker.isEnabled() && screenShotTaker.isInitialized()) {

screenShotTaker.onAction(“ScreenShot”, true, tpf);

}

}

}



[/java]



Android specific ScreenshotAppState file:

[java]

package com.jme3.app.state;



import android.graphics.Bitmap;

import android.opengl.GLES20;

import com.jme3.app.Application;

import com.jme3.input.InputManager;

import com.jme3.input.KeyInput;

import com.jme3.input.controls.ActionListener;

import com.jme3.input.controls.KeyTrigger;

import com.jme3.post.SceneProcessor;

import com.jme3.renderer.RenderManager;

import com.jme3.renderer.Renderer;

import com.jme3.renderer.ViewPort;

import com.jme3.renderer.queue.RenderQueue;

import com.jme3.system.JmeSystem;

import com.jme3.texture.FrameBuffer;

import java.io.File;

import java.io.FileOutputStream;

import java.nio.ByteBuffer;

import java.nio.ByteOrder;

import java.nio.ShortBuffer;

import java.util.List;

import java.util.logging.Level;

import java.util.logging.Logger;

import javax.microedition.khronos.opengles.GL10;



public class ScreenshotAppState extends AbstractAppState implements ActionListener, SceneProcessor {



private static final Logger logger = Logger.getLogger(ScreenshotAppState.class.getName());

private boolean capture = false;

private Renderer renderer;

private String appName;

private int shotIndex = 0;

private Bitmap bitmapImage;



@Override

public void initialize(AppStateManager stateManager, Application app) {

if (!super.isInitialized()){

InputManager inputManager = app.getInputManager();

inputManager.addMapping(“ScreenShot”, new KeyTrigger(KeyInput.KEY_SYSRQ));

inputManager.addListener(this, “ScreenShot”);



List<ViewPort> vps = app.getRenderManager().getPostViews();

ViewPort last = vps.get(vps.size()-1);

last.addProcessor(this);



appName = app.getClass().getSimpleName();

}



super.initialize(stateManager, app);

}



public void onAction(String name, boolean value, float tpf) {

if (value){

capture = true;

}

}



public void initialize(RenderManager rm, ViewPort vp) {

renderer = rm.getRenderer();

reshape(vp, vp.getCamera().getWidth(), vp.getCamera().getHeight());

}



@Override

public boolean isInitialized() {

return super.isInitialized() && renderer != null;

}



public void reshape(ViewPort vp, int w, int h) {

bitmapImage = Bitmap.createBitmap(w, h, Bitmap.Config.ARGB_8888);

}



public void preFrame(float tpf) {

}



public void postQueue(RenderQueue rq) {

}



public void postFrame(FrameBuffer out) {

if (capture){

capture = false;

shotIndex++;



int width = bitmapImage.getWidth();

int height = bitmapImage.getHeight();

int size = width * height;

ByteBuffer buf = ByteBuffer.allocateDirect(size * 4);

buf.order(ByteOrder.nativeOrder());

GLES20.glReadPixels(0, 0, width, height, GL10.GL_RGBA, GL10.GL_UNSIGNED_BYTE, buf);

int data[] = new int[size];

buf.asIntBuffer().get(data);

buf = null;

Bitmap bitmap = Bitmap.createBitmap(width, height, Bitmap.Config.RGB_565);

bitmap.setPixels(data, size-width, -width, 0, 0, width, height);

data = null;



short sdata[] = new short[size];

ShortBuffer sbuf = ShortBuffer.wrap(sdata);

bitmap.copyPixelsToBuffer(sbuf);

for (int i = 0; i < size; ++i) {

//BGR-565 to RGB-565

short v = sdata;

sdata = (short) (((v&0x1f) << 11) | (v&0x7e0) | ((v&0xf800) >> 11));

}

sbuf.rewind();

bitmap.copyPixelsFromBuffer(sbuf);



String fileName = JmeSystem.getStorageFolder() + File.separator + appName + shotIndex + “.png”;

logger.log(Level.INFO, “Saving ScreenShot to: {0}”, fileName);



try {

FileOutputStream fos = new FileOutputStream(fileName);

bitmap.compress(Bitmap.CompressFormat.PNG, 100, fos);

fos.flush();

fos.close();

} catch (Exception e) {

// handle

}



}

}

}

[/java]

6 Likes

nice!

If anyone wants it, I’m going to start working on the VideoRecorderAppState next.



@normen It looks like the VideoRecorderAppState uses a default framerate of 30, if the Android device is currently running at < 30fps (good possibility), what do you think the video would look like? Would like look just like the game or would it be choppy? Also, I’m expecting that the game will run slower with a video being created, but we’ll see when I have it working. I’m a little worried that creating the video during runtime may make the running game too choppy to play, we’ll see.

read the manual, the video recorder bends time

@normen Sorry, got lazy :slight_smile:

thanks! :slight_smile: good work as usual

Well, I’m going to give up on the VideoAppState for a bit. It might be beyond my understanding :). I have some research to do on how avi’s are formatted before I can finish the VideoAppState :frowning:



I did, however, noticed an issue that is present on both desktop and android screenshots.



The way the code works for both desktop and android is that the processor is attached to the postFrame() for the last postView ViewPort. In my app, I have 2 additional postView ViewPorts that render a up/down joystick and a right/left joystick. When I get a screenshot, the joysticks are not included in the image. However, if I place the code to take the screenshot in the onFrame routine of OGESShaderRenderer, they do show up. While looking at the steps taken during a scan, it looks like the only step that comes after the postFrame() is the call to renderTranslucentQueue(vp). I’m obviously missing something, but I don’t know why my joysticks are not displaying until the renderTranslucentQueue is called.



Can anyone explain why the joysticks would be displayed during renderTranslucentQueue ?



Here are the 2 routines that I use to create the joystick viewports (spatialNavStick is loaded from a j3o file):

[java]

public void initNavStick() {

String geoName = “Nav Stick”;



stickLoc = new Vector3f((float)(navStickWidth)/2, (float)(navStickHeight)/2, 0f);

logger.log(Level.INFO, “stickLoc: {0}”, stickLoc);



myCam = new Camera(settings.getWidth(), settings.getHeight());

myCam.setFrustumPerspective(45f, 1f, 1f, 1000f);



Vector3f camLocation = new Vector3f();

camLocation.x = (float)(navStickWidth)/2f;

camLocation.y = (float)(navStickHeight)/2f;

camLocation.z = (float)(300f);

myCam.setLocation(camLocation);



Vector3f camLookAtPoint = new Vector3f();

camLookAtPoint.x = (float)(navStickWidth)/2f;

camLookAtPoint.y = (float)(navStickHeight)/2f;

camLookAtPoint.z = (float)(0);

myCam.lookAt(camLookAtPoint, new Vector3f(0f, 1f, 0f));



logger.log(Level.INFO, “myCam Location: {0}”, myCam.getLocation());

logger.log(Level.INFO, “myCam Rotation: {0}”, myCam.getRotation());

logger.log(Level.INFO, “myCam ProjectionMatrix: {0}”, myCam.getProjectionMatrix());

logger.log(Level.INFO, “myCam ViewMatrix: {0}”, myCam.getViewMatrix());

logger.log(Level.INFO, “myCam ViewProjectionMatrix: {0}”, myCam.getViewProjectionMatrix());



if (useUnshadedMaterials) {

GeometryUtils.switchToUnshaded(spatialNavStick);

}



logger.log(Level.INFO, “geoNavStick Local Translation: {0}”, spatialNavStick.getLocalTranslation());

logger.log(Level.INFO, “geoNavStick World Translation: {0}”, spatialNavStick.getWorldTranslation());



navStickNode.move(stickLoc);

navStickNode.scale(.9f,.9f,.9f);



logger.log(Level.INFO, “geoNavStick Name: {0}”, spatialNavStick.getName());

logger.log(Level.INFO, “geoNavStick Local Translation: {0}”, spatialNavStick.getLocalTranslation());

logger.log(Level.INFO, “geoNavStick World Translation: {0}”, spatialNavStick.getWorldTranslation());



spatialNavStick.setCullHint(CullHint.Never);



navStickNode.attachChild(spatialNavStick);

logger.log(Level.INFO, “geom Local Translation: {0}”, spatialNavStick.getLocalTranslation());

logger.log(Level.INFO, “geom World Translation: {0}”, spatialNavStick.getWorldTranslation());



DirectionalLight sun = new DirectionalLight();

sun.setDirection(new Vector3f(-.1f, .3f, -1f));

navStickNode.addLight(sun);





}



public void createViewPort() {



navStickNode.updateGeometricState();



float newRightEdge = (float)(navStickWidth) / (float)(settings.getWidth());

float newTopEdge = (float)(navStickHeight) / (float)(settings.getHeight());

logger.log(Level.INFO, “newRightEdge: {0}”, newRightEdge);

logger.log(Level.INFO, “newTopEdge: {0}”, newTopEdge);



if (curScreenHorizontalDirection == SCREEN_HORIZONTAL_DIRECTION_RIGHT) {

viewPortLeft = (screenLocation.x / settings.getWidth());

viewPortRight = (screenLocation.x / settings.getWidth()) + newRightEdge;

} else {

viewPortLeft = (screenLocation.x / settings.getWidth()) - newRightEdge;

viewPortRight = (screenLocation.x / settings.getWidth());

}

if (curScreenVerticalDirection == SCREEN_VERTICAL_DIRECTION_UP) {

viewPortBottom = (screenLocation.y / settings.getHeight());

viewPortTop = (screenLocation.y / settings.getHeight()) + newTopEdge;

} else {

viewPortBottom = (screenLocation.y / settings.getHeight()) - newTopEdge;

viewPortTop = (screenLocation.y / settings.getHeight());

}



myCam.setViewPort(viewPortLeft, viewPortRight, viewPortBottom, viewPortTop);

myCam.setFrustumPerspective(45f, 1f, 1f, 1000f);



if (viewNavStick != null && renderManager.getPostViews().contains(viewNavStick)) {

renderManager.removePostView(viewNavStick);

}

viewNavStick = renderManager.createPostView(“NavStick Viewport”, myCam);

viewNavStick.setClearFlags(false, true, true);



viewNavStick.attachScene(navStickNode);



}

[/java]

I rearrange the screenshot app state a bit and added some documentation. I’m hoping that @nehon or @Momoko_Fan will approve this and commit it :).



[EDIT] I found an issue on both the Desktop and Android versions. If the last PostView ViewPort size is less than full screen, the image does not get saved correctly. I fixed the issue on Android, but I’m not sure how to fix it for Desktop. The patch file below includes a line to set the OpenGL viewport to full screen size before taking the screenshot.

[java]

GLES20.glViewport(startX, startY, width, height);

GLES20.glReadPixels(startX, startY, width, height, GLES20.GL_RGBA, GLES20.GL_UNSIGNED_BYTE, out);

[/java]



Since Desktop uses framebuffers, I wasn’t sure how to resolve the issue for the Desktop version.



Android specific file to manipulate screenshot data:

[java]

This patch file was generated by NetBeans IDE

Following Index: paths are relative to: D:UserspotterecDocumentsjMonkeyProjectsjME3srcandroidcomjme3util

This patch can be applied using context Tools: Patch action on respective folder.

It uses platform neutral UTF-8 encoding and n newlines.

Above lines and this line are ignored by the patching process.

Index: AndroidScreenshots.java

— AndroidScreenshots.java Locally New

+++ AndroidScreenshots.java Locally New

@@ -0,0 +1,73 @@

+package com.jme3.util;

+

+import android.graphics.Bitmap;

+import android.opengl.GLES20;

+import java.nio.ByteBuffer;

+import java.nio.ByteOrder;

+import java.util.logging.Logger;

+

+public final class AndroidScreenshots {

  • private static final Logger logger = Logger.getLogger(AndroidScreenshots.class.getName());

    +
  • /**
  • * Convert OpenGL GLES20.GL_RGBA to Bitmap.Config.ARGB_8888 and store result in a Bitmap<br />
    
  • * @param buf ByteBuffer that has the pixel color data from OpenGL<br />
    
  • * @param bitmapImage Bitmap to be used after converting the data<br />
    
  • */<br />
    
  • public static void convertScreenShot(ByteBuffer buf, Bitmap bitmapImage) {
  •    int width = bitmapImage.getWidth();<br />
    
  •    int height = bitmapImage.getHeight();<br />
    
  •    int size = width*height;<br />
    

+

  •    // Grab data from ByteBuffer as Int Array to manipulate data and send to image<br />
    
  •    int data[] = new int[size];<br />
    
  •    buf.asIntBuffer().get(data);<br />
    

+

  •    // convert from GLES20.GL_RGBA to Bitmap.Config.ARGB_8888<br />
    
  •    // ** need to swap RED and BLUE **<br />
    
  •    for (int idx=0; idx&lt;data.length; idx++) {<br />
    
  •        int initial = data[idx];<br />
    
  •           int pb=(initial&gt;&gt;16)&amp;0xff;<br />
    
  •           int pr=(initial&lt;&lt;16)&amp;0x00ff0000;<br />
    
  •           int pix1=(initial&amp;0xff00ff00) | pr | pb;<br />
    
  •           data[idx]=pix1;<br />
    
  •    }<br />
    

+

  •    // OpenGL and Bitmap have opposite starting points for Y axis (top vs bottom)<br />
    
  •    // Need to write the data in the image from the bottom to the top<br />
    
  •    // Use size-width to indicate start with last row and increment by -width for each row<br />
    
  •    bitmapImage.setPixels(data, size - width, -width, 0, 0, width, height);<br />
    
  •    data = null;<br />
    

+

  • }

    +
  • /**
  • * Saves the Color Buffer of current display from OpenGL to a ByteBuffer.<br />
    
  • * If null is passed in, a new ByteBuffer is created, else the passed in ByteBuffer will be used.<br />
    
  • * @param startX Starting X value for the display<br />
    
  • * @param startY Starting Y value for the display<br />
    
  • * @param width Width of the display<br />
    
  • * @param height Height of the display<br />
    
  • * @param out ByteBuffer containing the Color Buffer from OpenGL<br />
    
  • * @return<br />
    
  • */<br />
    
  • public static ByteBuffer getScreenShot(int startX, int startY, int width, int height, ByteBuffer out){

    +
  •        int size = width * height;<br />
    

+

  •        // Create new buffer is not passed in<br />
    
  •        if (out == null) {<br />
    
  •            // Create ByteBuffer and store data from OpenGL<br />
    
  •            out = ByteBuffer.allocateDirect(size * 4);  // data stored as 4 bytes per pixel<br />
    
  •            out.order(ByteOrder.nativeOrder());<br />
    
  •        } else {<br />
    
  •            if (out.capacity() != size*4) {<br />
    
  •                throw new IllegalArgumentException(&quot;Byte Buffer is not the correct size&quot;);<br />
    
  •            }<br />
    
  •        }<br />
    
  •        GLES20.glViewport(startX, startY, width, height);<br />
    
  •        GLES20.glReadPixels(startX, startY, width, height, GLES20.GL_RGBA, GLES20.GL_UNSIGNED_BYTE, out);<br />
    

+

  •        return out;<br />
    
  • }

    +}

    [/java]



    Android specific ScreenshotAppState that mimics the Desktop version:

    [java]

This patch file was generated by NetBeans IDE

Following Index: paths are relative to: D:UserspotterecDocumentsjMonkeyProjectsjME3srcandroidcomjme3appstate

This patch can be applied using context Tools: Patch action on respective folder.

It uses platform neutral UTF-8 encoding and n newlines.

Above lines and this line are ignored by the patching process.

Index: ScreenshotAppState.java

— ScreenshotAppState.java Locally New

+++ ScreenshotAppState.java Locally New

@@ -0,0 +1,116 @@

+package com.jme3.app.state;

+

+import android.graphics.Bitmap;

+import com.jme3.app.Application;

+import com.jme3.input.InputManager;

+import com.jme3.input.KeyInput;

+import com.jme3.input.controls.ActionListener;

+import com.jme3.input.controls.KeyTrigger;

+import com.jme3.post.SceneProcessor;

+import com.jme3.renderer.RenderManager;

+import com.jme3.renderer.Renderer;

+import com.jme3.renderer.ViewPort;

+import com.jme3.renderer.queue.RenderQueue;

+import com.jme3.system.JmeSystem;

+import com.jme3.texture.FrameBuffer;

+import com.jme3.util.AndroidScreenshots;

+import java.io.File;

+import java.io.FileOutputStream;

+import java.nio.ByteBuffer;

+import java.nio.ByteOrder;

+import java.util.List;

+import java.util.logging.Level;

+import java.util.logging.Logger;

+

+public class ScreenshotAppState extends AbstractAppState implements ActionListener, SceneProcessor {

+

  • private static final Logger logger = Logger.getLogger(ScreenshotAppState.class.getName());
  • private boolean capture = false;
  • private Renderer renderer;
  • private String appName;
  • private int shotIndex = 0;
  • private Bitmap bitmapImage;
  • private int width = 0;
  • private int height = 0;

    +
  • @Override
  • public void initialize(AppStateManager stateManager, Application app) {
  •    if (!super.isInitialized()){<br />
    
  •        InputManager inputManager = app.getInputManager();<br />
    
  •        inputManager.addMapping(&quot;ScreenShot&quot;, new KeyTrigger(KeyInput.KEY_SYSRQ));<br />
    
  •        inputManager.addListener(this, &quot;ScreenShot&quot;);<br />
    

+

  •        List&lt;ViewPort&gt; vps = app.getRenderManager().getPostViews();<br />
    
  •        ViewPort last = vps.get(vps.size()-1);<br />
    
  •        last.addProcessor(this);<br />
    

+

  •        appName = app.getClass().getSimpleName();<br />
    
  •    }<br />
    

+

  •    super.initialize(stateManager, app);<br />
    
  • }

    +
  • public void onAction(String name, boolean value, float tpf) {
  •    if (value){<br />
    
  •        capture = true;<br />
    
  •    }<br />
    
  • }

    +
  • public void initialize(RenderManager rm, ViewPort vp) {
  •    renderer = rm.getRenderer();<br />
    
  •    reshape(vp, vp.getCamera().getWidth(), vp.getCamera().getHeight());<br />
    
  • }

    +
  • @Override
  • public boolean isInitialized() {
  •    return super.isInitialized() &amp;&amp; renderer != null;<br />
    
  • }

    +
  • public void reshape(ViewPort vp, int w, int h) {
  •    width = w;<br />
    
  •    height = h;<br />
    
  •    logger.log(Level.INFO, &quot;viewport {0} new size: w: {1}, h:{2}&quot;,<br />
    
  •            new Object[]{vp.getName(), w, h});<br />
    
  • }

    +
  • public void preFrame(float tpf) {
  • }

    +
  • public void postQueue(RenderQueue rq) {
  • }

    +
  • public void postFrame(FrameBuffer out) {
  •    if (capture) {<br />
    
  •        shotIndex++;<br />
    
  •        capture = false;<br />
    
  •        int size = width * height;<br />
    

+

  •        // Create ByteBuffer and store data from OpenGL<br />
    
  •        ByteBuffer buf = ByteBuffer.allocateDirect(size * 4);  // data stored as 4 bytes per pixel<br />
    
  •        buf.order(ByteOrder.nativeOrder());<br />
    

+

  •        // Get the current Color Buffer from OpenGL<br />
    
  •        AndroidScreenshots.getScreenShot(0, 0, width, height, buf);<br />
    

+

  •        // Create Bitmap to store image<br />
    
  •        bitmapImage = Bitmap.createBitmap(width, height, Bitmap.Config.ARGB_8888);<br />
    

+

  •        // Convert OpenGL GLES20.GL_RGBA to Bitmap.Config.ARGB_8888<br />
    
  •        AndroidScreenshots.convertScreenShot(buf, bitmapImage);<br />
    
  •        buf = null;<br />
    

+

  •        // Save Bitmap to file<br />
    
  •        String fileName = JmeSystem.getStorageFolder() + File.separator + appName + shotIndex + &quot;.png&quot;;<br />
    
  •        logger.log(Level.INFO, &quot;Saving ScreenShot to: {0}&quot;, fileName);<br />
    

+

  •        try {<br />
    
  •            FileOutputStream fos = new FileOutputStream(fileName);<br />
    
  •            bitmapImage.compress(Bitmap.CompressFormat.PNG, 100, fos);<br />
    
  •            fos.flush();<br />
    
  •            fos.close();<br />
    
  •        } catch (Exception e) {<br />
    
  •            logger.log(Level.INFO, &quot;Error Saving File: {0}&quot;, e);<br />
    
  •        }<br />
    
  •    }<br />
    
  • }

    +}

    [/java]
2 Likes

Maybe you added those viewports after the ScreenshotAppState was attached? It would be ideal if we had some callback for when a frame finishes rendering for this sort of stuff but there’s nothing at the moment…

1 Like

@Momoko_Fan Yep, that is exactly what was happening. Thanks for the idea. I didn’t even think about that.



If you want me to create a listener structure for getting a callback when the frame finishes rendering, I’d be happy to do it.

For the Desktop version, it works with the change below. I think since, in my case, the last postView ViewPort size was smaller than full screen, the OpenGL ViewPort size was smaller than the defined image and ByteBuffer size so the glReadPixels size wasn’t lined up correctly with the image size and ByteBuffer size in the ScreenshotAppState. Anyway, I added a call to force the OpenGL ViewPort size to full screen and the screenshots get made correctly. If the last postView ViewPort was already full screen, then the issue didn’t occur previously.



[java]

This patch file was generated by NetBeans IDE

Following Index: paths are relative to: D:UserspotterecDocumentsjMonkeyProjectsjME3srcdesktopcomjme3appstate

This patch can be applied using context Tools: Patch action on respective folder.

It uses platform neutral UTF-8 encoding and n newlines.

Above lines and this line are ignored by the patching process.

Index: ScreenshotAppState.java

— ScreenshotAppState.java Base (BASE)

+++ ScreenshotAppState.java Locally Modified (Based On LOCAL)

@@ -81,6 +81,7 @@

capture = false;

shotIndex++;


  •        renderer.setViewPort(0, 0, awtImage.getWidth(), awtImage.getHeight());<br />
    

renderer.readFrameBuffer(out, outBuf);

Screenshots.convertScreenShot(outBuf, awtImage);



[/java]

1 Like

Is it possible to use Renderer.readFrameBuffer() on Android instead of having GL calls in AndroidScreenshots.getScreenShot()? You don’t need to add framebuffer support, just support reading from null (main) framebuffer.

Absolutely. I’ll make that change.

iwgeric <3

@Momoko_Fan Here are the new patch files for Android



OGLESShaderRenderer

[java]

This patch file was generated by NetBeans IDE

Following Index: paths are relative to: D:UserspotterecDocumentsjMonkeyProjectsjME3srcandroidcomjme3rendererandroid

This patch can be applied using context Tools: Patch action on respective folder.

It uses platform neutral UTF-8 encoding and n newlines.

Above lines and this line are ignored by the patching process.

Index: OGLESShaderRenderer.java

— OGLESShaderRenderer.java Base (BASE)

+++ OGLESShaderRenderer.java Locally Modified (Based On LOCAL)

@@ -1271,9 +1271,22 @@

}

*/


  • /**
  • * Reads the Color Buffer from OpenGL and stores into the ByteBuffer.<br />
    
  • * Since jME for Android does not support Frame Buffers yet, make sure the FrameBuffer<br />
    
  • * passed in is NULL (default) or an exception will be thrown.<br />
    
  • * Also, make sure to call setViewPort with the appropriate viewport size before<br />
    
  • * calling readFrameBuffer.<br />
    
  • * @param fb FrameBuffer (must be NULL)<br />
    
  • * @param byteBuf ByteBuffer to store the Color Buffer from OpenGL<br />
    
  • */<br />
    

public void readFrameBuffer(FrameBuffer fb, ByteBuffer byteBuf) {

  •    logger.warning(&quot;readFrameBuffer is not supported.&quot;);<br />
    
  •    if (fb != null) {<br />
    
  •        throw new IllegalArgumentException(&quot;FrameBuffer is not supported yet.&quot;);<br />
    

}

+

  •    GLES20.glReadPixels(vpX, vpY, vpW, vpH, GLES20.GL_RGBA, GLES20.GL_UNSIGNED_BYTE, byteBuf);<br />
    
  • }

    /*

    public void readFrameBuffer(FrameBuffer fb, ByteBuffer byteBuf){

    if (fb != null){

    [/java]



    AndroidScreenshots

    [java]

This patch file was generated by NetBeans IDE

Following Index: paths are relative to: D:UserspotterecDocumentsjMonkeyProjectsjME3srcandroidcomjme3util

This patch can be applied using context Tools: Patch action on respective folder.

It uses platform neutral UTF-8 encoding and n newlines.

Above lines and this line are ignored by the patching process.

Index: AndroidScreenshots.java

— AndroidScreenshots.java Locally New

+++ AndroidScreenshots.java Locally New

@@ -0,0 +1,42 @@

+package com.jme3.util;

+

+import android.graphics.Bitmap;

+import java.nio.ByteBuffer;

+import java.util.logging.Logger;

+

+public final class AndroidScreenshots {

  • private static final Logger logger = Logger.getLogger(AndroidScreenshots.class.getName());

    +
  • /**
  • * Convert OpenGL GLES20.GL_RGBA to Bitmap.Config.ARGB_8888 and store result in a Bitmap<br />
    
  • * @param buf ByteBuffer that has the pixel color data from OpenGL<br />
    
  • * @param bitmapImage Bitmap to be used after converting the data<br />
    
  • */<br />
    
  • public static void convertScreenShot(ByteBuffer buf, Bitmap bitmapImage) {
  •    int width = bitmapImage.getWidth();<br />
    
  •    int height = bitmapImage.getHeight();<br />
    
  •    int size = width*height;<br />
    

+

  •    // Grab data from ByteBuffer as Int Array to manipulate data and send to image<br />
    
  •    int data[] = new int[size];<br />
    
  •    buf.asIntBuffer().get(data);<br />
    

+

  •    // convert from GLES20.GL_RGBA to Bitmap.Config.ARGB_8888<br />
    
  •    // ** need to swap RED and BLUE **<br />
    
  •    for (int idx=0; idx&lt;data.length; idx++) {<br />
    
  •        int initial = data[idx];<br />
    
  •           int pb=(initial&gt;&gt;16)&amp;0xff;<br />
    
  •           int pr=(initial&lt;&lt;16)&amp;0x00ff0000;<br />
    
  •           int pix1=(initial&amp;0xff00ff00) | pr | pb;<br />
    
  •           data[idx]=pix1;<br />
    
  •    }<br />
    

+

  •    // OpenGL and Bitmap have opposite starting points for Y axis (top vs bottom)<br />
    
  •    // Need to write the data in the image from the bottom to the top<br />
    
  •    // Use size-width to indicate start with last row and increment by -width for each row<br />
    
  •    bitmapImage.setPixels(data, size - width, -width, 0, 0, width, height);<br />
    
  •    data = null;<br />
    

+

  • }

    +

    +}

    [/java]



    ScreenshotAppState

    [java]

This patch file was generated by NetBeans IDE

Following Index: paths are relative to: D:UserspotterecDocumentsjMonkeyProjectsjME3srcandroidcomjme3appstate

This patch can be applied using context Tools: Patch action on respective folder.

It uses platform neutral UTF-8 encoding and n newlines.

Above lines and this line are ignored by the patching process.

Index: ScreenshotAppState.java

— ScreenshotAppState.java Locally New

+++ ScreenshotAppState.java Locally New

@@ -0,0 +1,120 @@

+package com.jme3.app.state;

+

+import android.graphics.Bitmap;

+import com.jme3.app.Application;

+import com.jme3.input.InputManager;

+import com.jme3.input.KeyInput;

+import com.jme3.input.controls.ActionListener;

+import com.jme3.input.controls.KeyTrigger;

+import com.jme3.post.SceneProcessor;

+import com.jme3.renderer.RenderManager;

+import com.jme3.renderer.Renderer;

+import com.jme3.renderer.ViewPort;

+import com.jme3.renderer.queue.RenderQueue;

+import com.jme3.system.JmeSystem;

+import com.jme3.texture.FrameBuffer;

+import com.jme3.util.AndroidScreenshots;

+import java.io.File;

+import java.io.FileOutputStream;

+import java.nio.ByteBuffer;

+import java.nio.ByteOrder;

+import java.util.List;

+import java.util.logging.Level;

+import java.util.logging.Logger;

+

+public class ScreenshotAppState extends AbstractAppState implements ActionListener, SceneProcessor {

+

  • private static final Logger logger = Logger.getLogger(ScreenshotAppState.class.getName());
  • private boolean capture = false;
  • private Renderer renderer;
  • private String appName;
  • private int shotIndex = 0;
  • private Bitmap bitmapImage;
  • private int width = 0;
  • private int height = 0;

    +
  • @Override
  • public void initialize(AppStateManager stateManager, Application app) {
  •    if (!super.isInitialized()){<br />
    
  •        InputManager inputManager = app.getInputManager();<br />
    
  •        inputManager.addMapping(&quot;ScreenShot&quot;, new KeyTrigger(KeyInput.KEY_SYSRQ));<br />
    
  •        inputManager.addListener(this, &quot;ScreenShot&quot;);<br />
    

+

  •        List&lt;ViewPort&gt; vps = app.getRenderManager().getPostViews();<br />
    
  •        ViewPort last = vps.get(vps.size()-1);<br />
    
  •        last.addProcessor(this);<br />
    

+

  •        appName = app.getClass().getSimpleName();<br />
    
  •    }<br />
    

+

  •    super.initialize(stateManager, app);<br />
    
  • }

    +
  • public void onAction(String name, boolean value, float tpf) {
  •    if (value){<br />
    
  •        capture = true;<br />
    
  •    }<br />
    
  • }

    +
  • public void initialize(RenderManager rm, ViewPort vp) {
  •    renderer = rm.getRenderer();<br />
    
  •    reshape(vp, vp.getCamera().getWidth(), vp.getCamera().getHeight());<br />
    
  • }

    +
  • @Override
  • public boolean isInitialized() {
  •    return super.isInitialized() &amp;&amp; renderer != null;<br />
    
  • }

    +
  • public void reshape(ViewPort vp, int w, int h) {
  •    width = w;<br />
    
  •    height = h;<br />
    
  • }

    +
  • public void preFrame(float tpf) {
  • }

    +
  • public void postQueue(RenderQueue rq) {
  • }

    +
  • public void postFrame(FrameBuffer out) {
  •    if (capture) {<br />
    
  •        shotIndex++;<br />
    
  •        capture = false;<br />
    
  •        int size = width * height;<br />
    

+

  •        // Create ByteBuffer to store data from OpenGL<br />
    
  •        ByteBuffer buf = ByteBuffer.allocateDirect(size * 4);  // data stored as 4 bytes per pixel<br />
    
  •        buf.order(ByteOrder.nativeOrder());<br />
    

+

  •        // Set the ViewPort size to full screen and get the current Color Buffer from OpenGL<br />
    
  •        renderer.setViewPort(0, 0, width, height);<br />
    
  •        renderer.readFrameBuffer(out, buf);<br />
    

+

  •        // Create Bitmap to store image<br />
    
  •        bitmapImage = Bitmap.createBitmap(width, height, Bitmap.Config.ARGB_8888);<br />
    

+

  •        // Convert OpenGL GLES20.GL_RGBA to Bitmap.Config.ARGB_8888<br />
    
  •        AndroidScreenshots.convertScreenShot(buf, bitmapImage);<br />
    
  •        buf = null;<br />
    

+

  •        // Save Bitmap to file<br />
    
  •        String fileName = JmeSystem.getStorageFolder() + File.separator + appName + shotIndex + &quot;.png&quot;;<br />
    
  •        logger.log(Level.INFO, &quot;Saving ScreenShot to: {0}&quot;, fileName);<br />
    

+

  •        try {<br />
    
  •            FileOutputStream fos = new FileOutputStream(fileName);<br />
    
  •            bitmapImage.compress(Bitmap.CompressFormat.PNG, 100, fos);<br />
    
  •            fos.flush();<br />
    
  •            fos.close();<br />
    
  •        } catch (Exception e) {<br />
    
  •            logger.log(Level.INFO, &quot;Error Saving File: {0}&quot;, e);<br />
    
  •        }<br />
    
  •    }<br />
    
  • }

    +}

    [/java]
1 Like

I request the contributor tag for @iwgeric

Damnit @nehon , if we kill him now he’ll be made a martyr…



Don’t worry, I’m making a list and I’m checking it twice, gonna find out who’s been naughty and nice.

Thanks @nehon. Funny you should say that because I just sent @Momoko_Fan a PM to ask what was involved with that.

@iwgeric said:
Thanks @nehon. Funny you should say that because I just sent @Momoko_Fan a PM to ask what was involved with that.

Mostly, you may have to bath with each member of the core team and anoint your body with perfumed oil....

Still wanna get in? :D

No, actually, in the facts you are already a contributor, since you contributed a fair amount of patches. But that would make it official that you are a trusted committer, and most of all you would be granted to directly push commits in the repo. (without waiting for me or Kirill, to show up :p)
You can still post your changes on the forum for approval at first if you are more comfortable with it. (like @Madjack does for example)

Also you would have a nice "contributor" tag next to your name in the forum so you could show off in front of JME noobs.. :p

@nehon I was all set to join until you mentioned the bathing part. Actually, I just submitted a request to join the contributor group. Not sure who the admin is for that, but it says the request was sent and waiting for admin.