Natives with Android

I have been giving this a lot of thought and agree with @nehon that there should be native calls for renderering and maybe for sound. Don’t get me wrong, we can still call all this functionality with java. This will give us more control how the renderer should perform and make @nehon happy with the Framebuffers, post process and shadows since we can us GLSL with the NDK.



For the audio there is the OpenSL ES which @prich has metioned and can be created with the NDK. Its only for android 2.3 and up. That can be switched during runtime if it’s not 2.3. It has low level functionality and good features, and faster than androids large delays.



Native code can avoid a lot of Dalvik-related overhead such as garbage collection pauses mostly.



Well guys, I have been reading a lot of this documentation and my eyes hurt bad. I am going to create my own implementations in my own apps. If I can get them to work, then I will see if I can integrate them in the JMonkeyEngine. For now, i’m coding like crazy and reading a lot. Then I will upload some videos if I get some good results.



What do you guys think?

I think supporting android with your java lib is almost as hard as supporting iOS properly by now :confused: Thanks for looking into this though, seems like the “Java OS” promise from google is something we should forget pretty soon… Interested in how using gcj for android would work out, I managed to get jME3 compiled using GCJ… And I guess that or avian will be our ticket to a few platforms anyway (OSX AppStore, PS3, iOS etc)

Tell us all about your progress as you move forward :slight_smile:



Also consider getting involved with Android for GSoC if you’re eligible to enter as a student.

Hey @techsonic

We definitely need to look into what’s doable on the native side for android.



Also what @antiplod said here picked out interest too, RenderingScript might be a nice alternative to native.

http://hub.jmonkeyengine.org/groups/site-project/forum/topic/gsoc-2012-idea-better-android-support/#post-163748

I don’t know if it’s applicable to audio though.



I don’t know how much the fact that OpenSL ES is only for 2.3 is bad…



giving this article (sorry it’ sin French but the pie chart is self explanatory)

http://www.phonandroid.com/android-4-0-ice-cream-sandwich-enfin-present-dans-le-rapport-de-repartition-mensuel.html

android 2.3.3 is the most used version in january 2012, but there is still a fair amount of 2.2 (30,4%).

Anyway i guess this chart will be completely different in 6 month so I guess it’s safe to look into openSL

Thanks guys for the response.



@normen

The android doesn’t use the GCJ with the NDK. It uses a toolchain that links the C++ and C code.

Yes, whatever, its binary files. I was talking about using gcj and/or avian for this so its platform-independent. All of the mentioned platforms support vanilla OpenGLES.

@nehon



I think RenderScript only works with Android 3.0+. I’ve read that somewhere, not sure where. Not only that, it’s another whole language to learn. You can probably confirm it yourself.

yes, it seems to be 3.0 specific.

Well anyway, we have to explore both ways. (natives and renderscript).

A lot of 2.3 and 3.0 devices may switch to 4.0 this year, so we’ll see what happen.

@techsonic said:
@nehon

I think RenderScript only works with Android 3.0+. I've read that somewhere, not sure where. Not only that, it's another whole language to learn. You can probably confirm it yourself.


I have to bow to your experience as I'm a bit of a newbie, but having played a little with Renderscript it seems pretty straight forward to pick up as a language, especially if you are familiar with C/CPP; it's just a cut down version all things considered. Memory arbitration across the Renderscript boundary is the trickiest bit to get to grips with, but after a quick play with the samples it all seems to click into place, especially as a compute engine. Documentation is a bit of a pain or rather the lack of it.

As to versions of Android supported, it is a tricky one for the JME gang, but for me the homogenising affect of Android 4 combined with new hardware would, as a developer, make it a no-brainer. By the time we get something to market Phones will have moved on and those devices that don't support Renderscript will be occupy an ever diminishing fraction of the Android market.

BTW I have only tested it on a Motorola Xoom - so mileage may vary with other devices...

@antiplod



Thanks for the complement. It does get a little tricky on the version of android support. Not everyone has the latest phones. RenderScript is great and has Hardware Acceleration which is awesome. Right now, I’m testing everything natively because I think I can bring my C++/C language into good use. It all comes down to what the market shows. In the end, native C has great portability. At some point I was even looking into OpenCL which deals with Parallel programming and available for some android tablets. If you do keep testing RenderScript, let me know on this thread how it turns out.



Thanks

Of course. Bit distracted with other stuff atm, but if I come across anything interesting I’ll let you know. There are a couple of good youtube intro’s to Renderscript if anyone else is interested, such as this one:

http://www.youtube.com/watch?v=5jz0kSuR2j4

Hello everyone,



Just wanted to update my progress.



http://i.imgur.com/qrE8p.jpg



I am writing code so that when OpenGL is used, it will be called natively. I am still seeing if I should target 2.3.3 or not since OpenSL ES would be cool. I am also seeing if GLSurfaceView.Renderer would benefit with native methods and if it would be to tedious to implement. I have even got a OpenGL ES book to help me with postprocessing effects and other goodies :).



If anyone has any questions, just reply

@nehon



I have been playing around with OpenSL ES and it’s a wonderful library. The android NDK has some good examples. Still some limitations but is much better than androids MediaPlayer. Responds really quick and the thing is that the asset manager is handled on the native side which is cool, which eleminates a lot of the checking in the java side of things. You can adjust the sample rates and quite a bit of presets such as simulating a stone corridor or hall. Here’s an example.



PaperHeroActivity .java

[java]

package com.techsonic.game;



import jni.NativeAudio;

import android.app.Activity;

import android.content.pm.ActivityInfo;

import android.content.res.AssetManager;

import android.os.Bundle;

import android.util.Log;

import android.view.Window;

import android.view.WindowManager;



public class PaperHeroActivity extends Activity {



// load native libraries

static {

System.loadLibrary(“techsonic”);

}



public final String TAG = “PaperHeroActvity”;



public PaperHeroGLSurfaceView pHSView;

static AssetManager assetManager;

public boolean isPlayingAsset = false;

public boolean created = false;



/** Called when the activity is first created. */

@Override

public void onCreate(Bundle savedInstanceState) {

super.onCreate(savedInstanceState);



// setup the window layout

requestWindowFeature(Window.FEATURE_NO_TITLE);

getWindow().setFlags(WindowManager.LayoutParams.FLAG_FULLSCREEN,

WindowManager.LayoutParams.FLAG_FULLSCREEN);

setRequestedOrientation(ActivityInfo.SCREEN_ORIENTATION_LANDSCAPE);

pHSView = new PaperHeroGLSurfaceView(getApplication());



assetManager = getAssets();



// initialize the native audio system

NativeAudio.createEngine();

NativeAudio.createBufferQueueAudioPlayer();



if(!created) {

created = NativeAudio.createAssetAudioPlayer(assetManager, “kick-the-rock.mp3”);

}

if(created) {

isPlayingAsset = !isPlayingAsset;

NativeAudio.setPlayingAssetAudioPlayer(isPlayingAsset);

}



setContentView(pHSView);

}



@Override

protected void onPause() {

Log.d(TAG, “pause”);

isPlayingAsset = false;

NativeAudio.setPlayingAssetAudioPlayer(false);

pHSView.onPause();



if(isFinishing()) {

Log.d(TAG, “isFinishing”);

}



super.onPause();

}



@Override

protected void onResume() {

Log.d(TAG, “resume”);

isPlayingAsset = true;

NativeAudio.setPlayingAssetAudioPlayer(true);

pHSView.onResume();



super.onResume();

}



@Override

protected void onDestroy() {

Log.d(TAG, “destroy”);

NativeAudio.shutdown();

super.onDestroy();

}

}

[/java]



NativeAudio.java

[java]

package jni;



import android.content.res.AssetManager;



public class NativeAudio {

public static native void createEngine();

public static native void createBufferQueueAudioPlayer();

public static native boolean createAssetAudioPlayer(AssetManager assetManager, String filename);

// true == PLAYING, false == PAUSED

public static native void setPlayingAssetAudioPlayer(boolean isPlaying);

public static native boolean enableReverb(boolean enabled);

public static native void shutdown();

}

[/java]



native_audio.c

[java]

#include <assert.h>

#include <string.h>

#include <jni.h>



// for native audio

#include <SLES/OpenSLES.h>

#include <SLES/OpenSLES_Android.h>



// for native asset manager

#include <sys/types.h>

#include <android/asset_manager.h>

#include <android/asset_manager_jni.h>



// engine interfaces

static SLObjectItf engineObject = NULL;

static SLEngineItf engineEngine;



// output mix interfaces

static SLObjectItf outputMixObject = NULL;

static SLEnvironmentalReverbItf outputMixEnvironmentalReverb = NULL;



// buffer queue player interfaces

static SLObjectItf bqPlayerObject = NULL;

static SLPlayItf bqPlayerPlay;

static SLAndroidSimpleBufferQueueItf bqPlayerBufferQueue;

static SLEffectSendItf bqPlayerEffectSend;

static SLMuteSoloItf bqPlayerMuteSolo;

static SLVolumeItf bqPlayerVolume;



// aux effect on the output mix, used by the buffer queue player

static const SLEnvironmentalReverbSettings reverbSettings =

SL_I3DL2_ENVIRONMENT_PRESET_STONECORRIDOR;



// file descriptor player interfaces

static SLObjectItf fdPlayerObject = NULL;

static SLPlayItf fdPlayerPlay;

static SLSeekItf fdPlayerSeek;

static SLMuteSoloItf fdPlayerMuteSolo;

static SLVolumeItf fdPlayerVolume;



// pointer and size of the next player buffer to enqueue, and number of remaining buffers

static short *nextBuffer;

static unsigned nextSize;

static int nextCount;



// this callback handler is called every time a buffer finishes playing

void bqPlayerCallback(SLAndroidSimpleBufferQueueItf bq, void *context)

{

assert(bq == bqPlayerBufferQueue);

assert(NULL == context);

// for streaming playback, replace this test by logic to find and fill the next buffer

if (–nextCount > 0 && NULL != nextBuffer && 0 != nextSize) {

SLresult result;

// enqueue another buffer

result = (bqPlayerBufferQueue)->Enqueue(bqPlayerBufferQueue, nextBuffer, nextSize);

// the most likely other result is SL_RESULT_BUFFER_INSUFFICIENT,

// which for this code example would indicate a programming error

assert(SL_RESULT_SUCCESS == result);

}

}



// create the engine and output mix objects

void Java_jni_NativeAudio_createEngine(JNIEnv
env, jclass clazz)

{

SLresult result;



// create engine

result = slCreateEngine(&engineObject, 0, NULL, 0, NULL, NULL);

assert(SL_RESULT_SUCCESS == result);



// realize the engine

result = (*engineObject)->Realize(engineObject, SL_BOOLEAN_FALSE);

assert(SL_RESULT_SUCCESS == result);



// get the engine interface, which is needed in order to create other objects

result = (*engineObject)->GetInterface(engineObject, SL_IID_ENGINE, &engineEngine);

assert(SL_RESULT_SUCCESS == result);



// create output mix, with environmental reverb specified as a non-required interface

const SLInterfaceID ids[1] = {SL_IID_ENVIRONMENTALREVERB};

const SLboolean req[1] = {SL_BOOLEAN_FALSE};

result = (*engineEngine)->CreateOutputMix(engineEngine, &outputMixObject, 1, ids, req);

assert(SL_RESULT_SUCCESS == result);



// realize the output mix

result = (*outputMixObject)->Realize(outputMixObject, SL_BOOLEAN_FALSE);

assert(SL_RESULT_SUCCESS == result);



// get the environmental reverb interface

// this could fail if the environmental reverb effect is not available,

// either because the feature is not present, excessive CPU load, or

// the required MODIFY_AUDIO_SETTINGS permission was not requested and granted

result = (*outputMixObject)->GetInterface(outputMixObject, SL_IID_ENVIRONMENTALREVERB,

&outputMixEnvironmentalReverb);

if (SL_RESULT_SUCCESS == result) {

result = (outputMixEnvironmentalReverb)->SetEnvironmentalReverbProperties(

outputMixEnvironmentalReverb, &reverbSettings);

}

// ignore unsuccessful result codes for environmental reverb, as it is optional for this example



}



// create buffer queue audio player

void Java_jni_NativeAudio_createBufferQueueAudioPlayer(JNIEnv
env,

jclass clazz)

{

SLresult result;



// configure audio source

SLDataLocator_AndroidSimpleBufferQueue loc_bufq = {SL_DATALOCATOR_ANDROIDSIMPLEBUFFERQUEUE, 2};

SLDataFormat_PCM format_pcm = {SL_DATAFORMAT_PCM, 1, SL_SAMPLINGRATE_8,

SL_PCMSAMPLEFORMAT_FIXED_16, SL_PCMSAMPLEFORMAT_FIXED_16,

SL_SPEAKER_FRONT_CENTER, SL_BYTEORDER_LITTLEENDIAN};

SLDataSource audioSrc = {&loc_bufq, &format_pcm};



// configure audio sink

SLDataLocator_OutputMix loc_outmix = {SL_DATALOCATOR_OUTPUTMIX, outputMixObject};

SLDataSink audioSnk = {&loc_outmix, NULL};



// create audio player

const SLInterfaceID ids[3] = {SL_IID_BUFFERQUEUE, SL_IID_EFFECTSEND,

/SL_IID_MUTESOLO,/ SL_IID_VOLUME};

const SLboolean req[3] = {SL_BOOLEAN_TRUE, SL_BOOLEAN_TRUE,

/SL_BOOLEAN_TRUE,/ SL_BOOLEAN_TRUE};

result = (*engineEngine)->CreateAudioPlayer(engineEngine, &bqPlayerObject, &audioSrc, &audioSnk,

3, ids, req);

assert(SL_RESULT_SUCCESS == result);



// realize the player

result = (*bqPlayerObject)->Realize(bqPlayerObject, SL_BOOLEAN_FALSE);

assert(SL_RESULT_SUCCESS == result);



// get the play interface

result = (*bqPlayerObject)->GetInterface(bqPlayerObject, SL_IID_PLAY, &bqPlayerPlay);

assert(SL_RESULT_SUCCESS == result);



// get the buffer queue interface

result = (*bqPlayerObject)->GetInterface(bqPlayerObject, SL_IID_BUFFERQUEUE,

&bqPlayerBufferQueue);

assert(SL_RESULT_SUCCESS == result);



// register callback on the buffer queue

result = (*bqPlayerBufferQueue)->RegisterCallback(bqPlayerBufferQueue, bqPlayerCallback, NULL);

assert(SL_RESULT_SUCCESS == result);



// get the effect send interface

result = (*bqPlayerObject)->GetInterface(bqPlayerObject, SL_IID_EFFECTSEND,

&bqPlayerEffectSend);

assert(SL_RESULT_SUCCESS == result);



#if 0 // mute/solo is not supported for sources that are known to be mono, as this is

// get the mute/solo interface

result = (*bqPlayerObject)->GetInterface(bqPlayerObject, SL_IID_MUTESOLO, &bqPlayerMuteSolo);

assert(SL_RESULT_SUCCESS == result);

#endif



// get the volume interface

result = (*bqPlayerObject)->GetInterface(bqPlayerObject, SL_IID_VOLUME, &bqPlayerVolume);

assert(SL_RESULT_SUCCESS == result);



// set the player’s state to playing

result = (bqPlayerPlay)->SetPlayState(bqPlayerPlay, SL_PLAYSTATE_PLAYING);

assert(SL_RESULT_SUCCESS == result);

}



// expose the mute/solo APIs to Java for one of the 2 players

static SLMuteSoloItf getMuteSolo()

{

if (fdPlayerMuteSolo != NULL)

return fdPlayerMuteSolo;

else

return bqPlayerMuteSolo;

}



// expose the volume APIs to Java for one of the 3 players



static SLVolumeItf getVolume()

{

if (fdPlayerVolume != NULL)

return fdPlayerVolume;

else

return bqPlayerVolume;

}



// enable reverb on the buffer queue player

jboolean Java_jni_NativeAudio_enableReverb(JNIEnv
env, jclass clazz,

jboolean enabled)

{

SLresult result;



// we might not have been able to add environmental reverb to the output mix

if (NULL == outputMixEnvironmentalReverb) {

return JNI_FALSE;

}



result = (bqPlayerEffectSend)->EnableEffectSend(bqPlayerEffectSend,

outputMixEnvironmentalReverb, (SLboolean) enabled, (SLmillibel) 0);

// and even if environmental reverb was present, it might no longer be available

if (SL_RESULT_SUCCESS != result) {

return JNI_FALSE;

}



return JNI_TRUE;

}



// create asset audio player

jboolean Java_jni_NativeAudio_createAssetAudioPlayer(JNIEnv
env, jclass clazz,

jobject assetManager, jstring filename)

{

SLresult result;



// convert Java string to UTF-8

const jbyte utf8 = (env)->GetStringUTFChars(env, filename, NULL);

assert(NULL != utf8);



// use asset manager to open asset by filename

AAssetManager
mgr = AAssetManager_fromJava(env, assetManager);

assert(NULL != mgr);

AAsset
asset = AAssetManager_open(mgr, (const char *) utf8, AASSET_MODE_UNKNOWN);



// release the Java string and UTF-8

(*env)->ReleaseStringUTFChars(env, filename, utf8);



// the asset might not be found

if (NULL == asset) {

return JNI_FALSE;

}



// open asset as file descriptor

off_t start, length;

int fd = AAsset_openFileDescriptor(asset, &start, &length);

assert(0 <= fd);

AAsset_close(asset);



// configure audio source

SLDataLocator_AndroidFD loc_fd = {SL_DATALOCATOR_ANDROIDFD, fd, start, length};

SLDataFormat_MIME format_mime = {SL_DATAFORMAT_MIME, NULL, SL_CONTAINERTYPE_UNSPECIFIED};

SLDataSource audioSrc = {&loc_fd, &format_mime};



// configure audio sink

SLDataLocator_OutputMix loc_outmix = {SL_DATALOCATOR_OUTPUTMIX, outputMixObject};

SLDataSink audioSnk = {&loc_outmix, NULL};



// create audio player

const SLInterfaceID ids[3] = {SL_IID_SEEK, SL_IID_MUTESOLO, SL_IID_VOLUME};

const SLboolean req[3] = {SL_BOOLEAN_TRUE, SL_BOOLEAN_TRUE, SL_BOOLEAN_TRUE};

result = (*engineEngine)->CreateAudioPlayer(engineEngine, &fdPlayerObject, &audioSrc, &audioSnk,

3, ids, req);

assert(SL_RESULT_SUCCESS == result);



// realize the player

result = (*fdPlayerObject)->Realize(fdPlayerObject, SL_BOOLEAN_FALSE);

assert(SL_RESULT_SUCCESS == result);



// get the play interface

result = (*fdPlayerObject)->GetInterface(fdPlayerObject, SL_IID_PLAY, &fdPlayerPlay);

assert(SL_RESULT_SUCCESS == result);



// get the seek interface

result = (*fdPlayerObject)->GetInterface(fdPlayerObject, SL_IID_SEEK, &fdPlayerSeek);

assert(SL_RESULT_SUCCESS == result);



// get the mute/solo interface

result = (*fdPlayerObject)->GetInterface(fdPlayerObject, SL_IID_MUTESOLO, &fdPlayerMuteSolo);

assert(SL_RESULT_SUCCESS == result);



// get the volume interface

result = (*fdPlayerObject)->GetInterface(fdPlayerObject, SL_IID_VOLUME, &fdPlayerVolume);

assert(SL_RESULT_SUCCESS == result);



// enable whole file looping

result = (fdPlayerSeek)->SetLoop(fdPlayerSeek, SL_BOOLEAN_TRUE, 0, SL_TIME_UNKNOWN);

assert(SL_RESULT_SUCCESS == result);



return JNI_TRUE;

}



// set the playing state for the asset audio player

void Java_jni_NativeAudio_setPlayingAssetAudioPlayer(JNIEnv
env,

jclass clazz, jboolean isPlaying)

{

SLresult result;



// make sure the asset audio player was created

if (NULL != fdPlayerPlay) {



// set the player’s state

result = (fdPlayerPlay)->SetPlayState(fdPlayerPlay, isPlaying ?

SL_PLAYSTATE_PLAYING : SL_PLAYSTATE_PAUSED);

assert(SL_RESULT_SUCCESS == result);



}



}



// shut down the native audio system

void Java_jni_NativeAudio_shutdown(JNIEnv
env, jclass clazz)

{



// destroy buffer queue audio player object, and invalidate all associated interfaces

if (bqPlayerObject != NULL) {

(*bqPlayerObject)->Destroy(bqPlayerObject);

bqPlayerObject = NULL;

bqPlayerPlay = NULL;

bqPlayerBufferQueue = NULL;

bqPlayerEffectSend = NULL;

bqPlayerMuteSolo = NULL;

bqPlayerVolume = NULL;

}



// destroy file descriptor audio player object, and invalidate all associated interfaces

if (fdPlayerObject != NULL) {

(*fdPlayerObject)->Destroy(fdPlayerObject);

fdPlayerObject = NULL;

fdPlayerPlay = NULL;

fdPlayerSeek = NULL;

fdPlayerMuteSolo = NULL;

fdPlayerVolume = NULL;

}



// destroy output mix object, and invalidate all associated interfaces

if (outputMixObject != NULL) {

(*outputMixObject)->Destroy(outputMixObject);

outputMixObject = NULL;

outputMixEnvironmentalReverb = NULL;

}



// destroy engine object, and invalidate all associated interfaces

if (engineObject != NULL) {

(*engineObject)->Destroy(engineObject);

engineObject = NULL;

engineEngine = NULL;

}

}

[/java]



Sorry for the late response on this. I don’t have internet at my house at the moment and being in the poverty level doesn’t help. I can’t test this with the svn so I can’t integrate myself as I don’t have the latest version. Just wanted to let you know that OpensSL ES is worth it. Will update whenever I get the chance.

1 Like

Ho nice!!

I did see your post before, i don’t know why i didn’t have a mail…



I was looking into audio on android and this will really come in handy!! thank you!

Would it be possible to have an OpenSL implementation of AudioRenderer? The interface you’re using here isn’t the one jME3 is using.

@Momoko_Fan



Actually it is possible, the above was just an example. I created another interface that has, play(), pause(), resume(), release(), isPlaying(), setLooping(), createEngine(), createAsset(AssetManager assetManager, String filename) and some other methods that involve Environmental effects. Assetmanager is handled natively as createAsset passes the AssetManager to to the native side. A lot of things are done natively, the MediaPlayer makes native calls so it would be no different.



To go a little off topic, I have downloaded Unity’s free offer and debugged Unity’s sample. I will say JME is at a huge disadvantage. They seem to use NativeActivity which interacts with their script, no wonder its fast. To compete with other engines, natives is a must, since a lot of the libraries are in CC++.



Right now I’m creating my own game. I will try to make time for it if I can. If not, then I will post the code here.

We know that rewriting the engine in native code will make it significantly faster on Android but currently we have no such ambitious plans. We are hoping to optimize some of the hot spots in the engine like sorting, particles, skinning, collision, etc with native code but that is for the future. Note that our physics engine (Bullet) is already implemented by native code on Android.

I see, then I will not speak of this anymore. Thanks for letting me know.