Materials and exports from terrain editor

Hi community,

let me first say a very big thanks to the developer of jME. I follow the project since a couple of months and i’m impressed how simple and powerfull the API is.
About me, i am an real android enthusiast. So i came to try some little things with jME on my android devices.

But right now, im facing some problems. I wanted to do a little tank driving game. I imported a simple tank model from blender and created a terrain from terrain editor (a really awesome tool).
Applied some physic Controllers (Terrain got from Scene Composer & tank got from custom shape programmatically).
It runs well on my N7000 with mali r3p1 driver (Android gpu driver with opengl, gles and x11 support). But on my other devices it ran at about 1fps.
The first thing i thought was, the bullet must have been the issue (after alot of compiling, testing and other stuff… it wasn’t :mrgreen: )
Without lights it runs on all devices just fine so i guess, the problem is about shadow calculation and the terrain must have the terrainlightening material definition.

So here are my questions:

  • Is it possible to let the terrain editor use an unshaded material?
  • Do i still have the 16 (i guess) “slots” for textures with one of those materials?
  • May i export any created image (alphamap, heightmap, texturemap…) from the terrain node, to build the terrain programmatically with different definition?
  • may i get a list of used shaders and disable one by one from queue to realize wich one does the performance impact?
  • is it possible to implement custom ES2 compatible shaders (V/F) to aquire features like shadows?
  • may i use the created terrain in blender somehow to bake an lightmap or build a lightmap in jME?

Hope that’s not too much of text and i could make myself clear? Ah yeah… i used the SVN trunk,

In case to compiling bullet for android native, i came to a bunch of problems.
At first this fix should find it’s way into the svn repo

Second thing, google must have changed alot about the directory behaviour of NDK. I couldn’t compile the trunk from scatch with android ndk nor could i do it with crystaX ndk.
Had to update the makefile in following cases:

  • Include folder starts outside of jni directory so i had to add a variable for bullet include preffix and used the other one for source folders
  • and i had to update the list of cpp files, because not all where referenced in makefile.

tried with bullet 2.80 & 2.81. Compiles after changes fine with all platforms (armeabi/-v7, mips and x86)


You might need to do your own shader for unlit terrain (could always bake lighting into one of the textures for it). You could base it off the existing terrain and unshaded shaders though…

1 Like

Thank you for the quick reply.
How may i bake a lightmap from jMP? And how may i get wich shaders are working on terrain? Is there something like getAllModules or getAllShaders available from renderer? Or may i even delete some from queue?
Every example i’ve seen so far with terrains on android, did create the terrainquads programmatically. Thats why i wanted to export the resources from terrain editor.

Thanks in advance

The shader working on terrain is whatever you have specified in your material…

Actually we have an unshaded material for terrain. see Common/MatDefs/Terrain/Terrain.j3md
It was even made before the terrain lighting material.
Afaik it uses the same amount of textures as the terrain lighting material.

Unfortunately you can’t bake light maps from the SDK, that’s a feature we’d like to add, but no one took the shot yet.
The shader used by a material is declared inside the j3md file.

1 Like

ah that makes some things more clear. However, i’ve seen the common material defs already. As far i can tell, the Terrain.j3md just supports 3 texs and an alphamap, unlike the one used by terrain editor.
I’m thinking about a different approach (Without export the terrain editor gets way useless this way). Did anyone already try to adapt the jME3 shader library for GLSL ES compatibility?
If not, where would you suggest me to start with work like this. As i told, i am pretty enthusiastic about the engine and android and i learn pretty fast.
Can i aquire a list of running Shadercodes at runtime? May be with additional data like linked object and so on.

Thanks alot

Eventually the engine should work seamlessly across platforms. The best way to start is to try around with fixing / adapting stuff and then posting and discussing patches here :slight_smile: “A line of code says more than a thousand words” so to speak :wink:

Our shaders are working on android. Only thing that could be managed would be precision qualifiers that makes only sense on opengl es.We set the default values for precision in the opengles renderer

All our shader are in core-data/Common, core-effects/Common, terrain/Common
Note that core-effects are mostly the shaders involved in post processing and are so resources consuming on android that they are not very likely to be used.

1 Like

I see. So theres nothing to do about the shader scripts themself. Hope this weekend gets someway relaxed. I may dive a little through the renderer and post processor. May be an epiphany will hit me hard.
Till then a big thanks.

They surely can be optimized for android. They were developed on and for desktop and then we just made them work on android.

So long, so far. I took a while to have a look into the renderers used for Android. The only little fix i “found” is according to an old guideline document for adreno GPUs. On Android you should call glClear after binding a framebuffer (because of very low graphic memory, to stop the gpu saving framebuffers) i just tried to apply this to oglesrender without any additional queries. That increased the FPS from 1 to 3 on tf700t tablet but ruined the onscreen stats (none framebuffers where listed)

Beside i found the comment in TerrainLightening.j3md ("// NOTE: Doesn’t support OpenGL1"), i guess the problem comes to pre and post processor. I didnt look into the material handler yet. So any hindsight would be appreciated. Do you already know, what exactly causes the performance impact with pre/post processors?

1 Like
@Android.Enthusiast said: Do you already know, what exactly causes the performance impact with pre/post processors?
Most android device fill rate is very low, so add to that full screen complex shading and you ruin the performance. The best example is that multisampling is faster on android than FXAA. I don't know if you are familiar with this, but multisampling is the "classic" way of doing anti aliasing and FXAA is a post process effect that blurs the edges in the scene that gives result somehow equivalent to multisampling x2 and is a lot faster on desktop. It's also used in deferred rendering pipelines. A way to alleviate the issue would be to reduce the resolution...but this has other cons...

The Glclear thing is pretty interesting, I wouldn’t care too much about the stats, as long as it’s faster…there is probably a way to fix the stats.

After alot of research, i agree about mobile GPUs. Right now im working on an export tool, for the heighmap an i’ll try to create a small repository of MatDefs with shaders optimized for mobile usage.
Guess the best way will be to compare shaders from other fast OGLES2 engines.
Once something remarkable comes out i will contribute it somewhere XD
About the glClear function, i’m still on playing around with the renderer (found few other issues… for example it looks like the tegra 3 t33 chipset doesn’t support Depth wich is kind of weird).
After a stable set of changes ill contribute as normen told.

I remain with a big thank you :wink:

@Android.Enthusiast said: for example it looks like the tegra 3 t33 chipset doesn't support Depth wich is kind of weird).
mhhh that can be a problem on our end, maybe we don't find a suitable pixel configuration and end up choosing one with no depth. I think it looks for at least a 16 bit depth pixel configuration, maybe this gpu only have a 8 bit depth buffer.

I did read some articles about this matter. On a forum for Unity someone already posted about that. As it seems, all Tegra chipsets don’t support depth textures at all. They where working on a way, to use RGBA textures instead and implement them in a shader, like a depth texture should behave.
That’s a big problem because next to Mali 400 & 604T and modern Adreno GPUs the Tegra is still the most used Chipset for 3d playing.