[Solved ]Texture with animated transparency

Hi guys,



I’m still working on my dynamic sky. I decided to fade in and out the cloud layer. On this layer, there is a texture of clouds with an alpha channel. I want to animate the clouds so they gradually fade in or out with transparency. So I created a shader based on Sky to create this fade effect. If works fine visually, but there’s a major memory leak! It eats up around 4 gig of memory in about 2 minutes! I don’t find any way to debug a shader but I get an INFO message in the console while my app runs. Here’s the code of the shader and the error message. If anyone well versed in shaders could help me, I would really appreciate it.



AnimatedAlphaSky.j3md

[java]

MaterialDef AnimatedAlphaSky Plane {

MaterialParameters {

TextureCubeMap Texture

Boolean SphereMap

Vector3 NormalScale

Float AlphaLevel

}

Technique {

VertexShader GLSL100: MatDefs/AnimatedAlphaSky.vert

FragmentShader GLSL100: MatDefs/AnimatedAlphaSky.frag



RenderState {

FaceCull Off

}



WorldParameters {

ViewMatrix

ProjectionMatrix

WorldMatrix

}



Defines {

SPHERE_MAP : SphereMap

ALPHA_LEVEL: AlphaLevel

}

}

Technique FixedFunc {

}

}

[/java]



AnimatedAlphaSky.vert

[java]

uniform mat4 g_ViewMatrix;

uniform mat4 g_ProjectionMatrix;

uniform mat4 g_WorldMatrix;



uniform vec3 m_NormalScale;

uniform float AlphaLevel;



attribute vec3 inPosition;

attribute vec3 inNormal;



varying vec3 direction;



void main(){

// set w coordinate to 0

vec4 pos = vec4(inPosition, 0.0);



// compute rotation only for view matrix

pos = g_ViewMatrix * pos;



// now find projection

pos.w = 1.0;

gl_Position = g_ProjectionMatrix * pos;



vec4 normal = vec4(inNormal * m_NormalScale, 0.0);

direction = normalize( (g_WorldMatrix * normal).xyz );

}

[/java]



AnimatedAlphaSky.frag

[java]

#import “Common/ShaderLib/Optics.glsllib”



uniform ENVMAP m_Texture;

uniform float AlphaLevel;



varying vec3 direction;



void main() {

vec3 dir = normalize(direction);

vec4 finalColor = Optics_GetEnvColor(m_Texture, direction);

finalColor.a = finalColor.a * ALPHA_LEVEL;

gl_FragColor = finalColor;

}

[/java]



Error message:

INFO: Uniform m_AlphaLevel is not declared in shader [ShaderSource[name=MatDefs/AnimatedAlphaSky.vert, defines, type=Vertex], ShaderSource[name=MatDefs/AnimatedAlphaSky.frag, defines, type=Fragment]].

Jan 24, 2012 1:14:47 PM com.jme3.renderer.lwjgl.LwjglRenderer updateUniformLocation



Of course, each time I want to change the alpha level of the texture, I set the material alpha level like this:

skyLayerMaterial.setFloat(“AlphaLevel”, 1.0f); //Where 1.0f is 100% opaque and 0.0f is 100% transparent



I don’t really understand the error message, since I never defined a uniform “m_AlphaLevel”. And the value is obviously set somewhere since the clouds do fade in and out…



Thanks! :slight_smile:

In Java, you say setFloat( “AlphaLevel” ) but that has to be defined as m_AlphaLevel in the shader. It’s the way JME works and you can see this in all of the other shaders.



Shaders can’t have memory leaks so it would have to be in your code or JME, I guess. If you fix the above problem and the memory leak goes away then there is something wrong in JME.

1 Like

Oh, so I just have to rename AlphaLevel to m_AlphaLevel? Arg!!! Yes, I see “Texture” in the .j3md and “m_Texture” in the .frag. I should have seen that…

There can be no memory leak in shaders? Okay, I’ll see what happens when the problem is solved and let you know if the problem seems to come from my code or JME’s.



Thanks a lot pspeed! :smiley:

Hum, I didn’t change my java code, I only changed the shader code and the memory leak is gone… Also I used to receive the error message above once each frame. Now, it appears only once, when the app is launched and it changed a little:



INFO: Uniform AlphaLevel is not declared in shader [ShaderSource[name=MatDefs/AnimatedAlphaSky.vert, defines, type=Vertex], ShaderSource[name=MatDefs/AnimatedAlphaSky.frag, defines, type=Fragment]].

Jan 24, 2012 2:34:35 PM com.jme3.renderer.lwjgl.LwjglRenderer cleanup



Here is the corrected shader code.



AnimatedAlphaSky.j3md

[java]

MaterialDef AnimatedAlphaSky Plane {

MaterialParameters {

TextureCubeMap Texture

Boolean SphereMap

Vector3 NormalScale

Float AlphaLevel

}

Technique {

VertexShader GLSL100: MatDefs/AnimatedAlphaSky.vert

FragmentShader GLSL100: MatDefs/AnimatedAlphaSky.frag



RenderState {

FaceCull Off

}



WorldParameters {

ViewMatrix

ProjectionMatrix

WorldMatrix

}



Defines {

SPHERE_MAP : SphereMap

}

}

Technique FixedFunc {

}

}

[/java]



AnimatedAlphaSky.vert

[java]

uniform mat4 g_ViewMatrix;

uniform mat4 g_ProjectionMatrix;

uniform mat4 g_WorldMatrix;



uniform vec3 m_NormalScale;



attribute vec3 inPosition;

attribute vec3 inNormal;



varying vec3 direction;



void main(){

// set w coordinate to 0

vec4 pos = vec4(inPosition, 0.0);



// compute rotation only for view matrix

pos = g_ViewMatrix * pos;



// now find projection

pos.w = 1.0;

gl_Position = g_ProjectionMatrix * pos;



vec4 normal = vec4(inNormal * m_NormalScale, 0.0);

direction = normalize( (g_WorldMatrix * normal).xyz );

}

[/java]



AnimatedAlphaSky.frag

[java]

#import “Common/ShaderLib/Optics.glsllib”



uniform ENVMAP m_Texture;

uniform float m_AlphaLevel;



varying vec3 direction;



void main() {

vec3 dir = normalize(direction);

vec4 finalColor = Optics_GetEnvColor(m_Texture, direction);

finalColor.a = finalColor.a * m_AlphaLevel;

gl_FragColor = finalColor;

}

[/java]



So, it would mean that the memory leak is in JME then?

@vincent said:
So, it would mean that the memory leak is in JME then?


Yes. I'll let @momoko_fan comment on that.

I mean, the good thing is that the leak only happens for what is really a coding error. But it still seems wrong to me.

Are you really sure its these lines causing the memory to be filled? Maybe some other part of your application doesn’t happen if you remove this? I really couldn’t imagine how setting some variables should eat away memory like this… Do you get out of memory errors anyway? If you just have the memory limit of java set to 4GB it might use it all before garbage collecting for the first time (only if it comes really bad but it might happen).

Smart caching is not enabled for shaders because they are non-clonable.



See this issue:

http://code.google.com/p/jmonkeyengine/issues/detail?id=273



You shouldn’t bind-to-define values that change often and have many values, like the AlphaLevel you have there. Each time you change it, your shader has to be recompiled on the driver which would significantly reduce performance. Also it is cached in the asset manager which would cause the “memory leak” you mentioned.

@vincent said:
Hum, I didn't change my java code, I only changed the shader code and the memory leak is gone... Also I used to receive the error message above once each frame. Now, it appears only once, when the app is launched and it changed a little:


You changed more than just the shader code. You also removed the: ALPHA_LEVEL: AlphaLevel from your original j3md.

Some advice:
It's important when trying to track down problem if you only change one thing at a time. Be scientific about it.

As it was, you changed more than one thing and got a false indicator on where the leak was... since as momoko points out it was the original define binding that caused the leak. You'd get the leak back if added that back even if the m_AlphaLevel was defined correctly.

@normen:

My app is pretty basic. I have no condition based on the AlphaLevel I set in the shader. Basically, each frame I calculate what the AlphaLevel should be in the shader and then I set it. You have a good point: I never get an out of memory error, but after a while the app begins to lag and my computer becomes unresponsive. I assumed a memory leak because of the 4GB of memory gone in such a short while. I am completely new to java, I wasn’t aware of the possibility to set a memory limit. That may be the problem.



@Momoko_Fan:

I see… So if setting a uniform value in a shader is a bad practice, do you have a suggestion on how to achieve this fade-in fade-out of a texture? Or is only only because I change it very often? if, for example, I changed the value each second rather than each frame, would that be good enough? It would probably still look good to the human eye and be easier to handle for the app.

@pspeed: But… isn’t the ALPHA_LEVEL defined in the shader? Or maybe I misunderstand shaders. I thought the shader was made of three parts: the .j3md, the .vert and the .frag.



I’m sorry if I confused anyone. I’m really starting from far here.

This defines the uniform:

Float AlphaLevel



And that’s fine.



In your original j3md, you also had:

ALPHA_LEVEL: AlphaLevel



Which is the same as having a #define in the shader code that has the AlphaLevel value. This means that the shaders need to be recompiled every time you set AlphaLevel. That is why you had a leak.

1 Like

Great! Thanks a lot for clearing that up for me! :slight_smile: Don’t worry, as I gain experience I will stop asking stupid questions.