Sorry if this has been covered already. I still have a lot of catching up to do on OpenGL and JME in general as you may have understood by now.
I am looking for the “right” way to fade the screen to black. I am assuming I should be able to control the alpha value of some kind of overlay in the main viewPort and that there is a very standard way of doing this. I would really appreciate if someone could point me in the right direction.
Actually, just to let you know, I was thinking something along these lines (pseudo-codish, left out positioning, and the update loop where I would grab the material and change the color/alpha). Am I on the right track?
[java]
Geometry blackOverlay = new Geometry();
Quad blackOverlayQuad = new Quad();
[…]
blackOverlay.setMesh(blackOverlayQuad);
blackOverlay.setQueueBucket(Bucket.Gui);
Material material = new Material(assetManager, “Common/MatDefs/Misc/SolidColor.j3md”);
material.setColor(“m_Color”, new ColorRGBA(0f, 0f, 0f, alpha));
blackOverlay.setMaterial(material);
viewPort.attachScene(blackOverlay);
[/java]
Hi,
I would do it as a 2D filter if i were you, but it involve shaders.
A filter is a full-screen quad on which you render the scene as a material.
Of course the idea is to implement the desired effect on this material.
You would have to pass the time, or tps to the shader and decrease the output color value until it reaches 0.
And the other way around for fade in.
Cool, thanks a lot. I am going to have a look at your ColorOverlayFilter and do some reading about shaders (and fragment shaders in particular I guess)…
Here is a doc that could help for shaders
https://wiki.jmonkeyengine.org/legacy/doku.php/jme3:advanced:jme3_shaders
Thanks again, I have it working now! I based it on the ColorOverlayFilter and the Gui material, subtracting the R, G, and B color components with the same (gradually increasing) value. I had to clamp the texVal color so it was max 1 to get a nice uniform fading out but now it looks great.
Cool
would you share the code?
Sure, np - please let me know if there is a nicer way of doing any of this (I am especially unfamiliar with what can be done in the shaders). This should be it I guess (excluding imports and comments):
FadeFilter.java:
[java]
public class FadeFilter extends Filter {
public float value;
public FadeFilter() {
super(“Fade In/Out Overlay”);
}
@Override
public Material getMaterial() {
material.setFloat(“m_Value”, value);
return material;
}
@Override
public void preRender(RenderManager renderManager, ViewPort viewPort) {}
@Override
public void initMaterial(AssetManager manager) {
material = new Material(manager, “Shaders/Fade.j3md”);
}
public void setValue(float value) {
this.value = value;
}
}
[/java]
Shaders/Fade.j3md:
[java]
MaterialDef Default Fade {
MaterialParameters {
Texture2D m_Texture
Float m_Value
}
Technique {
VertexShader GLSL100: Shaders/Fade.vert
FragmentShader GLSL100: Shaders/Fade.frag
WorldParameters {
WorldViewProjectionMatrix
}
Defines {
TEXTURE : m_Texture
}
}
Technique FixedFunc {
}
}
[/java]
Shaders/Fade.vert
[java]
uniform mat4 g_WorldViewProjectionMatrix;
uniform float m_Value;
attribute vec3 inPosition;
#ifdef TEXTURE
attribute vec2 inTexCoord;
varying vec2 texCoord;
#endif
void main() {
gl_Position = g_WorldViewProjectionMatrix * vec4(inPosition, 1.0);
#ifdef TEXTURE
texCoord = inTexCoord;
#endif
}
[/java]
Shaders/Fade.frag
[java]
#ifdef TEXTURE
uniform sampler2D m_Texture;
varying vec2 texCoord;
#endif
uniform float m_Value;
void main() {
#ifdef TEXTURE
vec4 texVal = texture2D(m_Texture, texCoord);
gl_FragColor = vec4(
(texVal.r > 1 ? 1 : texVal.r) - m_Value,
(texVal.g > 1 ? 1 : texVal.g) - m_Value,
(texVal.b > 1 ? 1 : texVal.b) - m_Value,
texVal.a);
#else
gl_FragColor = vec4(m_Value, m_Value, m_Value, 1);
#endif
}
[/java]
Usage:
[java]
if (renderer.getCaps().contains(Caps.GLSL100)) {
FadeFilter fadeFilter = new FadeFilter();
FilterPostProcessor filterPostProcessor = new FilterPostProcessor(assetManager);
filterPostProcessor.addFilter(fadeFilter);
viewPort.addProcessor(filterPostProcessor);
}
[/java]
I’ve also obviously left out the updating of the fadeFilter value - I keep a reference to the fadeFilter for easy access and then update the value in the main loop when I need to fade.
that looks good.
just one or two remarks / advices
- I would have made the m_value computed in the shader, or at least in the filter, so you don’t have to compute the value in the application main loop. To do this in the shader, there are two global uniforms that can be used : Time and tpf. Time is the time in millisec since the application started and tpf is the render duration of the last frame.
- you don’t really need the conditional define for the m_Texture (#ifdef, etc…) because it’s a filter and it’s very likely that you’ll always pass the scene rendered as a texture to the shader.
- Be careful when you use this line
texVal.r > 1
you are comparing a float value to an int value so on nvidia you’ll have a warning like “implicit cast from int to float”, and it’s gonna work, but on ATI card this raises an error and the shader just does not compile. (nvidia on mac os raises an error too)
The valid glsl syntax is
texVal.r > 1.0
Thank you for sharing your work, can i use it for a JME3 Fade IN/OUT filter?
Cool, thanks a lot for the info! I am learning something new every day.
Sure, go ahead and take whatever you can use.
OK so i committed the FadeFIlter to last svn, with some changes though.
Doing it i realized that you had no mean to do the calculation in the Filter or in the shader, so i introduced the preFrame(float tpf) method that you can now override in any filter. It’s called prior any calculations for the frame.
So the usage has changed a bit :
- There is a duration parameter that can be set in the constructor or via a setter, it’s the duration of the fade in seconds
- you can now call methods fadeIn() or fadeOut() on the filter to trigger the effect
You can see TestPostFilters to see an example.
i changed the way of fading in the filter. I multiply the color by a value going from 0 to 1 (or the other way around), so the fade is more linear.
One or two things remain to do.
I guess that you want to be aware of when the effect has finished to switch scene or do something like that.
I could setup an observer/observant pattern, so you could had listeners to the filter. But it needs to be more generic, maybe i’ll introduce a AnimatedFilter interface that you can implement if your filter holds some kind of animation.
The other thing that bothers me is that the filter is always active, even if the fade is not playing.
I must implement a way of enabling/disabling filters in the filter stack.
I would appreciate your feed back on that.
Thank you
That’s great!
I actually added an interface “FadeFilterListener” with one method (fadeDone), and made the FadeFilter have one single reference to such an object, and also moved the updating code into the FadeFilter class. It works but I’m not too happy about it. I agree that a more generic interface would be a lot nicer, so the AnimatedFilter idea sounds good to me. I’ve noticed that jQuery (the javascript API) has an animate() function with a callback function with just the one function - animationDone() or whatever.
When I add the filter to the stack it does make the frame rate drop considerably (from 400 fps to 100fps at the moment) but as it’s just for transitions I’m not too bothered about it. As you say, I would need to be able to drop it out of the stack when I don’t need it. At the moment I just do filterPostProcessor.removeFilter(fadeFilter) when the fading is done and it works at the moment because I only have that one filter in the stack - and it’s also in the main update loop so it should be safe from the thread perspective I guess. Being able to enable/disable the actual filter sounds a lot better to me, especially if (or when) I need to use more filters.
Hi!
I am trying to make an object/geometry/mesh fade out (gets more and more transparent until vanish).
I saw this FadeFilter, but it is applied to the viewport as a post processor.
Can it be applied to a single geometry/texture?
I am still researching…
Particles:
I found at particles that the color buffer has each color alpha changed, so it would be a way; I just wonder if there is another way already implemented (if that isnt the only way)?
Alpha Blending
Also I couldnt find any examples using Blend.AlphaAdditive… I am still learning this stuff anyways
EDIT: mmm simple, just use a material like Lighting.j3md and set the difuse and the 2 others with alpha, and set the geometry to bucket transparent, all geometries, so they all render properly (dunno if it can cause performance issues…)