On the path to something else, I created a Depth Blur Filter. It could probably be improved but it’s pretty nifty.
It looks like this in Mythruna (which ironically will probably not be using it. ;)):
That picture has the focus distance set to 64 and the range set to 20 or something. So near is out of focus and far is out of focus.
Also, I think the code makes a reasonably good template for any depth-based filter because sometimes reverse engineering what’s in the zbuffer can be tricky. I’ve tried to be as illuminating as possible but I’m a bit of a shader noob, too.
Here is the DepthBlurFilter.java file:
[java]
package yourPackageHere;
import com.jme3.asset.AssetManager;
import com.jme3.post.Filter;
import com.jme3.material.Material;
import com.jme3.renderer.RenderManager;
import com.jme3.renderer.Renderer;
import com.jme3.renderer.ViewPort;
/**
- A post-processing filter that performs a depth range
- blur using a scaled convolution filter.
*
-
@version $Revision: 779 $
-
@author Paul Speed
*/
public class DepthBlurFilter extends Filter
{
private float focusDistance = 50f;
private float focusRange = 10f;
private float blurScale = 1;
// These values are set internally based on the
// viewport size.
private float xScale;
private float yScale;
public DepthBlurFilter()
{
super( "Depth Blur" );
}
/**
- Sets the distance at which objects are purely in focus.
*/
public void setFocusDistance( float f )
{
this.focusDistance = f;
}
public float getFocusDistance()
{
return focusDistance;
}
/**
- Sets the range to either side of focusDistance where the
- objects go gradually out of focus. Less than focusDistance - focusRange
- and greater than focusDistance + focusRange, objects are maximally "blurred".
*/
public void setFocusRange( float f )
{
this.focusRange = f;
}
public float getFocusRange()
{
return focusRange;
}
/**
- Sets the blur amount by scaling the convolution filter up or
- down. A value of 1 (the default) performs a sparse 5x5 evenly
- distribubted convolution at pixel level accuracy. Higher values skip
- more pixels, and so on until you are no longer blurring the image
- but simply hashing it.
*
- The sparse convolution is as follows:
%MINIFYHTMLc3d0cd9fab65de6875a381fd3f83e1b338%
- Where ‘x’ is the texel being modified. Setting blur scale higher
- than 1 spaces the samples out.
*/
public void setBlurScale( float f )
{
this.blurScale = f;
}
public float getBlurScale()
{
return blurScale;
}
@Override
public boolean isRequiresDepthTexture()
{
return true;
}
@Override
public Material getMaterial()
{
material.setFloat( “FocusDistance”, focusDistance );
material.setFloat( “FocusRange”, focusRange );
material.setFloat( “XScale”, blurScale * xScale );
material.setFloat( “YScale”, blurScale * yScale );
return material;
}
@Override
public void preRender( RenderManager renderManager, ViewPort viewPort )
{
}
@Override
public void initFilter( AssetManager assets, RenderManager renderManager,
ViewPort vp, int w, int h )
{
material = new Material( assets, “MatDefs/DepthBlur.j3md” );
xScale = 1.0f / w;
yScale = 1.0f / h;
}
@Override
public void cleanUpFilter( Renderer r )
{
}
}
[/java]
And the material definition:
[java]
MaterialDef Depth Blur {
MaterialParameters {
Int NumSamples
Int NumSamplesDepth
Texture2D Texture
Texture2D DepthTexture
Float FocusRange;
Float FocusDistance;
Float XScale;
Float YScale;
}
Technique {
VertexShader GLSL100: Common/MatDefs/Post/Post.vert
FragmentShader GLSL100: MatDefs/DepthBlur.frag
WorldParameters {
WorldViewProjectionMatrix
}
}
Technique FixedFunc {
}
}
[/java]
And the .frag where the ‘magic’ happens:
[java]
uniform sampler2D m_Texture;
uniform sampler2D m_DepthTexture;
varying vec2 texCoord;
uniform float m_FocusRange;
uniform float m_FocusDistance;
uniform float m_XScale;
uniform float m_YScale;
vec2 m_NearFar = vec2( 0.1, 1000.0 );
void main() {
vec4 texVal = texture2D( m_Texture, texCoord );
float zBuffer = texture2D( m_DepthTexture, texCoord ).r;
//
// z_buffer_value = a + b / z;
//
// Where:
// a = zFar / ( zFar - zNear )
// b = zFar * zNear / ( zNear - zFar )
// z = distance from the eye to the object
//
// Which means:
// zb - a = b / z;
// z * (zb - a) = b
// z = b / (zb - a)
//
float a = m_NearFar.y / (m_NearFar.y - m_NearFar.x);
float b = m_NearFar.y * m_NearFar.x / (m_NearFar.x - m_NearFar.y);
float z = b / (zBuffer - a);
// Above could be the same for any depth-based filter
// We want to be purely focused right at
// m_FocusDistance and be purely unfocused
// at +/- m_FocusRange to either side of that.
float unfocus = min( 1.0, abs( z - m_FocusDistance ) / m_FocusRange );
if( unfocus < 0.2 ) {
// If we are mostly in focus then don’t bother with the
// convolution filter
gl_FragColor = texVal;
} else {
// Perform a wide convolution filter and we scatter it
// a bit to avoid some texture look-ups. Instead of
// a full 5x5 (25-1 lookups) we’ll skip every other one
// to only perform 12.
// 1 0 1 0 1
// 0 1 0 1 0
// 1 0 x 0 1
// 0 1 0 1 0
// 1 0 1 0 1
//
// You can get away with 8 just around the outside but
// it looks more jittery to me.
vec4 sum = vec4(0.0);
float x = texCoord.x;
float y = texCoord.y;
float xScale = m_XScale;
float yScale = m_YScale;
// In order from lower left to right, depending on how you look at it
sum += texture2D( m_Texture, vec2(x - 2.0 * xScale, y - 2.0 * yScale) );
sum += texture2D( m_Texture, vec2(x - 0.0 * xScale, y - 2.0 * yScale) );
sum += texture2D( m_Texture, vec2(x + 2.0 * xScale, y - 2.0 * yScale) );
sum += texture2D( m_Texture, vec2(x - 1.0 * xScale, y - 1.0 * yScale) );
sum += texture2D( m_Texture, vec2(x + 1.0 * xScale, y - 1.0 * yScale) );
sum += texture2D( m_Texture, vec2(x - 2.0 * xScale, y - 0.0 * yScale) );
sum += texture2D( m_Texture, vec2(x + 2.0 * xScale, y - 0.0 * yScale) );
sum += texture2D( m_Texture, vec2(x - 1.0 * xScale, y + 1.0 * yScale) );
sum += texture2D( m_Texture, vec2(x + 1.0 * xScale, y + 1.0 * yScale) );
sum += texture2D( m_Texture, vec2(x - 2.0 * xScale, y + 2.0 * yScale) );
sum += texture2D( m_Texture, vec2(x - 0.0 * xScale, y + 2.0 * yScale) );
sum += texture2D( m_Texture, vec2(x + 2.0 * xScale, y + 2.0 * yScale) );
sum = sum / 12.0;
gl_FragColor = mix( texVal, sum, unfocus );
// I used this for debugging the range
//gl_FragColor.r = unfocus;
}
}
[/java]
Improvements: it currently hard-codes the near/far clip values to what I use for my camera. These could potentially be passed in or maybe there’s a better way to get them.
Also, the blur might look better if it was clamped radially around the center of the screen. Anything within distance - range → distance + range is in focus which means that the focused area is a big strip across the screen centered at ‘distance’.
Let me know if I’ve done anything the hard way.