Question about depth

I’m trying to write a custom water shader. Currently, I’m trying to change water “blueness” based on how deep it is in perspective view (I don’t know much about water optics, but this is my guess). With this, shores will be almost transparent and deep water will be very blue. I basically use (sceneDepth - waterDepth) in my shader to do this, but I get inconsistent colors like this:

And I back up a little bit:

Btw, I know my water mesh is weird.

Well that certainly looks nice for your first shader attempts! Gratz.
Whats the question btw? :wink:

It may be that how you are interpetting depth is incorrect. One trick I used was to color everything red based linearly on depth to get my head around it.

Since you haven’t provided anything else about how you’ve wired this together that’s all the advice I can offer.

Initially I guessed it had something to do with depth precision (which I don’t really know about). But it wasn’t (I think).

@seann999 said: Initially I guessed it had something to do with depth precision (which I don't really know about). But it wasn't (I think).
That sounds like you solved your problem? How? And what exactly is "it"? :)

@normen Thanks. The problem is that the colors are inconsistent. In the first picture, it looks like the water between the “islands” are all blue, but the at the bottom, only a narrow strip is blue.
@pspeed Okay, I have two separate viewports. One is for the main scene and the other contains water geometries. The two rendered textures from both are combined in a shader. I know there are some problems to rendering transparent objects this way (I guess it’s kind of like deferred rendering). In this shader, I blend the colors from the two. In part of this, I mix them depending on their depth differences. This is part of the code:
[java]
color = vec4(mix(texture2D(m_SceneColor, texCoord).rgb, texture2D(m_WaterColor, texCoord).rgb, clamp(water.a * 32.0 * (sceneDepth - waterDepth), 0.0, 1.0)),1.0);
[/java]

1 Like
@normen said: That sounds like you solved your problem? How? And what exactly is "it"? :)
I changed the depth texture format from Format.Depth to Format.Depth16, but that didn't solve the issue.

But how are you calculating scene depth? That part was left out.

This .frag file has an example of pulling depth from a depth texture:
http://code.google.com/p/jmonkeyengine/source/browse/trunk/engine/src/core-effects/Common/MatDefs/Post/DepthOfField.frag

…along with comments.

@pspeed I get depth by getting the r value from the depth texture that is set for the framebuffer of the viewport (I do it twice for my shader; one for the water scene and one for the non-water scene). I think that’s what the DepthOfField.frag is doing too.

@seann999 said: @pspeed I get depth by getting the r value from the depth texture that is set for the framebuffer of the viewport (I do it twice for my shader; one for the water scene and one for the non-water scene). I think that's what the DepthOfField.frag is doing too.

Well, I will cut and paste so we are both sure… this seemed like something you would have pasted if you were doing it:

[java]
float zBuffer = texture2D( m_DepthTexture, texCoord ).r;

//
// z_buffer_value = a + b / z;
//
// Where:
// a = zFar / ( zFar - zNear )
// b = zFar * zNear / ( zNear - zFar )
// z = distance from the eye to the object
//
// Which means:
// zb - a = b / z;
// z * (zb - a) = b
// z = b / (zb - a)
//
float a = m_NearFar.y / (m_NearFar.y - m_NearFar.x);
float b = m_NearFar.y * m_NearFar.x / (m_NearFar.x - m_NearFar.y);
float z = b / (zBuffer - a);
[/java]

1 Like

@pspeed Oh, I didn’t do the “depth correction” (?) part. Well, I still got the same effect.
I then tried it by getting the position from the depth for both and then finding the distance between them. Still pretty much the same.
I then realized that “getting the difference for the depths from both viewports at the same texture coordinates” might be the cause for the “color inconsistency”.
Example:

The camera from position A and B are looking at the same spot in the water, but they will get fairly different hues of blue. This is because in A, the distance between the point of intersection with water and the point of intersection with the scene behind the water is larger than B. So it will get a darker blue.
I then found a good spot to demonstrate this:

This is from a camera looking down at a shallow “lake”. So it’s kinda like B. On the other hand,

this is like A.
Um, sorry if I explained it poorly. I tried.
So in conclusion, it might have not been about depth precision or anything, but it was the way I was doing water.

Yes, your modeling of water density is not altogether accurate, I guess. And in real life, the difference between A and B would probably not be that much for the depths involved.

If you are trying to model ocean water then maybe never make it very clear… and then only make it really dark for super-deep water. If you are trying to model clear seas then still the depth gradient should be very long… and never get as dark as you have it.

1 Like

the water shader compute water color based on depth of the water and depth of the scene. It has a nice “color extinction” algorithm.
Basically you set at what depth (water depth) the color will vanish first red, then green then blue. It’s based on a real phenomenon.
Maybe you could ditch that out form the water Filter. That’s not the most expensive part of it.
In this article http://www.gamedev.net/page/reference/index.html/_/technical/graphics-programming-and-theory/rendering-water-as-a-post-process-effect-r2642
look at the “Color extinction part” for more explanations

If you have the 2 renders as textures, you should not use alpha, just mix the images
The water shader finds the position of the fragment in world space from the scene depth.
Then compute the water depth at this point (compute a surface point from the water height, then compute the length).
then the color code is this
[java]
float depthN = depth * m_WaterTransparency;
float waterCol = saturate(length(m_LightColor.rgb) / m_SunScale);
refraction = mix(mix(refraction, m_WaterColor.rgb * waterCol, saturate(depthN / visibility)),
m_DeepWaterColor.rgb * waterCol, saturate(depth2 / m_ColorExtinction));

[/java]
‘depth’ is the water depth. it’s the length between the surface point and the fragment position
‘refraction’ would be your scene texture without the water,
‘sunscale’ an’light color’ are for specular effect, maybe you don’t need this.
‘visibility’ is a constant set to 3.0.
‘depth2’ is surfacePoint.y - position.y, not sure there is a difference with depth, because the surface point is supposed to be vertical to the position…
‘m_WaterColor’ and ‘m_DeepWaterColor’ are to passed colors initialized by default to a greenish ble and a very dark bleue.

This algorithm is kinda cheap and it’s a great part in the “realistic” look of the water.

ofc feel free to look into the Water shader code.

2 Likes