Shader question

Hello,
This question is related to something i want to do for the ProjectedGrid

Introduction:
I have a the water (for simplicity you can think of it as a quad) with a custom material. There is also a box that intersects the water. Part of the box is above water, and part of the box is below water.

The goal is to get the part of the box that is below water projected onto the water texture, while the part of the box that is above water should not be projected onto the water texture. (In the ProjectedGrid this is the refraction effect.)

Currently for creating this effect there is a separate camera (refractionCam) with the same settings (direction, location etc) as the scene camera. The refractionCam supplies what it sees as a texture to the custom water material, and this texture is mixed into the water material in the fragment shader.

To avoid getting getting geometries above water projected onto the water, the ProjectedGrid simply clips the refractionView at the xz-plane (average water height). This works very well for flat water (water at xz-plane), but causes some weird effects for partially submerged geometries when waves are present.

I want to make the refraction effect a bit more precise.

Question:
Can I somehow in the shader for the water material find the distance from the camera to the geometries that are seen by the refractionCam?

I am able to find the distances from the camera to each water vertex in the vertex shader, and this could be supplied to the fragment shader through a varying float.
My idea is that I in the fragment shader could compare distances to the refractionCam geometries and the distance to the water, and only project the refractionCam texture onto the water if distance to geometry > distance to water.

Any other suggestions for how to achieve what I want are also welcome :slight_smile:

Just to be exact: I am asking if I can find the distance from the camera to each vertex of the geometries seen by the refractionCam. The distance to the geometry as a whole could be found outside the shader.

You need to pass the depthBuffer of the refractionCam to the shader as well.

But, from your description it sounds like you are rendering the whole scene twice, once for the refraction cam and once for the main cam.
I am sure @Nehon can give a few more hints, but AFAIK you could get the effect also if process the water material after the scene has been rendered. And use a copy of the already rendered scene buffers as input for your shader…

//Add 2: Since i have something on my todo list which would need the same technique i am also interested in some infos about this. Or, if you like i can offer collaboration till the basic setup is done…

1 Like

Thank you for your reply!

But, from your description it sounds like you are rendering the whole scene twice, once for the refraction cam and once for the main cam.
I forgot to mention that the refraction view does not attach the rootNode, but attaches the reflectionNode (used for the refraction/reflection effect) which is a subset of the rootNode scene. This limits the number of elements you have to process for the refraction.
I am sure @Nehon can give a few more hints, but AFAIK you could get the effect also if process the water material after the scene has been rendered. And use a copy of the already rendered scene buffers as input for your shader…
If I understand you correctly then I could get the same effect as with the refractionCam if I only input the scene buffers for objects on the reflectionNode. I am not sure how to do this method though, so unless someone shows me how I would prefer to use the refractionCam just for simplicity for now. At least until the original problem is solved.
Add 2: Since i have something on my todo list which would need the same technique i am also interested in some infos about this. Or, if you like i can offer collaboration till the basic setup is done..
I would definitely like some collaboration. Do you specifically mean the second approach (post rendering), or the initial problem?

I will start reading up on the depthBuffer approach with the refractionCam and see if I can get that to work.

The benefit of using the main cam’s buffer is that not a single object would be drawn twice… I’ll investigate the basic stuff today and create a git repository.

Well i also have to target your problem, but i really have performance issues, so i definitely go for the post-processing route. Don’t know if that fits your problem. but since i am targeting also water and heat refraction i think we should face the same problem.

I just need a cheap solution to the problem, with as few draw calls as possible

I have made an example that solves my initial problem of determining what parts of an object is below sea level, using the depthBuffer from the refractionCam as suggested.

Shader files:
vertex:
[java]
uniform mat4 g_WorldViewProjectionMatrix;

attribute vec3 inPosition;

varying vec4 viewCoords;

void main() {
vec4 vVertex = vec4(inPosition, 1.0);
viewCoords = g_WorldViewProjectionMatrix * vVertex;
gl_Position = viewCoords;
}
[/java]
fragment:
[java]
uniform sampler2D m_refraction;
uniform sampler2D m_depthMap;
uniform float zNear;
uniform float zFar;

varying vec4 viewCoords;

void main() {

vec2 projCoordDepth = viewCoords.xy / viewCoords.q;
projCoordDepth = (projCoordDepth + 1.0) * 0.5;
projCoordDepth = clamp(projCoordDepth, 0.0, 1.0);

// Calculate depth from camera to depthMap
float z_b = texture2D(m_depthMap, projCoordDepth).r;
float z_n = 2.0 * z_b - 1.0;
// float depth = 2.0 * zNear * zFar / (zFar + zNear - z_n * (zFar - zNear));
// For some reason zNear and zFar are not transferred correctly to shader
// hard coded for now
float depth = 2.0 * 1.0 * 1000.0 / (1000.0 + 1.0 - z_n * (1000.0 - 1.0));

vec4 refractionColor = texture2D(m_refraction, projCoordDepth);
refractionColor.g = 0.1;

if (depth > viewCoords.w) {
    gl_FragColor = refractionColor;
} else {
    gl_FragColor = vec4(0.1, 0.1, 0.1, 0.3);
}

}
[/java]

I am also interested in suggestions for how the refraction problem could be solved with post processing.

Have you looked at JME’s built in water post processor? It seems to do all or most of what you are trying to do so at best it does the job and at worst it is something you can look at to see how it was done.

1 Like

Good suggestion. I will have a look.