After having been annoyed with a white “frame” around refractions in the ProjectedGrid I started to look for the cause and have now come across a possible bug.
I have reproduced the “bug” in a small example where i have a blue box behind a quad. The blue box uses unshaded blue material, and the quad uses my own material (MyMaterial) The input to the MyMaterial shader is a depth texture and a color texture from a camera that displays the blue box. The viewPort background is black.
In the shader i expect the pixels of the color texture to match the pixels of the depth texture. So that if the depth is relatively short, the color texture for that fragment will be blue (box). And if the depth is large, the color texture will be black (background).
In the shader i do a check for this, and if the color is blue but corresponding depth is large i display a red pixel, and if the color is black but the corresponding depth is short i display a white pixel.
This normally works, but if I introduce a “distortion” in the pixel coordinate that both the color and depth texture uses (which is done for the ProjectedGrid) the bug appears see pictures below.
vertex shader:
[java]
uniform mat4 g_WorldViewProjectionMatrix;
attribute vec3 inPosition;
varying vec4 viewCoords;
void main() {
vec4 vVertex = vec4(inPosition, 1.0);
viewCoords = g_WorldViewProjectionMatrix * vVertex;
gl_Position = viewCoords;
}
[/java]
fragment shader:
[java]
uniform sampler2D m_refraction;
uniform sampler2D m_depthMap;
varying vec4 viewCoords;
void main() {
vec2 projCoordDepth = viewCoords.xy / viewCoords.q;
projCoordDepth = (projCoordDepth + 1.0) * 0.5;
//Include distortion to reveal "bug"
projCoordDepth += vec2(0.01, 0.0);
projCoordDepth = clamp(projCoordDepth, 0.0, 1.0);
// Calculate depth from camera plane to depthMap
float z_b = texture2D(m_depthMap, projCoordDepth).r;
float z_n = 2.0 * z_b - 1.0;
float depth = 2.0 * 1.0 * 1000.0 / (1000.0 + 1.0 - z_n * (1000.0 - 1.0));
vec4 fragColor = vec4(0.0, 0.1, 0.0, 1.0);
vec4 textureColor = texture2D(m_refraction, projCoordDepth);
if (textureColor.b > 0.1) {
if (depth > 50.0) {
//indicates bug - display red pixel
fragColor = vec4(1.0, 0.0, 0.0, 1.0);
} else {
//modify blue color if blue box detected
fragColor.b = 0.3;
}
}
if (textureColor.b < 0.1) {
if (depth < 50.0) {
//indicates bug - display white pixel
fragColor = vec4(1.0, 1.0, 1.0, 1.0);
} else {
//detected black background, modify green component
fragColor = vec4(0.0, 0.1, 0.0, 1.0);
}
}
gl_FragColor = fragColor;
}
[/java]
Result without including the distortion in the fragment shader:
Result with distortion in the fragment shader:
Note that the bug is not visible for all “distortions”. By moving the x-axis for the pixel by 0.2 the red line is not present.
This leads me to believe that the shader somehow ends up not reading matching pixels from the 2 textures even though the coordinate input is the same.
Does anyone know what the cause could be?