Lately I have been playing with procedural solid textures and things went well except for bump mapping. I am using worldPosition as coordinates for sampling the procedural textures, in other words I got two functions:

vec3 getColor(vec3 worldPos);

float getHeight(vec3 worldPos);

One thing that I have hard time understanding is whether it is possible to construct normals with the use of the getHeight(vec3 wPos) function for objects using smooth shading.

Currently my best attempt to construct the normal was achieved with this code:

vec4 quat(vec3 d, vec3 z) {
vec4 quat = vec4(cross(z,d),sqrt(dot(z,z)*dot(d,d)) + dot(z,d));
quat *= inversesqrt(dot(quat,quat));
return quat;
}
vec3 qmult(vec4 q, vec3 v) {
return v + 2.0*cross(cross(v, q.xyz) + q.w*v, q.xyz);
}
varying vec3 normal; //smoothly interpolated normal
varying vec3 worldPos;
...
vec3 uv = worldPos;
vec3 norm = normalize(normal); //smooth normal
vec3 uvdx = dFdx(uv);
vec3 uvdy = dFdy(uv);
float r = getHeight(uv);
float rx = getHeight(uv+uvdx);
float ry = getHeight(uv+uvdy);
vec3 normF = normalize(cross(uvdx, uvdy)); //Flat normal
//N is bump mapped normal on flat surface
vec3 N = normalize(cross(uvdx + (rx - r)*normF, uvdy + (ry - r)*normF));
//construct rotation from normF to N
vec4 q = quat(normF, N);
//apply rotation on norm
vec3 n = qmult(q, norm);

Result: Edges of triangles are visible

Later, I remembered that blender has a bump node which I could try so I went ahead and tried a similar experiment in it. I was hoping the smooth shading would be kept. However here is the result.

Smooth shading is not preserved after using bump mapping in blender either.

The question remains, given fragment worldPos, smooth normal, height function, is there a way to create smooth bump mapped normals. Or is there another approach. As you can see from the example above, with my current method I have managed to bump map a flat shaded surface without uv coordinates and without tangent per vertex.

I donâ€™t think the height of one point by itself is enough to calculate a normal. You would need to know the height of all of the surrounding â€śpixelsâ€ť also.

If your â€śprocedureâ€ť doesnâ€™t supply normals as well as heights then I guess your job will be kind of tricky or you will have to do a lot of multisampling.

Edit: and whether or not smooth shading is maintained will be dependent on how consistently your values are calculated. I guess your vertexes are not shared?

Thank you for reply. Yes, vertices are shared and the original vertex normal is smoothly interpolated. But, I do not have uv coordinates, which can be used to calculate tangents and then TBN matrix.

Actually I do know the height of surrounding pixels. Yet still, I get the visible seams between the triangle edges.

Edit: I do have uv coordinates, but not (u,v) but (u,v,t) as the world position of the fragment. If i could calculate the tangent, then I could create TBN matrix and see if I get different results.

Itâ€™s just weird because if the edge of two triangles shares the same vertexes then on some level, the fragments along that edge should be calculated the same.

If the vertexes are not shared then they might get calculated differently.

Alternately, there is some issue with the math that changes as it approaches the edgeâ€¦ so itâ€™s not really the edge that isnâ€™t blending but the fragments near the edge.

You should be able to create a greatly simplified example quad that illustrates (or not) the issue by giving it normals that give it a curve.

Edit: note that the tangents are a bit arbitrary in this caseâ€¦ I think probably the normals themselves are not even right. And as long as you arenâ€™t parallax sampling the texture then you donâ€™t need them. So even if you ultimately plan to parallax sample some texture (then why the procedural generation) then you do not need TBN for testing Nâ€¦ only N, really. At the very least, you could get N right before proceeding.

I had a similar issue when I was procedurally generating my geometry. The edges were brutally obvious.

The solution I came up with was for the edges Iâ€™d calculate 2 normals using positions from either side of the edge. Add the 2 normals and normalize that vector. Itâ€™s not perfect but it does look a heck of a lot better.

Not sure if thatâ€™d help you in this case but thought Iâ€™d throw that in.

It occurs to me that the faceting may be how you are calculating your procedural values.

The triangles actually ARE flat. So if you imagined slicing up a hunk of wood with triangles then youâ€™d be sampling the 3D wood texture in slices and not in curves. So it theory it could be related to sampling. You may have to project out uvt based on the world normal.

The normal will be linearly interpolated along the face anywayâ€¦ when you normalize it then maybe you can get the difference between normalized and unnormalized and adjust uvt by that amount.

The uvdx, uvdy are incorrect at edges due to how dFdx, dFdy are implemented I think. If I understand it correctly dFdx and dFdy return the derivative of the currently rasterised triangle, thus produce incorrect results at edges.

If this is the case, the I could try to render worldPos, normals + height into a buffer and in lighting post process, rotate the normals with derivatives obtained from sampling worldPos buffer.

PS: I have tried to not use dFdx, dFdy as described but it did not fix the issue.

Yes that is definitely one of the issues. I have managed to find and solve another. Instead of this:

vec4 q = quat(normF, N); //construct rotation from normF to N
vec3 n = qmult(q, norm); //apply rotation on norm

I have used this:

vec4 q = quat(normF, norm);
vec3 n = qmult(q, N);

The result is in the image. Several visible edges are not seen now if not viewing the cylinder at an angle. This time I believe the issue is with regards to the mapping as you have mentioned.