[SOLVED] Spherical normal mapping problem

I’m trying to render a sphere (Earth) using normal maps to accentuate mountain ranges and so on. I’m encountering a weird lighting problem when normal maps are enabled. The problem is that the bright spot (“shine”, for this post) appears to move and change shape when the sphere rotates in place.



The camera and light source (directional) stay still. The sphere rotates in-place, directly in front of the camera.



The result when normal maps are OFF: http://i.imgur.com/aaSZZ.png



The result when normal maps are ON (flat normal map):

http://i.imgur.com/OEkWn.png

http://i.imgur.com/rY79X.png



I expect that the shine should stay in the center of the screen in both cases. Am I wrong for assuming that? Am I making some stupid mistake in my code?



Code:

The code is almost entirely from the “Hello Material” tutorial. I used a very low-res “diffuse.png” and “normal.png” to demonstrate; they can be found here: http://i.imgur.com/mqGOn.png ; http://i.imgur.com/5Ystb.png



[java]public class LightingBug extends SimpleApplication {



public static void main(String[] args) {

new LightingBug().start();

}



private Node node;



@Override

public void simpleInitApp() {

// Set up a sphere…

Geometry geo = new Geometry (“s”, new Sphere(16, 32, 1f) );



Material mat = new Material ( assetManager, “Common/MatDefs/Light/Lighting.j3md” );

mat.setTexture ( “DiffuseMap”, assetManager.loadTexture(“diffuse.png”) );

mat.setTexture ( “NormalMap”, assetManager.loadTexture(“normal.png”) ); // try commenting out this line…



mat.setBoolean ( “UseMaterialColors”,true );

mat.setColor ( “Specular”, ColorRGBA.White );

mat.setColor ( “Diffuse”, ColorRGBA.White );

mat.setFloat ( “Shininess”, 5f);



geo.setMaterial(mat);



node = new Node ( “n” );

node.attachChild(geo);

rootNode.attachChild ( node );



// Set up a light…

DirectionalLight light = new DirectionalLight();

light.setDirection ( Vector3f.UNIT_Y );

light.setColor ( ColorRGBA.White );

rootNode.addLight(light);



cam.setLocation(new Vector3f(0,-3,0));

cam.lookAt(Vector3f.ZERO, Vector3f.UNIT_Z);

}



@Override

public void simpleUpdate(float tpf) {

node.rotate(0, 0, tpf / 10);

}

}[/java]



Additional details:

  • The problem also appears if you use a Cylinder, though it is less obvious; I was not able to see the problem with a Quad
  • The problem also appears if the Sphere is stationary and you move the camera around the sphere, though it is again less obvious.
  • I have tried both an older version of JME3 (2012-06-11 nightly) and a newer version (2012-08-22 nightly); both are exactly the same.
  • The addition of a specular map does not fix the problem.

Try:

http://hub.jmonkeyengine.org/javadoc/com/jme3/util/TangentBinormalGenerator.html

1 Like

That fixes the problem (thanks!). Is there an explanation I can read? I’ve never heard of Tangents or Binormals being important to rendering (yes, I’m pretty new to rendering).

When you use things like normal maps and bump maps there needs to be a “tangent space” so that the shader knows how to interpret the values. The regular normal isn’t enough because if the object is rotated then the normal map normal won’t rotate with it… so there needs to be some reference vector. Tangent space is defined by the surface normal, a tangent vector, and a binormal… generally forming orthogonal axes.



Hopefully that at least gives you enough to google around. All normal mapping and bump mapping will require it because they need to know the full orientation of the drawn “pixel” and not just which direction it’s pointing.

I can’t believe I missed that note in “Hello Materials” – that’s embarrassing. Thanks for your help!