# Normals / Normal Maps and Custom Meshes

Good evening, lovely Monkey Community I have a few questions regarding Normals / Normal Maps. As I want to make my own meshes programmatically, I guess I need to understand what e.g. Blender does when I apply a Normal Map on a Model.

• Normal Maps are used often if you want to have the look of a high-poly model, but bake it onto a low-poly model.
• As I know so far through the Custom Mesh article, you can assign one Normal Vector per each Vertex.
• But here is the question: If this is a 1:1 allocation, how can you apply the Normal Map of the high-poly (=much vertices) to the low-poly mesh?
My comprehension is, that you must have multiple Normal Vectors in between the low-poly vertices to achieve more detail while keeping the vertex count low.

If you have some sources to read, I would appreciate that. I did some research, but the articles I have found only explained the rough concept of Normal Maps or how to use tools to create and apply Normal Maps. I am looking for the backgrounds. Is there a general source for such background knowledge?

Thanks a lot A normal map is a texture where each pixel represents a normal.

Googling “normal map” will probably give you a wealth of material to read.

Also if you wanna see a really quick example check out

http://cpetry.github.io/NormalMap-Online/

Turn off specular, AO and displacement on the right, and you can rotate the cube and see exactly what the normal map is doing - it’s just changing the lighting to make it look like there is more detail there, but its ultimately still flat.

X: -1 to +1 : Red: 0 to 255
Y: -1 to +1 : Green: 0 to 255
Z: 0 to -1 : Blue: 128 to 255

A normal pointing directly towards the viewer (0,0,-1) is mapped to
(128,128,255). Hence the parts of object directly facing the viewer are
light-blue. The most common color in a normal map.

A normal pointing to top right corner of the texture (1,1,0) is
mapped to (255,255,128). Hence the top-right corner of an object is
usually light-yellow. The brightest part of a color map.

A normal pointing to right of the texture (1,0,0) is mapped to
(255,128,128). Hence the right edge of an object is usually light-red.

A normal pointing to top of the texture (0,1,0) is mapped to
(128,255,128). Hence the top edge of an object is usually light-green.

A normal pointing to left of the texture (-1,0,0) is mapped to
(0,128,128). Hence the left edge of an object is usually light-cyan.

A normal pointing to bottom of the texture (0,-1,0) is mapped to
(128,0,128). Hence the bottom edge of an object is usually
light-magenta.

A normal pointing to bottom left corner of the texture (-1,-1,0) is
mapped to (0,0,128). Hence the bottom-left corner of an object is
usually blue. The darkest part of a color map.

Thank you. That Wikipedia article was very interesting.
I already know the rough concept of Normal Maps, that in Normal Maps each pixel is representing the direction of a Normal Vector and it changes the lighting to apear so. But what exactly the colour values do is explained very well You remember, I want to do all this “programatically”. My problem is, that I can only assign as much Normals, as there are Vertices on the mesh, according to the wiki article: Custom Meshes

float[] normals = new float;
normals = new float[]{0,0,1, 0,0,1, 0,0,1, 0,0,1};
mesh.setBuffer(Type.Normal, 3, BufferUtils.createFloatBuffer(normals));

``````You need to specify as many normals as the polygon has vertices. For a flat quad, the four normals point in the same direction.

The example is a 4-vertices plane and the Normals are {0,0,1, 0,0,1, 0,0,1, 0,0,1}. That means that I could only assign 4 Normal Vectors onto it, which is not much.

So the way to go would be, to programatically generate a Normal Texture and apply that to the mesh?
With the knowledge of the colour values this is possible. I wonder though, why that limit counts for code specified Normals but not for Normal Maps...``````

Because they are two different things, the normals that you are talking about are vertex attributes, the others are textures…

Oh ok, so they are processed differently by the GPU. That explaines much. I’ve always thought that the Normal Maps are applied as Normal Vectors to the mesh (in Java) before the are processed by the GPU.

Thank you very much!  