Normals are needed for lighting. They are not used for face culling. Only winding is used for face culling.
But to know how much light is hitting a surface, you need to know which way it’s facing, ie: the normal.
Common next question might be why this isn’t just calculated in the shader or whatever:
Since polygonal surfaces are only approximations of potentially curved surfaces, normals cannot generally be calculated from a triangle directly as there needs to be some understanding of the curvature of the surface. (Even a dumb straight average of the shared triangle normals is not enough.)
A tangent helps orient the texture in 3D space. It should essentially point in the ‘x axis’ of the texture in 3D space. Again, this is necessary because surfaces curve and/or triangles may be oriented in any direction relative to the square texture. The binormal is orthogonal to both the normal and the tangent and can generally be calculated from them.
The normal, tangent, and binormal represent the 3D axes of ‘texture space’… where the tangent is x (or u), the binormal is y (or v), and the normal is depth.
I think even in PBR you only really need them if you have normal maps or bump maps… because these will need to know how to orient those bumps/local normals properly and so on. Maybe roughness needs them? (I feel like not since global illumination should be in world space and not texture space. Texture space is only needed to interpret textures, ie: normal map textures, bump map textures.)