Hi, I’m making some light shaders purely as a learning exercise. Not posting the whole shader I don’t expect someone to debug it for me, but there is 1 thing that is getting me:
material.setVector3(“SpotDirection”, new Vector3f(0f, -1f, 0f));
I’m sending my own values in, and this is the spots direction, so essentially down. I convert from world to view space (tutorial I’m following is doing it all in viewspace) then normalize it
vec3 dir = vec4(g_ViewMatrix * vec4(m_SpotDirection,1.0)).xyz;
vec3 D = normalize(dir);
Alternatively, I try sending in -100 instead of -1
material.setVector3(“SpotDirection”, new Vector3f(0f, -100f, 0f));
In the case of -100 the spot works perfectly, for -1 it fires off down the Z axis.
Assuming I’m not screwing it up elsewhere, should -1 or -100 be making any difference with it being normalized? Since the others are 0 I figured no matter what viewMatrix * spotdirection gives it will still be 0,x,0 with x being turned to 1 when normalized.