Normalize vector

Hi, I’m making some light shaders purely as a learning exercise. Not posting the whole shader I don’t expect someone to debug it for me, but there is 1 thing that is getting me:

material.setVector3(“SpotDirection”, new Vector3f(0f, -1f, 0f));

I’m sending my own values in, and this is the spots direction, so essentially down. I convert from world to view space (tutorial I’m following is doing it all in viewspace) then normalize it

vec3 dir = vec4(g_ViewMatrix * vec4(m_SpotDirection,1.0)).xyz;
vec3 D = normalize(dir);

Alternatively, I try sending in -100 instead of -1

material.setVector3(“SpotDirection”, new Vector3f(0f, -100f, 0f));

In the case of -100 the spot works perfectly, for -1 it fires off down the Z axis.

Assuming I’m not screwing it up elsewhere, should -1 or -100 be making any difference with it being normalized? Since the others are 0 I figured no matter what viewMatrix * spotdirection gives it will still be 0,x,0 with x being turned to 1 when normalized.

When the other values are 0 then not, when they are then obviously yes.

Why are you using the vec4 in your calculation there anyway?

1 Like

Oh. Big Oh.

I am doing it to multiply by the mat_4 g_ViewMatrix, and when I looked at it to check it I saw that 1.0, which you mentioned :stuck_out_tongue:

Changed it to a zero and its much better. Thanks

the 1.0 in the forth component is the devil
use 0.0
1 compared to 100 in negligible so when you normalize you have an ok direction,
but when you have (0,-1,0,1) the normalized value is definitely off

ahh the ninja edit by me, thanks though

magic w strikes again