Camera Bindings for shader nodes


I try to create a custom particle shader by using CameraLeft and CameraUp uniform bindings
but as I do, the resulting meshes appear distorted, when they appear

I replaced these two bindigs in my shader code by vec3(1,0,0) and vec3(0,1,0) and it works (partialy of course)

so I was wondering if camera uniforms where correctly injected in shader nodes system or supported at all


void main(){ vec4 pos = vec4(, 1.0); pos=pos+vec4(texCoord2.x*vec3(1,0,0)/*cameraLeft*/,1)+vec4(texCoord2.y*vec3(0,1,0)/*cameraUp*/,1); vertexPosition=pos; }

[edit : I tried injecting camera.getLeft and getUp as material params, it appears to work so there is obviously an issue with shader nodes]

To inject it you need to add CameraLeft and CameraUp in the list of WorldParams of the technique.
Then you need to use them in the shader as


If this is what you did and you still have an issue then there is a problem indeed.

Also if this shader is used for a filter material, (can’t tell from your post) you may have hit one of the limitation of 3.0 that has been changed in 3.1
Global Camera informations for filter were the ones of the parallel projection camera created to render the filter. So it was problematic because you had to pass the real scene camera information yourself.

in 3.1, we use the same camera to render the filter with soma math tricks in the vert shader so you can use globals and have correct scene camera information.
So maybe you hit that issue.

ok I’ll try that

what do you call a filter ? I am curious…

A post process filter like SSAO, Bloom, etc… that works with the FilterPostProcessor