Also a good test case to fix the issue in the ShaderNode editor.
EDIT : the problem in the editor comes form the variables you called m_something. There is a place where I strip it from the variable names and then it messes all up.
As a workaround, you can remove all m_ from your variables names.
I’m gonna make a more robust fix.
Thanks Momoko_Fan, i guess global parameters would be nice but they aren’t really necessary here.
I do have most things working, i updated the git repository.
If you want to run the test app you can switch between different light types with “space”, switch between new shadow renderers and old ones with “1” and stop the shadow light from moving with “2”.
A picture of the spot light:
I’m currently using new classes for the lights which extend the old light classes (They only implement an empty interface).
The other thing i did was to extend the material class where i use updateLightListUniforms() to find the shadow casting light and set the material parameter for the shader.
(Well and obviously new shadow renderers)
This isn’t optimal but that way it’s working without having to modify jMonkeys core.
Now one thing i don’t know if there already is a better solution but my shader file does look rather messy:
My problem is the vertex and fragment shader nodes have a lot of inputs and outputs and most of them are just parameters which will be set by the different shadow renderers.
It would be really great if there was an option to pre define the parameters so that the user would have only two inputs and one output for the shader node instead of twenty which always have to be configured the same way.
Another slightly annoying thing is the node editor does sometimes remove my defines from the j3md file (i couldn’t reproduce reliably when it does happen, but i think it was usually after the shader failed to compile).
For my own use case everything is working good enough now, so i’ll move on to creating a few shader nodes to support hardware skinning, instancing, normal mapping and all those things (which I’ll also upload, since as far as i can tell most of those are missing from the provided shader Nodes).
Definitely want to have this in core… The difference of quality between the current implementation of shadows and this is tremendous.
I still think the “correct” solution would be to use global material parameters, but in reality it doesn’t real give much gains in this case. Essentially your system works the same way as the current soft particle system, where it “injects” a texture into all materials inside a render bucket.
Have you thought about texture unit limitations, by the way? The maximum for OpenGL 2.0 is 16 textures, obviously you won’t be able to have multiple point lights with diffuse, normal, height, specular maps, etc all enabled at once.
It might be possible to accomplish this with a cleverly constructed texture array, then you only need a single texture unit.
If the lights are sorted with the shadow casting lights coming in first it should be fairly easy to use a texture array and apply them in a shader.
This is assuming you want to use every shadow casting light for every shadow receiving object in the scene. If not it might be a bit more complicated but probably still possible.
In a multipass system it might be also possible to render the new shadow maps between the different passes which wouldn’t require any additional textures.
But to continue I’ll have to make a change in the engine, because i want to render the depth to a texture array and unless I missed something, array textures and depth textures aren’t supported as render targets.