I played around a bit with the SceneProcessor like @nehon said. Here are some first visual results:
Edit: Updated the real slim shader to use vertex colors … still not what I really want. Why do I need SceneProcessor? I could also make a copy of the phong lighting material and add the projective stuff. I guess the fixed function solution needs modifications in LwjglGL1Renderer.
Edit2: Hmm, a post filter provides the scene as a texture, but is it really reasonable (performance wise) to splash the projected texture on a full screen quad? I have no experience and can’t estimate it. GL1 can do it with texture matrix and combining.
As you can see, the result is not what one would expect. The box and shpere have a light gray unshaded material. My shader overrides the fragment color completely, because I don't know how to access the "previous" color. I'm also not sure if I understood what's going on. My understanding is, that what I do happens in a second render pass. So I somehow have to access the fragment color from the first render pass and use that, if the texture coordinates are out of [0, 1] range. But I have no idea how. Perhaps I understood it completely wrong. Please correct me.
Anyway, here's the relevant code:
Edit: Code removed due to mistakes. I'll post a contribution, soon.
I’m making good progress now after digging into jme3 and understanding a bit more what’s going on.
One question: Is there a smart way to create an offscreen FrameBuffer with the same format as the main FrameBuffer (null)? Especially the number of samples (AA). Using AppSettings feels not so good because it might have changed.
At the moment, the best way seems to be using “application.getContext().getSettings()”. It’s the settings the Renderer uses, not a clone(), but if someone changes it, it’s his own fault.
@nehon said:
The first video looks fine to me, what's the issue?
The problem was that I didn't know how to get the color of the first render pass into the texturing pass. After examining other SceneProcessors / Filters, I've got a clue.
Now phong lighting, alpha texture and multi sampling works:
http://www.youtube.com/watch?v=dneRsjRgb-4
I'll tidy it up a bit and make a contribution post.
Of course I will share the result with you. I did the whole thing with contributing to jme in mind.
At the moment I’m improving the shader. You might know that the naive implementation of those algorithms suffers from “back projection”. Also, areas that would normally not be lit by the projector beam because they lie in shadow are lit in a mirrored way. You can see this effect here.
Thank you! As you might have noticed in the videos before, the transition between light and shadow moves. This is not correct since the location of the projector stays the same. Instead of [java]cosAngle = dot(inNormal, m_ProjectorDirection);[/java] I now use [java]cosAngle = dot(inNormal, m_ProjectorLocation - inPosition);[/java] Here is the result:
Edit: To make it completely correct, the sphere would have to shadow an area of the plane in front of it. But that goes into the direction of real shadowing and might out of scope of this shader. Or is it easily doable? Depth buffer?
Edit2: It might be a better effect to just repeat / stretch the last texel before mirroring to receive a continuous projection between sphere and plane.