Rendering order issue

I have a bit of a dilemma. Here is my scene:

  1. Aircraft - Opaque Bucket
  2. Clouds (2d quads with transparent bitmaps) - Transparent Bucket
  3. Cloud Shadows (also a 2d quad with transparent bitmap) - Transparent Bucket
  4. Terrain - Opaque Bucket

I’m using the DirectionalLightShadowRenderer to cast shadows except on the clouds (since I’m faking the shadows there with the second quad). Initially I was using the DirectionalLightShadowFilter for shadows, but noticed I could not exclude the clouds from the shadow render with setShadowMode(ShadowMode.Off), as it kept receiving shadows rather than letting them just pass through as I would expect (not sure if this is a bug).

But anyway the real problem is that now I’m having trouble rendering the scene properly. If I put the 2. Clouds layer in the translucent bucket things work as I want, the real time and pre-rendered bitmap shadows blend seamlessly. But then my clouds are always rendered above the 1. Aircraft objects (not good). If I leave the 2. Clouds layer in the transparent bucket, things are rendered in the proper order except there is a blending issue with the clouds transparency and the real time shadows. Btw, this only happens if I have filters in use. If I turn them off things render as expected. But I’d like to use the filters. So. I’m looking for suggestions.

It’s like I need a second transparent bucket that will render after the real time shadows have been drawn. Is it possible to do this?

It’s also possible I’m barking up the wrong tree and there is an easier solution, but quite a bit of digging on the forums hasn’t revealed a solution.

Many thanks for any thoughts you might have on the topic.

Getting things right in these cases is very tricky. I think the shadow processor is rendered after all of the buckets. The translucent bucket only really matters when you have a translucent filter in the post processor and then you might somewhat control things if you had shadows and filtering issues.

Anyway, one approach would be to render your clouds in a separate scene processor. You’d have to take on some of the rendering yourself but it’s one way to be sure they are rendered after shadows. A separate viewport could do it to, I guess… but then you’d have to somehow get your post-processing to span the two viewports (I think this is possible now but I’ve never done it).

Writing your own scene processor might be educational. Actually, if you are going to render the clouds like that then you could make your clouds as a post-proc filter maybe (that renders geometry instead of doing actual post-processing) and then you could go back to using the shadow filter and micro-manage the ordering.

pspeed,

Thanks so much for your reply. It gave me a lot to chew on and I did a little more digging. It turns out that I was actually able to get things working correctly by adding a TranslucentBucketFilter to my FilterPostProcessor.

Thanks again.