RenderState.FaceCullMode.Front Issue

The lighting shader doesn’t even get to see triangles that are culled through face culling. So it isn’t the shader’s fault.

More likely some weird render state thing that happens with multipass lighting and point lights… ie: the cull mode gets lost somehow.

1 Like

I didnt understand the difference until now.

The problem I have with the translucency bucket is that it didnt seem to work with depth testing.

All my translucent geometries were rendered on top regardless of depth. Maybe a setting can alter this behaviour? If this is the case then maybe I didnt even need to write my transparency handler lol

Did never use the “translucency bucket” of jME and not sure what it is intended for.
Most systems can’t differentiate between Transparency maps and Translucency maps.
The first one needs AlphaBlend-blending and the second one needs “Multiplication-blending”.
If you multiply all of them, the product should always be the same (same end result color).
But if you mix Translucency and Transparency bad things will happen (if you don’t have a good shader).

Also it seems that many artists are not aware of this thing either.
They only have one alpha channel they work with.

I never ran any tests, but I made some UI elements that have two different alpha maps.
Maybe I will have time to make a test example for this community somewhere in the future.

The translucent bucket is here to render translucent objects after post processing.
For example if you have a fire particle emitter with transparents particles, you don’t want SSAO or shadows to affect them.
Depth is taken into account in the translucent bucket, as long as your material does have the depthTest flag on ( it’s defaulted to true). Also if you have a filterPostPorcessor, you need to add a TranslucentBucketFilter at the end of your filter stack.

In 99% of the cases your transparent objects must go in the Transparent bucket.

1 Like

Would be nice to have a jME demo for mixed Transparency and Translucency.
I guess another material and shader and maybe some work on the engine itself might be needed.
But currently such things have very low priority for me.

The TestPostWater test case demonstrate it somehow.

I mean something other than that - a demo with transparency maps and translucency maps.
At least one object should have both (e.g. a window curtain that is both transparent and translucent).

Been there, done that… could write a 10 pages post of why it sucks, but that would be redundant.

I can assure you that if you install the sdk and move your project to it, you will never return to using notepad.
You will have a blast.

Netbeans is very simple to use so apart from spotting the couple key shortcuts that are amazing (that I or others could give you), the process would be quick and painless.

OO tends to end you with lots of classes… and that requires an IDE.
Humans making and using tools it what has kept us in the gene pool :smiley: .

1 Like

Yes, even though a little off topic, I do fully agree to what loopies said.
I can write extremely long and complex code with my brain in “idle mode” most of the time.
Used Eclipse before which was okay too. Then the SDK / Netbeans set my code dev to warp speed. :chimpanzee_cool:

1 Like