Alpha blending in model PNG textures

I have a model that's using PNG textures with transparency, and I'm having an issue with getting the transparencies of my model to be correctly blended.  Around the edges I'm seeing a lot of black, rather than the correct alpha blending (see attachment).  I'm using the following BlendState code:



      Renderer theRenderer = DisplaySystem.getDisplaySystem().getRenderer();
      final BlendState theBlendState = theRenderer().createBlendState();
      theBlendState.setBlendEnabled( true );
      theBlendState.setSourceFunctionAlpha( BlendState.SourceFunction.SourceAlpha );
      theBlendState.setDestinationFunctionAlpha( BlendState.DestinationFunction.OneMinusSourceAlpha );
      theBlendState.setBlendEquation( BlendState.BlendEquation.Add );
      theBlendState.setTestEnabled( false );
      theBlendState.setEnabled( true );

      theModelNode.setRenderState( theBlendState );
      theModelNode.updateRenderState();
      
      rootNode.attachChild( theModelNode );
      rootNode.updateRenderState();



Am I missing something here?  I use similar code in my HUD and it seems to work fine on quads.

this error has been made a thousand times around here (me as well). (you' ll just have to search the forums "transparency" or "transparency not working")


  1. have transparent textures - check
  2. have a blendstate attached - check
  3. updateRenderstate() - check
  4. add the spatial to the transparent render queue - ?



    spatial.setRenderQueueMode(Renderer.QUEUE_TRANSPARENT);



    post back if this works,

    so long,

    Andy

Thanks for the reply dhdd.  I'd seen some posts, but tried just about everything in there without noticing changes.



I'd set the render queue mode before and didn't notice the subtle improvement because of the lack of background: the alpha in my model now correctly blends with other models, just not with itself (see attachment).  Am I stuck loading multiple model meshes and attaching them to the same node if I want this alpha sorting to be correctly figured out by JME?

this is just a guess, try adding a ZBufferState.

Hrm.  I added this and set it as the render state for both my model node and the root node, with no noticeable change.  Is there a way to turn up the precision of depth testing, etc.?  I'm not sure why some things are alpha blended in a z order different from their actual z ordering (which is correct, thus my seeing these gray boxes on top rather than behind the rest of the model).

      ZBufferState theZBuffState = theRenderer.createZBufferState();
      theZBuffState.setEnabled( true );
      theZBuffState.setWritable( true );
      theZBuffState.setFunction( ZBufferState.TestFunction.LessThanOrEqualTo );
      
      theModelNode.setRenderState( theZBuffState );
      theModelNode.updateRenderState();



In Milkshape 3D I notice a very similar ordering of my transparent mesh, despite their Z orders, until I switch it from "Simple" transparency rendering mode to "depth buffered with alpha reference".  Is there a way to get similar behavior out of JME?

i think you just need to set the correct source / dest functions in the blendstate.

Depending on if you use black or an alpha channel in your texture to mark the transparent areas.

It seems as though the blending mode is indubitably correct to me, as these are pngs with alpha data in them and the mesh blends fine with other meshes.  The only blending problem that remains is that the different alpha textured pieces of the loaded model don’t blend well with each other.  It looks like the arm is alpha blending with the box behind the model instead of the torso like it should, etc.  I’ve seen posts like this that lead me to believe I may have to separate these into different model files entirely if I want it to sort correctly.



Is there a better solution?  It seems that depth testing should be taken into account when this blending call is made, but it sure doesn’t look like it is :(.

ah now i see.

I don't know what else you could try tho.

the default value of ZBS's Testfunction is ZBufferState.TestFunction.LessThan and usually works.



try that.

  • Things you can see through should not write to the z-buffer.
  • Sorting is done by mesh not by pixel
  • Having meshes which are largely opaque in the transparent render queue is ineffecient.



    Things you can try:
  • Disable z-buffer writing for transparent meshes (zs.setWritable( false ):wink: Not ideal for meshes that are mostly opaque.
  • Split things that are mostly transparent into seperate meshes from things that are completely opaque (pretty much always something you should try to do).
  • If your texture just has pixels either completely visible or completely invisible, as opposed to translucent, then this might work (forum search for alpha killing):

      + Add this blendstate


        BlendState as = DisplaySystem.getDisplaySystem().getRenderer().createBlendState();
        as.setBlendEnabled(false);
        as.setTestEnabled(true);
        as.setTestFunction(BlendState.TestFunction.GreaterThan);
        as.setReference(0.5f);
        as.setEnabled(true);


  + Do not put in the transparent renderqueue.

My textures have plenty of half transparent pixels, so I can't use that blend state.  I tried it anyway, though, and it didn't any difference for Z-ordering; it just made my edges rougher.



zs.setWritable( false ) did fix the alpha issues, though!  Now my only issue that remains is the z-ordering of the blended polygons.  If there's no way to get the sorting to be done more precisely than on a per-mesh basis, I can make each body part a separate mesh.  It seems as though getting this sorting working correctly would be pretty important for a good few other scenarios.  Model-based grass/fur effects come to mind.  I'm surprised it's not a more solvable issue.



I'm trying to construct models entirely out of layers of transparent textures, so having this not-work is potentially a pretty big dead end for this art style.



Oh well.  If that's that I'll experiment with how many different meshes I can have per character before things get too dumb :).  I'm certainly still open to any suggestions for getting this going; even if they're relatively expensive.  The geometry for these textures isn't exactly complicated.

i wonder if different renderpasses would help, or would this only make things more complicated?

I'd be interested to see an performant solution. I know that type of approach is common in rendered still and animations. I don't know of it being used much in realtime though, there seem to be a number of things that would slow it down a lot.



I'm sure there's some way of doing it, but I don't think it's a straightforward one  :expressionless:

For now I'm settling on using an alpha reference of 0.5 with blending enabled, and leaving the ZBufferState's zs.writable(true).



I'm doing this because enabling the ZBufferState displays the polygons in their true ZOrder rather than their blended ZOrder, and setting the alpha reference to 0.5 gets rid of much of the texture that was blending with the mesh behind the model rather than the model itself.  This does make the texture's edges more pixelated than they would be otherwise, but this isn't very noticeable, especially in motion.



I may add a state as to whether or not a model will render in smooth blended or alpha referenced blending, but this seems the closest I can come to a solution at the moment.  It looks quite decent!  I am also going to edit models, where I can, to eliminate alpha blended edges at joints, etc.





I experimented with using multiple meshes, but ran into the (rather obvious now that I think of it) problem that the meshes needed to either be entirely in front of or behind each other, or one of the two blending orders would be chosen arbitrarily.



Ah well.  Thanks for your help, guys.