Optimize terrain shaders for tri-planar mode per layer, remove fog defines to make room?

Not entirely sure which category to put this in, since the only Terrain related category is troubleshooting, and this isn’t troubleshooting related.

Lately I’ve been thinking of many ways to optimize many PBR terrains for rendering large open-worlds, and one of the biggest resource-sinks for all 3 of jme’s stock terrain shaders is tri-planar mapping, since it multiplies the texture reads x3.

Currently the terrain shaders define a single boolean to apply tri-planar mapping, so you either turn it on or off for all 12 texture layers.

So I’m thinking of refactoring jme’s terrain shaders to also include 12 new boolean defines for enabling tri-planar on a per layer basis, that way you can, for example, enable it for a mountain texture that is primarily painted on vertical terrain, and then disable tri-planar for other layers that don’t need to use triplanar mapping, and this could be done even more aggressively for terrains at a far distance. This could result in a potential save of 20+ texture reads per terrain when using AdvancedPbrTerrain in some cases.

The only potential issue is that the AdvancedPbrTerrain shader (the one that would benefit the most from this optimization) has reached its limit of 64 defines in the .j3md file, so in order to make this change on my local branch, I had to remove the code for in-shader fog that jayfella added to the pbr shader a long time ago. Personally I had no issue axing this fog code because it was broken for terrains and since the original author is gone and no longer around to support it, and the actual FogFilter works much better IME.

Do any other jme devs have any thoughts on this? I also am curious to know if there is anyone currently using the PBR terrain shaders with in-shader fog that would be negatively affected by this? (although that in-shader fog is very broken with terrains so I would be surprised if anyone is actually using it)

1 Like

Is this a JME limitation or a GLSL limitation?

The problem with post-processing fog is that it applies to everything even if you don’t want it to. And if you want to render your terrain in multiple layers with different scales then you are totally out of luck.

Post processing fog is a kludgy hack, really.

Also, killing 12 parameter slots for a bunch of booleans seems unfortunate. If we can fix the limitation that makes that a scarce resource then that would be the better option.

Is the issue that JME requires material parameters in order to have defines? I’ve often felt that the inability to set defines directly is a big limitation.


I haven’t tried using filter based fog much yet, not as much as I’ve tried using the in-shader fog. I actually disliked how the in-shader fog didn’t apply to the water and sky box was using, but I can see how it would also be bad when the fog filter renders fog over objects you don’t want it to like over-head health bars.

I think I experimented with using the fog-filter for very light fog that likely was too light for me to notice any issues, and then I tried using the in-shader fog at the same time for blending out the far-terrain so that players can’t tell where the frustrum distance culling is happening. But using intense fog was where I ran into issues that the original author of the in-shader fog unfortunately wasn’t able to help with, and I think I ended up getting too frustrated to continue following up on it to at the time to be entirely honest lol.

Here’s an example video I made at the time showing the issue with in-shader fog at high intensities. Any TerrainPatches that were within range of a point light would suddenly cause the entire TerrainPatch to light up weird like this, but it only happened at high fog intensities:

I suppose its been a few jme version since then though, so maybe I should try it again and see if anything has changed. (edit: I tried the in-shader fog again and it is still bugged)

That is a good question, I tried looking this up in the past and it sounded like there is no official value set by GLSL, but I found documentation referring to a GLSL parameter that determines this limit. So I think it can be changed when configuring GLSL for the engine.

Although I’m noticing now I am able to go past the limit of 64 without getting an error now that I’m trying again. Last I encountered this error was an older version of jme so maybe something changed that updated it… I am certain I did receive that error before though otherwise I would’ve left the fog code in on my own fork.

But I guess if I’m not hitting that limit anymore I should be able to go ahead and add 12 extra tri-planar defines without having to worry about removing the fog code.

I suppose its not an issue now, since the define limit error is no longer occurring and I’ve surpassed 64, but it would definitely be nice if you could set defines without having to declare a material parameter to go along with it, especially in cases like this where its just a boolean value that never needs read as a uniform.

Yeah, I have a bunch of cases where I had to “waste” a material parameter for a boolean or a single float. And while I’m not entirely sure this contributes to the “material parameter limit” (since that may only apply to actually bound uniforms in the shader), it still bothered me a bit to have infrastructure setup for a material parameter I will only map to a define.

1 Like

…but it’s bugged only for terrain, yes? JME terrain is not well written. I do not trust it very much. There is no engine reason that the in-shader fog would affect lighting at all… certainly I do not run into this issue with my own Lighting.j3md forked shaders.

In Mythruna, it’s the lights that I want to not be fogged. Post-processing fog would kill all of my far away lights at night and that’s no fun. There are some other effects lack that that I would prefer to either not fog at all or have a different set of parameters.

1 Like

No it is also happening for my other models too actually. I wasn’t sure so I tested again, and it seems like any intense fog flickers the entire geometry when I get close enough while holding a weapon that has a point light.

If I remember correctly, It didn’t even end up being anything related to fog specifically. I ended up reducing my shaders to have only a single line in the main() section to hard-code the output color to the same gray color that intense fog would result in, and the issue still occurred. So it seemed to be a deeper JME issue that only got highlighted coincidentally as a result of the gray color you would experience by intense fog.

Here was the thread I made about the issue from a while ago:

I think the last thing mentioned in that thread was you saying I should make a test case, but I wasn’t exactly certain at the time how to best show the issue and eventually forgot about it since I stopped using the fog in my game. Maybe I should try to look into it more since the issues still happening.

1 Like

Ah, I don’t use point lights and so wouldn’t have seen this issue.

I hit the error telling me I have too many defines again, but it is only happening with one specific map that must be using more textures and consequentially using more defines.

I think I made the incorrect assumption that the 64 limit referred to the amount of defines declared in the .j3md file. But it seems like it only takes count of the ones that are being used, since it’s happening on this map and not another map despite using the same shader. Although I’m still slightly confused about this to be honest, since the other day I thought I was using more than 64 defines with values assigned to their matching uniforms at one point… so I guess its likely I was making some sort of mistake in my testing at that time.

Here’s the error I get when it fails to load a terrain that is using too many defines:

Caused by: java.lang.IllegalStateException: Cannot have more than 64 defines on a technique.
	at com.jme3.material.TechniqueDef.addShaderUnmappedDefine(TechniqueDef.java:449)
	at com.jme3.material.logic.SinglePassAndImageBasedLightingLogic.<init>(SinglePassAndImageBasedLightingLogic.java:74)
	... 57 more

So to make the changes I planned, I do think we would need to find a way to increase the limit of 64 defines in the engine (without causing any other issues doing so of course), and I think increasing this limit could likely benefit other jme users working with their own large shaders as well.

It looks like it should be as simple as changing this constant value here:

Any thoughts on this from jme devs/users?

1 Like

I looked in the code.

…it’s 100% arbitrary and because we were lazy and fully allocated the array instead of letting it expand as we go.

Also, considering this class is just spitting out strings in the end… it seems needlessly complicated that it stores int[] in the first place rather than just using the Object values already provided. (In which case a SafeArrayList would have been a fine choice for keeping the values.)

As to your issue randomly cropping up, there is something called “unmapped defines” that are set on the technique. I did not look into it but maybe these are internally used defines that are not part of the j3md and so wouldn’t always factor into your count.

In any case, the DefineList definitely needs an overhaul.


I’ve opened an issue in regards to refactoring this class in the ways you mentioned.

I think it may be a slightly bigger task than I first thought, since (depending on the specifics of how you want to refactor it) it may require making changes in a lot of other classes that use DefineList. But I’m by no means an expert on this section of the engine and only briefly looked at the relevant source code in the past day, so this discussion can continue on the issue I just filed.

In the meantime, I’ve also submitted another PR increasing the define limit from 64 to 128 to at least bypass the issue until we refactor DefineList to allow the array to grow as needed. This way I can continue making the triplanar optimizations to the terrain shaders and potentially have that included in the next 3.6.1 engine release as well.


I don’t know if this information might help you to fix this, but the DefineList class was manipulated using a TreeMap and it was a Savable on the git-blame, it might be a good idea to look back on the git-blame in details before attempting to think of a solution.

There is a commit, that seems to fully change the logic of how this works.

Not sure what you are referring to. DefineList doesn’t use TreeMap.

The issue in question starts here:

…and to some extent, the line above.

DefineList pre-allocates an int array but because it cannot tell if a value has been set or not, is used the isSet variable (64 bits) to tell which entries have been set and which entries haven’t. This is why it can only support a maximum of 64 defines.

And it’s the only reason.

Fixes can be made without changing any other classes. All fixes would be internal. Even just switching to an object array and having isSet check for null would remove the 64 entry max requirement and cost essentially nothing more than the int array approach.

Your PR cannot work because it doesn’t fix the isSet issue. To up the limit to 128 would require a second isSet long… or a different approach (maybe a BitSet instead).

It’s a very strange implementation and has existed for a really long time… since the beginning of JME3 I think.

1 Like

I am talking about this commit (in case it has concerns related to the fix):

It concerns it only in the sense that DefineList was added in that commit.

Again, we are not talking about changing the outward behavior of this class other than removing the max defines limit.

If someone can point to relevant code that would have a problem with a new max defines limit (preferably something more narrow than ‘the whole codebase’) then that would be useful.

1 Like