PBR, Materials and TechniqueDefLogic and shader variables

So i continue to do my shader nodes PBR. Very very slowly. However it is not clear to me the correct way to pass lighting information to the shader nodes. For example the number of lights, the list of lights and properties of said lights. Note that for PBR these lights will have extra info from the base lights.

Digging into the code i found TechniqueDefLogic interface and all its different implementations, including the image based SinglePassAndImageBasedLightingLogic.java. I also note that how a different implementation is chosen is fairly tightly coupled with the material loader.

Now the list of lights can be done the way its done now. I just need to add attributes such as size for point lights. It is not entirely clear where i should be adding this.

But even more problematic is the image based lighting. This is a possible set of “light probes” and a independent set of SPH light information.

What i don’t follow is how i should be managing this info into the material data and subsequently the shader node uniform variables.

Best practices, quick solutions and any general help always appreciated. It should be noted that just about everything in the scene will use the same material. A PBR material where only the textures will be different if this helps at all.

From time to time I work on shader node system to extend this system. I want to make PBR shader nodes material as well :slight_smile:

The TechniqueDefLogic is what responsible for sending light uniforms to the shader.
You might be interested by wht’s going on here jmonkeyengine/SinglePassAndImageBasedLightingLogic.java at master · jMonkeyEngine/jmonkeyengine · GitHub

But I guess you already got that.
Unfortunately it’s not easily extensible. If you want to make your own logic, you need to implement TechniqueDefLogic, but you’ll have to set it to your technique via code (technique.setLogic(myLogic))

The logic is normally defined in the j3m with the LightMode key word.

Allowed values are from an enum though so if you want to add your own it complicated.

Maybe this code should try to find a class that starts with the name of the LightMode like
LightMode MyLightMode
and then you’ll have to implement a MyLightModeTechniqueDefLogic class. But this would kill the portability of your material. (albeit you mind not care about portability).

Anyway for your particular issue,
If you do your own logic you can decide entirely how uniform will be sent to the shader.
In SinglePassAndImageBased Direct lights are sent in an array with 3 vec4 per light

  • The first vec4 is the color of the light with the 4th element being the light type (used in the shader to know how to compute attenuation)

  • the second one is light direction or position depending on the type (Direction for directional an spot lights and position for PointLight)
    For Directional light i don’t rememeber whet the 4th attribute of the direction is used for. It may not be used
    For Point light the 4th element is the inverse radius of the light (the radius is never used directly so to avoid computing the inverse in the shader for each pixel we compute it once here).
    For spot light we pack the inverse range of the spot light in the 4th element

  • The 3rd one is only for spot lights to store the direction. in the 4th element we pack the spot angles cosine. (see code in SpotLight for this). note that this 3rd vec4 is padded with 0s for direction al an point light. it may look counter intuituve but it’s better to have a fixed, and predictable at compile time number of iterations in a shader so that the compiler can unroll them. Hence the padding to always have 3 vec4 for each light. The number of lights handled in one pass is sent as a Define, and is also constant. You can change it by using renderManager.setSinglePassLightBatchSize(n)

So if you want to add data for point light it’s gonna be tough. You can try to pack it in the 4th channel along with the inverse radius in a similar way as the cosines are packed for the spot light, or use the remaining unused vec4 (that may be simpler)

For probes, you’ll have to send your data as a light source so that it ends up in the light list.
If you have separated SH from the lightprobes you need another light type (maybe AmbentIBL or something like this) and add it to he root node as a light.

1 Like

Awesome detailed reply. Yea its sort of how i thought things are. For now i may do my own TechniqueDefLogic and manually bind it. However when it comes time to push this. I may just add this to the small existing set?

We’ll see. Either I’ll add a more flexible way of using custom techniquelogic, either your technique will replace the default PBR one.
I mean if it’s better, why keep both?

At my current rate of progress that won’t be till some time after the sun burns out. :confused:

So rather than start another thread I thought i would just append.

I am making progress. But i need to just confirm a few things to ensure that i don’t go a long way down the wrong track. I;m 99% sure i got things sorted. But we know what assume stands for.

Lights
So light directions, and positions are in Camera space when passed to shader nodes via uniforms. More explicitly, directions are transformed by the normalMatrix and positions by the worldview matrix.

Many lights under the current com.jme3.material.logic.SinglePassLightingLogic can in fact produce more than one pass and uses alpha additive blend for “accumulation buffer” (old school i know).

Internally JME sorts out groups of nodes/geometry that are influenced by what lights and builds up the appropriate number of passes and uniforms.

World Parameters

To get the world parameters uniforms set correctly, i just ensure that in the material def, my technique{ worldParameters{ }} specifices the parameter from the Uniform class. ie if i want the WorldParam.NormalMatrix i have NormalMatrix in the worldParamaters block. It just automagically populates it correctly for me.

Vertex attributes

This is a little more vague for me. Where are vertex attributes defined. I currently only use position, texCoord and inTangent. But is there a list? can i add my own? In the past i have packet quite a lot of custom info into vertex attributes, ie for seamless LOD (no pop!), morph/animation data etc. So it would be nice to have the option.

Shader nodes Conditions

So this is where i am having small issues. I can turn whole nodes on and off without issue, that all seems to work. But seem to have a problem with input conditionals.

For example

      InputMappings{
           normalMap = MatParam.normalMap : normalMap
      }

always defines a normalMap in the resultant shader even when i have no normalMap defined in the material. In fact even the code texture(normalMap,uv); Doesn’t produce an error.

This doesn’t quite seem to be what i want. Since i may want to use a #ifdef normalMap explicity in the shader. But this doesn’t work as its still defined. Just not bound/set?

What am i missing.

Of course the final version will look more like

InputMapping{
     normal = NormalMapNode.outNormal : normalMap
     normal = VertexNodeDefault.outNormal : ! normalMap
}

mhh nope. All is in world space.

All in this section is right

Vertex attributes are defined by the buffers defined on your mesh. You can have a complete list in the vertexBuffer.Type enum

you can define any of these types as vertex attributes in the shade by prefixing them by “in” like “inPosition”, “inNormal” and so on. If there is the corresponding buffer on the mesh, the attribute will be populated with the data in the buffer.
Note that since this system is not very flexible (like you cannot define custom vertex attributes) it ma change in the future.

Yes… about that. It’s bit of clumsy right now to be honest. At first what I did was to “propagate” the conditions to nodes input and output. Like if you have an attribute used only for one node that is behind a condition, the attribute was automatically behind this condition. However it can get quite complex depending on all the combinations you can have and I ended having a lot of issues and bugs. I decided to remove this. For now you have to set the conditions your self on the node and on the inputs and outputs depending on what you want to do. But it can get pretty messy.
I’ll probably work on it at some point to have something that works better.

1 Like

What about my PR with AST shader generator? :slight_smile:

It won’t change this at all

AST generator extends available options to define conditions.

I’ll look into it. Long time overdue.

AST generator provides two new features with conditions:

  1. You can define shader node’s new definitions, which will be moved to top of a result shader with additional prefics to avoid any collisions.
  2. You can use conditions by input/output variables of shader nodes.

thanks. Will have to check my light position math. I sort of don’t know what @javasabr is on about however and arn’t following the convo there.

As far as conditionals, in my case it is simple fallbacks. That is no normal maps doesn’t result in a crash or no reneder so defaults to vertex normals. PBR has a very nice property that you don’t need all that much “suff” to define it. Diffuse map, Metal map, and roughness. So i have just a diffuse+metalMap in one texture. and a normalmap+ roughness in the other.

It is something at even a few #ifdefs could do.

But i intend to use the provided nodes for things. Like hardware skinning etc.

So lights are in worldSpace? I think that is wrong. I get it working pretty good with the assumption of camera space (aka worldview space). In SinglePassLightingLogic.java we have

     rm.getCurrentCamera().getViewMatrix().mult(tmpVec, tmpVec);

On line 145 (i’m fairly sure i have git latest. I pull often.). Unless i am really getting the wrong end of the stick with transforms and everything.

this is how i have assumed its defined. We start in model space, apply the model matrix get world space. apply the view matrix get view space or camera.

I clearly want to get this all correct. It is easier to get some of this wrong, get visible results and only later realize its all a bit funky with strange lighting.

The stock PBR material uses the SinglePassAndImageBasedLightingLogic which sends light data in world space.

StaticPassLightingLogic is only used for legacy phong lighting and indeed sends light data in viewspace.
If you are using this Logic you won’t be able to handle probe data. If you made your own logic…well then…do whatever you want.
In my experience I found it easier to compute lighting in world space than in view space in the shader.

1 Like

Thanks. I have been using SinglePassLightingLogic. In the past i have just always used camera space. But again being consistent is the key just so everything works properly.

But while i have you around (we are not same TZ …). I am having issues setting my own Lighting Logic. There is TechniqueDef.setLogic();. But Where when do i find this to set my own? For example Materials have a active technique that has a TechniqueDef that i could then use. However active is clearly not defined early on, and well i expect this to potentially change perhaps with different passes?

As always thanks for your replys and time.

Well tbh it’s not really flexible.

TechniqueDefLogic are defined in TechniqueDef as LightMode

Then in the j3md you define your lightmode for your technique

Then the j3mLoader instantiate the proper TechniqueDeLlogic here

We thought of using reflection to not have the LightMode enum and instead set the name of the class in the j3md and instantiate whatever class you set. But that would break the portability of the j3o file… that’s why it’s like this.
Not really satisfying though.

Hmm… so there is no way just to set my own custom Technique while developing this? I mean i can just add to my own local version of jme. But i like to keep it a separate till finished or at least close to finished.

And well i have sort of being doing my own version of jme up to now anyway.

@javasabr So are you doing a PBR? i don’t want to do a buch of work if someone else is already doing it. I would expect my version to be about a month away from showing off. And 2 months to perhaps a pull request.

I’m afraid not.

afaik he is using the stock PBR.

2 Likes