Shader Inj......Nah... Shader Nodes!

Get over it, you’ll never have shader injection.

After countless hours of dicussion in the core chat over almost a year (or was it two?), we finally got to a point of agreement on some concept on how to make the material system more flexible and less painful to maintain.

This concept is called Shader Nodes and I recently took the shot to implement it.
Today I’m fairly happy with what is done and I just made the commits of the whole system into the repo.

So…what on earth is it?
Shader nodes are self contained units of shader code that take inputs and give outputs.
With this new system, you don’t write a shader, you assemble shader nodes and the system generates a shader for you.

A shader node adds a feature to your shader, it can go from a simple texture fetch to hardware skinning for example.

The nodes are connected by mappings between outputs and inputs.

So for example, let’s say you’d like to add support for fog into the unshaded material.
Just grab the material definition of the unshaded material, add the Fog shader node to it and make the relevant connections.

So from now on, we are just gonna provide shader nodes, and users will make their own material definitions with what ever node they like.
Of course, unshaded and lighting materials will be translated to this system, but you’ll now have to consider them as examples of material definitions.

Note that if you are in love with monolithic shaders, you can still use them as before. The node system only work if there are no shaders defined in a technique (and ofc that shader nodes are declared)

This has several advantages :

  • Very modular, build your shaders like legos.( you need at least some basic shader knowledge though)
  • Make the shader code a lot more concise and easy to maintain
  • You can add tons of features to your material, without having to care about the length of the code.
  • You can mix up your shader nodes with stock shader nodes, still having the benefit of bug fixes of our nodes.
  • It handles GLSL versions and generate the appropriate version for the GPU
  • And of course… that’s friggin awesome…

I made a documentation that explains how to use them here.
https://wiki.jmonkeyengine.org/legacy/doku.php/jme3:advanced:jme3_shadernodes
It may change over time, but at least you’ll have a complete overview of what the system can do.
If you want to expriment, you have a UnshadedNodes.j3md material def in the repo, that is the unshaded material with additional fog support.
You also have experimental nodes in Common/MatDefs/ShaderNodes

I can’t talk about everything in one post because we got passed the TL;DR threshold a long time ago, so if you have questions, please ask, it will be my pleasure to answer.

32 Likes

Now that I bored you to death with tech readings, that’s the awe effect moment.
I couldn’t just stop with the system, I also made an editor for the SDK.(took me longer that the actual system actually :stuck_out_tongue: )
So here is a quick overview video of the editor, hopefully it will help you understand the system better.
Enjoy

16 Likes

:smiley: :o :affe:

So in case anyone didn’t get it: We have modular shaders now and you can edit them visually :slight_smile:

2 Likes

Amazing work!

@nehon
Some questions =)

  1. I assume we can still just write and use full shaders?
  2. When something changes in the structure of the shader using shader nodes, does it force a recompile? I assume the answer is yes… so how would one use branching for more dynamic shaders? My guess on this one is the branching would happen internal to specific nodes?
  3. How does inter-dependency work between nodes (i.e. The results of node 2 will effect if node 4 is executed at all… etc. Guess #3 You have to pass info along through the input/output variables.)
  4. If guess 3 is correct, are there any performance issues that come along with this? Not that it really matters… just wondering.

I think that’s about the extent of my questions so far!

:-o :mrgreen: !
Excellent work Remy!

oh wow, this looks really sweet!! I didn’t even realise there was sound in that video, till the very end! :stuck_out_tongue: will have to watch it again. AWEZOME!!!

But what are we going to tease people with vague hints about now? :o

1 Like

Very nice ! Lots of questions of course.

Let’s begin by a simple one : How do you handle expressions ? Is there some kind of script module ?

For example, if I want to make a lightMap effect, I multiply the lightMap grey level by the diffuse color. (to make it simple). How do you implement this with the editor ?

Edit ; Ok I feel stupid. This is one example of your wiki page I didn’t read yet.
But this is still valid for more complex expressions… Do we have to write a specific node for that ?

Remy, the shader king.

1 Like
@t0neg0d said: Amazing work!

@nehon
Some questions =)

  1. I assume we can still just write and use full shaders?
  2. When something changes in the structure of the shader using shader nodes, does it force a recompile? I assume the answer is yes… so how would one use branching for more dynamic shaders? My guess on this one is the branching would happen internal to specific nodes?
  3. How does inter-dependency work between nodes (i.e. The results of node 2 will effect if node 4 is executed at all… etc. Guess #3 You have to pass info along through the input/output variables.)
  4. If guess 3 is correct, are there any performance issues that come along with this? Not that it really matters… just wondering.

I think that’s about the extent of my questions so far!

  1. TL;DR? :slight_smile:
  2. Well sure but what use do you imagine where a shader changes each frame? A j3md compiles the shader nodes to a normal material definition like always…
  3. They have inputs and outputs, check the wiki manual that @nehon already wrote up for all who want to create shader node packs :wink:
  4. Nah, its all happening as before. If it grabs shader node files in addition to the glsllibs and the main shader file thats not much overhead.

@all: Make sure you check out the wiki page and video as well as the editor tomorrow before you kill @nehon by sucking the last bit of life out of him after this monumental effort :wink:

2 Likes

Yes! Loving this! :smiley:

Only question I have is, how hard or how much work will it be to take an already existing big shader and translate it to ShaderNode

Can’t wait to play with that after I’m done with the GUI stuff. :smiley:

@madjack said: Only question I have is, how hard or how much work will it be to take an already existing big shader and translate it to ShaderNode
You monkeys tell us :) The content of this post is meant to be read as a straight information or question without an implicit dismissive stance or interest in having the other party feel offended unless there's emotes that hint otherwise or there's an increased use of exclamation marks and all-capital words.
1 Like

This is (notice the big A -->) A w e s o m e !! I will for sure take this for a spin as soon as possible =) Really great work nehon!

@normen said: You monkeys tell us :)

Fair enough. :stuck_out_tongue: Probably won’t be me though as I’ve got a crapload of GUI to translate. I’ll still have fun when I get there. :smiley:

1 Like

I … just … wow :-o

1 Like

Hahahahahahaha, nehon, nice work. But I just nearly crapped my pants when I read the the title and your first sentence :smiley:

Hehe thanks for all the nice comments.
Answers to some questions :

@t0neg0d said: 1. I assume we can still just write and use full shaders? 2. When something changes in the structure of the shader using shader nodes, does it force a recompile? I assume the answer is yes... so how would one use branching for more dynamic shaders? My guess on this one is the branching would happen internal to specific nodes? 3. How does inter-dependency work between nodes (i.e. The results of node 2 will effect if node 4 is executed at all... etc. Guess #3 You have to pass info along through the input/output variables.) 4. If guess 3 is correct, are there any performance issues that come along with this? Not that it really matters... just wondering.
1. yes, you just have to keep the VertexShader and FragmentShader directive in the technique block of the J3md and the shader will be used. 2. First, you can't change the nodes at runtime (for now though, I already got a brushed up API). Although I'm not completely sure it'll be useful...anyway. The system still heavily relies on defines. You can set conditions on a node or on a mapping and the system will generated defines in the shader. then you can activate or de-activate a node by just passing a material parameter (I wanted to keep the same system as before). And yes, the shader is recompiled when a define changes :p If later we have an API to modify the matDef structure at runtime, the shader will be regenerated and recompiled too. 3. Not sure i get what you mean, there is no inter dependency between nodes. No variables are shared between nodes, inputs are mapped to the previous node outputs. 4. There might be some overhead because there are more variable assignations than in a normal shader (because of the mapping), But I'm pretty sure the glsl compiler optimize this anyway, so i doubt there will be performance loss. The only grey area about it would be glsl compiler on android. But the way the shader is generated may change, i still need to perform some performance tests.
@madjack said: Only question I have is, how hard or how much work will it be to take an already existing big shader and translate it to ShaderNode
Well, I guess how hard depends on your knowledge of shader and the system. I did the unshaded shader while building the system, so it does not really count. Next step is to make the Lighting shader...we'll see ;)
@yang71 said: et's begin by a simple one : How do you handle expressions ? Is there some kind of script module ? For example, if I want to make a lightMap effect, I multiply the lightMap grey level by the diffuse color. (to make it simple). How do you implement this with the editor ? Edit ; Ok I feel stupid. This is one example of your wiki page I didn't read yet. But this is still valid for more complex expressions... Do we have to write a specific node for that ?
I realize I didn't talk much of the shader node code itself in the presentation and the video. It's explained in the doc though, but in short, a shader Node has a .vert or .frag shader under the hood yes. It contains the core of the node of course. For now the only available nodes are the one I did, but I really hope users will contribute them instead of big fat shaders. Also I'll make a wizard and an editor for shaderNodes themselves in the SDK.
@kwando said: This is (notice the big A -->) A w e s o m e !! I will for sure take this for a spin as soon as possible =) Really great work nehon!
Thank you. I kept something special for you. The system handles MRT. You can add many out colors to the fragment shader and the generator will handle the gl_FragData or several out colors for glsl 1.5. This means we'll have a flexible way for users to add what ever buffer they want to their back buffer in our future deferred rendering process :D
1 Like
@Dodikles said: Hahahahahahaha, nehon, nice work. But I just nearly crapped my pants when I read the the title and your first sentence :D
hehe that was intended ;)

I think this is a much cleaner approach than injection anyway. :slight_smile:

1 Like

Hehe, you finally published it - Great work, man! :slight_smile: