I took the leap and built the SDK from source to fix some issues. It’s not that difficult (but I remember thinking so a couple of years ago).
In any case, I’m working on some SDK issues now, but the Shader Editor is interesting and I can take a look at that next. Feel free to describe the issues so that I or someone else who feels inclined can start working on them with minimum effort.
And we are aware that SDK can be a bit hassle to build. You need to follow the instructions to the letter. We are also in the process of making this easier with a bit of modern magic. Hopefully it will be a success.
After that there is only the psychological fear of coding NB platform code and having it all in IDE that is not sexy at this given time. Minds are the most difficult thing to turn.
Now we need to write the PBRLighting.frag file. I need your help writing the fragment shader function please. (This file colorMutl.frag can be useful as reference).
void main(){
outColor = ....;
}
Good news: the newly created Shader Nodes are visible in the editor. There is no need to recompile the SDK. With your help, we can probably find a solution ready to use with the editor we already have.
I took a look at the Shader Node Editor last night and read some forum threads. I can’t believe it’s close to 10 years old!
In my view, the biggest “problem” is that what you’re editing is the material definition, not a material. So it’s another layer of abstraction compared to, for example, blender, and difficult to preview the results (since it doesn’t have default values (or am I wrong?))
Apart from that I found a couple of bugs, but nothing major.
What are the expectations on the editor? How can using it be simplified?
I think the shader nodes support default values. You can see it from ColorMult, where the default value of the input colors is white → vec4(1.0)
ShaderNodeDefinitions{
ShaderNodeDefinition ColorMult {
Type: Fragment
Shader GLSL100: Common/MatDefs/ShaderNodes/Basic/colorMult.frag
Documentation{
Multiplies two colors
@input color1 the first color
@input color2 the second color
@output outColor the resulting color
}
Input {
vec4 color1 vec4(1.0)
vec4 color2 vec4(1.0)
}
Output {
vec4 outColor
}
}
}
I stopped supporting this jMB ShadeNode tool after rejecting my PR to jME with GLSL AST processor without it, I didn’t see any good option to implement really flexible shader tool editor.
well, if im not wrong (looking at features) @RiccardoBlb wanted to add similar things(extend shader language kind of) into Shaders to be able make const loops/etc more easy way. But here i see its about namespaces additionally. But I dont understand point 1.
looking at discussion it was like about making Shader nodes more simple for Artist, while Paul and Nehon said that it will make Shaders less flexible/modular, because Shader nodes are not split into smaller Nodes. Well it is right.
but… Well, why not just have both? smaller + bigger nodes? i dont see a problem. (ofc except maintaining more)
but still, is it unable to just create custom Nodes currently that will be easier for artists you got?
myself i just need PBR Node like in blender(but can be different ofc). This is like missing build-in feature for me. IDK how to make PBR with current Shader Node elements.
I mean, if we would provide PBR/etc Nodes i dont see why for most shaders would need create ANY custom Shader Node. We could just add existing blocks without writing any shader code.
I understand that developer might want “edit” within PBR Node lets say, but in most time, you dont need - you just need modify params for it.
I written many custom shaders based on existing PBR, but i dont use Shader Nodes because i would need create many custom “nodes” there so its easier for me to just edit single-file shader. Thats the fact, idk how others think.
So maybe we should just have Second Layer of Abstraction for Shader Nodes to compromise both. (and editor for it ofc)
I personally tried using shader nodes with Nehon’s SDK editor back when it worked and was a new tfeature to the SDK, but I quickly realized many of my ideas would require custom nodes that didn’t exist, so I still had to learn a lot of glsl code anyways. Then I realized it was easier for me to put an important glsl function in a simple .glslib file rather than turning it into a whole shader node.
However using shader nodes was an excellent way to learn how to write basic shaders if you inspect the code it generates for you. But past that, I personally found that, as a coder, its more convenient and more flexible to code your shaders (especially if you’re forking an existent shader) and import any necessary functions from a .glslib file
So I think Javasabr’s point of making a shader node editor in a way thats easier for artists actually makes more sense since non-coder artists are the ones who need a visual editor most, because IMO coders (who are skilled enough and have the time to spare) are better off learning to write shaders with code and ignoring shader nodes altogether. But they are definitely great for cases where someone cannot or does not want to invest the time to learn GLSL code.
They’re also fantastic for porting shaders across backends (eyeing up tricky OpenGL version advancements or a Vulkan backend here). If all materials are defined in shader nodes and all rendering backends provide the framework for compiling shader nodes to shaders, then your shaders are portable across all backends (both existing and future ones). If you hand-write GLSL code you potentially have to evolve your shader for every backend.