Multiple shaders on the same Mesh? (Multipass?)

Based on searches I’ve done it doesn’t seem like jME 3 really supports multipass, and that typically people seem to use other methods to achieve similar effects. Is that true? I ask because I’m working on essentially porting a multipass/multishader implementation to jME, and it’d probably be a whole lot easier if I knew how to do this without tricks/hacks (if possible) I’m currently doing. I’ve been able to get close to the look I need so far (just a few more ‘shader based effects’ to add in), but I can’t help but think there’s probably a better way, particularly if I could apply multiple shaders to a single Mesh.

Wait we already have that, take a look at how the glow effect works.

I’m not sure if that’s as flexible as what I need. That effect works as a second shader, but on an entire mesh I believe? What I’m looking for is a way to apply shaders to specific aspects of a mesh, ex. “mountain shader”, “water shader”, “grass shader”, etc.



http://i56.tinypic.com/291271f.jpg



Above see now the mountain/grass textures blend, but the rivers/water don’t? I’m doing that by using two meshes, one for the land and one for water. Ideally I’d be able to “paint” components of a single mesh landscape. What I’m doing now will probably work, but it’s more complicated since I need to be able to move vertices, alter textures, etc. With two meshes instead of one, I need to ensure the two fit together perfectly to avoid ‘holes’ between them. Not impossible but I expect some headaches when I start to allow “painting” of river/ocean terrain (like a map editor) since my pick rays are going to be hitting either of two meshes and I’ll need to edit the geometry and texture data for each as changes occur.



I’m in contact with someone else doing similar work who’s doing “multiple passes” and using different shaders for different aspects of the mesh. It sounds like a cleaner approach than my current technique (although, what I’m doing w/jME now seems incredibly fast in comparison, so far).

Ah ok now I see what you mean, well basically you coudl write your own shader, that contains all code for every material you use and a lot of if else cases. Then oyu just need to determine what part of the sshader to use for the fragment rendered.



Most game engines will only allow you to use one MAterial per Mesh tho, cause else you would get conflicts. (Image one tells it to be red te other to be blue, what should happen now?

We do have a water filter with shoreline blending

Thanks for the mention Momoko - but my requirements are little odd. Were I doing this for any other purpose, I wouldn’t even be messing with a mesh for landscape and I’d be doing it the more jME standard way with heightmaps, but I’m actually trying to mimic the appearance of another game (including any poorly done graphics). I might be able to use that for a basic ‘wave effect’ though.



Empire - I’m relatively new to using shaders (used many 3d engines but always at a higher level) but I think in the other case I described above, one shader feeds into another shader which further manipulates or “adds” elements to the mesh’s overall appearance. I have my own shaders now and I’m using defines (if statements if needed) to do what you suggest already. So I guess the way I’m doing it now is good enough? I feel like there’s a more optimal way to go about this:



Vertex shader

[java]uniform mat4 g_WorldViewProjectionMatrix;

uniform mat4 g_WorldMatrix;

attribute vec3 inPosition;

attribute vec3 inNormal;

attribute vec4 inColor;

attribute vec2 inTexCoord;



//Below are texture alphas, NOT texCoords

attribute vec2 inTexCoord2;

attribute vec2 inTexCoord3;

attribute vec2 inTexCoord4;

attribute vec2 inTexCoord5;

attribute vec2 inTexCoord6;

attribute vec2 inTexCoord7;

attribute vec2 inTexCoord8;



//Below are texture alphas, NOT texCoords

varying vec2 texCoord2;

varying vec2 texCoord3;

varying vec2 texCoord4;

varying vec2 texCoord5;

varying vec2 texCoord6;

varying vec2 texCoord7;

varying vec2 texCoord8;



varying vec4 vertColor;

varying vec2 texCoord;

varying float texCoordY;



void main(){



gl_Position = g_WorldViewProjectionMatrix * vec4(inPosition, 1.0);





//Mountain texture mod

vec4 vWorldPos = vec4(g_WorldMatrix * vec4(inPosition, 1.0));

texCoordY = clamp((vWorldPos.y - 5.0f), 0, 1);





vertColor = inColor;

texCoord.xy = inTexCoord.xy;



//Below are texture alphas, NOT texCoords

texCoord2 = inTexCoord2;

texCoord3 = inTexCoord3;

texCoord4 = inTexCoord4;

texCoord5 = inTexCoord5;

texCoord6 = inTexCoord6;

texCoord7 = inTexCoord7;

texCoord8 = inTexCoord8;

}[/java]



Fragment shader

[java]uniform sampler2D m_desertTex;

uniform sampler2D m_steppeTex;

uniform sampler2D m_mountainTex;

uniform sampler2D m_mountainSnowTex;

uniform sampler2D m_oceanTex;

uniform sampler2D m_plainTex;

uniform sampler2D m_riverTex;

uniform sampler2D m_snowTex;

uniform float m_desertScale;

uniform float m_steppeScale;

uniform float m_mountainScale;

uniform float m_oceanScale;

uniform float m_plainScale;

uniform float m_riverScale;

uniform float m_snowScale;



varying vec4 vertColor;

varying vec2 texCoord;

varying float texCoordY;



//Below values used for texture alphas, not tex coords

varying vec2 texCoord2;

varying vec2 texCoord3;

varying vec2 texCoord4;

varying vec2 texCoord5;

varying vec2 texCoord6;

varying vec2 texCoord7;

varying vec2 texCoord8;



void main(){



vec4 outColor = vec4(0.5);



#ifdef USE_VERTEX_COLOR

outColor = vertColor;

#else

vec4 des = texture2D(m_desertTex, texCoord.xy * m_desertScale);

vec4 ste = texture2D(m_steppeTex, texCoord.xy * m_steppeScale);

vec4 mou = texture2D(m_mountainTex, texCoord.xy * m_mountainScale);

vec4 oce = texture2D(m_oceanTex, texCoord.xy * m_oceanScale);

vec4 pla = texture2D(m_plainTex, texCoord.xy * m_plainScale);

vec4 riv = texture2D(m_riverTex, texCoord.xy * m_riverScale);

riv = (riv + oce) / 1.7;

vec4 sno = texture2D(m_snowTex, texCoord.xy * m_snowScale);



outColor = mix(outColor, oce, texCoord2.x);

outColor = mix(outColor, oce, texCoord2.y);

outColor = mix(outColor, mou, texCoord3.x);

outColor = mix(outColor, mou, texCoord3.y);

outColor = mix(outColor, ste, texCoord4.x);

outColor = mix(outColor, ste, texCoord4.y);

outColor = mix(outColor, pla, texCoord5.x);

outColor = mix(outColor, pla, texCoord5.y);

outColor = mix(outColor, sno, texCoord6.x);

outColor = mix(outColor, sno, texCoord6.y);

outColor = mix(outColor, des, texCoord7.x);

outColor = mix(outColor, des, texCoord7.y);

outColor = mix(outColor, riv, texCoord8.x);

outColor = mix(outColor, des, texCoord8.y);



vec4 mountSnow = texture2D(m_mountainSnowTex, texCoord.xy * m_mountainScale);



//clamp((mou.r + mou.g + mou.b) / 3,

outColor = mix(outColor, mountSnow, clamp(texCoordY, 0, 1) * ((texCoord3.x + texCoord3.y)));

#endif



gl_FragColor = outColor;

}[/java]

Funny… when I looked at that map it reminded me so much of that game, and when I read your next comment I was sure :stuck_out_tongue:

I always hated that water because there’s no difference between sea and river, but if you like it that way just keep going and good luck with your project!

I guess one example of a use for this would be doing something like this:



http://fabiensanglard.net/shadowmappingVSM/index.php



Separate “blur shaders” in addition to shadow shaders. I’m not sure this shadow technique is possible with jME currently - or is it? (Also, he says you need to cull front faces (and then back faces) during different stages of the rendering.)

So, is this something that:



  • Just isn't implemented yet?

  • Was rejected as a feature for some reason (why?)

  • Is already possible using some method not yet mentioned?

  • Is not possible, or I'm misinterpreting what "multiple passes" actually means?

Not implemented yet

Okay, thanks, not a bad answer–for now I’ll just do what I can with workarounds. I know jme3 is still in heavy development, but even at this point I’m still pretty impressed.

Hey, can you tell me which game you are actually trying to copy?