Hmmmm... alternatives to Vector Arrays in Material Def files?

Yep… title says it all. Vector Arrays are not currently supported in JME Mat Defs… seeeeeeeew… alternatives? gl_FragData? /shrug… not sure whether to proceed with my current idea… or just scrap it for a whole new approach.

m_vect1, m_vect2 etc? I think its the only way.

@normen Thanks you! I think a SIMPLE shader-based particle system may be out of the question just yet then. I was looking at this as an option for rain/snow/etc as a post process filter, but… on second thought,… :wink:

Well… A simple way to get an “array” in is a texture :slight_smile: Normal maps encode vectors in the image data, also noise maps are used a lot.

1 Like

Since you probably don’t want to edit vector arrays in the material editor, you don’t need to include them in the MatDef. Just define them in the shader like this …

[java]

uniform vec4 g_LightPosition[NUM_LIGHTS];

uniform vec4 g_LightDirection[NUM_LIGHTS];

uniform vec4 g_LightColor[NUM_LIGHTS];

[/java]

… and access them from Java like this:

[java]

Uniform lightColor = shader.getUniform(“g_LightColor”);

Uniform lightPos = shader.getUniform(“g_LightPosition”);

Uniform lightDir = shader.getUniform(“g_LightDirection”);



lightColor.setVector4Length(numLights);

lightPos.setVector4Length(numLights);

lightDir.setVector4Length(numLights);



for (int i = 0; i < numLights; i++)

{

Light l = lightList.get(i);

ColorRGBA color = l.getColor();

lightColor.setVector4InArray(color.getRed(), color.getGreen(), color.getBlue(), l.getType().getId(), i);

// …

}

[/java]



I’m doing it this way in Project: RealSlimShader (see MaterialSP).

1 Like

@normen I was looking at that as a possibility, Silly question, but since this is never written to the screen, do you pass the information directly into gl_FragColor? Or is it gl_FragData[x]? I’m also still a bit confused on the JME side… Is there a filter that is doing this currently that I can pick through for understanding how to construct the texture being passed in?

@survivor Exactly what I was looking for as an example… most appreciated.

@t0neg0d said:
@normen I was looking at that as a possibility, Silly question, but since this is never written to the screen, do you pass the information directly into gl_FragColor? Or is it gl_FragData[x]? I'm also still a bit confused on the JME side... Is there a filter that is doing this currently that I can pick through for understanding how to construct the texture being passed in?

on the java side :
[java]
material.setTexture("yourTextureParamName",yourTexture);
[/java]

yourTextureParam must be defined in your j3md file as a Texture2D in the MaterialParameters section

then in your frag shader declare it as a sample2D uniform with an "m_" prefix

uniform sample2D m_yourTextureParam;

then you can go

vec4 color = texture2D(m_yourTextureParam, textCoord);

for example.

the yourTexture param on the java side has to be a Texture object loaded via the assetManager for example, or a texture attached to a frameBuffer that is previously rendered, or an awt generated texture.
You have some convenience methods in the ImageToAWT tool class to do that.

@nehon I’m familiar with this. I’m trying to figure out how to inject vec arrays as a texture. But, I think I’m going to end up doing what @survivor did.



Oddly enough, it looks like updateLightListUniforms is the only place you have access to the shader during a render pass.

Uniforms are just to pass a small number of parameters. If you want to pass a greater amount of data, I’d use a texture like @normen proposed. That’s why it’s more generally called a sampler (a color is also a vector). But there’s interpolation LoDs of stuff going on in the background I think. Maybe there’s also a way to add custom attributes to the mesh. Depends on what you need.

@survivor I’m trying to avoid using geometry all together. I’d like to just pass in vectors, convert those to pixel coords in the vertex shader (worldviewmodelviewvector) and then match them in the fragment shader to “paint” the effect I want in surrounding pixels based on on the camera angle. I have most of this complete, however, I need to find a way to inject the vectors.



I’m not sure how to create the texture on the JME side to contain this information. Create an image buffer the size of the vector array? or?

@t0neg0d said:
@nehon I'm familiar with this. I'm trying to figure out how to inject vec arrays as a texture. But, I think I'm going to end up doing what @survivor did.

Oddly enough, it looks like updateLightListUniforms is the only place you have access to the shader during a render pass.

use a texture, set the mag filter to nearest if you don't want interpolation issues.
A 32*32 rgb float texture can be considered a 32x32 vector array.
Fetching it is optimized in the shader.
branching to iterate over an array is not.
@t0neg0d said:
@survivor I'm trying to avoid using geometry all together. I'd like to just pass in vectors, convert those to pixel coords in the vertex shader (worldview*modelview*vector) and then match them in the fragment shader to "paint" the effect I want in surrounding pixels based on on the camera angle. I have most of this complete, however, I need to find a way to inject the vectors.

I'm not sure how to create the texture on the JME side to contain this information. Create an image buffer the size of the vector array? or?


Note: no matter what, you will always have to send _some_ geometry. Otherwise, you will have nothing for the .vert shader to run on.

In the end, I'm not sure what sending an array of stuff buys you over sending geometry if all you are trying to do is turn that array into geometry. After all, that is the SLOWEST way to get vertexes to a shader.

An array is useful if you need some list of things when the vert shader operates on your vertexes or when the frag shader operates on the pixels... but if you just need vertexes then send them and avoid the array.

Or maybe describe more specifically what you are trying to do and we might be able to help further.