Point mesh issues with PointCoord

Hi.

I have issue displaying correctly a texture on a point mesh. The issue is present in the TestPointSprite example.
Here is a test case:
[java]
mesh = new Mesh();
mesh.setMode(Mesh.Mode.Points);
mesh.setBuffer(VertexBuffer.Type.Position, 3, BufferUtils.createFloatBuffer(0f, 0f, 0f));
mesh.updateCounts();
mesh.updateBound();

    material = new Material(env.getAssetManager(), "MatDefs/Particle.j3md");
    material.setTexture("Texture", env.getAssetManager().loadTexture("Effects/Explosion/Debris.png"));

    Geometry geom = new Geometry("ware", mesh);
    geom.setMaterial(material);
    geom.setLocalTranslation(0, 1, 0);
    env.getRootNode().attachChild(geom);

[/java]

Using the following shader:
[java]
MaterialDef Point Sprite {

MaterialParameters {
    Texture2D Texture
}

Technique {

    VertexShader   GLSL100 : MatDefs/Particle.vert
    FragmentShader GLSL120 : MatDefs/Particle.frag

    WorldParameters {
        WorldViewProjectionMatrix
        WorldViewMatrix
        WorldMatrix
        CameraPosition
    }

    RenderState {
        Blend AlphaAdditive
        DepthWrite Off
        PointSprite On
    }
}

}
[/java]
fragment:
[java]
uniform sampler2D m_Texture;
void main(){
gl_FragColor = texture2D(m_Texture, gl_PointCoord.xy);
}
[/java]
vert:
[java]
uniform mat4 g_WorldViewProjectionMatrix;
attribute vec3 inPosition;
void main(){
vec4 pos = vec4(inPosition, 1.0);
gl_Position = g_WorldViewProjectionMatrix * pos;
gl_PointSize = 100.0;
}
[/java]

The expected result is a stretched texture over the point quad, but the texture coord seems to vary with the camera position.
See http://postimg.org/image/wug93k2wr/ccbe78db/ vs http://postimg.org/image/7ji79nyyp/bdf20b34/ vs http://postimg.org/image/vngf1596f/acb91f8e/ while i was getting closer with the camera.

screen0
<br/>
screen2
<br/><br/>
In a demo example of my video card driver, the follwing C code is used without any issue
[java]
// Init

static const char *fragShaderText =
#version 120 \n”
“uniform sampler2D tex0; \n”
“void main() { \n”
" gl_FragColor = texture2D(tex0, gl_PointCoord.xy, 0.0); \n"
“}\n”;
static const char *vertShaderText =
“void main() {\n”
" gl_Position = gl_ModelViewProjectionMatrix * gl_Vertex;\n"
" gl_PointSize = 40.0;\n"
“}\n”;

if (!ShadersSupported())
exit(1);

vertShader = CompileShaderText(GL_VERTEX_SHADER, vertShaderText);
fragShader = CompileShaderText(GL_FRAGMENT_SHADER, fragShaderText);
program = LinkShaders(vertShader, fragShader);

glUseProgram(program);
// snip
assert(glIsProgram(program));
assert(glIsShader(fragShader));
assert(glIsShader(vertShader));

MakeTexture();

glEnable(GL_POINT_SPRITE);
glEnable(GL_VERTEX_PROGRAM_POINT_SIZE_ARB);
glColor3f(1, 0, 0);

// maketexture:
//snip
glActiveTexture(GL_TEXTURE0); /* unit 0 */
glBindTexture(GL_TEXTURE_2D, 42);
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, SZ, SZ, 0,
GL_RGBA, GL_UNSIGNED_BYTE, image);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, Filter);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, Filter);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);

// camera

glViewport(0, 0, width, height);
glMatrixMode(GL_PROJECTION);
glLoadIdentity();
glOrtho(0, width, 0, height, -100, 100);

glMatrixMode(GL_MODELVIEW);
glLoadIdentity();

// drawing
glPointParameteri(GL_POINT_SPRITE_COORD_ORIGIN, Origin);

glClear(GL_COLOR_BUFFER_BIT);

/* draw one point/sprite */
glPushMatrix();
glBegin(GL_POINTS);
glVertex3f(WinWidth / 2.0f, WinHeight / 2.0f, -50);
glEnd();
glPopMatrix();

glutSwapBuffers();

[/java]

The texture is stretched correctly whatever the z parameter of the glVertex3f call.

I can’t be sure from your description but you might be being hit by the maximum sprite size. On some cards (looking at you ATI mostly) the maximum on screen pixel size for a sprite is 64 pixels. (There is actually a way in OpenGL to query this size but JME doesn’t provide easy access to the value.)

This means that if you are close to the sprite it will still be drawn as only 64 on-screen pixels. Some cards (all of my nvidia cards) have no such limit. It’s kind of frustrating as it severely limits the overall utility of point sprites for lots of things.

Actually… your issue is different. I think it’s because you have no texture coordinates.

Edit: why not just use the existing particle materials as reference?

The material i posted is a stripped down version of the particle material. As I said, the issue is present in the TestPointSprite, which use the distributed Particle material.
I don’t use texture coordinates for my mesh because I don’t use them in the simplified shader.
I’m pretty sure the issue is not related to the point mesh sizes, as in the C example I tweaked that parameter.

Also I have noticed that the sprite texture is upside down, and that it shrink to nothing then grow back quickly when getting very close. I will look into the GL context used in jme and compare it with the one used in the C example, especially concerning projection.

Maybe there is something wrong with your display drivers or something, then. TestPointSprite works just fine for me.

Your C code is not at all equivalent, though. It’s rendering in ortho mode. You position the geometry instead of positioning the camera, etc…

I’m not sure what texture coordinates you think the .frag will be using if you don’t provide them… and that’s the part you’ve stripped out of the regular particle shader. Ah, I see you are just using the point coord as the texture coordinate. Actually, if you are in ortho mode then the image shouldn’t change at all as you move it closer and farther away… so perhaps even the C code is not operating correctly.

Indeed its probably a bug with intel dri mesa drivers