World coordinates in shader

I’m trying to get a shader working which relies on coloring parts of the world a certain color.
I can’t seem to get it to work however; no matter what I do, I’m not able to get actual world coordinates of what’s being rendered on screen.

The vertex shader (reduced to the simplest nonworking case):
[java]
uniform mat4 g_WorldViewProjectionMatrix;
uniform mat4 g_WorldMatrix;
uniform float m_tileSize;
uniform int m_maxTileX;
uniform int m_maxTileZ;
uniform float[ARRSIZE] m_fowData;

uniform float m_cameraX;
uniform float m_cameraZ;

in vec4 inPosition;
in vec2 inTexCoord;

out vec2 texCoord;
out float fogFactor;

void main() {
vec4 worldCoord = (g_WorldMatrix * inPosition );
if (worldCoord.x < 1024) {
fogFactor = 1.0;
}
else fogFactor = 0.0;

texCoord = inTexCoord;
vec2 pos = (g_WorldViewProjectionMatrix * inPosition).xy;
gl_Position = vec4(pos, 0.0, 1.0);

}
[/java]

The line [java]vec4 worldCoord = (g_WorldMatrix * inPosition );[/java] is as far as I understand supposed to give the actual world coordinates of the vertex being rendered, but it seems to for some reason be returning a pixel-on-screen related value. Hence the line [java]if (worldCoord.x < 1024) {[/java] - 1024 happens to be the width of the window, and is a pivot point where any values below that half-color the screen, and any values above that fully color the screen.

The fragment shader:
[java]
#import “Common/ShaderLib/MultiSample.glsllib”
uniform COLORTEXTURE m_Texture;
uniform vec4 m_FowColor;

in vec2 texCoord;

in float fogFactor;

void main() {
vec4 texVal = getColor(m_Texture, texCoord);
gl_FragColor = mix(texVal, m_FowColor,fogFactor);
}

[/java]

And the material definition:
[java]
MaterialDef FogOfWar {

MaterialParameters {
	Int NumSamples
	Int NumSamplesDepth
    Texture2D Texture
    
    Vector4 FowColor
    FloatArray fowData
    Float tileSize
    Int maxTileX
    Int maxTileZ
    Int arrSize
    Float cameraX;
    Float cameraZ;
}



Technique {
    VertexShader GLSL150:   MatDefs/Post/Fow15.vert
    FragmentShader GLSL150: MatDefs/Post/Fow15.frag

    WorldParameters {
        WorldMatrix
        WorldViewProjectionMatrix
        WorldViewMatrix
    }	
    Defines {
		ARRSIZE : arrSize
	}
}

Technique {
    VertexShader GLSL100:   MatDefs/Post/Fow.vert
    FragmentShader GLSL100: MatDefs/Post/Fow.frag

    WorldParameters {
        WorldMatrix
        WorldViewProjectionMatrix
        WorldViewMatrix
    }	
    Defines {
		ARRSIZE : arrSize
	}
}

}

[/java]

The output:

The main problem with this output is that the filter applied to the screen doesn’t change at all as the camera is moved around, even though it’s supposed to be based on world coordinates. Plus, I’m not sure how I ended up with a color gradient when the only possible colors should be fully green or normal.

Am I doing anything obviously wrong?
Thanks in advance.

Nevermind- I think I understand my problem now, after taking a close look at the water filter.
Since I was running this on a FilterPostProcessor, the actual geometry at the time was just a plane overlaid on the camera, so g_WorldView couldn’t return the positions I wanted. The water filter gets around this by effectively passing in its own matrix to figure out the position of a pixel from the depth buffer and camera position.
The relevant code is:
[java]
vec3 getPosition(in float depth, in vec2 uv){
vec4 pos = vec4(uv, depth, 1.0) * 2.0 - 1.0;
pos = m_ViewProjectionMatrixInverse * pos;
return pos.xyz / pos.w;
}

float sceneDepth = fetchTextureSample(m_DepthTexture, texCoord, sampleNum).r;

vec3 pos = getPosition(sceneDepth, texCoord);
[/java]