<cite>@t0neg0d said:</cite>
Atmospheric Scattering will more than likely end up in Filter... you can pre-render whatever you need in the postQueue method, just remember to grab a reference to the RenderManager in the initFilter method.
However, I’m a little lost on what you are pre-rendering. Are the images already created? If they are (which I am assuming is the case), you just need to pass these into the composite shader (or the shader last used in your Filter)
From the look of the GL calls above, they are using the rendered scene (from a frame buffer) and passing in the transmitenceTexture (sp?) for use in the light scattering process.
EDIT: Hmmm… actually, it looks like the transmitenceTexture is a pre-rendered portion of the scene… I am guessing? Unless you can be a bit more specific about what it is you need to render for the final output, it is going to be almost impossible to point you in the right direction. All I do know, is this will end up being a Filter, so you may want to take a look at a few of the JME stock Filters to see how the setup/use FrameBuffers for both Passes and forced rendering.
EDIT 2: A quick and dirty example of forced rendering:
[java]
@Override
protected void postQueue(RenderQueue renderQueue) {
Renderer r = rm.getRenderer();
// Set the frame buffer to something other than the scene output fb
r.setFrameBuffer(ghostBuffer);
// Clear buffers
r.clearBuffers(true, true, true);
// Set a forced material (prolly don’t need this)
rm.setForcedMaterial(ghostMat);
// In this case I am rendering a geometrylist
rm.renderGeometryList(geoms);
// Clear the forced material
rm.setForcedMaterial(null);
// Reset the frame buffer back to the original
r.setFrameBuffer(vp.getOutputFrameBuffer());
}
[/java]
This is later used in the composite frag shader
Hi,
The images are not already created. A full-screen quad is created and the colors of this quad are set in the pixel shader using the atmospheric scattering calculations…it is then rendered to a texture. The final output of the texture is simply a table of values that are then looked up when applying the scattering. For example, this is the transmittance texture after it has been created in the shader:
For more reference here are the actual shaders that render that:
[java]
// vertex shader:
void main() {
gl_Position = gl_Vertex;
}
// fragment shader:
float opticalDepth(float H, float r, float mu) {
float result = 0.0;
float dx = limit(r, mu) / float(TRANSMITTANCE_INTEGRAL_SAMPLES);
float xi = 0.0;
float yi = exp(-(r - Rg) / H);
for (int i = 1; i <= TRANSMITTANCE_INTEGRAL_SAMPLES; ++i) {
float xj = float(i) * dx;
float yj = exp(-(sqrt(r * r + xj * xj + 2.0 * xj * r * mu) - Rg) / H);
result += (yi + yj) / 2.0 * dx;
xi = xj;
yi = yj;
}
return mu < -sqrt(1.0 - (Rg / r) * (Rg / r)) ? 1e9 : result;
}
void main() {
float r, muS;
getTransmittanceRMu(r, muS);
vec3 depth = betaR * opticalDepth(HR, r, muS) + betaMEx * opticalDepth(HM, r, muS);
gl_FragColor = vec4(exp(-depth), 0.0); // Eq (5)
}
[/java]
I will look into the source for the filters, thanks for the help.