This is intended to have a dev discussion regarding what way we may choose.
So summoning the brains @pspeed, @normen, @momoko_fan, @Sploreg, @kwando
Of course anyone is invited to the discussion.
So I looked into Gamma correction recently…I always thought it was just a way to tweak the rendered picture and that it was a matter of taste…
It’s not.
Our lighting material really suffers from not having it because the lighting calculation is wrong. I think it will enhance the visual of the shader nicely if we implement it in the engine.
So for those who doesn’t have a clue of what’s behind the term here are explanations (for the others jump to Implementations):
We consider color values to be linear when computing lighting. What does that means? That means that we consider that the color 0.5,0.5,0.5 is half way between black and white.
The problem is that it’s not the case, or at least not when you look at the color through a monitor.
CRT monitors had physical limitations that prevented them to have a linear way of representing colors. that means that 0.5,0.5,0.5 through a monitor is not half way between black and white (it’s darker). Note that black and white remains the same though.
If we do not take that into account, the rendered images are overly darken and feels dull.
LCD monitors still mimic this physical limitation (I guess for backward compatibility).
Output correct colors :
Gamma Correction is the technique that tends to correct the issue. Gamma is an power factor applied to the color by the monitor when lighting a pixel on screen (or at least a simplification of the function applied). So when we output the color, we have to apply the invert power of this factor to nullify the effect : finalColor = pow(computedColor, 1/gamma);
Knowing what colors we have as input
The other aspect of gamma correction is the colors we get as input in the rendering process that are stored in pictures. Almost all image editors are storing color data in images already gamma corrected (so that colors are correct when you display the picture in a viewer of in a browser).
Such images are said to be in sRGB space meaning “standard RGB” (which is not the standard one would guess).
That means that textures that we use as input in our shaders are not in a linear space. The issue is that we need them in linear space when we compute the lighting…so the lighting is wrong.
To avoid this we need to apply some gamma correction to the color read from the image (pow(color, gamma);
This only apply to textures that will render colors on screen (basically diffuse map, specular, light maps). Normal maps, height maps don’t need the correction.
this is the kind of difference you can have :
left is non corrected output, right is gamma corrected output.
Implementation
So…now we have several choices.
1st implementation :
Since opengl 3.0 we have an ARB extension that allows you to output a color in linear space and have the GPU automatically correct it.
This ARB extension is this one https://www.opengl.org/registry/specs/ARB/framebuffer_sRGB.txt
So to make it work in JME (with LWJGL) we’d have to create a Display with a PixelFormat that support sRGB (there is a convenient method withsRGB on PixelFormat).
Then we have to call in the renderer glEnable(GL30.GL_FRAMEBUFFER_SRGB). This would handle the corrected output
Also we’d have to figure out how it can be done with jogl, the android renderer and the iosRenderer (not sure it’s supported on opengl ES 2.0)
But I guess/hope there are equivalents.
Now for the input, instead of classic RGBA8 image format, you can use SRGB8_ALPHA8_EXT which is basically RGBA in sRGB. Using this you specify the GPU that the texture is in sRGB space and when fetching a color from it, the GPU will linearize the color value for you (for free apparently…). (there are sRGB equivalent to all RGBish formats)
But all textures don’t need this, so we would need to be able to specify when loading a texture if we want it in sRGB.
I think the best way is to add a flag in the TextureKey and add a way to specify this in a j3m or j3md (like we have flip, repeat…).
Also, I’d have a flag in the AppSetting that toggle gamma correction on or off, because as we don’t support it for now, this will change the visual aspect of many games. (we tend to compensate the darkness of the scenes by boosting lights intensities, mostly ambient light…that doesn’t make them right though).
2nd Implementation
Idk if this would be a fall back for opengl3.0 incompatible cards, but it will be more painful.
The idea would be to have a semi-hardware gamma correction by doing the correction by hand in the shader(s). Correction when reading texture and correcting when outputing colors.
This is obviously tedious, more expensive, and present some issues with post filters, because only the last output must be corrected, or you have to make the reverse correction for each filter.
Meaning that the lighting shader must have a define to toggle correction of the output (depending if there are filters or not) and that we need a filter at the end of the stack that does the gamma correction (which we already have, but is not working properly).
Also…even unshaded should have this, at least the output part as even solid colors are wrong…