Gamma correction woes…

This is intended to have a dev discussion regarding what way we may choose.
So summoning the brains @pspeed, @normen, @momoko_fan, @Sploreg, @kwando
Of course anyone is invited to the discussion.

So I looked into Gamma correction recently…I always thought it was just a way to tweak the rendered picture and that it was a matter of taste…
It’s not.

Our lighting material really suffers from not having it because the lighting calculation is wrong. I think it will enhance the visual of the shader nicely if we implement it in the engine.

So for those who doesn’t have a clue of what’s behind the term here are explanations (for the others jump to Implementations):
We consider color values to be linear when computing lighting. What does that means? That means that we consider that the color 0.5,0.5,0.5 is half way between black and white.
The problem is that it’s not the case, or at least not when you look at the color through a monitor.
CRT monitors had physical limitations that prevented them to have a linear way of representing colors. that means that 0.5,0.5,0.5 through a monitor is not half way between black and white (it’s darker). Note that black and white remains the same though.

If we do not take that into account, the rendered images are overly darken and feels dull.

LCD monitors still mimic this physical limitation (I guess for backward compatibility).

Output correct colors :
Gamma Correction is the technique that tends to correct the issue. Gamma is an power factor applied to the color by the monitor when lighting a pixel on screen (or at least a simplification of the function applied). So when we output the color, we have to apply the invert power of this factor to nullify the effect : finalColor = pow(computedColor, 1/gamma);

Knowing what colors we have as input
The other aspect of gamma correction is the colors we get as input in the rendering process that are stored in pictures. Almost all image editors are storing color data in images already gamma corrected (so that colors are correct when you display the picture in a viewer of in a browser).
Such images are said to be in sRGB space meaning “standard RGB” (which is not the standard one would guess).
That means that textures that we use as input in our shaders are not in a linear space. The issue is that we need them in linear space when we compute the lighting…so the lighting is wrong.
To avoid this we need to apply some gamma correction to the color read from the image (pow(color, gamma);
This only apply to textures that will render colors on screen (basically diffuse map, specular, light maps). Normal maps, height maps don’t need the correction.

this is the kind of difference you can have :
left is non corrected output, right is gamma corrected output.

Implementation
So…now we have several choices.
1st implementation :
Since opengl 3.0 we have an ARB extension that allows you to output a color in linear space and have the GPU automatically correct it.
This ARB extension is this one https://www.opengl.org/registry/specs/ARB/framebuffer_sRGB.txt
So to make it work in JME (with LWJGL) we’d have to create a Display with a PixelFormat that support sRGB (there is a convenient method withsRGB on PixelFormat).
Then we have to call in the renderer glEnable(GL30.GL_FRAMEBUFFER_SRGB). This would handle the corrected output
Also we’d have to figure out how it can be done with jogl, the android renderer and the iosRenderer (not sure it’s supported on opengl ES 2.0)
But I guess/hope there are equivalents.

Now for the input, instead of classic RGBA8 image format, you can use SRGB8_ALPHA8_EXT which is basically RGBA in sRGB. Using this you specify the GPU that the texture is in sRGB space and when fetching a color from it, the GPU will linearize the color value for you (for free apparently…). (there are sRGB equivalent to all RGBish formats)
But all textures don’t need this, so we would need to be able to specify when loading a texture if we want it in sRGB.
I think the best way is to add a flag in the TextureKey and add a way to specify this in a j3m or j3md (like we have flip, repeat…).

Also, I’d have a flag in the AppSetting that toggle gamma correction on or off, because as we don’t support it for now, this will change the visual aspect of many games. (we tend to compensate the darkness of the scenes by boosting lights intensities, mostly ambient light…that doesn’t make them right though).

2nd Implementation
Idk if this would be a fall back for opengl3.0 incompatible cards, but it will be more painful.
The idea would be to have a semi-hardware gamma correction by doing the correction by hand in the shader(s). Correction when reading texture and correcting when outputing colors.
This is obviously tedious, more expensive, and present some issues with post filters, because only the last output must be corrected, or you have to make the reverse correction for each filter.
Meaning that the lighting shader must have a define to toggle correction of the output (depending if there are filters or not) and that we need a filter at the end of the stack that does the gamma correction (which we already have, but is not working properly).
Also…even unshaded should have this, at least the output part as even solid colors are wrong…

5 Likes

For an in-depth read on gamma correction (it’s actually not a hard read) look here.
I think we need gamma correction, and a dev needs to be able to turn it on and off since they are in control of their images sources. The off toggle will help with backwards compatibility as you mention @nehon.

With route #2, unshaded material will definitely need the correction as well.

Pre-correcting the images before image processing is important, the tutorial link I posted talks about this. Processes such as mip-map generation can exacerbate issues for non-linear images.

Generally when I apply textures to in-game objects the diffuse texture is made in photoshop/gimp and then saved out in a gamma-corrected format. I think a lot of people follow this method in games. These gamma-corrected (non-linear) images get played with by the lighting shader to produce compounding effects of the source image that should be linear to start before the calculations take place. Determining what textures need correction vs what ones don’t could be determined by the color space info of the image itself, or it can just be part of the asset development process where the artists need to choose one format and stick with it. Not unreasonable imo.

As to what option to choose, I think whatever can be unloaded onto the video card for “free” is the way to go. But if that means OpenGL 3.0, then we still have to manually change the shaders, no?

I’m actually all for making jme 3.1 require OpenGL 3.0 for reasons like this and other features like texture arrays.

Well I’m for the first solution,
cause adjusting every shader ever seems a bit overkill.

Having a shadernode that does gamma correction for fallback users sounds nice tho.

@Sploreg said: For an in-depth read on gamma correction (it's actually not a hard read) look here.
yup already read it...I think I read all the internet about gamma correction... :p
@Sploreg said: Generally when I apply textures to in-game objects the diffuse texture is made in photoshop/gimp and then saved out in a gamma-corrected format. I think a lot of people follow this method in games. These gamma-corrected (non-linear) images get played with by the lighting shader to produce compounding effects of the source image that should be linear to start before the calculations take place. Determining what textures need correction vs what ones don't could be determined by the color space info of the image itself, or it can just be part of the asset development process where the artists need to choose one format and stick with it. Not unreasonable imo.
from what I gathered, jpg, gif, png are already corrected, so Artists wouldn't have to change anything (if we use the sRGB formats fro reading). Are you doing something special to the image to have it gamma corrected? i'm not sure I get what you mean.
@Sploreg said: As to what option to choose, I think whatever can be unloaded onto the video card for "free" is the way to go. But if that means OpenGL 3.0, then we still have to manually change the shaders, no?

I’m actually all for making jme 3.1 require OpenGL 3.0 for reasons like this and other features like texture arrays.


Well it’s tempting, but IMO that has a lot more implications… Idk if there are still a lot of device running on opengl2 only…but I guess there is

I’m not doing anything to the gamma correction of the output images, just stating that the artists probably won’t have to do much. In some cases they would obtain images that are not corrected and then in that case they will appear incorrect. For example if they save an image from the web it may not be corrected, but if they open it in photoshop then save it it will then be gamma corrected.

@Sploreg said: For example if they save an image from the web it may not be corrected, but if they open it in photoshop then save it it will then be gamma corrected.
I was under the impression that on the contrary web images were gamma corrected. Jpg gifs and png. PNG being quite special in the matter because they hold the gamma information ....but not sure about this.

I don’t mean all images, just that they could get one that isn’t corrected. Probably not something that needs to be worried about. The artists just have to be somewhat conscious of the source of the image if they are getting them from who-knows-where.

I see two challanges with sRGB support:

  1. How do users specify which texture is sRGB and which is not? For example, normal maps must not be sRGB decoded, and therefore should not be loaded as sRGB textures. Inside your J3MD file, you must be able to mark the NormalMap texture parameter as “Non-sRGB” or “Linear” so that it is not going through sRGB processing.

  2. How to deal with devices with no SRGB support? In that case, lighting will not appear correct, but this is how things are now, so maybe its not a big issue. OpenGL 3.0 is NOT required for sRGB. This is a feature available since 2007 and was supported since the GeForce 8000 series. The only GPUs to potentially not supported are Intel ones, but they have lots of other issues as well.

Once you have this then it should be as simple as AppSettings.setSRGB(true) and then the sRGB based pipeline is activated. No need for shader nodes or to modify any filters.

Mipmapping should not be an issue because they are generated by the GPU, which is sRGB aware once the pipeline is turned on. If you’re using DDS format with offline generated mipmaps, make sure your tool takes sRGB into account - most probably do at this point.

@Momoko_Fan said: 1) How do users specify which texture is sRGB and which is not? For example, normal maps must not be sRGB decoded, and therefore should not be loaded as sRGB textures. Inside your J3MD file, you must be able to mark the NormalMap texture parameter as "Non-sRGB" or "Linear" so that it is not going through sRGB processing.
@nehon said: But all textures don't need this, so we would need to be able to specify when loading a texture if we want it in sRGB. I think the best way is to add a flag in the TextureKey and add a way to specify this in a j3m or j3md (like we have flip, repeat....).
Also as a side note, we'd benefit of this kind of system, it's still impossible right now for example to specify that a texture is a cube map directly in the j3m.
@Momoko_Fan said: 2) How to deal with devices with no SRGB support? In that case, lighting will not appear correct, but this is how things are now, so maybe its not a big issue. OpenGL 3.0 is NOT required for sRGB. This is a feature available since 2007 and was supported since the GeForce 8000 series. The only GPUs to potentially not supported are Intel ones, but they have lots of other issues as well.
oh ok, I guess we have to check for the extension directly then, I assumed this because it was in GL30 class of lwjgl.
@Momoko_Fan said: Once you have this then it should be as simple as AppSettings.setSRGB(true) and then the sRGB based pipeline is activated. No need for shader nodes or to modify any filters.

Mipmapping should not be an issue because they are generated by the GPU, which is sRGB aware once the pipeline is turned on. If you’re using DDS format with offline generated mipmaps, make sure your tool takes sRGB into account - most probably do at this point.


Exactly, IMO that’s the best approach.

As for a fall back we can keep the GammaCorrectionFilter we have that would work for the output…and for the input, we’d need a define in the shader that do the conversion…hence the shader node idea
or users would need to have linear textures…

But as you said…fallback would be just as it is now…wrong lighting.

I think we should use the hardware extensions to do the work for us.

Doing it in shaders means we have to correct all nonlinear inputs that represents color, but we also have to keep all intermediate buffers in linear space so that blending works correcly, this means we might need to render into a deeper pixelformat in order not to loose precision in the darker areas.

Even with hardware support I think we need to correct our shader input colors (ColorRGB) but this correction can be done on the CPU, this is for diffuse, specular, light colors, vertex colors and…

I think we should add a intensity value to our Lights, since user specified colors is most likely given in screen gamma space which means it is not as straight forward as just multiply by a factor in order to change the intensity without shifting the hue. This can also be correctly premultiplied into color on the CPU side before rendering… we would probably need a “global” screen gamma value we can use in our conversions.

I think @Momoko_Fan is right that we need to specify which inputs that need linear values in the MatDef file.

All in all, I think there is many ways we can screw up the linearity if this is not done carefully but I think it would be a worthwhile effort in the end.

@kwando said: Even with hardware support I think we need to correct our shader input colors (ColorRGB) but this correction can be done on the CPU, this is for diffuse, specular, light colors, vertex colors and..
Nope, if you specify the texture uses the sRGB format (SRGB8_ALPHA8_EXT instead of RGBA) the GPU will linearize the result on each texture fetch. vertex color and lights colors are linear they don't need to be corrected.

Also, the drawback of this approach is that the gamma correction is approximated (to gamma = 2.2 from the specs of the extension), this is a rough approximations. I read comments here and there that, the besy way to have a correct gamma value is to apply gamma yourself (that is in the shader) and offer a calibration UI to the user so he can choose a proper gamma value for his screen (I guess you’ve already seen this in games).
With hardware Gamma correction you can’t do this.

IMO that’d be a lot of hassle for not much more…If one wants to have precise gamma correction he can always disable hardware Gamma correction and implement a calibration screen and have all his shaders use a proper gamma value.

@nehon said: Nope, if you specify the texture uses the sRGB format (SRGB8_ALPHA8_EXT instead of RGBA) the GPU will linearize the result on each texture fetch. vertex color and lights colors are linear they don't need to be corrected.

This is true for nonlinear textures loaded into sRGB buffers. Material colors, vertex colors and light colors still need to be corrected I think, since they most probably have been choosen visually on a nonlinear display => they are in nonlinear space.

EDIT: Added link.

@kwando said: This is true for nonlinear textures loaded into sRGB buffers. Material colors, vertex colors and light colors still need to be corrected I think, since they most probably have been choosen visually on a nonlinear display => they are in nonlinear space.
sigh....that's getting really complicated....but you're right.. So .... we allow to mark colors as sRGB in the mat def (like : Color MyColor sRGB) and we convert them when passed to the shader?

Yeah… this turns out to be little more complicated than it first appeared… I’ve found a few flaws in my own gamma corrected pipline in this process =P

@nehon said: So .... we allow to mark colors as sRGB in the mat def (like : Color MyColor sRGB) and we convert them when passed to the shader?

This will work if we assume that all user specified colors is in sRGB space, which might not allways be the case… they way ColorRGBA is written kind of assumes its values is in linear space even though it most certainly is not. ColorRGBA#add ColorRGBA#add and the like is essentially “wrong” if the values is in sRGB space since gamma(A) + gamma(B) != gamma(A + B).

One way to address this could be to add ColorSpace information to ColorRGBA so we now in which space it is defined, then we can (together with the information in j3md file) use that information to “do the right thing”.

Colors on materials specifically would be in sRGB because that’s the color space in which the user specified them. In that case, when the sRGB pipeline is turned on, they should be converted sRGB → linear before being passed to the shader. Light colors don’t matter in this case because they should be linear anyway, they are more of a light intensity for each frequency (red / green / blue) than an absorption factor like it is on materials. Also sRGB * sRGB = sRGB^2 (incorrect), however sRGB * linear = sRGB (correct) so that somehow works out (?).

The ColorRGBA functions are wrong indeed, but they are not used for anything by the engine itself, maybe they should be marked as deprecated?

@Momoko_Fan said: Colors on materials specifically would be in sRGB because that's the color space in which the user specified them. In that case, when the sRGB pipeline is turned on, they should be converted sRGB -> linear before being passed to the shader. Light colors don't matter in this case because they should be linear anyway, they are more of a light intensity for each frequency (red / green / blue) than an absorption factor like it is on materials. Also sRGB * sRGB = sRGB^2 (incorrect), however sRGB * linear = sRGB (correct) so that somehow works out (?).

The ColorRGBA functions are wrong indeed, but they are not used for anything by the engine itself, maybe they should be marked as deprecated?

So what about inColor buffers? Since we don’t allow custom vertex attributes a lot of us are forced to co-opt the existing ones for other things. I would think conversion would need to be done in the shader as needed then.

Well, unless a color buffer is baked, I can’t see why it wouldn’t be linear. I guess it’s the result of a calculation, or something users adjusted trough JME output, which will now be gamma corrected.

@nehon said: Well, unless a color buffer is baked, I can't see why it wouldn't be linear. I guess it's the result of a calculation, or something users adjusted trough JME output, which will now be gamma corrected.

My point is that because there is no way to have custom vertex attributes that I use things like inColor for “some random 4 floats that I need” and if something in the JME side was autoconverting them as colors then it would break my code. It also kind of seems like a hack.

In general, it feels a little weird to have JME perform a conversion all the time on something that could have been precalculated. For uniforms does this mean we convert the colors for every color uniform once per Material one per frame? (Remember the uniforms are resent to the GPU for every object, every frame.)

@pspeed said: My point is that because there is no way to have custom vertex attributes that I use things like inColor for "some random 4 floats that I need" and if something in the JME side was autoconverting them as colors then it would break my code. It also kind of seems like a hack.
Yep, my point was that there is no need of conversion anyway (even if it's used for standard color). the fact that you can't use custom buffers is another matter but I agree we should add a way to do it.
@pspeed said: In general, it feels a little weird to have JME perform a conversion all the time on something that could have been precalculated. For uniforms does this mean we convert the colors for every color uniform once per Material one per frame? (Remember the uniforms are resent to the GPU for every object, every frame.)
I agree on that too...IMO if we go this way I'd rather store linear values in the ColorRGBA in additional class attributes, update them when they are set in srgb space and pass them to the shader. I know it's additional memory, but computing it every time you pass the color to the shader seems a waste.

EDIT : Actually the idea we had this morning to have all ColorRGBA considered as linear and to add a getAsSRGB and setAsSRGB was better IMO…why was that ruled out already?

@nehon said: Yep, my point was that there is no need of conversion anyway (even if it's used for standard color). the fact that you can't use custom buffers is another matter but I agree we should add a way to do it.

I agree on that too…IMO if we go this way I’d rather store linear values in the ColorRGBA in additional class attributes, update them when they are set in srgb space and pass them to the shader.
I know it’s additional memory, but computing it every time you pass the color to the shader seems a waste.

EDIT : Actually the idea we had this morning to have all ColorRGBA considered as linear and to add a getAsSRGB and setAsSRGB was better IMO…why was that ruled out already?

I thought momoko ruled it out since everyone is already doing sRGB without knowing it… but I agree it was clean at least.