Gamma correction woes…

Also IF there is a proper way to deactivate it…users may choose to set gamma correction to off if that break their visuals.
We agreed that 3.1 may break things…this is one thing :stuck_out_tongue:

Is this discussion somewhat redundant? Common thing is to use 3d texture as a color look up on vertex stage.
With shader nodes user can add correction if wants it.

In my opinion sRGB isn’t a big deal, but if sRGB will go into core as a quick hack (EXT_SRGB is like quick hack), engine itself will gain +1 confusing point because will have one more “special case”.

Things that i really missing are GL_TEXTURE_BUFFER(because can be used to contraptions like texture 3d array) or GL_RGBA8_UI(because can be used to deliver ints or custom vertex attributes without rounding errors) or GL_R32_UI(delivery of data into shader, i very miss it in my sandbox game).

@SQLek said: Is this discussion somewhat redundant? Common thing is to use 3d texture as a color look up on vertex stage. With shader nodes user can add correction if wants it.
3dtextures are most probably not gamma corrected as they often not represent color at all. If the software you use gamma correct them, change it... The point is not to have ALL textures gamma corrected, only those that represent colors.
@SQLek said: In my opinion sRGB isn't a big deal, but if sRGB will go into core as a quick hack (EXT_SRGB is like quick hack), engine itself will gain +1 confusing point because will have one more "special case".
Explain. How something that is an official ARB extension be a quick hack? https://www.opengl.org/registry/specs/ARB/framebuffer_sRGB.txt

My point is to make it as plain as possible for users, hence the discussion.

@SQLek said: Things that i really missing are GL_TEXTURE_BUFFER(because can be used to contraptions like texture 3d array) or GL_RGBA8_UI(because can be used to deliver ints or custom vertex attributes without rounding errors) or GL_R32_UI(delivery of data into shader, i very miss it in my sandbox game).
Can't see how this is related to the discussion, if you want to talk about it, please create another thread.

I don’t supply whitepaper. My bad.
3D Luts in GPUGems

This technique will run on pure gl 2.0 or gles 2.0 with 3d texture ext, will utilize conditions in shader nodes. Not only RGB -> sRGB conversion will be possible but also conversions like CMY -> sRGB or YUV -> sRGB.

Ok…yet another way to do it …sigh.
I wonder how it performs compared to hardware gamma correction. The article compares it to a brute force post processing gamma correction (like our GammaCorrectionFilter).
Idk how hardware gamma correction is implemented, I guess it can depend on the vendor’s drivers.

Anyway, i’ll try.

1 Like

That 3D LUT thing seems quite nice… It can be utilized in the final step when we transform our linear colors back to srgb space, instead of using 3 calls to pow. It will also enable some other nice color grading options which we will get “for free”.

However it won’t change the need for using sRGB textures for inputs though. I read somewhere that gamma correction for the common RGBA8-format is implemented as a lookup table in hardware, but I do not know how true this is for modern GPUs.