Hello!
When using background textures in Lemur there’s a slight color difference between originals and in game. the difference is slight, but it’s annoying because it would force me to make adjustments each time a new texture is used.
the setup for the bg is the following (this is the blue background but it’s the same configuration for the orange buttons as well):
Hi and thanks! Great Idea, but no, Gamma Correction was turned off by deafult and even when explicitly set to false it generated the same thing. (edit: it was not turned off by default but it only affected 3D objects, supposedly because they use a different shader that is actually affected by gamma Correction.)
What format is the mockup image?
Bits per channel? If that is different then the framebuffer precision then that might cause issues. In your image it also looks like only the blue color is affected while orange stays the same
Hey zzuegg, thank you for taking your time.
the bg image (/Interface/style/border-bg.png) is png. here are the export options from Krita if they can help.
Gamma correction affects how textures are loaded and how the ColorRGBA values are interpreted. (srgb versus linear)
This is a common thing to ask about because at some point JME defaulted to having gamma correction on and it messed with folks’ UI colors… especially when they setColor() on materials but didn’t convert their values to srgb first.
So for example, your “force convert to srgb” is probably going to slightly mess up any image created in photoshop or other graphics tools that are already working in srgb.
Everything you describe as your issue sounds like an srgb versus linear problem.
It’s definitely confusing. I feel like I have to relearn it every time it comes up.
Generally, I think if you are editing an image on a computer screen then it is in srgb. (Computer screens are srgb.)
Old 3D software (JME included), used to just load textures up and treat the color space as if it were linear… meaning if you multiply some color by 0.5 then you’d get “half that color”. Since lighting used a lot of these sorts of fractional colors and interpolated colors, lighting in all 3D software was essentially “wrong”. Spheres would get their highlights in the wrong places, shading would look unrealstic, etc… My understand of the history is that some developer discovered this issue and then spent a couple of years pushing back on GPU vendors before things started to improve.
In the modern times, for a shader to do color interpolation correctly then the colors need to be in “linear space”. This means that when loading textures they need to be converted from srgb to linear for the shader. Then it can do all of the math it wants to do in linear space and then that gets converted back to srgb at display time… which is unhelpfully called “gamma correction”. (Unhelpful because for a whole generation of gamers, “gamma correction” was how you made dark rooms brighter in Quake…)
Bottom line: it’s very important that your textures agree with how the materials and frame buffer are setup. And any new ColorRGBA(r, g, b) style colors you send to your shaders need to also agree.
And the classic symptom is: “My colors are all bright and washed out.” That’s pretty classic when gamma is turned on and all of the colors are used to being linear.
I encountered this issue (in reverse) - my texture assets were more saturated than they appeared in game (didnt happen for the models imported). Turned out the texture had a different color space (linear, appeared desaturated). As others mentioned before, i think you can solve it as such: