It doesn’t, really.
A Lumen is a unit of absolute amount of light from a given source. It is not exactly brightness. (That would be ‘Lux’ or Lumens per square Meter.)
Light levels in a digital image or scene are expressed in arbitrary units from
0 meaning “Too little to Register” up to
MAX_VALUE of the datatype, meaning “Too high to Count.”
Note: this is true even of photos taken of a real world scene. A single pixel might have a wide range of values, depending on the Apareture of the lens, Shutter speed, and the sensitivity (ISA) of the sensor chip. This is why we bracket several real exposures together to create HDR images of real scenes.
The actual brightness you get out of your screen will depend on the specific screen, it’s size, etc. but they’re still working on the basic
AS_MUCH_AS_I_CAN model. (Very oversimplified. Gamma correction, Contrast ratios, etc. all tie into what that particular pattern of pixels look like, as well as what extra math is done in the video driver)
The numbers that are stored in a
ColorRGBA are really only relative to one another, kind of the way that vertex positions are measured in units. They don’t really mean anything. (I know that it’s common to model real world objects and set physics forces such as if each unit = 1 meter, but that’s just a convention. They could equally be a foot, inch, or angstrom. Whatever is most convenient for the detail that you are modeling!)
hdr.glslib stuff is all about converting to and from Radiance hdr data format, which is trims a set of three floats down to three 8-bit mantissas sharing an 8 bit exponent. Much smaller to store, but harder to do math on.