So I am mucking around with the PBR pipeline. I really like the results so far, and to make life hard on myself i am generating my own hdr maps in blender.
It all works pretty good up until i add some nebula billboards. Then i get the following error. java.lang.UnsupportedOperationException: NaN to half conversion not supported! at com.jme3.math.FastMath.convertFloatToHalf(FastMath.java:971) at com.jme3.texture.image.DefaultImageRaster.setPixel(DefaultImageRaster.java:134) at com.jme3.environment.util.CubeMapWrapper.setPixel(CubeMapWrapper.java:196) at com.jme3.environment.generation.IrradianceMapGenerator.generateIrradianceMap(IrradianceMapGenerator.java:167)
I am using the TestPBRLighting code as is with just the hdr map replaced.
So i have been playing with the images and i can’t really find out what is wrong. other than some gradient thing or something. I tried making my own nebula, and of course it could be some sort of problem in blender. However without the image billboards it works perfectly.
on this page the “milkyway” zip has a milkyway_small.hdr has the same problems. sIBL Archive
Awesome. I managed to get something working. I used a full skymap in blender rather than billboards. Obviously i would like to do my own with and without billboards, so knowing what is wrong would be grand. But no rush. Also i did go through the source a little. Nothing jumped out at me.
While your here also when i set the EnvironmentCamera size to anything other than 128, it seems to just no produce a lightmap. In fact 127 cause a hard crash in my video drivers. Other values just never seem to make a light probe. However it doesn’t seem to matter that 128 is used so not sure about that.
Ok I narrowed the issue to FastMath.convertHalfToFloat that produce NaN or Infinite values. But it seems intended depending on the value. I can’t question the math behind it, so I guess the issue comes from the data in the image fed to the convertHalfToFloat, so basically… the HDRLoader.
Makes sense, those spots are weird anyway.
I did read your implementation notes. Once i though about it, it made sense, size of texture used that is also mipmaped. However it mostly just doesn’t render the probe, so i was wondering why my scene was so dark and didn’t appear properly lit. But hay i found it.
Yea don’t know what the deal is with those few hdr files. Thing is that i use very bright square emitters without issue in blender. Some alpha blended billboard and it breaks. But now a full skymap is working.
man… i don’t get it… some pixels are read with weird values. For example :
this is the raw data : r: -32, g: 119, b: 119, e: 114
e is the exposure.
This converts to float rgb as :
r: 5.340576E-5, g: 2.8371811E-5, b: 2.8371811E-5
which to me looks like pretty much black…
then converted to half float (16bit component floats) and then to short (idk why we need this…probably to store it in a short buffer)
r: 768, g: 32624, b: 32624
which makes little sense to me but w/e.
The problem is that 32624 produce POSITIVE_INFINITY once converted back to float. (yeah because we need floats)
the weirdest thing is that those pixels are actually not the spots that you can see in the picture…