Writing to 128 bits rgba textures

Hi,

I presented a program here: http://www.jmonkeyengine.com/jmeforum/index.php?topic=9005.0. That worked fine with my hd2600 card. But, with deeper zooming, this program wrote data in a shader to multiple 128 bits float textures. With a x1600 card and also with a nvidia 6200, the values were not stored properly (=> ugly pixels). Do you know why?

What I don’t understand, with my hd2600, all is working fine.



Best,

Andreas

Hi,

was produced by a nvidia 6200 card. At the ati x1600, I had a mixture of correct calculated and light blue pixels. That seems to be an address problem. But, I still don’t know, how to solve it.



Best,

Andreas

Hi,

writing to only one float texture doesn’t seem to be a problem. But with mrt and gl_fragdata, that doesn’t work. I looked at http://www.gamedev.net/community/forums/topic.asp?topic_id=414665. Perhaps, LWJGLTextureRenderer with render(ArrayList<? extends Spatial> toDraw, ArrayList<Texture> texs, boolean doClear) should be modified. But now, I have no access to nvidia 6200 or ati x1600 for experiments and with my hd2600, all is fine.



Best,

Andreas

Thanks for reporting this, we really need a special area for hardware compatibility issues in the forum

Hi,

I just downgraded my computer to a x1650 and got com.jme.system.JmeException: Error in opengl: Invalid operation (1282). I will see, if I can fix it.



Andreas

Hi,

the program was not working, because, my shader had too many ALU instructions. So, I divided my shader into two shaders and used textures to save the results between the steps.



Best,

Andreas