Shaders on Android devices

Hey guys, I’m currently playing with shaders and my Nexus One android devices.

But…using a desktop shader on an android device can be a real hassle



I already stumbled on some pitfalls, so i’m gonna list them here while trying new things. Hope that’ll help any Android developers willing to get more into shaders. feel free to add any comment, or your own experience with shaders and android


  • Nexus One device only support 32 float varryings (or as the error states 8 float vec4)
  • mutiplying an int and a float like this



    float result = myInt * myFloat;



    Just fail silently (null error with a partial output of the shader code in the log -_-)

    use



    float result = float(myInt) * myFloat;



    took me hours to figure this one

@nehon



I have a found a tutorial on the net that produced good results. I was able to build the app and will post the link below so you can test it and see if you like it. I’m trying to create shaders that work with OpenGL ES since it has more restrictions than the desktop. I’m also toying around with the framebuffer.



Link

Just a reminder:

http://hub.jmonkeyengine.org/groups/general/forum/topic/fast-mobile-shaders-talk-at-siggraph/

@techsonic, that’s nice, however how is the framerate?

@nehon said:
float result = myInt * myFloat;
`
Just fail silently (null error with a partial output of the shader code in the log -_-)
use
`
float result = float(myInt) * myFloat;
`
took me hours to figure this one


Actually ANY!!! implicit conversion is forbidden on desktop as well, let me guess you have a nvidia graficcard? Cause on AMD that code would fail. (at least on mine current and my last)
@EmpirePhoenix said:
Actually ANY!!! implicit conversion is forbidden on desktop as well, let me guess you have a nvidia graficcard? Cause on AMD that code would fail. (at least on mine current and my last)

He said that 5 months ago, we are well aware of that, thanks :) Actually any shader fails to compile on mac when this is the case, so we use apples OpenGL stack as a conformity check really ^^

No, this



float result = myInt * myFloat;



works fine on any decent desktop card, and would work on a Mac too.

For the record I have an ATI card or i would pull my hairs each time i release a shader.

the problem here is more that it just fails silently instead of throwing the appropriate error…

@nehon



http://i.imgur.com/AJghm.jpg



Not bad! I have a HTC Evo 4g so it’s not bad.

good!