Looking at the lwjgl java docs and i ran accross something interesting.
org.lwjgl.opengl.EXTCgShader
You can pass GL_CG_VERTEX_SHADER_EXT to glCreateShaderARB instead of GL_VERTEX_SHADER_ARB to create a vertex shader object that will parse and compile its shader source with the Cg compiler front-end rather than the GLSL front-end.
Doesn't that mean all you have to change is 2 lines of code and GLSLShaderObjectState becomes CgShaderObjectState?
Did you test it? I guess it's nVidia only… and I have an Ati card.
why would it be nvidia only and i'll probly try to test it seems to easy every Cg demo i look at has a bunch of stuff like CGcontext,CGProfile,CGprogram. And we would only be filling the program needs and profile(CG).
::edit::
if your saying it is nvidia only on account of you tested it and you have a ati card and ruled thats the problem then i'm probly not gonna have better luck i got ati too.
Cg is an Nvidia technology. However, with nVidia's CG developer tools you can compile to GL_VERTEX_SHADER_ARB and GL_FRAGMENT_SHADER_ARB, and now GLSL too. Of course, most likely this will still work better for nVidia cards.
Likewise, Ati has RenderMonkey, which doesn't always work on nVidia. (Don't know if there are any OpenGL extentions specific for this).
Render monkey is just a shader IDE there would be no reason for it not to work on nvidia cards
It's made by Ati. For example when it came out it used PS1.4, while nVidia only had PS1.1 cards. So wether you think there should be a reason or not, it didn't always work. You can expect it to always work perfect with Ati cards, but don't be surprised if it'll work less well with nVidia some times.
It's also not just an IDE, it has some technology you can use in your game engine, but that's DirectX based afaik. Like I said, don't know if they have something smilar for OpenGL (they certainly keep more quiet about it if they do).