Invalid Enum: Intel 965GM

I know this has been talked about before but this issue is still unresolved for me.

I am runing Ubuntu 8.10 (intrepid) on a Macbook 3,1 with an Intel 965GM graphics processor with MESA Dri support.



When I try to run the test app: TestBoxColor or any of the other test apps, I get the following error.


     [java] Jan 12, 2009 3:42:17 PM class jmetest.renderer.TestBoxColor start()
     [java] SEVERE: Exception in game loop
     [java] org.lwjgl.opengl.OpenGLException: Invalid enum (1280)
     [java]    at org.lwjgl.opengl.Util.checkGLError(Util.java:54)
     [java]    at org.lwjgl.opengl.Display.swapBuffers(Display.java:626)
     [java]    at org.lwjgl.opengl.Display.update(Display.java:645)
     [java]    at com.jme.renderer.lwjgl.LWJGLRenderer.displayBackBuffer(LWJGLRenderer.java:516)
     [java]    at com.jme.app.BaseGame.start(BaseGame.java:90)
     [java]    at jmetest.renderer.TestBoxColor.main(TestBoxColor.java:66)
     [java]    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
     [java]    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
     [java]    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
     [java]    at java.lang.reflect.Method.invoke(Method.java:597)
     [java]    at jmetest.TestChooser.start(TestChooser.java:465)
     [java]    at jmetest.TestChooser.main(TestChooser.java:444)



Anyone know how to fix this?

Thanks

Are you sure you have the newest (jME 2.0) code?



This is the 'new' location (as of the release of 2.0)…

http://code.google.com/p/jmonkeyengine/

Yep.  Just to make sure, I re-downloaded (via subversion), rebuilt, and re-ran.  I get the same error.  I don't think its a jme specific problem.  There are other openGL apps like Tremulous and Regnum Online that don't seem to work properly.  Although both of those games do boot up, but there are texture problems with them.

I got the same when i ran my game ona a eeePC.

I know what it is, I fixed in in the JOGL renderer. Your graphics card doesn't know the values used with glGetError or aonther method like glGetInteger or glGetFloat, I don't rememver exactly. I had the same kind of error with JOGL 1.1.1 with my ATI Radeon 9250 Pro. As I don't use LWJGL, I prefer not taking any risk to break everything. Maybe someone else can fix it.

I looked at the source code, it seems to happen in the LWJGL code :frowning: but the symptom looks like what I got before modifying JOGLContextCapabilities.java :


try {
gl.glGetIntegerv(GL.GL_MAX_VERTEX_ATTRIBS_ARB, intBuf);
GL_MAX_VERTEX_ATTRIBS_ARB = intBuf.get(0);
} catch(GLException gle) {
GL_MAX_VERTEX_ATTRIBS_ARB=0;
}


Do you see what I mean?

I don't understand why you get your exception, I'm looking in the source code of LWJGL :


public static void checkGLError() throws OpenGLException {
      int err = GL11.glGetError();
      if ( err != GL11.GL_NO_ERROR ) {
         throw new OpenGLException(err);
      }
   }



I haven't found the call that uses an invalid enum, maybe it is in BaseSimpleGame or in LWJGLRenderer.

Please be kind with us as Intel 965GM is a particularly bad chip  :x

Looks like its in the lwjgl code.



So does the jogl renderer have this problem too?

That error is coming from LWJGL itself, Im afraid you may have to go to that forum to get better help resolving this issue.



If you do; would you mind coming back here and give us a quick update?



heres the check GL error method:


   /**
    * Throws OpenGLException if GL11.glGetError() returns anything else than GL11.GL_NO_ERROR
    *
    */
   public static void checkGLError() throws OpenGLException {
      int err = GL11.glGetError();
      if ( err != GL11.GL_NO_ERROR ) {
         throw new OpenGLException(err);
      }
   }



the swap buffers method:


   /**
    * Swap the display buffers. This method is called from update(), and should normally not be called by
    * the application.
    *
    * @throws OpenGLException if an OpenGL error has occured since the last call to GL11.glGetError()
    */
   public static void swapBuffers() throws LWJGLException {
      synchronized ( GlobalLock.lock ) {
         if ( !isCreated() )
            throw new IllegalStateException("Display not created");

         Util.checkGLError();
         Context.swapBuffers();
      }
   }





and the update method:


   /**
    * Update the window. This calls processMessages(), and if the window is visible
    * clears the dirty flag and calls swapBuffers() and finally polls the input devices.
    *
    * @throws OpenGLException if an OpenGL error has occured since the last call to GL11.glGetError()
    */
   public static void update() {
      synchronized ( GlobalLock.lock ) {
         if ( !isCreated() )
            throw new IllegalStateException("Display not created");

         processMessages();
         // We paint only when the window is visible or dirty
         if ( display_impl.isVisible() || display_impl.isDirty() ) {
            try {
               swapBuffers();
            } catch (LWJGLException e) {
               throw new RuntimeException(e);
            }
         }

         pollDevices();
         if ( parent_resized ) {
            reshape();
            parent_resized = false;
         }
      }
   }


Conzar said:

Looks like its in the lwjgl code.

So does the jogl renderer have this problem too?

No. I use JOGL with a worse graphics card than yours and it works fine. I'm not sure but LWJGL doesn't handle errors like JOGL. In the JOGL renderer, the OpenGL errors causes immediately an exception and then it is trivial to see what causes such an error involving invalid enums, it allowed me to fix them quickly.

However, I agree with basixs, it would be fine to know how to fix it and to be sure it comes from LWJGL.

Please use lwjgl-debug.jar instead of lwjgl.jar. It will help us to know where your bug comes from.

It looks like one has to build the lwjgl-debug.jar themselves.



The LWJGL libraries were just updated (less than a week ago) and this issue may have been resolved w/ that, update then give it another shot :).

Same problems here under Linux. Also DDS loading doesnt work after a bit of a fix for this Intel bug with a workaround like this:



Overriding render (before displayBackBuffer) in BaseSimpleGame extension… checkCardError is the thing to prevent crash.



   protected final void render(float interpolation) {

      super.render(interpolation);
      
      Renderer r = display.getRenderer();

      TrimeshGeometryBatch.passedTimeCalculated = false;

      /** Draw the rootNode and all its children. */
      r.draw(rootNode);

      /** Call simpleRender() in any derived classes. */
      simpleRender();

      /** Draw the fps node to show the fancy information at the bottom. */
      //r.draw(fpsNode);

      doDebug(r);
      try{
         DisplaySystem.getDisplaySystem().getRenderer().checkCardError();
      }catch (Exception ex)
      {}      

   }



This way at least it runs well, though might be a bit of a slowdown, exception handling... :( Seems like a bug in LWJGL or intel driver, other brand of cards work well.