Pbuffer crashes

Has anyone else experienced crashing when using the Pbuffer stuff? I can’t run anything that uses it at all. Here’s the output from the TestCameraMan demo though any demo that uses pbuffers crashes in exactly the same place:

May 11, 2005 1:42:08 PM com.jme.renderer.lwjgl.LWJGLTextureRenderer <init>
INFO: Copy Texture Pbuffer supported!
#
# An unexpected error has been detected by HotSpot Virtual Machine:
#
#  SIGSEGV (0xb) at pc=0xb137f0ee, pid=31232, tid=3085492352
#
# Java VM: Java HotSpot(TM) Client VM (1.5.0_02-b09 mixed mode, sharing)
# Problematic frame:
# C  [liblwjgl.so+0x1e0ee]  Java_org_lwjgl_opengl_LinuxPbufferPeerInfo_nInitHandle+0x12e
#
# An error report file with more information is saved as hs_err_pid31232.log
#
# If you would like to submit a bug report, please visit:
#   http://java.sun.com/webapps/bugreport/crash.jsp
#


I'm using the latest LWJGL from CVS. But the code in question hasn't changed since February, so I don't think it's newly introduced.

As you can see I'm using Linux. I assume it's specific to that platform, but it is making it somewhat challenging for me to experiment with using render to texture to implement UI viewports and scrollable areas. Maybe it's just something with my setup, though I'm using the latest nvidia drivers (7174). I'll try it on some of the other Linux machines around here. But perhaps someone else has seen this problem and there's an easy fix.

Sorry, I haven’t seen that before.



It’s dying the actual JNI opengl call it appears. So, it’s a low level error.



http://www.lwjgl.org/demos.php



there is a pbuffer test there. Does that run for you?



Unfortunately, it’s webstarted, so we can’t guarantee it’s using the same libraries etc. But it’s a good place to start.

Hmm… those seem to work. I’ll have to experiment with different LWJGL versions and see if I can sort out at what point things went pear shaped.

FYI: I eventually tracked down the source of this error.



At the lowest level, LWJGL wasn’t checking for a NULL return value which caused the segfault, but it really should have been catching the problem earlier when it requested to create a GL visual for which there was no matching X visual.



This led me to discover that LWJGLTextureRenderer is hard coded to request a 32 bit Pbuffer with an 8 bit depth buffer. My X server happened to be configured such that no 32 bit visuals were available which (once the LWJGL bugs were fixed) was causing the Pbuffer constructor to fail. Then JME was siliently ignoring that error and null-pointering a bit later.



So my proposal is at the very least this diff:


@@ -295,7 +295,8 @@ public class LWJGLTextureRenderer implem
 
         try {
             pbuffer = new Pbuffer(PBUFFER_WIDTH, PBUFFER_HEIGHT,
-                    new PixelFormat(32, 0, 8, 0, 0), texture, null);
+                    new PixelFormat(display.getBitDepth(), 0, 8, 0, 0),
+                    texture, null);
         } catch (Exception e) {
             LoggingSystem.getLogger().throwing(this.getClass().toString(),
                     "initPbuffer()", e);



but in the long term it seems like one would want to be able to specify the pixel format themselves (I don't need a depth buffer for example in this case since I'm just doing UI rendering) but the TextureRenderer interface does not make this possible.

I would also suggest this fix as silently ignoring errors makes for difficult debugging:

@@ -310,9 +311,11 @@ public class LWJGLTextureRenderer implem
                 texture = null;
                 useDirectRender = false;
                 initPbuffer();
-                return;
-            } else
-                return;
+            } else {
+                LoggingSystem.getLogger().log(
+                    Level.WARNING, "Failed to create Pbuffer.", e);
+            }
+            return;
         }
         try {
             activate();



Anyhow, at least I can now get back to implementing a ScrollPane in BUI.

Thanks, sam. I’m working on getting these fixes in (including allow for the specification of the levels).

Ok, sam. That’s improved. You can now call createTextureRenderer from the display system and provide it values. If no values are provided it will use the display system and set the texture renderer to the same as the regular renderer.



That’s checked in.

Great, thanks!



So I’m having trouble getting the TextureRenderer to work with things in the ORTHO queue. Is there anything I should be watching out for?. I get things rendering (albeit strangely) when I take my UI elements out of the ortho queue, but when they’re in there, they seem to all get rendered in a teeny tiny corner of the texture (it’s hard to tell but something is showing up down there).



If there’s anything obvious I should be worrying about, I’d be glad to hear about it. Otherwise I’ll keep poking around and eventually put together some standalone test case.

It might be related to how the camera is set up that you are using for the tex renderer. After all, the screen coordinates of an ortho rendered object are related to the camera setup. Give that a good look over when you put things together in a test.