LWJGLException: Insufficient depth buffer precision

Greeting Oh Wise Ones!

I am shiny and new, and of course I would like to start off by saying jMonkey has me very excited! Thanks to all involved, and I hope to help add to this wonderful thing in future (in some small way).

I am learning the ropes, and am going through the Packt Publishing book, jMonkeyEngine 3.0 Beginner’s Guide. Shortly into things I have encountered the error specified in the topic. I realize there is another post that mentions it here, but I am unclear on this for two reasons:

First though, here is the main code, directly from the book’s code supplement material:
[java]public class AppSettingsDemo extends SimpleApplication {

/** Start the jMonkeyEngine application */
public static void main(String[] args) {
    // read GraphicsDevice attributes
    GraphicsDevice device = GraphicsEnvironment.
    DisplayMode[] modes = device.getDisplayModes();
    // specify display settings based on DisplayMode
    AppSettings settings = new AppSettings(true);
    settings.setResolution(modes[0].getWidth(), modes[0].getHeight());
    settings.setTitle("My Cool Game"); // only visible if not fullscreen
    settings.setSamples(2);            // anti-aliasing
    // activate display settings and start app 
    AppSettingsDemo app = new AppSettingsDemo();
    app.setSettings(settings);         // apply settings to app
    app.setShowSettings(false);        // don't ask user for settings
    app.start();                       // use settings and run

/** Initialize the scene here: 
 *  Create Geometries and attach them to the rootNode. */
public void simpleInitApp() {
    setDisplayFps(false);                         // hide frames-per-sec display
    setDisplayStatView(false);                    // hide debug statistics display

    Box b = new Box(1, 1, 1);                     
    Geometry geom = new Geometry("Box", b);       
    Material mat = new Material(assetManager,
    mat.setColor("Color", ColorRGBA.Blue);        
    rootNode.attachChild(geom);                   // make geometry appear in scene


I will assume this code is good form. The error occurs on this line:

To my confusion:

  1. I am confused as to why setting the depth causes an error. If I remove the .setDepthBits() call, the problem is no more (the default of 24 bits is used, and indeed, I can leave the call in if I use that value). And yet the modes array is clearly indicating a resolution and bit depth that are supposedly correct for the card (as the values are all pulled from the first index in the array, which would seem to indicate a properly supported relationship, yes?).

  2. Is this going to cause issues with running on different devices? Android, for instance, and perhaps iOS? And if so, what is the safest way to handle bit depth and resolution?

Thanks in advance to any who undertakes the herculean task of enlightening this muddled noob. Oh, have I mentioned how awesome you are? :slight_smile:

WIN 7 64-bit
Nvidia GTX 460 - driver up to date
jMonkey 3.0 64-bit version

Don’t do this:

If the book tells you to do that then it is confused. Bit depth is not the same as depth bits.

Thanks pspeed.

I have triple checked the code download and the book, and indeed the bad code was precisely what I stated. I downloaded my PDF of the book a little while ago, but the code was downloaded less than 24 hours ago and there were no corrections. I have therefore crafted an errata submission on Packt’s web site.

Based on your response, and reviewing the code completion/api, it seems the line should have been:


Things work swimmingly (which is to say, as expected) with that line as replacement, but if I’m off on this, I would appreciate a kick in the head.


Looks like you got it right. Good job and thanks for the errata submission.