Is there any source for this?

in the class LWJGLPropertiesDialog there is the method:



private static String[] getDepths(String resolution, DisplayMode[] modes) {
        ArrayList<String> depths = new ArrayList<String>(4);
        for (int i = 0; i < modes.length; i++) {
            // Filter out all bit depths lower than 16 - Java incorrectly
            // reports
            // them as valid depths though the monitor does not support them
            if (modes[i].getBitsPerPixel() < 16)
                continue;

            String res = modes[i].getWidth() + " x " + modes[i].getHeight();
            String depth = String.valueOf(modes[i].getBitsPerPixel()) + " bpp";
            if (res.equals(resolution) && !depths.contains(depth))
                depths.add(depth);
        }

        String[] res = new String[depths.size()];
        depths.toArray(res);
        return res;
    }



is there any source for the comment


 // Filter out all bit depths lower than 16 - Java incorrectly
            // reports
            // them as valid depths though the monitor does not support them



Im interested in further information about this topic but couldn't find any of them using google.

thx

I am unclear as to what your question is…



(maybe you are looking for the DisplayMode source?)

im interested why Java recognize resolutions lower than 16 bit as valid even they aren't. and i want to know if there is any link or something like that where i can find more about this theme

i dont think its Java that is meant in the comment, could be LWJGL, search their forums.

Bits below 16 are valid, just not openGL (or even newer monitor) valid.



    * 1-bit color (21 = 2 colors) monochrome, often black and white.

    * 2-bit color (22 = 4 colors) CGA, gray-scale early NeXTstation, color Macintoshes.

    * 3-bit color (23 = 8 colors) many early home computers with TV out displays

    * 4-bit color (24 = 16 colors) as used by EGA and by the least common denominator VGA standard at higher resolution, color Macintoshes.

    * 5-bit color (25 = 32 colors) Original Amiga chipset

    * 6-bit color (26 = 64 colors) Original Amiga chipset

    * 8-bit color (28 = 256 colors) most early color Unix workstations, VGA at low resolution, Super VGA, AGA, color Macintoshes.

    * 12-bit color (212 = 4096 colors) some Silicon Graphics systems, Color NeXTstation systems, and Amiga systems in HAM mode.



http://en.wikipedia.org/wiki/Color_depth

thanks a lot now i understand!



then the comment in the code is a little bit misleading

hmm nevertheless i can't find anywhere a quote that below 16bit isn't availble for OpenGL…

Perhaps if you explained what your motivations are you would get an answer that was more suited to your question…

i just want to know why you have to remove all display modes with depth lower than 16bit.

to clarify my previous post:



i think there has to be a reason why the developer of the code fragment i posted in post1 has made his comment and removed all resolutions with depth below 16bit.



as you already mentioned, this is because openGL doesn't accept resolutions beyond 16 bits. I think that could be true, but i would be really happy if theres a link, paper, book or whatever, which proves that statement.

I don't know if I can help you there, best I could offer is to download the source and modify that statement to allow the depths you want to test…



(that comment may be years old…)