2 Replies Latest reply: Apr 8, 2011 2:06 PM by 851329 RSS

    How to get and set the GraphicsDevice bit depth?

    851329
      Hi all.

      I am trying to get and then set the bit-depth in Full-Screen Exclusive Mode. However, when I call getDisplayMode(), it always returns BIT_DEPTH_MULTI (that is, -1) for the current bit-depth. When I create a new DisplayMode object and pass it to setDisplayMode(), it always fails for any bit-depth value other than BIT_DEPTH_MULTI.

      How can I find out the actual bit depth? How can I set it?

      I am running Linux (2.6.31.5-127.fc12.i686.PAE), using javac 1.6.0_23-ea, with java version "1.6.0_0", OpenJDK Runtime Environment (IcedTea6 1.6) (fedora-31.b16.fc12-i386), OpenJDK Client VM (build 14.0-b16, mixed mode).

      Here's my code:
      import javax.swing.*;
      import java.awt.event.*;
      import java.awt.*;
      
      public class T extends JFrame implements ActionListener
      {
              JButton exit;   // to exit
              JButton fsem;   // to enter full screen exclusive mode
              int bd;         // remembers the requested bit-depth
      
              public T(String a)
              {
                      super("T");
                      bd = Integer.decode(a); // save the requested bit-depth
                      setSize(320, 240);      // make the JFrame a modest size
                      setLayout(new FlowLayout());
      
                      // create and add the two buttons
      
                      exit = new JButton("exit");
                      add(exit);
                      exit.addActionListener(this);
                      fsem = new JButton("fsem");
                      add(fsem);
                      fsem.addActionListener(this);
      
                      // prepare a plain JFrame
      
                      setUndecorated(true);
                      setResizable(false);
                      setBackground(Color.ORANGE);
                      setVisible(true);
              }
      
              public void actionPerformed(ActionEvent evt)
              {
                      // on exit request, just exit
                      if (evt.getSource() == exit)
                              System.exit(0);
      
                      // on full-screen request, save mode, switch to FSEM for four seconds,
                      // then switch back.
      
                      if (evt.getSource() == fsem)
                      {
      
                              // get the display device and its current display mode
      
                              GraphicsEnvironment ge = GraphicsEnvironment.getLocalGraphicsEnvironment();
                              GraphicsDevice dev = ge.getDefaultScreenDevice();
                              DisplayMode origDM = dev.getDisplayMode();
      
                              // if full-screen is available, switch to it now
      
                              if (dev.isFullScreenSupported())
                              {
                                      dev.setFullScreenWindow(this);
      
                                      // try setting the requested bit-depth (this always fails
                                      // if the bit-depth is NOT -1 (BIT_DEPTH_MULTI)
      
                                      try
                                      {
                                              dev.setDisplayMode(new DisplayMode(origDM.getWidth(),
                                                                                 origDM.getHeight(),
                                                                                 bd,
                                                                                 origDM.getRefreshRate()));
                                      }
                                      catch (Exception e)
                                      {
                                              e.printStackTrace();
                                      }
      
                                      // wait four seconds, then switch out of full-screen
      
                                      try
                                      {
                                              Thread.sleep(4000);
                                      }
                                      catch(Exception e)
                                      {
                                              e.printStackTrace();
                                      }
                                      dev.setDisplayMode(origDM);
                                      dev.setFullScreenWindow(null);
                              }
                      }
              }
      
              // create the one object we've got and pass the string with the requested bit-depth
      
              public static void main(String[] args)
              {
                      T t = new T(args[0]);
              }
      }
      Running with java T -1 produces no errors. Running with anything else ( java T 16, java T 24, and so on), produces this:

      java.lang.IllegalArgumentException: Invalid display mode
           at sun.awt.X11GraphicsDevice.setDisplayMode(X11GraphicsDevice.java:413)
           at T.actionPerformed(T.java:65)

      (At run time, just click the "fsem" button to switch to FSEM, wait four seconds, and it will switch you back. You'll see the error message then in your terminal window.)

      I've searched the Web and this forum (but probably neither enough) for info on this. All I've found so far is a bug report that regards BIT_DEPTH_MULTI as a bad idea, since it creates pretty much the problem I've got (that is, it doesn't tell you what you wanted to know, which was the actual bit-depth).

      I'd be very grateful for some guidance on this. Is there another way to get/set the bit-depth? Am I misunderstanding the purpose of BIT_DEPTH_MULTI.

      Thanks,
      Stevens
        • 1. Re: How to get and set the GraphicsDevice bit depth?
          799019
          Stevens Miller wrote:
          Hi all.

          I am trying to get and then set the bit-depth in Full-Screen Exclusive Mode. However, when I call getDisplayMode(), it always returns BIT_DEPTH_MULTI (that is, -1) for the current bit-depth. When I create a new DisplayMode object and pass it to setDisplayMode(), it always fails for any bit-depth value other than BIT_DEPTH_MULTI.

          How can I find out the actual bit depth? How can I set it?
          First, you need to use the
          getDisplayModes()
          method to determine what modes are actually supported.


          On my Windows Vista machine, printing out the available modes yields:

          1920x1200) 32 bpp
          1920x1200) 32 bpp
          640x480) 32 bpp
          640x480) 32 bpp
          640x480) 32 bpp
          720x480) 32 bpp
          720x480) 32 bpp
          720x480) 32 bpp
          720x576) 32 bpp
          720x576) 32 bpp
          720x576) 32 bpp
          720x576) 32 bpp
          800x600) 32 bpp
          800x600) 32 bpp
          800x600) 32 bpp
          848x480) 32 bpp
          848x480) 32 bpp
          848x480) 32 bpp
          1024x768) 32 bpp
          1024x768) 32 bpp
          1024x768) 32 bpp
          1152x864) 32 bpp
          1152x864) 32 bpp
          1280x720) 32 bpp
          1280x720) 32 bpp
          1280x720) 32 bpp
          1280x768) 32 bpp
          1280x768) 32 bpp
          1280x800) 32 bpp
          1280x800) 32 bpp
          1280x1024) 32 bpp
          1280x1024) 32 bpp
          1280x1024) 32 bpp
          1360x768) 32 bpp
          1360x768) 32 bpp
          1600x1200) 32 bpp
          1600x1200) 32 bpp
          1680x1050) 32 bpp
          1680x1050) 32 bpp
          1920x1080) 32 bpp
          1920x1080) 32 bpp
          1280x960) 32 bpp
          1280x960) 32 bpp
          1440x900) 32 bpp
          1440x900) 32 bpp
          1920x1200) 16 bpp
          1920x1200) 16 bpp
          640x480) 16 bpp
          640x480) 16 bpp
          640x480) 16 bpp
          720x480) 16 bpp
          720x480) 16 bpp
          720x480) 16 bpp
          720x576) 16 bpp
          720x576) 16 bpp
          720x576) 16 bpp
          720x576) 16 bpp
          800x600) 16 bpp
          800x600) 16 bpp
          800x600) 16 bpp
          848x480) 16 bpp
          848x480) 16 bpp
          848x480) 16 bpp
          1024x768) 16 bpp
          1024x768) 16 bpp
          1024x768) 16 bpp
          1152x864) 16 bpp
          1152x864) 16 bpp
          1280x720) 16 bpp
          1280x720) 16 bpp
          1280x720) 16 bpp
          1280x768) 16 bpp
          1280x768) 16 bpp
          1280x800) 16 bpp
          1280x800) 16 bpp
          1280x1024) 16 bpp
          1280x1024) 16 bpp
          1280x1024) 16 bpp
          1360x768) 16 bpp
          1360x768) 16 bpp
          1600x1200) 16 bpp
          1600x1200) 16 bpp
          1680x1050) 16 bpp
          1680x1050) 16 bpp
          1920x1080) 16 bpp
          1920x1080) 16 bpp
          1280x960) 16 bpp
          1280x960) 16 bpp
          1440x900) 16 bpp
          1440x900) 16 bpp


          For my Mandriva Linux (nvidia-kernel 2.6.33.7) machine I get:

          1920x1200(60)-1 bpp
          1920x1080(60)-1 bpp
          1600x1200(60)-1 bpp
          1680x1050(60)-1 bpp
          1400x1050(60)-1 bpp
          1360x768(60)-1 bpp
          1280x1024(60)-1 bpp
          1280x960(60)-1 bpp
          1280x800(60)-1 bpp
          1280x720(60)-1 bpp
          1024x768(60)-1 bpp
          800x600(60)-1 bpp
          640x480(60)-1 bpp


          Jim S.
          • 2. Re: How to get and set the GraphicsDevice bit depth?
            851329
            Jim,

            Thanks for the reply.

            Yes, my code does get the current display mode (not all of them, but I only want to find out the current bpp and then try to set that same value). On Windows, my program does as yours does, reporting 32 bpp. On Linux, I also get your result, -1 (aka "BIT_DEPTH_MULTI"). The API docs say this means that multiple bit-depths are supported, but I think that's misleading. If you try to set any common bit-depth (8, 15, 16, 32), you get a run-time error. Setting to -1 leaves things unchanged, with no error.

            More Googling revealed a bug report about this that would that seem to attribute it to a limitation of the X server.

            I can detect and change the bit-depth under Windows, just as the API docs say I can. What the API docs do not say is that this can be done under Linux, nor do they say it cannot be done under Linux. It would appear, however, that a bit-depth of -1 really means you can't change it and that this is not a problem in the API, but a characteristic of X windows.

            If anyone knows more, I'd be grateful to learn.

            Stevens