3 Replies Latest reply on Apr 21, 2010 3:48 AM by 740862

    Too many open files exception

      Hi All,

      I want to understand more on the Too many file exception .

      -What are the main reason for the Too many open files exception ?

      -How to get rid of these errors ?

      -What should be the precautions taken to avoid such errors ?

      Following is the stack trace :

      <Error> <HTTP> <BEA-101019> <[ServletContext@5348313] Servlet failed with IOException
      java.io.FileNotFoundException:(Too many open files)
           at java.io.FileInputStream.open(Native Method)
           at java.io.FileInputStream.<init>(FileInputStream.java:106)
           at weblogic.utils.classloaders.FileSource.getInputStream(FileSource.java:31)
           at weblogic.servlet.internal.WarSource.getInputStream(WarSource.java:65)
           at weblogic.servlet.FileServlet.sendFile(FileServlet.java:400)
           Truncated. see log file for complete stacktrace

      Thanks in advance.

        • 1. Re: Too many open files exception

          Take a look at:

          sysctl -a |grep fs.file-max

          this setting is the system wide number of all open files including sockets.

          Also use 'ulimit -n' to see the limits for the current process.

          use either

          edit file-max in /etc/sysctl.conf and sysctl -p (requires toot)

          or increase the session settngs with: ulimit -n <new setting>

          to make this permanent change the profile for the oracle user as described in the installation documentation

          you might need to change /etc/security/limits.conf as well if you want to go beyond the setting here


          Bernhard Jongejan

          1 person found this helpful
          • 2. Re: Too many open files exception

            Each directory, file, socket a process uses will have a handle to it called a file descriptor. When the process reaches its limit, you get too many open files exception.

            As bernhard said, You could increase the descriptors limit using ulimit

            Sometimes increasing the descriptors may not really help if the limit is already good enough..like 1024 or so and your process is not really that IO intensive.
            I worked for a great boss once and he told me that, when Unix was fairly new they used to have like 8 or 16 file descriptor limit per process
            Today you may not be able to run the process with just the 8 or 16. But usually people run into this issue beacuse of program errors.
            Sometimes people will be able to mask the program error by increasing the limit.

            But it may not work always.. I would suggest you to take a look at all the open files when the process throws this exception.
            You can use lsof commmand for this. If you dont have lsof you can take a look at /proc/<PID>/fd directory.
            If you end up finding that a particular file/socket is listed a lot, check if you really need it, if not, try to close them.
            1 person found this helpful
            • 3. Re: Too many open files exception

              Thanks Bernhard and Maverick for the wonderful explanation.

              I will surely try the things and let you know.