1 Reply Latest reply: Feb 19, 2013 9:50 AM by Marco Milo-Oracle RSS

    Best Way to Import users into LDAP


      I tried the following As part of a performance Testing environment setup just to see how much my application can scale i need to populate the SUN DSSE 7 Directory server with around 100 Million user records for which i have created a script which can generate LDIF files having 100M records.

      My Machine setup is

      Operating System: Windows 2008 R2 on AMD 64-Bit
      Directory Server: SUN Directory server 7
      RAM: 8 GB Harddisk: 500 GB

      As the ldapmodify command is very slow i have opted to go out with dsadm offline import. So i have generated 20 LDIF files each having 5M records and given on the dsadm import command but after it started importing the second file (while going to complete 10M records) the server is giving the following error which is reported in Directory 5.2 P4 crashes with virtual memory error.

      Then i tried imported 6 Million records with dsadm import command in first pass and to import the subsequent records i have tried using --incremental approach which imported 2 more million records but while creating the indexes the task thrown the following error:

      libdb: Lock table is out of available locks

      I am able to start the server but i am not able to perform any search as some attributes haven't been indexed. Even i tried re-indexing but no success :(

      My Questions are

      1) Whether the given directory server can withhold 100 Million records?
      2) If Yes, What is the best way to import those records ?
      3) What are the tuning techniques i should take care ?

      And Yes, I have to populate 100 Million records. Tell me if you think are you crazy 100M with Sun DSSE no way who will have 100 Million users even i think the same even though it is not for the production purpose?


      Edited by: 988083 on Feb 14, 2013 5:00 AM

      Edited by: 988083 on Feb 14, 2013 5:07 AM
        • 1. Re: Best Way to Import users into LDAP
          Marco Milo-Oracle
          I've seen deployments with 100M records and even more, but to be honest, never on MS-Windows and never with such small amount of RAM... (my 5+ years old laptop has 8GB and 500gb HD).

          However, the reported error may happen when you don't have enough db-locks available; as described in:


          this is a parameter that needs to be tuned depending on the size/scalability of your deployment. Have you tried increasing that or are you still running with the default configuration?