I tried the following As part of a performance Testing environment setup just to see how much my application can scale i need to populate the SUN DSSE 7 Directory server with around 100 Million user records for which i have created a script which can generate LDIF files having 100M records.
My Machine setup is
Operating System: Windows 2008 R2 on AMD 64-Bit
Directory Server: SUN Directory server 7
RAM: 8 GB Harddisk: 500 GB
As the ldapmodify command is very slow i have opted to go out with dsadm offline import. So i have generated 20 LDIF files each having 5M records and given on the dsadm import command but after it started importing the second file (while going to complete 10M records) the server is giving the following error which is reported in Directory 5.2 P4 crashes with virtual memory error.
Then i tried imported 6 Million records with dsadm import command in first pass and to import the subsequent records i have tried using --incremental approach which imported 2 more million records but while creating the indexes the task thrown the following error:
libdb: Lock table is out of available locks
I am able to start the server but i am not able to perform any search as some attributes haven't been indexed. Even i tried re-indexing but no success :(
My Questions are
1) Whether the given directory server can withhold 100 Million records?
2) If Yes, What is the best way to import those records ?
3) What are the tuning techniques i should take care ?
And Yes, I have to populate 100 Million records. Tell me if you think are you crazy 100M with Sun DSSE no way who will have 100 Million users even i think the same even though it is not for the production purpose?