Working with large data
529154Aug 21 2006 — edited Aug 21 2006I have to create a database with specific distribution of key size and data
size. Key size is some few bytes, and data size varies in a great range from
few bytes to some megabytes with average size near 64K. Overall size of a
database filled by key/data pairs is some gigabytes. One can imagine our
key/data pairs form context index (inverted file) for a large set of
Russian/English texts.
Could you recommend the best way to configure such a data storage? I mean the
best random read speed for key/data pairs in a given database and good enough
write speed (for "context index" updating).