Discussions
Categories
- 197K All Categories
- 2.5K Data
- 546 Big Data Appliance
- 1.9K Data Science
- 450.8K Databases
- 221.9K General Database Discussions
- 3.8K Java and JavaScript in the Database
- 31 Multilingual Engine
- 552 MySQL Community Space
- 479 NoSQL Database
- 7.9K Oracle Database Express Edition (XE)
- 3.1K ORDS, SODA & JSON in the Database
- 555 SQLcl
- 4K SQL Developer Data Modeler
- 187.2K SQL & PL/SQL
- 21.4K SQL Developer
- 296.3K Development
- 17 Developer Projects
- 139 Programming Languages
- 293K Development Tools
- 110 DevOps
- 3.1K QA/Testing
- 646.1K Java
- 28 Java Learning Subscription
- 37K Database Connectivity
- 158 Java Community Process
- 105 Java 25
- 22.1K Java APIs
- 138.2K Java Development Tools
- 165.3K Java EE (Java Enterprise Edition)
- 19 Java Essentials
- 162 Java 8 Questions
- 86K Java Programming
- 81 Java Puzzle Ball
- 65.1K New To Java
- 1.7K Training / Learning / Certification
- 13.8K Java HotSpot Virtual Machine
- 94.3K Java SE
- 13.8K Java Security
- 205 Java User Groups
- 24 JavaScript - Nashorn
- Programs
- 468 LiveLabs
- 39 Workshops
- 10.2K Software
- 6.7K Berkeley DB Family
- 3.5K JHeadstart
- 5.7K Other Languages
- 2.3K Chinese
- 175 Deutsche Oracle Community
- 1.1K Español
- 1.9K Japanese
- 233 Portuguese
Concurrent Data Storage Example

535773
Member Posts: 9
Is there handy example on how to use Concurrent Data Storage?
I am witing an sample program, in which a process spawns a thread and writes to HashMap every 10 secs, and on other side multiple processes with multple threads which constantly reads the HashMap every 10 secs.
I was successful, in writing sample progams for writing and reading those HashMaps. But, on reading the HashMap, I don't get updated data. I have restart the reader process to read most current data.
On Writer, I am setting these flags:
envConfig.setTransactional(false);
envConfig.setLocking(false);
envConfig.setAllowCreate(true);
dbConfig.setTransactional(false);
dbConfig.setAllowCreate(true);
Also, I am calling env.sync(), after I do map.put(index, value) on HashMap.
On Reader, I am setting these flags:
envConfig.setTransactional(false);
envConfig.setLocking(false);
envConfig.setReadOnly(true);
dbConfig.setTransactional(false);
dbConfig.setReadOnly(true);
And I am calling map.get(index) to retrieve the value.
Do, I have to set any other flags? Or I am totally off the course.
Thanks
I am witing an sample program, in which a process spawns a thread and writes to HashMap every 10 secs, and on other side multiple processes with multple threads which constantly reads the HashMap every 10 secs.
I was successful, in writing sample progams for writing and reading those HashMaps. But, on reading the HashMap, I don't get updated data. I have restart the reader process to read most current data.
On Writer, I am setting these flags:
envConfig.setTransactional(false);
envConfig.setLocking(false);
envConfig.setAllowCreate(true);
dbConfig.setTransactional(false);
dbConfig.setAllowCreate(true);
Also, I am calling env.sync(), after I do map.put(index, value) on HashMap.
On Reader, I am setting these flags:
envConfig.setTransactional(false);
envConfig.setLocking(false);
envConfig.setReadOnly(true);
dbConfig.setTransactional(false);
dbConfig.setReadOnly(true);
And I am calling map.get(index) to retrieve the value.
Do, I have to set any other flags? Or I am totally off the course.
Thanks
Comments
-
Hello,I am witing an sample program, in which a processWhen you say HashMap I assume you mean StoredMap (in com.sleepycat.collections), is that correct? If you are using a HashMap directly, you'll need to store the HashMap in the database.
spawns a thread and writes to HashMap every 10 secs,
and on other side multiple processes with multple
threads which constantly reads the HashMap every 10
secs.I was successful, in writing sample progams forBerkeley DB JE may only be used in limited ways when multiple processes access the database directly. The reader processes see a snapshot of the data as of the time that the Environment is opened. In order for the reader process to see additional data, it must close the Environment and open it again. It is not necessary to restart the process, but it is necessary to close and re-open the Environment.
writing and reading those HashMaps. But, on reading
the HashMap, I don't get updated data. I have restart
the reader process to read most current data.
This is described in the documentation here:
http://www.sleepycat.com/jedocs/GettingStartedGuide/introduction.html#features
"Moreover, JE allows multiple processes to access the same databases. However, in this configuration JE requires that there be no more than one process allowed to write to the database. Read-only processes are guaranteed a consistent, although potentially out of date, view of the stored data as of the time that the environment is opened."
If it is practical for you to close and re-open the Environment in order to see changed data, then using a reader process that opens the Environment will work for you. If this is not practical, the solution is for you to send the read requests to the writer process, perhaps using a web service interface that you implement in your application. In other words, your writer process would perform all reads and writes to the Environment, and other processes would interact with the environment only by sending messages to the writer process.
Mark -
Hi Mark,I am sorry, that I did not make it clearer. Yes, it is correct that I am using StoredMap to store and retrieve values.When you say HashMap I assume you mean StoredMap (in >>com.sleepycat.collections), is that correct? If you are using a HashMap directly, >>you'll need to store the HashMap in the database.Yes, it is pratical for our application to re-open the Environment to see the changes.If it is practical for you to close and re-open the Environment in order to see >>changed data, then using a reader process that opens the Environment will >>work for you. If this is not practical, the solution is for you to send the read >>requests to the writer process, perhaps using a web service interface that you >>implement in your application. In other words, your writer process would >>perform all reads and writes to the Environment, and other processes would >>interact with the environment only by sending messages to the writer process.
Thank You Mark, for pointing me to right direction.
Thanks again -
Yes, it is pratical for our application to re-openYou're welcome, and I'm glad that re-opening the Environment will work for you.
the Environment to see the changes.
Thank You Mark, for pointing me to right direction.
Mark -
Hi Mark,
Unfortunately my joy was shortlived..
Since there are multiple processes that reads the data, the solution worked for only one reader process. But, when I started my second process, it threw following exception
com.sleepycat.je.DatabaseException: (JE 3.0.12) Environment.setAllowCreate is false so environment creation is not permitted, but there is no pre-existing environment in ./data
Please not that the writer process is continously writing the data without closing database or enviroment. It does call env.sync() after ever call of StoredMap.put(index, value).
Any suggestions ?
Thanks -
Hi Mark,:-(
Unfortunately my joy was shortlived..
Since there are multiple processes that reads theThis error should not occur when opening multiple read-only processes.
data, the solution worked for only one reader
process. But, when I started my second process, it
threw following exception
com.sleepycat.je.DatabaseException: (JE 3.0.12)
Environment.setAllowCreate is false so environment
creation is not permitted, but there is no
pre-existing environment in ./data
Are you using NFS or another network file system? JE does file locking to enforce reader and writer process restrictions, and file locking does not always work properly on NFS.
In any case I will try to reproduce this problem and post back here with the results.Please not that the writer process is continouslyThat approach in the writer process should be fine.
writing the data without closing database or
enviroment. It does call env.sync() after ever call
of StoredMap.put(index, value).
Mark -
Mark,
Yes, we are using NFS for reader process. Let me first consult my Network Admin, to make sure that I am not screwing up at our end.
I will update you shortly.
Thanks for looking into it. -
Yes, we are using NFS for reader process.Please see this FAQ on NFS support in JE:
http://www.oracle.com/technology/products/berkeley-db/faq/je_faq.html#1
Mark -
Mark,
Yes, I sanned through FAQ before implementing the sample programs.
According to FAQ:If you cannot rely on flock() across NFS on your systems, you could handle (1) by >taking responsibility in your application to ensure that there is a single writer >process attached. Having two writer processes in a single environment could result >in database corruption. (Note that the issue is with processes, and not threads)We do take responsibilty of maintaining a writer in single environment.
And, about the exception: One of NFS server was not configured properly, so it was fault at our end.
So, the sample writer and reader sample programs just worked fine. I was able to read and validated the data from multiple processes across NFS.
And once again, Thanks for responding so quickly. -
We do take responsibilty of maintaining a writer inOK.
single environment.And, about the exception: One of NFS server was notThat's great news!
configured properly, so it was fault at our end.
So, the sample writer and reader sample programs just
worked fine. I was able to read and validated the
data from multiple processes across NFS.
Mark
This discussion has been closed.