I am currently working on a requirement in which,
1. User should able to update individual record in the index through Studio portlet
2. User should also able to do bulk update by refining to bunch of records and update an attribute value with new value.
Using DIWS, I know it can be achieved.
Please provide your suggestions if you have come implemented any similar requirements.
I want to know how the Endeca server will behave if more than one users are updating the index at same time.
Also please also let know how will the performance for such update operation from front end.
Thanks in advance
From within Studio, you can't edit the actual data values. You can really only configure the attribute metadata (create and manage views, create and manage attribute groups, configure display names and format, configure behavior).
To change the data values, you still need to either use Integrator/DIWS, or, if the data was uploaded using the Provisioning Service, update the values in the original source and then reload the data.
>I want to know how the Endeca server will behave if more than one users are updating the index at same time.
>Also please also let know how will the performance for such update operation from front end.
1. Regarding: "how the Endeca server will behave if more than one users are updating the index at same time", what do you mean by more than one users? As stated in an earlier reply by JaniceM, to change the data values, you still need to either use Integrator/DIWS, or go through the Provisioning Service reload.
In the case of Integrator, when data values are updated, it sends the updates to the Endeca Server. Endeca Server updates its index. Studio users access the results of the updated index. Updates to the index are implemented within the context of outer transaction, to guarantee atomicity, consistency, isolation and durability. the updating graphs in Integrator run within an outer transaction. Additionally, if Endeca Server is implemented as part of the cluster, Integrator sends data to the Endeca Server's process that can update the index. The index is then distributed to other Endeca Server processes. Clients (such as Studio), access the updated index. At any given point, the results that Endeca Server returns for queries are guaranteed to reflect the updated state of the index.
For information on outer transactions, please see:
2. As for your second question: "how will the performance for such update operation from front end."
Here is a section from Studio User's Guide about performance. It does not mention an update operation specifically, and this is because an update operation does not affect performance of Studio. (Updates to the Endeca Server index can have performance impact in some instances, but this is a different question.)
thanks for responding.
in my use case, there will be a bunch of studio users who can edit data through our custom portlet. We are going to use DIWS to reflect the changes in real time in the index. So my query is towards, how the endeca server will handle simultaneous update operation since these studio users can login at same time from different location and can perform the edit operation.
please provide your suggestions.
I am not familiar with implementations where custom portlets in Studio can issue updating requests (and any updating request via Data Ingest Web Service is an updating request). Studio is designed to issue query (non-updating) requests to the index only. Any updating requests to the index should be sent to the Endeca Server either directly -- through a web service that can update the index, or, in most use cases, through Integrator. It is also possible to reload updated data through the Provisioning Service.
If you do issue updating requests to the Endeca Server, then to ensure they are applied "as a whole" or rolled back, the mechanism that is offered is known as "outer transactions". See the link I provided earlier in this thread.
All requests within an outer transaction are applied serially. All requests outside of outer transaction are applied serially. If any other outer transactions are in progress, requests wait until the outer transaction is completed. Internally, all such requests (they are read-write, or updating requests to the Endeca Server) are automatically routed by the Endeca Server to the single server that can process updating requests to the index. That is to say, if any such requests arrive concurrently, Endeca Server utilizes its own load balancing and routing mechanism to serialize these requests.
Been hoping this would make it into the product, we've had this for a while (since 3.0) and our customers really like it.
There's a lot to consider when you design your solution, concurrency (of both queries and updates), data integrity, security, being just a few. It's an interesting problem to solve both from a customer and software engineering standpoint.
To your question, what have you tried from a concurrency perspective? I think Julia's answered your question to a certain extent in terms of how transactions work and how users might be affected. However, conversation/ingest web service behavior (speaking from past experience) tends to change slightly from release to release so (imo) it's incumbent on developers (people like you and I) to make sure our bases are covered and the solution is well architected. Having done this before, I would argue that, while it's the most fun to code, your Portlet extension is the least important piece of the solution you'll build (assuming it's a production app with multiple users, etc.).
Thanks for the link and providing your thoughts.
Yes it is going to be prod app with multiple users in different time zones.
Like you mentioned, I want to cover the data integrity and concurrency because as per the requirement the server will be busy serving the queries and update around the clock.
Really appreciate if you provide any inputs in design considerations
If you have specific questions, we would be happy to answer. If you are looking for solutioning and architecture advice, we offer that to our customers. It's "what we do".
However, I don't think this is the kind of thing that works effectively on a forum and this type of work is something we charge for. Feel free to reach out to us offline if that is something you might be interested in. If there are specific questions, feel free to post them though.