This discussion is archived
1 Reply Latest reply: Jan 14, 2011 12:16 PM by Bob Hanckel RSS

Overstamping messaging queue in push replication to prevent OOM.

701681 Explorer
Currently Being Moderated
Scenario:

- Siite A is publishing data to Site B using PRP.
- Site B becomes unavailable for an indefinite period.

Is there a way to configure PushRep/Coherence messaging so that messaging only retains the most recent message for a given key/cache, so that the messaging queue (coherence.messagingpattern.messages) does not grow indefinitely and cause the storage node in siteA to go OOM?

i.e. for some caches, only the latest value is of real significance, there is no point queuing 1000s of messages for same key/cache. (Much like how coherence idempotent cachestores work).

NB: I am aware of the CoalescingBatchPublisher, but this only helps when site B "is" available. I am also aware of the jmx drain function, but this needs user intervention, and is not really what I want to achieve.

Cheers,
Neville.
  • 1. Re: Overstamping messaging queue in push replication to prevent OOM.
    Bob Hanckel Explorer
    Currently Being Moderated
    Hi Neville,

    Sorry for not getting back sooner on this. Right now there is no configuration knob
    to do this, although procedurally it would not be difficult to implement. Probably it
    would be best done in the PublishingService which would know if a target cluster is
    currently not available and could constantly trim the message queue if it got too large.
    If this is the policy for all published caches the fix should be straight forward.

    If you look at the code in CoherencePublishingService, you will see an exception block
    at the bottom of onPublish() which handles the situation where a publisher failed.
    In it, probably in code that would otherwise suspend, you can put in trim logic to
    figure out how much is backlogged in the message queue, and optionally throw messages
    away (much in the same way as would be done if they had successfully been sent.)

    This model has some serious shortcomings in that both source and target can permantently
    diverge in state.


    Regards,

    Bob

Legend

  • Correct Answers - 10 points
  • Helpful Answers - 5 points