Announcement Announcement Module
Collapse
No announcement yet.
Multiple splitters/aggregators Page Title Module
Move Remove Collapse
X
Conversation Detail Module
Collapse
  • Filter
  • Time
  • Show
Clear All
new posts

  • Multiple splitters/aggregators

    I would like to split a message, then split those messages again. The first correlation ids, sequence numbers and sequence size header values get overwritten by the second splitter, not allowing me to aggregate using the default rules, twice.. I can't aggregate the second split, then aggregate the first split.

    I was wondering if there were any examples, or if someone could post an example, or explain how this could be done..

    IE:
    inbound channel --> splitter 1 --> splitter 2 --> aggregator 2 --> aggregator 1 --> outbound channel

  • #2
    Hi!

    H-m, strange: it is out of the box feature. There is an ability to 'push' & 'pop' sequence details.
    Here is a test-case: https://github.com/artembilan/spring...ontext.xml#L27
    & its Class:
    https://github.com/artembilan/spring...tionTests.java

    Check your config with that, please.

    Take care,
    Artem

    Comment


    • #3
      Excellent.. Thank you

      Excellent info.. just what I was looking for.
      I had actually tried this, and it did not work for me, but maybe I had something out of whack.. At least I know it is doable.
      I spent the last 3 hours, putting together a custom aggregation strategy/correlation etc.. to get the second one to work.. The benefit of that at least, is I know how to do it now.. LOL..
      I will try this and let you know if I have any difficulties.

      Its great there are examples.. Searches on google/bing, led to no clear links for this.. Thanks again.

      Comment


      • #4
        In general, you shouldn't need to use these APIs yourself - we recommend writing custom code as POJOs and the framework will take care all the plumbing.

        Comment


        • #5
          Like this

          So, something like this should work...
          Code:
              <!-- receive the inbound message -->
              <int:gateway service-interface="com.brian.inboundGateway"
                  default-request-channel="inbound-request-channel" default-reply-channel="outbound-request-channel"/>
              <int:channel id="inbound-request-channel"/>
              <int:channel id="outbound-request-channel"/>
          
              <!-- splitter chain -->
              <int:chain input-channel="inbound-request-channel" output-channel="client-call-channel">
                  <int:splitter>
                      <bean class="com.brian.Splitter1"/>
                  </int:splitter>
                  <int:splitter>
                      <bean class="com.brian.Splitter2"/>
                  </int:splitter>
              </int:chain>
          
              <!-- multi threaded call to client -->
              <bean id="channelTaskExecutor" class="org.springframework.scheduling.concurrent.ThreadPoolTaskExecutor">
          	<property name="corePoolSize" value="20"/>
          	<property name="daemon" value="false"/>
          	<property name="queueCapacity" value="20"/>
              </bean>
              <int:channel id="client-call-channel">
                  <int:dispatcher task-executor="channelTaskExecutor"/>
              </int:channel>
              <int:service-activator input-channel="client-call-channel" output-channel="client-return-channel">
                  <bean id="clientExecutor" class="com.brian.ClientExecutor"/>
              </int:service-activator>
              <int:channel id="client-return-channel"/>
              
              <!-- aggregate the two splits back together and finish -->
              <int:chain input-channel="client-return-channel" output-channel="outbound-request-channel">
                  <int:aggregator send-partial-result-on-expiry="true" expire-groups-upon-completion="true"/>
                  <int:aggregator send-partial-result-on-expiry="true" expire-groups-upon-completion="true"/>
              </int:chain>

          Comment

          Working...
          X