Announcement Announcement Module
Collapse
No announcement yet.
Parallely process multiple Jobs Page Title Module
Move Remove Collapse
X
Conversation Detail Module
Collapse
  • Filter
  • Time
  • Show
Clear All
new posts

  • Parallely process multiple Jobs

    Hi,

    I want to perform multiple jobs parallely and in asynchronous mode using Spring Batch so that each of these jobs perform various operation. I came across the below link:-

    http://static.springsource.org/sprin....html#parallel

    But I havent seen any examples of using StagingItemReader and StagingItemWriter to perform multiple jobs parallely? Also it has been mentioned that there will be a staging table that will be created on the fly before writing the data in the ouput format it will go through the staging table which will be processed by the multi thread step execution. How this is configured in spring applicationcontext.xml ?and whether this is the best way to process the multiple jobs parallely? Is there any other way to do it? Please share some links on these which will be really useful and clarify the doubts.

    Thanks.

  • #2
    Guys any update on this?

    Comment


    • #3
      Assuming the jobs are operating on separate data, there is no reason why you couldn't run them in parallel.

      With regards to the staging table, that is what that sample job does. You may or may not want to create a table on the fly for this functionality based on what your needs are.

      Comment


      • #4
        Thanks. I have 2 process here, in the first one i want to run multiple jobs or steps parallely which will be doing separate operation. Here I was thining to use "Multithread step execution". But I havent seen a good example on running multiple jobs parallely (seen examples on invoking the jobs one after another). Please send a link on this.
        The second one, here I want to run multiple jobs or steps parallely which will be working on the same file, hence I want to ensure that the files read by a partcular thread in a job is not read by the another thread as all the jobs will be running parallely by reading the data from the same file. Which approach I can use here ? Whether "Partitioning approach" will work here? If so please provide some links on this.
        Cant we do the same with "multithread step execution" approach?Please clarify.

        Comment


        • #5
          Thanks. For the first scenario that is fine where I am using taskexecutor to create multiple threads to operate on different data parallely. I want to know how I can handle the second scenario, where we need multiple threads to operate on the same file. In such case how I can go about it?
          Also clarify whether using the "JdbcBatchItemWriter" for batch update using multiple threads (with a throttle-limit=1000) wont have any problem related to updating the records in the same table? As each threads (in this case 10) will handle 1000 records each.
          Please clarify.

          Comment


          • #6
            With regards to processing a file via multiple threads, there typically is no performance improvement. Typically you see the buffers being saturated with I/O and, depending on the configuration, the processing actually slowing down. Because of this, I would recommend importing the file into a staging table of some kind and processing the data from there.

            JdbcBatchItemWriter is thread safe per the documentation.

            Comment

            Working...
            X