Announcement Announcement Module
Collapse
No announcement yet.
Executing multiple steps within a partition Page Title Module
Move Remove Collapse
X
Conversation Detail Module
Collapse
  • Filter
  • Time
  • Show
Clear All
new posts

  • Executing multiple steps within a partition

    Hi,

    I need help in configuring a spring batch job here's what i want
    I have three inputs InputTable1, InputTable2, InputTable3 (three tables from DB)
    data in each input file is around 600K records, I want to process the data concurrently so i will partition the data into 6 partitions and each partition will process 100k records each for each input file.

    I have a Common object where i want each step to write data to appropriate inputTable*Attr...
    i.e Step1 as per write process will write to common Object and fills inputTable1Attr1 and inputTable2Attr2.
    and accordingly step2 and step3 writes to there own attributes.
    i.e

    Class CommonAttr{
    private String inputTable1Attr1;
    private String inputTable1Attr2;
    private String inputTable2Attr1;
    private String inputTable2Attr2;
    private String inputTable3Attr1;
    private String inputTable3Attr2;
    }


    In the partition i define the startegy to query from db i.e read from inputTables1..3

    I define my step has shown below
    <batch:step id="step1">
    <batch:tasklet>
    <batch:chunk reader="someReader" writer="someWriter" processor="someprocessor"
    commit-interval= '1000'>
    ..............

    </batch:chunk>
    </batch:tasklet>
    </batch:step>

    i.e Chunk Based processing that is we have reader/writer/processor defined within chunk

    If I create 3 steps to acccomplish the job by reading 3 inputTables from database the way it will work is in
    "step1","step2" ,"step3" it will execute for all partitions read/process and write i.e it will execute sequentially the steps one after the other there by processing all the 600k input records for three steps.
    i.e step1 finishes first then step2 then step3

    what i want is to define the job such that "Step1" reads/processes/writes based on chunk interval i.e 1000 records
    then "step2" reads/processes/writes based on chunk interval i.e 1000 records
    and then "step3" reads/processes/writes based on chunk interval i.e 1000 records
    and this continues till all the records for that partition is processed.

    and when it finishes the 1st partition all the three steps have completed reading/processing/writing the data for 1st partition and concurrently all the partitions will be processed there by completign entire job.

    i.e i want job definition something like this

    Within a partition for defined commit interval (chunk)
    Read/Process/Write STEP1
    Read/Process/Write STEP2
    Read/Process/Write STEP3
    Last edited by cmkumar56; Jul 25th, 2013, 08:21 AM.
Working...
X