Announcement Announcement Module
Collapse
No announcement yet.
Need help to understand how to assign custom Item readers, streams to dynamic flows Page Title Module
Move Remove Collapse
This topic is closed
X
X
Conversation Detail Module
Collapse
  • Filter
  • Time
  • Show
Clear All
new posts

  • Need help to understand how to assign custom Item readers, streams to dynamic flows

    ‹Hi,

    My current configuration is like this, I have two similar kinds of jobs which I am currently running them parallel, but I am not convinced the way I am doing here as static, because now they are two steps and future it may be expanded to number 'n'.

    I have already made readers and writers to step scope, and now I am exploring to make this flow also dynamic (java code instead of xml config),

    Now where I am stuck was, I don't know how to add these step configured multi-resource Item reader and customized item readers( cofngiruation names: itemReader, outputTestReader)

    Could you please help me to know , how to inject these customized readers and writers when ever I create multiple flows, because each flow has different job folders and need to read different files.


    Code:
    <batch:job id="fileReadingJob">
                    <batch:split id="multi-split-job" task-executor="parallelJoBtaskExecutor">
                            <batch:flow>
                                    <batch:step id="tasletStep1">
                                            <batch:tasklet>
                                                    <batch:chunk reader="itemReader" writer="writer"
                                                            commit-interval="500">
                                                            <batch:streams>
                                                                    <batch:stream ref="outputTestReader" />
                                                            </batch:streams>
                                                    </batch:chunk>
                                                    <batch:listeners>
                                                            <batch:listener ref="stepListener" />
                                                    </batch:listeners>
                                            </batch:tasklet>
                                    </batch:step>
                            </batch:flow>
                            <batch:flow>
                                    <batch:step id=" tasletStep2">
                                            <batch:tasklet>
                                                    <batch:chunk reader="itemReader" writer="writer"
                                                            commit-interval="500">
                                                            <batch:streams>
                                                                    <batch:stream ref="outputTestReader" />
                                                            </batch:streams>
                                                    </batch:chunk>
                                            </batch:tasklet>
                                            <batch:listeners>
                                                    <batch:listener ref="stepListener" />
                                            </batch:listeners>
                                    </batch:step>
                            </batch:flow>
                    </batch:split>
            </batch:job>
    <bean id="stepListener" class="com.owl.explorer.batchio.StepListenerLocal"
                    scope="step" />
     
            <bean id="itemReader" class="com.owl.explorer.batchio.LogItemReader"
                    scope="step">
            <property name="itemReader" ref="outputTestReader" />
    </bean>
     
    <bean id="outputTestReader"
                    class="org.springframework.batch.item.file.MultiResourceItemReader"
                    scope="step" >
    <property name="resources" value="…. " />
            <property name="delegate" ref="reader" />
    </bean>
     
    <bean id="reader" class="org.springframework.batch.item.file.FlatFileItemReader"
                    scope="step">
    <property name="lineMapper">
    
                    </property>
    </bean>
     
    <bean id="writer" class="com.owl.explorer.batchio.LogItemWriter" scope="step"/>

    ava code I found for adding flows parallel,

    Code:
    SimpleFlow flow = new SimpleFlow("job");
    SimpleFlow flow1 = new SimpleFlow("flow1");
    SimpleFlow flow2 = new SimpleFlow("flow2");
    
    List<StateTransition> transitions = new ArrayList<StateTransition>();
    transitions.add(StateTransition.createStateTransition(new StepState(new StubStep("step1") {
    @Override
    public void execute(StepExecution stepExecution) throws JobInterruptedException
    {
    if (!stepExecution.getJobExecution().getExecutionContext().containsKey("STOPPED"))
    {
    stepExecution.getJobExecution().getExecutionContext().put("STOPPED", true);
    stepExecution.setStatus(BatchStatus.STOPPED);
    jobRepository.update(stepExecution);
    }
    else {
    fail("The Job should have stopped by now");
    }
    }
    }), "end0"));
    
    transitions.add(StateTransition.createEndStateTransition(new EndState(FlowExecutionStatus.COMPLETED, "end0")));
    flow1.setStateTransitions(new ArrayList<StateTransition>(transitions));
    flow1.afterPropertiesSet();
    flow2.setStateTransitions(new ArrayList<StateTransition>(transitions)); flow2.afterPropertiesSet();
    transitions = new ArrayList<StateTransition>();
    transitions.add(StateTransition.createStateTransition(new SplitState(Arrays.<Flow> asList(flow1, flow2),"split"), "end0"));
    transitions.add(StateTransition.createEndStateTransition(new EndState(FlowExecutionStatus.COMPLETED, "end0")));
    flow.setStateTransitions(transitions);
    flow.afterPropertiesSet();
    job.setFlow(flow);
    job.afterPropertiesSet();
    job.execute(jobExecution);
    assertEquals(BatchStatus.STOPPED, jobExecution.getStatus());
    checkRepository(BatchStatus.STOPPED, ExitStatus.STOPPED);
    assertEquals(1, jobExecution.getAllFailureExceptions().size());
    assertEquals(JobInterruptedException.class, jobExecution.getFailureExceptions().get(0).getClass());
    assertEquals(1, jobExecution.getStepExecutions().size());
    for (StepExecution stepExecution : jobExecution.getStepExecutions())
    {
    assertEquals(BatchStatus.STOPPED, stepExecution.getStatus());
    }
    Last edited by ashokgudise; Feb 27th, 2014, 11:42 AM.

  • mminella
    replied
    Sorry for not getting to this sooner. We are in the process of moving to StackOverflow for our forums.
    This question is probably a better candidate for StackOverflow, perhaps against the #spring-batch tag. If you do post it there, please reply here with the link. Thanks!

    Leave a comment:

Working...
X