Announcement Announcement Module
Collapse
No announcement yet.
multiprocessing chunk of datas from Stax input file Page Title Module
Move Remove Collapse
X
Conversation Detail Module
Collapse
  • Filter
  • Time
  • Show
Clear All
new posts

  • multiprocessing chunk of datas from Stax input file

    hi,

    I have to read a large XML input file and do some modifications on data read on the file and finally write the result in an xml output file.

    the strategy of reading a chunk of datas from XML file and use multithreading to process the modification of those datas and finally write the chunk of datas in an output file seems the right soultion

    here is the configuration of my job :
    ##############################
    <job id="ioSampleJob">
    <step id="step1">
    <tasklet>
    <chunk reader="itemReader" processor="itemProcessor" writer="itemWriter" task-executor="executor" commit-interval="10" />
    </tasklet>
    </step>
    </job>
    <beans:bean id="itemProcessor" class="org.springframework.batch.sample.domain.tra de.internal.CustomerCreditIncreaseProcessor" />
    <beans:bean id="executor" class="org.springframework.core.task.SimpleAsyncTa skExecutor">
    <beansroperty name="concurrencyLimit" value="5"></beansroperty>
    </beans:bean>
    </beans:beans>
    ###########################

    but this configuration process reading and writing in multithreading, is there anything wrong with this configuration ?
    i just want to multithread/parallelize the processing of modification on a chunk of datas read from xml file

    I am using the m4 version


    thanks for your advise

  • #2
    is there any sample which matches my issu

    thanks for the response

    Comment

    Working...
    X