Announcement Announcement Module
Collapse
No announcement yet.
Job that has itemWriter when restarted does not update output file Page Title Module
Move Remove Collapse
X
Conversation Detail Module
Collapse
  • Filter
  • Time
  • Show
Clear All
new posts

  • Job that has itemWriter when restarted does not update output file

    Scenario:
    Job that can be restarted takes Comma DeLimited File (CSV) with 200 rows, item Reader reads them with commit interval of 10 and writes to new file using itemWriter that is Pipe De Limited File.
    <job id="CSVFileLoadJob" restartable="true">
    <step id="processCSV">
    <tasklet>
    <chunk reader="itemReaderForTest" processor="itemProcessor" writer="itemWriter" commit-interval="10"/>
    </tasklet>
    </step>
    </job>

    In CSV file there is error on line 195, hence the pipe output file has 190 rows. (perfect)
    manually Fix input file on line 195 and restart job the item reader works perfect it picks up rows 191-200 but the output file still has only 190 rows and the job is successfully completed..

    <beans:bean id="itemWriter" class="org.springframework.batch.item.file.FlatFil eItemWriter">
    <beansroperty name="saveState" value="true"/>
    <beansroperty name="shouldDeleteIfExists" value="true"/>
    <beansroperty name="resource" ref="outputResource" />
    <beansroperty name="lineAggregator">
    <beans:bean class="org.springframework.batch.item.file.transfo rm.DelimitedLineAggregator">
    <beansroperty name="delimiter" value="||"/>
    <beansroperty name="fieldExtractor">
    <beans:bean class="org.springframework.batch.item.file.transfo rm.BeanWrapperFieldExtractor">
    <beansroperty name="names" value="accountId,dataSourceTypeId,originalAccountN umber,accountNumber,primarySSN,updatedBy,updatedDa te"/>
    </beans:bean>
    </beansroperty>
    </beans:bean>
    </beansroperty>
    </beans:bean>

    Are there any special settings to allow file to be updated or is anything in configuration wrong that is not updating file...

    Thanks in advance

  • #2
    I think that what you are describing is related to http://jira.springframework.org/browse/BATCH-1225. It will be fixed for the 2.0.1 release.

    Comment


    • #3
      not sure.. in my scenario i don't see any overwrites, just the new records that are part of re-run are not appended to file. Changed test and this is what i could see to confirm...

      Changed commit-interval to 300, Hence when it failed at record 195 there was no entires in output file (purfect), then fixed input file not to have errors and restarted job it created all 200 rows.

      Hence it looks like the issue is when the non-comitted rows are restarted they are not updating output file...

      Thanks

      Comment


      • #4
        Since you have 200 records, try it with a commit interval of 10 and make record #105 fail. When you fix the record and restart it you will probably see that some 10 rows are missing from the middle. (This is the bug I mentioned).

        Comment


        • #5
          thanks for detail got it.. The chunk of records in transaction block that failed are lost when restarted..

          Comment


          • #6
            http://jira.springframework.org/browse/BATCH-1225 should address this issue...

            Comment

            Working...
            X