Job that can be restarted takes Comma DeLimited File (CSV) with 200 rows, item Reader reads them with commit interval of 10 and writes to new file using itemWriter that is Pipe De Limited File.
<job id="CSVFileLoadJob" restartable="true">
<step id="processCSV">
<tasklet>
<chunk reader="itemReaderForTest" processor="itemProcessor" writer="itemWriter" commit-interval="10"/>
</tasklet>
</step>
</job>
In CSV file there is error on line 195, hence the pipe output file has 190 rows. (perfect)
manually Fix input file on line 195 and restart job the item reader works perfect it picks up rows 191-200 but the output file still has only 190 rows and the job is successfully completed..
<beans:bean id="itemWriter" class="org.springframework.batch.item.file.FlatFil eItemWriter">
<beans

<beans

<beans

<beans

<beans:bean class="org.springframework.batch.item.file.transfo rm.DelimitedLineAggregator">
<beans

<beans

<beans:bean class="org.springframework.batch.item.file.transfo rm.BeanWrapperFieldExtractor">
<beans

</beans:bean>
</beans

</beans:bean>
</beans

</beans:bean>
Are there any special settings to allow file to be updated or is anything in configuration wrong that is not updating file...
Thanks in advance
Leave a comment: