Announcement Announcement Module
No announcement yet.
Writing data to multiple files using CompositeItemWriter Page Title Module
Move Remove Collapse
Conversation Detail Module
  • Filter
  • Time
  • Show
Clear All
new posts

  • Writing data to multiple files using CompositeItemWriter


    I have a requirement to get the data from database and write that data to files based on the filename given in Database.

    This is how data is defined in DB,

    Columns --> FILE_NAME, REC_ID, NAME
    Data --> file_1.csv, 1, ABC
    Data --> file_1.csv, 2, BCD
    Data --> file_1.csv, 3, DEF
    Data --> file_2.csv, 4, FGH
    Data --> file_2.csv, 5, DEF
    Data --> file_3.csv, 6, FGH
    Data --> file_3.csv, 7, DEF
    Data --> file_4.csv, 8, FGH

    As you seen, basically the file names along with the data is defined in the Database so what SpringBatch should do is get this data and write it to the corresponding file specified in the Database (i.e., file_1.csv should only contain 3 records - 1,2,3 and file_2.csv should only contain 4 and 5 records etc.)

    Is it possible to use MultiResourceItemWriter for this requirement (please note that entire file name is dynamic and needs to be retrieved from Database). Appreciate if you can provide sample configuration for this. Thanks.
    Last edited by forumuser1; Apr 11th, 2013, 05:46 PM.

  • #2
    Actually the subject line of this thread has a typo, the corrected subject should be
    "Writing data to multiple files using MultiResourceItemWriter" I know CompositeItemWriter do not suffice my requirement.

    Appreciate your help on this. Thanks.


    • #3
      Unfortunately there is nothing out of the box within Spring Batch for that use case. The best you'll be able to do is write a custom ItemWriter that dynamically creates the resources and FlatFileItemWriters for each file needed. Keep in mind that restart will be tricky at best in this scenario.


      • #4
        Thanks. Yes, we end up writing a Tasklet writing multiple files dynamically. We understand the risks of failure scenarios whereby restarting the jobs etc. but in our case, functionality outweighs these risks.