Announcement Announcement Module
Collapse
No announcement yet.
how to get the entire records in the spring batch writer Page Title Module
Move Remove Collapse
X
Conversation Detail Module
Collapse
  • Filter
  • Time
  • Show
Clear All
new posts

  • how to get the entire records in the spring batch writer

    Hi,

    I am reading the data from the database using ScoreCursorReader and writing to a flat file using FlatFileItemWriter. I kept the commit interval as 10000 in the step. untill this is everything is fine. but now my requirement is that I need to apply some business rules for the entire result (i.e entire list, which are written to the flat file) at one shot and as the count of the result is unknown ( approximately 1lakhs records or even more) I am unable to proceed with the further logic which is blocking me.

    Is there any process to get the all the records at one shot.?

    Please let me know if there is any way of acheiving the above requirement.

    Thanks in advance,
    SN

  • #2
    first,get the StepExecution;
    second,stepExecution.getReadCount() or stepExecution.getWriteCount();(stepExecution.getXX XXXCount()
    good luck!

    Comment


    • #3
      Hi Oukousei,

      Thanks for the reply.

      getReadCount() or getWriteCount() of the stepexecution will give you number of records the step has been read till now but i need all the records in the itemwriter before they were written to output file.

      For example if the query returns 1lakhs records from the database and in the batch if the commit interval is 10,000 then the step will execute and read all the 1 lakh records in 10 intervals. 1 interval records will be delegated to the ItemWriter and here all these records will be committed to the output file. But for me all the interval records should be written to the outfile at the last interval (i.e 10th interval)after applying some business rules.

      please help me if any approach is available.

      Thanks in advance,
      SN
      Last edited by sivananda; Dec 21st, 2011, 08:36 AM.

      Comment


      • #4
        Well one obvious way to rectify the situation is to increase the commit-interval to be greater than the number of record read. While this isn't advisable normally since it could lead to memory issue in your case there isn't a lot that could be done if each record can only be written at the end of processing the entire input file.

        A second option would be to break the read-process and read-write into two different steps.
        The read-process step read the data, processes and stores the values needed for the write in the JOB_EXECUTION.
        The second step read-write - reads the data again & uses the values stored in the JOB_EXECUTION to apply the business rules and write the data out.

        A third possible approach (if only a particular positional field in the file is to updated) is using a StepExecutionListener()/JobExecutionListener() and manually writing to the file in update mode.

        Comment

        Working...
        X