Announcement Announcement Module
Collapse
No announcement yet.
How do know the total number of record in a file Page Title Module
Move Remove Collapse
X
Conversation Detail Module
Collapse
  • Filter
  • Time
  • Show
Clear All
new posts

  • How do know the total number of record in a file

    HI,

    I want to know the total number of records in a file just before the batch starts to process the first record, to show the status of job something like total records, processed records, failed records...
    Please guide me.

    Thanks in advance,
    Shahul

  • #2
    You can count the records in beforeJob or beforeStep listener, or maybe use a separate TaskletStep.

    Comment


    • #3
      It's really hard to know the number of records processed *before* the job is run. If it's fixed width input, you could probably determine it mathematically by analyzing the total size of the file and dividing by the size of each record, which would work because every record has to take up the same amount of space. However, if it's delimited, the only way I can think to do that would be to spin through the file quickly and count.

      It's fairly easy to count the records as your process though, which is fairly common.

      Comment


      • #4
        I have to process fixed, delimited XML files. Coming files contain millions of records. So if we go through the file only for getting the total number of records, impact system performance. So I am planning to show only processed records. But here, I have one more question.
        If I need to save number of processed records, number of failed records and status of execution in a table to show the batch status through a screen. The system should update table when it processed every 1000 records.
        How can we achieve this?

        Comment


        • #5
          If you set commit interval = 1000 you'll have the STEP_EXECUTION table updated after 1000 items are processed.

          Alternatively you can launch the job asynchronously so that you have a reference to the JobExecution object which is being updated as job executes.

          Comment


          • #6
            As Robert said, if you set the commit interval to 1,000, the meta-data will be stored every 1,000 records, just before committing. The ExecutionContext is where you can store this kind of state, so that in case of a failure you don't have to start over again.

            Comment


            • #7
              I implemented ItemReadListener, ItemWriteListener and extended CompositeExecutionJobListener in one class(BatchStatusLoggerListener) and that class is using for track the total number of processed records, failed records, etc... I try to simulate one writing error in a record as insert a string into a number column in table. So it generate numberformatexception and throws to our BatchStatusLoggerListener implementation class and increased the count of failed records. But after that it called onError method of CompositeExecutionJobListener and stopped. Please guide me to overcome this issue.

              Thanks,
              Shahul

              Comment


              • #8
                All errors are fatail by default. Do you need to add a skip limit?

                Out of interest, why do you need to extend CompositeExecutionJobListener? That should really be an internal class - it's not really designed for extension, and I can't immediately see why you need to.

                Comment


                • #9
                  sorry for late reply...

                  I implemented JobExecutionListener, not CompositeExecutionJobListener.

                  Comment

                  Working...
                  X