Announcement Announcement Module
Collapse
No announcement yet.
Setting the reading pointer with FlatFileItemReader Page Title Module
Move Remove Collapse
X
Conversation Detail Module
Collapse
  • Filter
  • Time
  • Show
Clear All
new posts

  • Setting the reading pointer with FlatFileItemReader

    Hi All !

    I am implementing a batch framework. Our batch strategy is such that if a job fails, upon restart it should start processing the flat file right after the line it successfully read and committed to the database. However, I am having trouble implementing such a strategy.

    My current configuration is such that if a job fails, upon restarting it starts from the line it last read (not committed). So for example: lets say my flat file has 6 lines and my chuck size is 2. If an exception occurs lets say during the first chunk. Then when I restart the job it starts processing from line 3 and skips the first 2. The first two lines were never committed to the database since exception caused a roll back.

    The above behavior is expected since ItemReaders are forward only. However, I was hoping that if I set the reader-transactional-queue="true" then ItemReader , on restarting the job would read again from line 1 rather than line 3.

    It doesn't happen so. When I set the above attribute to true, my job skips the first 2 lines on exception and continues to process the reaming lines. This is not what I expected. I was hoping for the job to fail on encountering an exception like it did above and I was hoping when I retstart it would read again from line 1.

    Could anyone help me with this ? Any pointers are greatly appreciated.

    Thanks

  • #2
    Forget about reader-transactional-queue (the FlatFileItemReader is not a queue).

    It should work as you expect if you aren't doing anything unusual.

    How did the job fail exactly? How did you restart it? Can you share some more details of your configuration?

    Comment


    • #3
      Dave:

      Thanks for your reply. I am running a customized version of the sample football job on my machine. I use it to test stuff before I put anything in my framework. I am using Hibernate , and Spring Hibernate Template for Data access.

      My step configuration is :
      <step id="playerload" next="gameLoad">

      <step id="gameLoad" next="playerSummarization" >

      I induce an error in the gameLoad step by increasing the width of the field to a value more than my db can accommodate.

      AbduKa01,1996,mia,10,nweeeeeeeeee,0,0,0,0,0,29,104,,16,2
      AbduKa02,1996,mia,11,clt,0,0,0,0,0,18,70,,11,2
      AbduKa03,1996,mia,12,oti,0,0,0,0,0,18,59,,0,0
      AbduKa04,1996,mia,13,pit,0,0,0,0,0,16,57,,0,0
      AbduKa05,1996,mia,14,rai,0,0,0,0,0,18,39,,7,0

      I run my job using command line:

      /jobs/football/football-job-context.xml footballJob scheduledOn=04/25/2011

      When I run the job:

      1. step playerLoad finishes successfully.
      2. step gameLoad fails with DataException (see below for the exception)

      entries in batch_step_execution table at this time are: (I am only showing a part of entries)

      playeLoad Completed

      gameLoad Failed commit count 1 , read count 2, write count 2

      3. When I run the job again with same parameters

      /jobs/football/football-job-context.xml footballJob scheduledOn=04/25/2011

      another execution id for the same job instance is created and now I see remaining steps get competed in batch_step_execution table:

      gameLoad Completed

      playerSummarization Completed

      This is expected but what I see now is that gameLoad step has skipped the first two lines in the CSV file and processed the remaining three.

      How can I get the gameLoad to start processing back from the first line (assuming the problem has been rectified).


      Also since I have your attention, Could you please give me some pointers on my other post here : http://forum.springsource.org/showthread.php?t=108409


      I appreciate your time and consideration. Thanks

      ----------------------------------------------
      Exception :




      10:29:29,093 WARN main org.hibernate.util.JDBCExceptionReporter:77 - SQL Error: 0, SQLState: 22001
      10:29:29,093 ERROR main org.hibernate.util.JDBCExceptionReporter:78 - Data truncation: Data too long for column 'OPPONENT' at row 1
      10:29:29,093 ERROR main org.hibernate.event.def.AbstractFlushingEventListe ner:301 - Could not synchronize database state with session
      org.hibernate.exception.DataException: Could not execute JDBC batch update
      at .........


      ........................... org.springframework.batch.core.step.tasklet.Taskle tStep$2.doInChunkContext(TaskletStep.java:261)
      at org.springframework.batch.core.scope.context.StepC ontextRepeatCallback.doInIteration(StepContextRepe atCallback.java:76)
      at org.springframework.batch.repeat.support.RepeatTem plate.getNextResult(RepeatTemplate.java:367)
      at ........

      ..........................................
      10:29:29,093 DEBUG main org.springframework.orm.hibernate3.HibernateTransa ctionManager:893 - Initiating transaction rollback after commit exception
      org.springframework.dao.DataIntegrityViolationExce ption: Could not execute JDBC batch update; SQL [insert into games (YEAR_NO, TEAM, WEEK, OPPONENT, COMPLETES, ATTEMPTS, PASSING_YARDS, PASSING_TD, INTERCEPTIONS, RUSHES, RUSH_YARDS, RECEPTIONS, RECEPTIONS_YARDS, TOTAL_TD, PLAYER_ID) values (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)]; nested exception is org.hibernate.exception.DataException: Could not execute JDBC batch update
      at org.springframework.orm.hibernate3.SessionFactoryU tils.convertHibernateAccessException(SessionFactor yUtils.java:642)
      at org.springframework.orm.hibernate3.HibernateTransa ctionManager.convertHibernateAccessException(Hiber nateTransactionManager.java:793)
      at org.springframework.orm.hibernate3.HibernateTransa ctionManager.doCommit(HibernateTransactionManager. java:664)
      at org.springframework.transaction.support.AbstractPl atformTransactionManager.processCommit(AbstractPla tformTransactionManager.java:754)
      at org.springframework.transaction.support.AbstractPl atformTransactionManager.commit(AbstractPlatformTr ansactionManager.java:723)
      at org.springframework.transaction.support.Transactio nTemplate.execute(TransactionTemplate.java:147)
      at org.springframework.batch.core.step.tasklet.Taskle tStep$2.doInChunkContext(TaskletStep.java:261)
      at org.springframework.batch.core.scope.context.StepC ontextRepeatCallback.doInIteration(StepContextRepe atCallback.java:76)
      at org.springframework.batch.repeat.support.RepeatTem plate.getNextResult(RepeatTemplate.java:367)
      at org.springframework.batch.repeat.support.RepeatTem plate.executeInternal(RepeatTemplate.java:214)
      at org.springframework.batch.repeat.support.RepeatTem plate.iterate(RepeatTemplate.java:143)
      at org.springframework.batch.core.step.tasklet.Taskle tStep.doExecute(TaskletStep.java:247)
      at org.springframework.batch.core.step.AbstractStep.e xecute(AbstractStep.java:196)
      at org.springframework.batch.core.job.SimpleStepHandl er.handleStep(SimpleStepHandler.java:135)
      at org.springframework.batch.core.job.flow.JobFlowExe cutor.executeStep(JobFlowExecutor.java:61)
      at org.springframework.batch.core.job.flow.support.st ate.StepState.handle(StepState.java:60)
      at org.springframework.batch.core.job.flow.support.Si mpleFlow.resume(SimpleFlow.java:144)
      at org.springframework.batch.core.job.flow.support.Si mpleFlow.start(SimpleFlow.java:124)


      Originally posted by Dave Syer View Post
      Forget about reader-transactional-queue (the FlatFileItemReader is not a queue).

      It should work as you expect if you aren't doing anything unusual.

      How did the job fail exactly? How did you restart it? Can you share some more details of your configuration?

      Comment


      • #4
        Did you flush the Hibernate session in your writer? If you don't then the failure will only happen during commit, when it's too late to get back the execution context state.

        Comment


        • #5
          That did it. Thanks Dave.

          Do you then recommend to flush everytime when using hibernate with Spring Batch ?

          Could I get some help on this topic here as well: http://forum.springsource.org/showthread.php?t=108409

          Comment

          Working...
          X