Announcement Announcement Module
Collapse
No announcement yet.
problem with duplicates inserted when using getSimpleJdbcTemplate().batchUpdate() Page Title Module
Move Remove Collapse
X
Conversation Detail Module
Collapse
  • Filter
  • Time
  • Show
Clear All
new posts

  • problem with duplicates inserted when using getSimpleJdbcTemplate().batchUpdate()

    Hi all,

    I'm getting a problem of getting my records inserted twice if I use getSimpleJdbcTemplate().batchUpdate() inside my ItemWriter if i have an an error while inserting one of the records in the commit.

    For example, if have 10 records in my batch update, and 1 of them is going to fail, the successful 9 will be inserted twice.

    It seems like that even though there is an error with one item in the batch insert, the other 9 are still commited. However, the batchUpdate() will return with an exception. Spring Batch (in particular, the BatchRetryTemplate.. i think), is catching that trying to insert each record with a chunk size of 1...(in an effort to find the one that caused an error), hence producing the duplicate. Is there a way to stop
    that from happening?

    here is my item writer...

    Code:
    	public void write(List<? extends DAARecord> daaRecords) {
    
    		if (daaRecords.size() == 0) { return; }
    		
    		SqlParameterSource[] sqlParamsList = new SqlParameterSource[daaRecords.size()];
    		
    		for (int i=0; i < daaRecord.size(); i++) {
    		
    			DAARecord daaRecord = daaRecords.get(i);
    			long id = getSimpleJdbcTemplate().queryForLong(SQL_KEY_INCREMENTER);
    			daaRecord.setId(id);
    			sqlParamsList[i] = new BeanPropertySqlParameterSource(daaRecord) ;
    		}
    		
    		int[] updateCounts = getSimpleJdbcTemplate().batchUpdate(sql,sqlParamsList);
    	}

  • #2
    you are probably facing the same issue as ours
    http://forum.springsource.org/showth...t=85781&page=2

    When an error occurred in the writer, it seems that spring batch calls the write method with the wrong argument when it tries to reprocess the chunk one by one.

    Comment


    • #3
      thanks for the heads up. i'll check my data in detail to see if it is the wrong arguments being passed and keep tabs on that thread.

      based one of your messages through.... the expected behaviour is...

      We're ok with the reprocessing. What we're not ok is the writer not being called anymore. here is what we expect.

      Chunk With 5 items:
      * 5 read, 5 processed, exception in the writer -> rollback
      * 1 read, 1 processed, 1 written -> commit
      * 1 read, 1 processed, 1 written -> commit
      * 1 read, 1 processed, 1 written -> commit
      * 1 read, 1 processed, exception -> rollback
      * 1 read, 1 processed, 1 written -> commit
      * Job FAILED
      so i think where it is causing issue for me is, the first rollback is not happening. my records are still commited. so when it goes in 1 by 1, i get duplicates on all of them except the one that causes an exception.

      so 1) do you guys using batchUpdate() as well and getting the rollback? I dont think it is an issue if i use insert(), but it is much slower.

      and 2) is there a way to stop the 1 by 1 reprocessing while still throwing an exception??

      if i catch the exception from being thrown up, the meta data in the spring batch tables get off b/c it thinks i written everything.

      Comment

      Working...
      X