Announcement Announcement Module
Collapse
No announcement yet.
Duplicate entries in table from JdbcBatchItemWriter Page Title Module
Move Remove Collapse
X
Conversation Detail Module
Collapse
  • Filter
  • Time
  • Show
Clear All
new posts

  • Duplicate entries in table from JdbcBatchItemWriter

    Hi

    I am using JdbcBatchItemWriter to writer my records from CSV file into a DB and a file also after validating.I have compositeItemWriter
    to write onto multiple(2) sources.

    In XM configuration file, to configure JdbcBatchItemWriter, i have insert query.
    Worked fine until i realized it is inserting same records multiple times.Its plain insert SQL not Update on any criteria.
    For 50,000 records, it almost hogged because of multiple entries.
    My csv file has 50,000 unique records.

    To fix multiple entries, i added unique key constraint and i started getting 'Not skipable' UNIQUE_KEY_CONSTRIANT error.

    Before that i was using JdbcITemWriter with Update-Insert sql.(If update fails, insertion happens) But for better performance,

    i tried using JdbcBatchItemWriter.
    I am not sure why its happening and what is is fix for multiple entries.
    I want unique insertions and dont understand how it is inserting same records many times.

  • #2
    My Exception -->
    org.springframework.batch.retry.ExhaustedRetryExce ption: Retry exhausted after last attempt in recovery path, but exception is not skippable.; nested exception is org.springframework.dao.DuplicateKeyException: PreparedStatementCallback; SQL [



    Originally posted by Misha79 View Post
    Hi

    I am using JdbcBatchItemWriter to writer my records from CSV file into a DB and a file also after validating.I have compositeItemWriter
    to write onto multiple(2) sources.

    In XM configuration file, to configure JdbcBatchItemWriter, i have insert query.
    Worked fine until i realized it is inserting same records multiple times.Its plain insert SQL not Update on any criteria.
    For 50,000 records, it almost hogged because of multiple entries.
    My csv file has 50,000 unique records.

    To fix multiple entries, i added unique key constraint and i started getting 'Not skipable' UNIQUE_KEY_CONSTRIANT error.

    Before that i was using JdbcITemWriter with Update-Insert sql.(If update fails, insertion happens) But for better performance,

    i tried using JdbcBatchItemWriter.
    I am not sure why its happening and what is is fix for multiple entries.
    I want unique insertions and dont understand how it is inserting same records many times.

    Comment


    • #3
      Do you have a configuration or unit test we can take a look at?

      Comment

      Working...
      X