Announcement Announcement Module
Collapse
No announcement yet.
JdbcTemplate.batchUpdate problem with inserting >50 records. Page Title Module
Move Remove Collapse
X
Conversation Detail Module
Collapse
  • Filter
  • Time
  • Show
Clear All
new posts

  • JdbcTemplate.batchUpdate problem with inserting >50 records.

    We are using JdbcTemplate's batchUpdate method to insert records to a database table. When the number of record we tried to insert in a batch gets large (>30 in our case), the nothing get inserted into the table. But the weird thing is that we caught no Spring or JDBC exceptions in our code. We turned up the log to debug level and only saw this in the log:

    2009-02-13 08:26:36,224 DEBUG [org.springframework.jdbc.core.JdbcTemplate] SQLWarning ignored: SQL state 'null', error code '16198', message [Invalid item number(1025)
    ]
    2009-02-13 08:26:36,224 DEBUG [org.springframework.jdbc.core.JdbcTemplate] SQLWarning ignored: SQL state 'null', error code '16198', message [Invalid item number(1025)
    ]
    2009-02-13 08:26:36,224 DEBUG [org.springframework.transaction.support.Transactio nSynchronizationManager] Retrieved value [org.springframework.jdbc.datasource.ConnectionHold er@1a55f23] for key [org.springframework.jdbc.datasource.DriverManagerD ataSource@8d241b] bound to thread [WorkerThread#0[172.18.104.142:38011]]


    I think the first 2 lines is related to the problem we experienced. But I have no idea what they mean. Has anybody else experienced similar problem? What's cause. We modified our code to break the large insert into smaller batch and still uses JdbcTemplate.batchUpdate. It works.

    More info about our environment:
    JBoss AS 4.3
    Sybase ASE 12 using Sybase JConn2 JDBC driver
    our insert sql has about 30 columns.

    Here is our code example:

    final int listSize = lineItemList.size();
    final int oid = order.getOrderId();

    int[] actualRowsAffected = getJdbcTemplate().batchUpdate(
    getQuery("INSERT_ITEMS"),
    new BatchPreparedStatementSetter() {
    public int getBatchSize() {
    return listSize;
    }

    public void setValues(PreparedStatement ps, int i)
    throws SQLException {
    LineItem lineItem = lineItemList.get(i);
    ps.setInt(1, oid);
    ps.setString(5, lineItem.getManufacturerName());
    ps.setString(7, lineItem.getProductDescription());
    // more ps.set... statement, has about 30 parameters to set.

    }
    });
    int rowCount = 0;
    for (int rows : actualRowsAffected) {
    rowCount += rows;
    }
    //when insert fails, this log line says Inserted 0 rows into item table.
    log.info("Inserted " + rowCount + " rows into item table.");
    Last edited by binli; Feb 13th, 2009, 08:23 AM.

  • #2
    We are using JdbcTemplate's batchUpdate method to insert records to a database table. When the number of record we tried to insert in a batch gets large (>30 in our case), the nothing get inserted into the table. But the weird thing is that we caught no Spring or JDBC exceptions in our code. We turned up the log to debug level and only saw this in the log:
    Did you find a solution to this problem?

    I too have run into a similar issue although I don't think the size of the data to be inserted should be an hurdle. In my case too, the application just keeps running ('hung' waiting for the batch write operation to complete) without any kind of exceptions, and i have no idea how to approach in order to troubleshoot this issue. Please share if you have any ideas. Thanks!
    Last edited by shiralkarprashant; Jun 8th, 2011, 02:43 PM.

    Comment

    Working...
    X