Announcement Announcement Module
Collapse
No announcement yet.
iBatis Performance Issue using Oracle Sequence. Page Title Module
Move Remove Collapse
X
Conversation Detail Module
Collapse
  • Filter
  • Time
  • Show
Clear All
new posts

  • iBatis Performance Issue using Oracle Sequence.

    Hi,

    Environment:
    1. Application instance location: US
    2. Database instance location: London
    3. Database: ORACLE 10g
    4. JRE 6.
    5. ORM Tool: iBatis.
    Scenario 1: We inserted 1000 records in few hundred milliseconds. These records did not have any Sequence ID in the table.
    We used batch mode.

    Scenario 2: We introduced a not null column ID and in the insert query (specified in sqlMap) we retrieved the nextVal using:
    <selectKey keyProperty="id" resultClass="long">
    select ID_SEQ.NEXTVAL AS ID FROM DUAL
    </selectKey>

    Now even the 100 rows insertion is taking more than 10 seconds. Update/Delete is working as before.


    In both the scenarios, we are using the batch mode.
    --
    getSqlMapClientTemplate().execute(new SqlMapClientCallback(){
    public Object doInSqlMapClient(SqlMapExecutor executor){
    executor.startBatch();
    for (E row : rows) {
    executor.insert(getInsertSQLId(), row);
    }
    executor.executeBatch();
    return null;
    }
    }

    Suspect: I am assuming that batch operation is bypassed as ibatis is internally firing a SQL to get the ID.NEXTVAL for each iteration?

    Can someone verify if this suspect is correct and what could be done to bring back the same performance as before?

    Has anyone used iBatis + Oracle 10g with Sequence and faced a similar problem? - if not then - network latency due to different location may be one of the culprit.

    Thanks in Advance,

  • #2
    If you don't care about id that was assigned, you can use following insert statement:

    INSERT INTO table (id, ...) VALUES (table_seq.nextval, ...);

    Comment


    • #3
      Originally posted by rience View Post
      If you don't care about id that was assigned, you can use following insert statement:

      INSERT INTO table (id, ...) VALUES (table_seq.nextval, ...);
      Thanks for the suggestion.

      However when I try using nextval directly like:

      <insert id="insertTestUsingSequence" parameterMap="testResult-map">
      INSERT INTO PB_PV.PV_TEST (ID, STRING, INTEGER_VAL, DOUBLE_VAL, LONG_VAL)
      VALUES ( PV_TEST_PERFORMANCE.NEXTVAL,?,?,?,? )
      </insert>


      iBatis is not able to detect the total number of parameters. it complains about the last column.

      StackTrace:
      --- Cause: java.sql.SQLException: Invalid column index
      at org.springframework.jdbc.support.SQLErrorCodeSQLEx ceptionTranslator.doTranslate(SQLErrorCodeSQLExcep tionTranslator.java:224)
      at org.springframework.jdbc.support.AbstractFallbackS QLExceptionTranslator.translate(AbstractFallbackSQ LExceptionTranslator.java:72)
      at org.springframework.orm.ibatis.SqlMapClientTemplat e.execute(SqlMapClientTemplate.java:212)
      at org.springframework.orm.ibatis.SqlMapClientTemplat e.insert(SqlMapClientTemplate.java:397)
      at com.ubs.pb.primevault.dataacess.ibatis.test.TestBa tchPerformanceUsingSequenceScenario$1.doInSqlMapCl ient(TestBatchPerformanceUsingSequenceScenario.jav a:23)
      at org.springframework.orm.ibatis.SqlMapClientTemplat e.execute(SqlMapClientTemplate.java:209)
      at com.ubs.pb.primevault.dataacess.ibatis.test.TestBa tchPerformanceUsingSequenceScenario.insertEntities (TestBatchPerformanceUsingSequenceScenario.java:17 )
      at com.ubs.pb.primevault.dataacess.ibatis.test.Abstra ctTestPerformanceScenario.run(AbstractTestPerforma nceScenario.java:45)
      at java.util.concurrent.ThreadPoolExecutor$Worker.run Task(Unknown Source)
      at java.util.concurrent.ThreadPoolExecutor$Worker.run (Unknown Source)
      at java.lang.Thread.run(Unknown Source)

      Comment


      • #4
        How is your 'testResult-map' XML looks like ?

        Comment


        • #5
          Got past the error. My mistake.

          However there is no difference with the performance after implementing the suggestion.

          Amit

          Comment


          • #6
            Here's another one. If number of records to be inserted is constant - then maybe setting INCREMENT option on sequence to batch size will solve the problem. Then you'll:

            1) Query sequence to get first value
            2) Sequence will be incremented by 1000 let's say
            3) You will now have 1000 numbers that you can assign to entities.

            Comment


            • #7
              Originally posted by rience View Post
              Here's another one. If number of records to be inserted is constant - then maybe setting INCREMENT option on sequence to batch size will solve the problem. Then you'll:

              1) Query sequence to get first value
              2) Sequence will be incremented by 1000 let's say
              3) You will now have 1000 numbers that you can assign to entities.
              Thanks rience. I tried this suggestion and here are the results.

              100 Rows insert:

              1.) Using Sequence as selectKey (or directly in insert statement): 34610 ms.

              2.) As Suggested - using sequence for first call - and then manually incrementing it for each insert: 34923 ms.

              3.) Using Direct JDBC for fetching the ID separately and then inserting: 17891 ms.

              All the scenarios are in the batch mode.

              Amit

              Comment


              • #8
                Originally posted by rience View Post
                Here's another one. If number of records to be inserted is constant - then maybe setting INCREMENT option on sequence to batch size will solve the problem. Then you'll:

                1) Query sequence to get first value
                2) Sequence will be incremented by 1000 let's say
                3) You will now have 1000 numbers that you can assign to entities.
                However if the same thing I do with using simple JDBC, the performance is increased by 6 times.

                Time taken to insert 100 records: 2656 ms.

                Comment


                • #9
                  I haven't been using iBatis recently so I may be wrong. But looking at Javadoc for iBatis and SqlMapExecutor.startBatch():


                  public void startBatch()
                  throws java.sql.SQLException

                  Starts a batch in which update statements will be cached before being sent to the database all at once. This can improve overall performance of updates update when dealing with numerous updates (e.g. inserting 1:M related data).
                  I don't know why only update statements (or maybe by update they're saying insert as well). But it seems like only updates will be executed in batch mode.

                  Comment


                  • #10
                    Originally posted by rience View Post
                    I haven't been using iBatis recently so I may be wrong. But looking at Javadoc for iBatis and SqlMapExecutor.startBatch():



                    I don't know why only update statements (or maybe by update they're saying insert as well). But it seems like only updates will be executed in batch mode.
                    I found this in iBatis-Javadoc of insert().

                    Executes a mapped SQL INSERT statement. Insert is a bit different from other update methods, as it provides facilities for returning the primary key of the newly inserted row (rather than the effected rows). This functionality is of course optional.

                    How do we make it optional so that batching is not broken?

                    Comment


                    • #11
                      Here is the comment from the Spring.SqlMapClientTemplate.execute() callback method.

                      // We always needs to use a SqlMapSession, as we need to pass a Spring-managed
                      // Connection (potentially transactional) in. This shouldn't be necessary if
                      // we run against a TransactionAwareDataSourceProxy underneath, but unfortunately
                      // we still need it to make iBATIS batch execution work properly: If iBATIS
                      // doesn't recognize an existing transaction, it automatically executes the
                      // batch for every single statement...

                      Now the issue is that
                      SqlMapSession session = this.sqlMapClient.openSession();

                      opens a new session even if this call is within a transaction, which is started by
                      getSqlMapClientTemplate().getSqlMapClient().startT ransaction();

                      As result of this new session -
                      ibatisCon = session.getCurrentConnection();
                      returns null.

                      Also when I debug more - (iBatis)SqlMapSessionImpl.getCurrentConnection()

                      Transaction trans = delegate.getTransaction(sessionScope);

                      is not able to find the transaction - because of which batching is effectively void.


                      Could someone help me on this? Not sure if I explained it clearly but I will try to explain more.

                      Amit.

                      Comment


                      • #12
                        could you post the latest sqlmap.xml contents?

                        Comment

                        Working...
                        X