Announcement Announcement Module
No announcement yet.
Problem using SimpleAsyncTaskExecutor Page Title Module
Move Remove Collapse
Conversation Detail Module
  • Filter
  • Time
  • Show
Clear All
new posts

  • Problem using SimpleAsyncTaskExecutor


    I was using spring batch 1.1.4 and i was using SimpleAsyncTaskExecutor to configure the taskExecutor for a SimpleStepFactoryBean.

    The throttleLimit was configured for 10 threads.

    My definition was bellow:

    <bean id="processItemInvoiceStages" parent="simpleStep">
    <property name="taskExecutor">
    class="org.springframework.core.task.SimpleAsyncTa skExecutor">
    <property name="itemReader" ref="processItemStageReader" />
    <property name="itemWriter" ref="rulesWriter" />
    <property name="commitInterval" value="1" />

    I set the commitInterval for 1, but i guess that it must be 10 too.

    The problem is that when i execute the jstack -l <pid> for my java process,
    nine threads are WAITING and only one as RUNNABLE. My process was slow probably because that behavior.
    My item writer insert records on a oracle database table.

    Someone know what can be the problem ? Thank you

    The output from jstack is: (for the WAITING THREAD)

    "SimpleAsyncTaskExecutor-1819" prio=10 tid=0x5db7f000 nid=0x6e8e waiting on condition [0x5d5fe000..0x5d5ff030]
    java.lang.Thread.State: WAITING (parking)
    at sun.misc.Unsafe.park(Native Method)
    - parking to wait for <0x6463d148> (a java.util.concurrent.Semaphore$NonfairSync)
    at java.util.concurrent.locks.LockSupport.park(LockSu
    at java.util.concurrent.locks.AbstractQueuedSynchroni zer.parkAndCheckInterrupt(AbstractQueuedSynchroniz
    at java.util.concurrent.locks.AbstractQueuedSynchroni zer.doAcquireSharedInterruptibly(AbstractQueuedSyn
    at java.util.concurrent.locks.AbstractQueuedSynchroni zer.acquireSharedInterruptibly(AbstractQueuedSynch
    at java.util.concurrent.Semaphore.acquire(Semaphore.j ava:286)
    at org.springframework.batch.core.step.JdkConcurrentS tepExecutionSynchronizer.lock(JdkConcurrentStepExe
    at org.springframework.batch.core.step.item.ItemOrien tedStep$2.doInIteration(
    at torRepeatTemplate$

    Locked ownable synchronizers:
    - None


    AND IT for the RUNNABLE thread:

    "SimpleAsyncTaskExecutor-1810" prio=10 tid=0x5c40c400 nid=0x63bd runnable [0x5cfad000..0x5cfadfb0]
    java.lang.Thread.State: RUNNABLE
    at Method)
    at java:129)
    at Source)
    at Source)
    at Source)
    at Source)
    at Source)
    at Source)
    at oracle.jdbc.driver.T4CMAREngine.unmarshalUB1(T4CMA
    at oracle.jdbc.driver.T4CMAREngine.unmarshalSB1(T4CMA
    at oracle.jdbc.driver.T4C8Oall.receive( 478)
    at oracle.jdbc.driver.T4CPreparedStatement.doOall8(T4
    at oracle.jdbc.driver.T4CPreparedStatement.executeFor Rows(
    at oracle.jdbc.driver.OraclePreparedStatement.execute Batch(
    - locked <0x6ffd72b0> (a oracle.jdbc.driver.T4CPreparedStatement)
    - locked <0x6fa7a2d8> (a oracle.jdbc.driver.T4CConnection)
    at org.apache.commons.dbcp.DelegatingStatement.execut eBatch(
    at org.apache.commons.dbcp.DelegatingStatement.execut eBatch(
    at org.hibernate.jdbc.BatchingBatcher.doExecuteBatch(
    at org.hibernate.jdbc.BatchingBatcher.addToBatch(Batc
    at org.hibernate.persister.entity.AbstractEntityPersi ster.insert(
    at org.hibernate.persister.entity.AbstractEntityPersi ster.insert(
    at org.hibernate.action.EntityInsertAction.execute(En
    at org.hibernate.engine.ActionQueue.execute(ActionQue
    at org.hibernate.engine.ActionQueue.executeActions(Ac
    at org.hibernate.engine.ActionQueue.executeActions(Ac
    at org.hibernate.event.def.AbstractFlushingEventListe ner.performExecutions(AbstractFlushingEventListene
    at org.hibernate.event.def.DefaultFlushEventListener. onFlush(
    at org.hibernate.impl.SessionImpl.flush(SessionImpl.j ava:1000)
    at org.hibernate.impl.SessionImpl.managedFlush(Sessio
    at org.hibernate.transaction.JDBCTransaction.commit(J
    at org.springframework.orm.hibernate3.HibernateTransa ctionManager.doCommit(HibernateTransactionManager. java:655)
    at atformTransactionManager.processCommit(AbstractPla
    at atformTransactionManager.commit(AbstractPlatformTr
    at org.springframework.batch.core.step.item.ItemOrien tedStep$2.doInIteration(
    at torRepeatTemplate$

  • #2
    With Batch 1.1 (up to 2.1) you have to be careful to make sure your DataSource connection pool has enough connections to service all the concurrent processing. If your Step is the only client of the DataSource at that time it would need 10 or more connections.


    • #3
      Hi Dave,

      My datasource configuration is:

      <bean id="dataSource" class="com.cpqd.billing.ebppbatch.consolidation.da tasource.DataSourceWrapper">
      <property name="driverClassName" value="${batch.jdbc.driver}" />
      <property name="url" value="${batch.jdbc.url}" />
      <property name="username" value="${batch.jdbc.user}" />
      <property name="password" value="${batch.jdbc.password}" />
      <property name="initialSize" value="30" />
      <property name="poolPreparedStatements" value="true"/>
      <property name="maxActive" value="30" />
      <property name="clientName" value="${}" />
      <property name="databaseProperties" value="etc/batch-${environment}.properties" />

      I have 30 active connections configure.

      I guess that the problem can be the commitInterval, because it was configure with 1, but i guess that it must be 10 too.
      The other threads that are WAITING,
      are waiting the commit if the other thread.

      If someone have a sugestion for that problem i will stay happy.


      • #4
        There is no connection between the commit interval and the concurrency settings (throttle limit, thread pool size, connection pool size).

        Your RUNNING thread is busy communicating with the database server. Do you think it is hung (nothing in the stack you posted suggests a problem). Maybe the server has a session limit or something?


        • #5
          Hi Dave,

          I was using hibernate for insert many records on a database table. (about 18.000.000 records or more)
          I was using hibernate-3.2.3 and my hibernate.cfg.xml was configured with;

          <property name="hibernate.jdbc.batch_size">1000</property>

          Do you guess that it can be a performance hibernate problem on session flush (i was using no stateless session) ?
          Maybe must I use pure jdbc for insert records ?

          thank you.


          • #6
            I have the same problems with you. I dont know why it appears. There are many unknown.....................
            Web Development Company
            Last edited by ltm0807; Sep 14th, 2010, 09:48 PM.


            • #7
              If your job works with the default task executor then there can be no issue with hibernate or batch sizes there. Was the process only slow?

              Just re-read the original post. If commitInterval=1 then you get no benefit from the async task executor because you spend all your time waiting for a synchronized update to the step execution. What about using a larger commit interval?


              • #8
                Hi Dave,

                Really the threads that were WAITING, are waiting the commit of all the data inserted (i guess that the transaction was opened by spring batch).
                I will try open a transaction (REQUIRES_NEW) on my itemWriter (call it on write method), to force a commit for each thread.

                I post a new message if it will function.

                Thank you


                • #9
                  What about using a larger commit interval?
                  That is the right solution if you want parallel processing to do anything useful.
                  Last edited by Dave Syer; Sep 17th, 2010, 02:05 AM. Reason: copy-paste error