Announcement Announcement Module
Collapse
No announcement yet.
Problem with large org.hibernate.impl.SessionFactoryImpl Page Title Module
Move Remove Collapse
X
Conversation Detail Module
Collapse
  • Filter
  • Time
  • Show
Clear All
new posts

  • Problem with large org.hibernate.impl.SessionFactoryImpl

    I have a problem with an OutOfMemory in an Struts-Spring-Hibernate Application. To an existing web-application I added some regular background tasks. Not knowing of the built in support of spring of scheduled tasks I provided an implementation myself. Here is what I did:

    1) Added a listener to web.xml (ApplicationStartUpListener)
    2) Implemented ApplicationStartUpListener as followed
    PHP Code:
    public void contextInitialized(ServletContextEvent servletContextEvent) {
      
    // setup some quartz stuff

      
    ServletContext servletContext servletContextEvent.getServletContext();
      
    ApplicationContext appContext WebApplicationContextUtils.getWebApplicationContext(servletContext);

      
    // pass appContext to quartz job
      

    3) The job is using appContext to retrieve a service and work with it (load some data, update some data)


    The service is using an DAO to interact with the database. The DAO is extending HibernateDaoSupport. For me that just looks like all the service calls that are made by the existing struts actions, so I did not expect any trouble here. On the first sight everything was working out fine but then after some time we encountered some OOM Exception from time to time.

    Analysing the heapdump it showed that org.hibernate.impl.SessionFactoryImpl was holding 95% of all memory allocated, most of it in org.hibernate.engine.query.QueryPlanCache. At the time of the OOM the SessionFactoryImpl itself is held by a quartz WorkerThread.

    Since I am making no calls to any DAO directly I am a little confused why there is a SessionFactoryImpl in my job. But as I am new to spring and hibernate this might be ok? I am also confused about the very large QueryCache.

    Any help would be appreciated.

  • #2
    My guess wrong transactionmanagement in case of using quartz. Post your configuration.

    Comment


    • #3
      Here is my configuration (concerning transaction management):
      Code:
          <aop:config>
              <aop:pointcut id="defaultServiceOperation"
                          expression="execution(* com.foo.bar.services.*Service.*(..))"/>
              
              <aop:advisor pointcut-ref="defaultServiceOperation" advice-ref="defaultTxAdvice"/>
          </aop:config>
      
         <tx:advice id="defaultTxAdvice">
              <tx:attributes>
                  <tx:method name="get*" read-only="true"/>
                  <tx:method name="find*" read-only="true"/>
                  <tx:method name="*"/>
              </tx:attributes>
          </tx:advice>
      
          <bean id="transactionManager" class="org.springframework.orm.hibernate3.HibernateTransactionManager">
      		<property name="sessionFactory"><ref local="sessionFactory"/></property>
          </bean>
      
          <bean id="UserManagement" class="com.foo.bar.services.UserManagementService">
      		<property name="userDAO"><ref local="UserDao"/></property>
          </bean>
      Thanks for your help.

      Comment


      • #4
        Well I was actually more interested in the quartz stuff and what those quartz jobs do.

        It should be a small effort to convert your configuration into a spring based one.

        Comment


        • #5
          Ah, ok. The quartz stuff is really simple:

          Code:
              SchedulerFactory schedFact = new org.quartz.impl.StdSchedulerFactory();
              sched = schedFact.getScheduler();
              sched.start();
          
              Trigger trigger = TriggerUtils.makeMinutelyTrigger(period);
              trigger.setStartTime(new Date());
              trigger.setName("myTrigger");
          
              JobDetail jobDetail = new JobDetail("myJob", Scheduler.DEFAULT_GROUP, regularService);
              if (params != null) {
                for (Object key : params.keySet()) {
                  jobDetail.getJobDataMap().put(key, params.get(key));
                }
              }
              
              sched.scheduleJob(jobDetail, trigger);
          On contextDestroyed Event of the web app:
          Code:
             sched.shutdown(false);
          The job itself is quit simple too:
          Code:
              ApplicationContext appContext = (ApplicationContext) context.getJobDetail().getJobDataMap().get(APP_CONTEXT_KEY);
              UserService userService = (UserService) appContext.getBean(UserService.BEAN_NAME);
              User user = userService.loadUserById(userId);
              // update some user data...
              userService.update(user);
          When I discovered that the quartz stuff is already available in spring I was hardly thinking about refactoring and using it. Since I am having about 10 regular background processes I first wanted to check if that really solves the problem.

          Comment


          • #6
            Could you also post your dao implementation? Is this some kind of batch job? Iterating over a list doing things? Also the UserService is that the UserManagementService mentioned earlier and is it advised by the transaction pointcut?

            Comment


            • #7
              Yes, UserService = UserManagementService. I am trying to simplyfy the example and made an mistake there. The tx:advice applies for this service.

              Yes, the job is a batchjob loading lots of entities and updating them if required.

              Here is the DAO Implementation:
              Code:
              public class UserDAO extends HibernateDaoSupport {
              
                public User findUserByUserId(Long userId) {
                  Session session = null;
                  try {
                    session = this.getSession();
                    return (User)session.get(User.class, userId);
                  } finally {
                    this.releaseSession(session);
                  }
                }
              
                public User updateUser(User user) {
                  return (User)getHibernateTemplate().merge(user);
                }
              }

              Comment


              • #8
                Yes, the job is a batchjob loading lots of entities and updating them if required.
                I think that is part of the issue and also the code you use to retrieve the user.

                Code:
                public User findUserByUserId(Long userId) {
                    Session session = null;
                    try {
                      session = this.getSession();
                      return (User)session.get(User.class, userId);
                    } finally {
                      this.releaseSession(session);
                    }
                  }
                This tends to open a new session outside the scope of springs resource management. I suggest using the sessionfactory directly instead of HibernateTemplate/HibernateDaoSupport (as it isn't recommended anymore).

                Code:
                public class UserDAO {
                
                  private SessionFactory sf;
                
                  public User findUserByUserId(Long userId) {
                    return sf.getCurrentSession().get(User.class, userId);
                  }
                
                  public void update(User user) {
                    sf.getCurrentSession().merge(user);
                  }
                
                  public void batchProcessingMethod() {
                    //iterate over users {
                      //get user
                      //update user     
                    // }
                  }
                }
                Also in your sample 2 sessions are used, one for retrieve and one for update, for performance, usabililty etc. I suggest using 1 session instead. If it is batch processing you might want to consider using a single session for the whole processing.

                You might want to consider moving the actual code to a service method.

                Code:
                User user = userService.loadUserById(userId);
                // update some user data...
                userService.update(user);
                This way everything uses a single session (if programmed correctly) and a single transaction instead of 2 transactions per user. As stated you might want to do some manual transaction management and commit x rows and then clear the session (to prevent oom exceptions).

                Code:
                  public void batchProcessingMethod() {
                    //iterate over users {
                      //if no transaction start tx
                      //get user
                      //update user
                      // 10 users processed, commit transaction, clear session to prevent oom
                    // }
                  }

                Comment


                • #9
                  Thank you for your answer. However I am having some questions:

                  Originally posted by Marten Deinum View Post
                  I think that is part of the issue and also the code you use to retrieve the user.

                  Code:
                  public User findUserByUserId(Long userId) {
                      Session session = null;
                      try {
                        session = this.getSession();
                        return (User)session.get(User.class, userId);
                      } finally {
                        this.releaseSession(session);
                      }
                    }
                  This tends to open a new session outside the scope of springs resource management. I suggest using the sessionfactory directly instead of HibernateTemplate/HibernateDaoSupport (as it isn't recommended anymore).

                  Code:
                  public class UserDAO {
                  
                    private SessionFactory sf;
                  
                    public User findUserByUserId(Long userId) {
                      return sf.getCurrentSession().get(User.class, userId);
                    }
                  
                    public void update(User user) {
                      sf.getCurrentSession().merge(user);
                    }
                  
                    public void batchProcessingMethod() {
                      //iterate over users {
                        //get user
                        //update user     
                      // }
                    }
                  }
                  We were encountering some problems (which I cannot remeber to be honest) when we were not releasing the session in our DAO. If we use SessionFactory directly is this not needed any more?

                  Originally posted by Marten Deinum View Post
                  Also in your sample 2 sessions are used, one for retrieve and one for update, for performance, usabililty etc. I suggest using 1 session instead. If it is batch processing you might want to consider using a single session for the whole processing.

                  You might want to consider moving the actual code to a service method.

                  Code:
                  User user = userService.loadUserById(userId);
                  // update some user data...
                  userService.update(user);
                  This way everything uses a single session (if programmed correctly) and a single transaction instead of 2 transactions per user.
                  Thats what I was planning to do next. It should be sufficient to make a service out of the job appent transactionhandling to it and inject the depended services, right?

                  Originally posted by Marten Deinum View Post
                  As stated you might want to do some manual transaction management and commit x rows and then clear the session (to prevent oom exceptions).

                  Code:
                    public void batchProcessingMethod() {
                      //iterate over users {
                        //if no transaction start tx
                        //get user
                        //update user
                        // 10 users processed, commit transaction, clear session to prevent oom
                      // }
                    }
                  Maybe my first post was not precise enough about the OOM. We dont have that much data that we are facing this problem during one run of a job. We are having several jobs (about 8) which are running very often (some every minute). It seems like the QueryPlanCache retained by HibernateSessionFactory is just growing each run. I am wondering if there is any chance of clearing that cache manually. Actually most of the time the jobs are just reading data and find out that no update is needed.

                  When you talk about "commit transaction, clear session" you mean I can do this manually? Or are you talking about calling an transactional method for every 10 users, letting Spring commit transaction and clear cache after transaction is finished?

                  Comment


                  • #10
                    We were encountering some problems (which I cannot remeber to be honest) when we were not releasing the session in our DAO. If we use SessionFactory directly is this not needed any more?
                    If you use the correct approach you don't need it. Correct approach is use getCurrentSession of the sessionFactory and have prober transactions in place. That combination saves you from manual session management.

                    Thats what I was planning to do next. It should be sufficient to make a service out of the job appent transactionhandling to it and inject the depended services, right?
                    More or less that would make the retrieval and update at least 1 transaction and 1 session, already cutting down the problems in half so to say .

                    When you talk about "commit transaction, clear session" you mean I can do this manually? Or are you talking about calling an transactional method for every 10 users, letting Spring commit transaction and clear cache after transaction is finished?
                    In batch processing I tend to use manual transaction management instead of declarative as this gives you more control especially on when to commit (process x users then commit).

                    Or are you talking about calling an transactional method for every 10 users, letting Spring commit transaction and clear cache after transaction is finished?
                    It is all manual because you want to control it, so commit and clear are done by yourself.

                    Comment


                    • #11
                      Problem solved

                      Just to let you know: It was a hibernate bug.

                      See http://opensource.atlassian.com/proj...rowse/HHH-2470

                      Applied the patch and the system in now running for over 2 weeks without any OOM Problems any more.

                      Comment

                      Working...
                      X