Announcement Announcement Module
Collapse
No announcement yet.
Configuring a job for online system Page Title Module
Move Remove Collapse
X
Conversation Detail Module
Collapse
  • Filter
  • Time
  • Show
Clear All
new posts

  • #16
    Hi Chudak,

    I have configured my application according to your posting. My job is processing about 600k records for each run. The job involves loading a fixed length flat file with file size about 300MB.

    After the 4th run, I got this out of memory error

    Code:
    java.lang.OutOfMemoryError: Java heap space at org.apache.catalina.loader.WebappClassLoader.findResourceInternal(WebappClassLoader.java:2053) at org.apache.catalina.loader.WebappClassLoader.findResource(WebappClassLoader.java:934) at org.apache.catalina.loader.WebappClassLoader.getResource(WebappClassLoader.java:1069) at org.springframework.core.io.ClassPathResource.getURL(ClassPathResource.java:159) at org.springframework.core.io.ClassPathResource.getFile(ClassPathResource.java:174) at org.springframework.core.io.AbstractResource.exists(AbstractResource.java:51) at org.springframework.batch.core.resource.StepExecutionResourceProxy.exists(StepExecutionResourceProxy.java:112) at org.springframework.batch.item.file.FlatFileItemReader.doOpen(FlatFileItemReader.java:226) at org.springframework.batch.item.support.AbstractBufferedItemReaderItemStream.open(AbstractBufferedItemReaderItemStream.java:154) at org.springframework.batch.item.support.CompositeItemStream.open(CompositeItemStream.java:103) at org.springframework.batch.core.step.item.ItemOrientedStep.open(ItemOrientedStep.java:462) at org.springframework.batch.core.step.AbstractStep.execute(AbstractStep.java:167) at org.springframework.batch.core.job.SimpleJob.execute(SimpleJob.java:100) at org.springframework.batch.core.configuration.support.ClassPathXmlApplicationContextJobFactory$ContextClosingJob.execute(ClassPathXmlApplicationContextJobFactory.java:107) at org.springframework.batch.core.launch.support.SimpleJobLauncher$1.run(SimpleJobLauncher.java:86) at java.lang.Thread.run(Thread.java:619)
    I have started my Tomcat server with -Xmx1024m

    If I run the job from command line (in separated JVM), the process is ok for any number of runs.

    Do you have any advice on where is the issue?

    Thank You

    Vito

    Comment


    • #17
      But when you run from the commandline, it only process ONE run each time you run it, correct (versus multiple runs inside the tomcat container)?

      Comment


      • #18
        Hi Chudak,

        I run 1 job each time too in the tomcat container. I am using some beans from the parent context for data access within a step. Is there are way to check whether the subcontext/ child context is really destroyed?

        Thanks

        Vito

        Comment


        • #19
          Originally posted by Vito Limandibhrata View Post
          Hi Chudak,

          I run 1 job each time too in the tomcat container. I am using some beans from the parent context for data access within a step. Is there are way to check whether the subcontext/ child context is really destroyed?

          Thanks

          Vito
          Right but the jvm is destroyed after you run one job from the commandline and it isn't when you run the jobs through tomcat.

          Here's the thing: if you set your context up like I suggested, there should only be a handful of beans in the child context. Running your job a half dozen times shouldn't create enough extra objects to blow the heap. Sounds like you may have a memory leak. I'd suggest that you run a profiler against your application and see where the objects are coming from...

          Comment


          • #20
            BTW, what version of Spring Batch are you using?

            Comment


            • #21
              Hi Chudak,

              I am using Spring Batch 1.1.2.
              My biggest question is how to clean up the 'prototype' data beans used by org.springframework.batch.item.file.mapping.BeanWr apperFieldSetMapper

              Here is my config

              Code:
                	<bean id="dataSupplierFieldSetMapper"
                      class="org.springframework.batch.item.file.mapping.BeanWrapperFieldSetMapper">
                  	<property name="prototypeBeanName" value="dataSupplier" />
                	</bean>
              
                	<bean id="dataSupplier"
                      class="com.company.DataSupplier"
                      scope="prototype" />
              
              	<bean id="dataSupplierFixedLengthLineTokenizer"
                      class="org.springframework.batch.item.file.transform.FixedLengthTokenizer">
                   
                  	<property name="names" value="cobolSupplierCode,name,oracleSupplierSiteCode,title,firstName,middleName,lastName,department,areaCode,contactPhone,email,oracleOperatingUnitName,eol" />
                  	<property name="columns" value="1-6,
              										7-31,
              										32-51,
              										52-71,
              										72-91,
              										92-111,
              										112-131,
              										132-151,
              										152-171,
              										172-191,
              										192-241,
              										242-291,
              										292-1500" />
                	</bean>
              I will run the profiler and I hope I get lucky to solve this.

              Cheers

              Vito

              Comment


              • #22
                Are you suggesting there is a memory leak? If so we would like to have a JIRA so we can track it.

                By the way, your dataSupplier bean doesn't need to be a bean definition at all because BeanWrapperFieldSetMapper can create beans with a default constructor by itslef.

                Comment


                • #23
                  I have created a JIRA entry for this. http://jira.springframework.org/browse/BATCH-940

                  Comment

                  Working...
                  X