Announcement Announcement Module
Collapse
No announcement yet.
heap size outofmemoryerror for processing Flatfile in Spring Batch Page Title Module
Move Remove Collapse
X
Conversation Detail Module
Collapse
  • Filter
  • Time
  • Show
Clear All
new posts

  • heap size outofmemoryerror for processing Flatfile in Spring Batch

    I have a file with more than 1 million of records with size approx equal to 400 MB. I am reading the file through FlatFileItemReader and i extend the Itemwriter and writing items into a queue with commit-interval as 100. I had set the maximum heap size as 1 GB.

    But Still am getting the Heap size outofmemoryerror. I had changed the commit-interval to 10 also but still am getting the same problem.

    I have few doubts on the above.
    1) while reading the file through FlatFileItemReader it will read records up to the commit-interval. is it flushing the data once it after written into writer ? if so why it takes much memory for the process ?.

    Please someone help me for the same. Thanks in Advance.

  • #2
    the FlatFileItemReader streams the data, so it shouldn't be the cause of the out-of-memory error. We need to know more to diagnose the error. Which ItemWriter are you using? You mentioned a queue, is it JMS? if so, is the JMS broker remote?

    Comment


    • #3
      Originally posted by arno View Post
      the FlatFileItemReader streams the data, so it shouldn't be the cause of the out-of-memory error. We need to know more to diagnose the error. Which ItemWriter are you using? You mentioned a queue, is it JMS? if so, is the JMS broker remote?
      Thanks for your reply. I am extending the ItemWriter and writing the data into Gemfire Cache region. I hereby attached my code below for your reference. Please suggest and let me know i have to any changes in the Writer class to avoid this problem.

      Code:
      import com.gemstone.gemfire.cache.Cache;
      import com.gemstone.gemfire.cache.Region;
      public class FeedToCache implements ItemWriter<FeedData> {
      
      	Cache	cache;
      	Region	cacheRegion;
      	
      	public Region getCacheRegion() {
      		return cacheRegion;
      	}
      
      	public void setCacheRegion(Region cacheRegion) {
      		this.cacheRegion = cacheRegion;
      		try {
      			Thread.sleep(1200);
      		} catch (InterruptedException e) {
      			e.printStackTrace();
      		}
      	}
      
      	public void setCache(Cache cache) {
      		this.cache = cache;
      		setCacheRegion(cache.getRegion("cacheRegion"));
      	}
      
      	public void write(List<? extends FeedData> items) throws Exception {
      		for (FeedData item : items) {
      			if (item!= null)
      			if(item.getAccountNumber()!=null)
      				cacheRegion.put(item.getAccountNumber()+Math.random(),item);
      		}
      	}
      }
      Last edited by harilance; May 2nd, 2011, 04:58 AM.

      Comment


      • #4
        if you configured Gemfire to store objects in memory, it's normal you run out of memory. It's pretty much like putting items in a HashMap. Looks like it all depends on your Gemfire configuration.

        Comment


        • #5
          Originally posted by arno View Post
          if you configured Gemfire to store objects in memory, it's normal you run out of memory. It's pretty much like putting items in a HashMap. Looks like it all depends on your Gemfire configuration.
          Thanks Arno. your are spot on. The problem is in Gemfire Configuration. I had raised new Thread in Gemfire Forum.

          Comment

          Working...
          X