Announcement Announcement Module
Collapse
No announcement yet.
Batch Restart is not accessing all the values stored in Job/StepExecutionContext Page Title Module
Move Remove Collapse
X
Conversation Detail Module
Collapse
  • Filter
  • Time
  • Show
Clear All
new posts

  • Batch Restart is not accessing all the values stored in Job/StepExecutionContext

    Hi Spring Batch team,

    In our job, we have 3 steps configured.

    Step1: Is storing 10 key value pairs to ExecutionContext
    Step2: Is processing the business logic by using the key value pairs stored in step1.
    Step3: Writing the records to output file.

    The problem we are facing is, what ever values we are storing in execution Context are getting removed whenever step2 is restarted.
    When I saw the Job_Execution_Context table, it is creating a new entry for every 2 key value pairs.
    When Step2 is restarted, execution context is holding only first entry with first 2 key value pairs. The remaining key value pairs are getting removed from Job/StepExecutionContext.
    I tried the same using ExecutionContextPromotionListener as specified in spring documentation, It is behaving the same way. I tried even storing the values in StepExecutionContext of Step2, still loosing the values . IS there any workaround for this to get all the values stored in Job/StepExecuctionContext back when restarting the same step?

    Any help would be appreciated.

    Thanks,
    Lavanya

  • #2
    I'm a colleague of Lavanya, the author of the original post, and again we've been testing this issue this afternoon. It seems to be isolated to the serialization of the ExecutionContext when then Job fails. I've seen a number of post about Thoughtworks XStream and Jettison conflicts that caused similar issues. Could this be it? As the Execution context is stored in the SpringBatch Execution Context table it stores the first two map values and then has another “entry” for additional values in the ExecutionContext. Every two entries in the ExecutionContext causes another "entry" in the serialized version of the ExecutiionContext. When the job restarts it will read only the first two items in the Execution context and the rest is gone and is not available on restart. We’ve tried to store maps within maps and during the first run of the job, the ExecutionContext seems intact. Upon restart, though the deserialization of the map within ExecutionContext throws exceptions from the parsing of the Xstream components.
    This is causing us some concern in our production environment as we're afraid that restarts may not work correctly. Any help would be greatly appreciated.

    Comment


    • #3
      Can you pls post your job configuration?

      Comment


      • #4
        We've just figured it out. It was most certainly tied to the version of the jettison.jar that is bundled with JBoss. This was in our classpath and causing the serialization to be off. I'm not sure of the specific version, but when we added com.springsource.org.codehaus.jettison-1.0.0.jar or com.springsource.org.codehaus.jettison-1.0.1.jar to our classpath, the serialization worked okay. We were currently using com.springsource.com.thoughtworks.xstream-1.3.0.jar com.springsource.org.codehaus.jettison-1.0.0.jar. We see there is a jettison 1.0.1 but 1.0.0 seemed to serialize okay too. Anyone know if 1.0.1 is the better choice. It has a fix, but I can't find what the specific fix was.

        Comment

        Working...
        X