Announcement Announcement Module
Collapse
No announcement yet.
calling job with in a tasklet of another job Page Title Module
Move Remove Collapse
X
Conversation Detail Module
Collapse
  • Filter
  • Time
  • Show
Clear All
new posts

  • calling job with in a tasklet of another job

    I need to call a job, from a takslet of another job.

    Job A:Step A:
    Tasklet A:get all the filenames with in a folder. In a loop, instantiate a job(Job B) with filepath parameter.
    Job B: processes each file.


    but while launching job B for first time i am getting an exception."Existing transaction detected in JobRepository. Please fix this and try again (e.g. remove @Transactional annotations from client)."

  • #2
    Why not have job A --> step A --> reader (read files names in) --> processor (itemProcessor processes each file) --> writer (if necessary).

    Why do you need two jobs?

    Just curious.

    Jeff

    Comment


    • #3
      Call another batch job in a shell

      Originally posted by visualjeff View Post
      Why not have job A --> step A --> reader (read files names in) --> processor (itemProcessor processes each file) --> writer (if necessary).

      Why do you need two jobs?

      Just curious.

      Jeff
      The answer to Jeff is obviously because processing a file is itself a batch job because processing can be quite complicated, reading a file, processing each line and transforming it, then creating an entry in a db for instance.

      I had a similar need and the only way I could find to do it was have the writer create a new shell and execute another batch in it using
      Runtime rt = Runtime.getRuntime();
      if (osName.toLowerCase().contains("windows")){
      command +="cmd /c ";
      }
      command += "java -jar " + jarFile
      + " launch-context-export.xml " + jobName .... etc
      Process pr = rt.exec(command);
      int exitVal = pr.waitFor();

      Is there a better way to do it?

      I tried using the SimpleJobLauncher instead of a shell but I also got the error IllegalStateException - existing transaction in repository.

      I would have thought that this is quite a common scenario and had hoped for better support from Spring Batch.

      Comment


      • #4
        Forking a JVM may not be necessary for complex batch processing. But if you have too then check out SystemCommandTasklet.


        Here is a Suggestion:
        Think of each file as an item and break out the job out over a number steps.

        Step 1:
        itemReader reads the files within a directory
        itemWriter a flatfile that lists every file that needs to be processed (a snapshot of the directory's current state)
        Step 2:
        itemReader reads the flatfile listing what files to be processed
        itemProcessor processes files (could even be a compositeItemProcessor)
        itemWriter writes record to the datasource

        Jeff

        Comment


        • #5
          A file is not an item

          Originally posted by visualjeff View Post
          Forking a JVM may not be necessary for complex batch processing. But if you have too then check out SystemCommandTasklet.


          Here is a Suggestion:
          Think of each file as an item and break out the job out over a number steps.

          Step 1:
          itemReader reads the files within a directory
          itemWriter a flatfile that lists every file that needs to be processed (a snapshot of the directory's current state)
          Step 2:
          itemReader reads the flatfile listing what files to be processed
          itemProcessor processes files (could even be a compositeItemProcessor)
          itemWriter writes record to the datasource

          Jeff
          Hi Jeff,
          I think it is a bit of a stretch to use a composite item processor, which if I understand correctly is intended to apply separate transformations to the same item, to convert the input item (a file containing many lines=items) to n output items.

          I suppose that it could be used, but it seems to me that it is throwing away all the advantages of processing a single file as a batch job, and each item as a repeated step.

          In my specific case I already had a working batch job (actually processing a group of db records to produce a flatfile rather than the other way around) so when it was needed to process n groups it seemed logical to create a job which performed the query to get a list of groups of db items that then called the existing batch job to process each group and produce a file for each.

          A more logical solution would be the possibility of specifying a batch job as an item processor.

          Another practical but less than optimal solution is for the parent job to use an itemWriter that creates a shell script where each line is the call to a child batch program executable with the appropriate parameters, and execute this file on termination of the first batch.

          regards,
          Philip

          Comment

          Working...
          X