Announcement Announcement Module
Collapse
No announcement yet.
Job, JobParameters are not serializable? Page Title Module
Move Remove Collapse
X
Conversation Detail Module
Collapse
  • Filter
  • Time
  • Show
Clear All
new posts

  • Job, JobParameters are not serializable?

    Hi,
    I am making use of Quartz's StatefulJob to prevent the jobs running concurrently. This requires contents of JobDataMap to implement Serializable interface. I am keeping instances of JobExecution and JobParameters in the jobDataMap so that they can be retrieved easily in quartz listener. I found that Job, JobParameters classes are not implementing Serializable interface. Why they are not doing that?. It would help in my case.

    regards,
    Ramkris

  • #2
    I'm not sure which version of the framework you're using, but I know JobParameters is serializable. You're right that Job isn't serializable though. I'm curious why you need the job to be? I understand the parameters, but can't you just store the name of the xml file where the job is in job data map? It's really best practice to reload the context everytime you run a job.

    Comment


    • #3
      Originally posted by lucasward View Post
      I'm not sure which version of the framework you're using, but I know JobParameters is serializable. You're right that Job isn't serializable though. I'm curious why you need the job to be? I understand the parameters, but can't you just store the name of the xml file where the job is in job data map? It's really best practice to reload the context everytime you run a job.
      I am using 1.0.1 and JobParameters is not implementing Serializable. Actually i am trying to store the jobexecution object in the jobdatamap once the job finishes. So i would pick it up in my quartz listener. JobExecution has JobInstance which has Job in it. That causes the problem. If you make the job as transient in JobInstance, then it would solve my problem.

      Comment


      • #4
        Job is not serializable and never will be (it is a configuration object - you can always re-constitute it from it's name via a JobLocator). The reference to Job in JobExecution and the Serializable marker on JobParameters may have been fixed since 1.0.1. If you need something now for JobParameters, you can always use Properties and a JobParametersConverter. I'm not sure why you would need to serialize and store a JobExecution anyway, so maybe there is a better way to store the persistent data that Quartz needs?

        Comment


        • #5
          Originally posted by Dave Syer View Post
          Job is not serializable and never will be (it is a configuration object - you can always re-constitute it from it's name via a JobLocator). The reference to Job in JobExecution and the Serializable marker on JobParameters may have been fixed since 1.0.1. If you need something now for JobParameters, you can always use Properties and a JobParametersConverter. I'm not sure why you would need to serialize and store a JobExecution anyway, so maybe there is a better way to store the persistent data that Quartz needs?
          My usecase is that i would send a mail in quartz listener once the job completes. I actually include how many items each step processed and status of all the steps in the mail.
          Thats the reason i want to keep JobExecution in quartz job datamap so that i would pick it up in the listener. Else i need to query from the database which is expensive and also the getLastJobExecution() in JdbcJobExecutionDAO won't return all the stepexecutions of the jobexecution.

          Comment


          • #6
            I'm still not clear why you need to serialize the JobExecution. A JobLauncher always returns a JobExecution, so if you want to do something with it after a job finishes, you have a reference already - there should be no need to put it in an input map for a Quartz job, unless I am missing something. How about posting your configuration?

            Comment


            • #7
              Originally posted by Dave Syer View Post
              I'm still not clear why you need to serialize the JobExecution.
              OK. Let me explain from the beginning. My first requirement is that i should not invoke the same job concurrently. I am implementing StatefulJob interface provided by quartz which make sure new triggers will be delayed before the completion of execute method. Also this persists the contents of job's datamap into the database.

              My second requirement is that i need send a mail once the job completes. I am doing this in the listener provided by the quartz. I would include all the details provided by the job execution(including the step executions) in the mail am sending.

              Originally posted by Dave Syer View Post
              A JobLauncher always returns a JobExecution, so if you want to do something with it after a job finishes, you have a reference already - there should be no need to put it in an input map for a Quartz job, unless I am missing something. How about posting your configuration?
              I am actually making use of JobExecution returned by the JobLauncher. Here is my code

              Code:
              public class ProxyJobBean extends QuartzJobBean implements StatefulJob,ApplicationContextAware
              {	
              
              	private ApplicationContext ctx;
              	private JobRegistry jobRegistry;
              		
              	public void setApplicationContext(ApplicationContext applicationContext) {
              	    this.ctx = applicationContext;
              	}
              /**
              	 * Invoked by the scheduler.
              	 * @param context {@link JobExecutionContext}
              	 * @throws JobExecutionException
              	 */
              	public final void executeInternal(JobExecutionContext context) throws JobExecutionException
              	{
              		String jobName = context.getJobDetail().getName();
              		this.jobRegistry = (JobRegistry) this.ctx.getBean(BatchConstants.Job_REGISTRY);
              		try {
              			// Check if current day is holiday. Don't run the jobs on holidays.
              			if (! isCurrentDayHoliday(jobName)) {
              				if (context.isRecovering()) {
              					recoverJobs(context);
              					return;
              				}
              				JobExecution jobExecution = runJob(context);
              				context.getJobDetail().getJobDataMap().put(BatchConstants.JOB_EXECUTION, jobExecution);
              			} else {
              				//TODO:logging is enough? 
              				logger.info("Job "+jobName+" will not run on business holiday ");
              			}
              		} catch (JobRestartException e) {
              			logger.error(e);
              			throw new JobExecutionException(e);
              		} catch (JobExecutionAlreadyRunningException e) {
              			logger.error(e);
              			throw new JobExecutionException(e);
              		} catch (JobInstanceAlreadyCompleteException e) {
              			logger.error(e);
              			throw new JobExecutionException(e);
              		} catch (NoSuchJobException e) {
              			logger.error(e);
              			throw new JobExecutionException(e);
              		}
              	}
              /**
              	 * Invokes the job using {@link SimpleJobLauncher}
              	 * @param context
              	 * @throws JobRestartException 
              	 * @throws JobInstanceAlreadyCompleteException 
              	 * @throws NoSuchJobException 
              	 * @throws {@link JobExecutionAlreadyRunningException}
              	 * @return {@link JobExecution}
              	 * 
              	 */
              	private JobExecution runJob(JobExecutionContext context) throws JobExecutionAlreadyRunningException, JobRestartException, 
              			JobInstanceAlreadyCompleteException, NoSuchJobException  {
              		SimpleJobLauncher launcher = (SimpleJobLauncher)
              			this.ctx.getBean(JOB_LAUNCHER);
              		String jobName = context.getJobDetail().getName();
              		Job job = getJob(jobName);
              		logger.info("Launching job :[" + jobName + "]");
              		return launcher.run(job, getJobParameters(job,context));
              	}
              }
              I am populating the JobExecution into datamap once returned by the JobLauncher. I would retrieve the jobexecution from job datamap in my listener class. Here is my quartz listener code.

              Please note that code is not a complete one.

              Code:
              public class QuartzJobListener implements JobListener{
              	private Log logger = LogFactory.getLog(getClass());
              	private static Map<JobInstance,Integer> counter=new HashMap<JobInstance,Integer>(50);
              	private JobRegistry jobRegistry;
              	private JdbcJobExecutionDao jobexecutionDao; 
              	private JdbcJobInstanceDao jobInstanceDao;
              	private JdbcStepExecutionDao stepExecutionDao;
              	private BatchNotificationService notificationService;
              	private Scheduler scheduler;
              	
              	public String getName() {
              		return getClass().getSimpleName();
              	}
              
              	public void jobExecutionVetoed(JobExecutionContext context) {
              		String jobName = context.getJobDetail().getName();
              		logger.debug(jobName + " was vetoed and not executed()");
              	}
              
              	public void jobToBeExecuted(JobExecutionContext context) {
              		String jobName = context.getJobDetail().getName();
              		logger.debug(jobName + " is about to be executed");
              	}
                     public void jobWasExecuted(JobExecutionContext context,
              			JobExecutionException exception) {
              
                              String jobName = context.getJobDetail().getName();
              		logger.debug(jobName + " was executed");
                             JobExecution jobExecution = (JobExecution) context.getJobDetail().getJobDataMap()						.get(BatchConstants.JOB_EXECUTION);
              sendNotifications(jobExecution);
              
                      }
              Please see the usage of JobExecution above. It works fine if i don't implement StatefulJob interface. Problem comes only when i implement Stateful Job because Quartz will try to persist contents of JobDataMap into the database.

              I hope i am able to clarify your doubts.

              Regards,
              ramkris

              Comment


              • #8
                Why not send the e-mail from a Spring Batch listener, then (or even a Step)? Then the JobExecution won't leak out of the JobDetail.

                Comment


                • #9
                  Originally posted by Dave Syer View Post
                  Why not send the e-mail from a Spring Batch listener, then (or even a Step)? Then the JobExecution won't leak out of the JobDetail.
                  Till m4, i was sending the mails from afterJob() method in JobListener provided by the batch framework. However starting from 1.0.0 release, the endTime is null and Exit Status shows as UNKNOWN in JobExecution. Both are very much needed when we send the mail.Thats the reason i am planning to move to Quartz Listener.

                  Comment


                  • #10
                    I have similar query before with ramkris. Please refer to:
                    http://forum.springframework.org/sho...t=serializable

                    I'm not sure why you would need to serialize and store a JobExecution anyway
                    Job is not serializable and never will be
                    In the use case, I want Job, JobParameters and JobExecution to be serializable. It is for us to write a job launcher that launch a job running in another JVM (e.g. inside container).

                    My previous received information from Robert is that 'JobExecution is serializable'. Is that still true? I got a little bit confused.

                    Comment


                    • #11
                      In the 1.1 trunk JobParameters are serializable and JobInstance does not have reference to Job, only jobName. So there shouldn't be any more serialization headaches in 1.1 release.

                      Comment


                      • #12
                        I'm still not sure that you need the job to be serializable for your use case. The JobExecution and JobParameters makes sense to me, since they're generated at runtime. (To answer your other question, yes both of them are Serializable, at least in the current trunk) However, if you wanted to launch in a jvm, why not just give send the job name over, along with the JobParameters, and let the other jvm create the application context? It has to anyway since it's already going to need to get/create a datasource, etc.

                        Comment


                        • #13
                          Lucasward, I got your meaning and I think it is easy to send jobName and jobParameters and get our job done easily.

                          Why we want that is we aim at keep the contract of launcher unchanged:
                          1. JobExecution jobExecution = launcher.run(job, jobParameters);

                          We can of course write something like:
                          2. JobExecution jobExecution = launcherByJobName.run(jobName, jobParameters);

                          But it changed the contract. Anyway, I think it is not a big deal but I still want to consult Spring Batch authors on our design thoughts. If you prefer the approach 2 to 1, we can surely proceed like that.

                          Comment


                          • #14
                            I can understand not wanting to change the contract, but in this case, I think I would. It's much preferable to leave the creation of the Job in the JVM where it will actually be run. I'm not sure how you could do it otherwise. Since a Job has steps, who have readers and writers, it would be pretty much impossible to stamp the entire thing as serializable. For example, if your writer was a DAO that took a DataSource, your DataSource would have to be serialized as well (which is of course a bad thing)

                            Comment


                            • #15
                              Originally posted by lucasward View Post
                              I can understand not wanting to change the contract, but in this case, I think I would. It's much preferable to leave the creation of the Job in the JVM where it will actually be run. I'm not sure how you could do it otherwise. Since a Job has steps, who have readers and writers, it would be pretty much impossible to stamp the entire thing as serializable. For example, if your writer was a DAO that took a DataSource, your DataSource would have to be serialized as well (which is of course a bad thing)
                              Will Spring Batch consider make another interface for this ?
                              Like JobRunner.run(String jobName , String[] parameters)

                              Comment

                              Working...
                              X