Announcement Announcement Module
No announcement yet.
Hadoop job properties from jobparameters in spring batch Page Title Module
Move Remove Collapse
Conversation Detail Module
  • Filter
  • Time
  • Show
Clear All
new posts

  • Hadoop job properties from jobparameters in spring batch

    Hi I am using spring hadoop in spring batch and would like to set properties of spring job from jobparameters, when I do this I get following error.

    Truncating long message before update of StepExecution, original message is: org.springframework.beans.factory.BeanNotOfRequire dTypeException: Bean named 'wordcount-job' must be of type [org.apache.hadoop.mapreduce.Job], but was actually of type [$Proxy6]
    at BeanFactory.doGetBean( )
    at BeanFactory.getBean(
    at et.execute(
    at org.springframework.batch.core.step.tasklet.Taskle tStep$ChunkTransactionCallback.doInTransaction(Tas
    at nTemplate.execute(
    at org.springframework.batch.core.step.tasklet.Taskle tStep$2.doInC

    Can some one point on what is the error , I am using following configuration in context file.

    <hdp:job id="wordcount-job" validate-paths="false" scope="step"
    input-path="#{jobParameters['inputFile']}" output-path="#{jobParameters['outputFile']}"
    mapper="org.apache.hadoop.examples.WordCount.Token izerMapper"
    reducer="org.apache.hadoop.examples.WordCount.IntS umReducer" >

  • #2
    You need to enable class proxying (by default batch uses JDK proxies and since Hadoop doesn't provide many interfaces...).

    Adding something like:
        <!-- required since Job is a class not an interface -->
        <bean class="org.springframework.batch.core.scope.StepScope" p:proxyTargetClass="true"/>
    should fix this.


    • #3
      Thanks a lot Costin for the pointer I added some more dependent libraries and it worked sweet.


      • #4
        No worries - yet another example why using interfaces for public APIs is in general, a good idea.