Announcement Announcement Module
Collapse
No announcement yet.
javax.security.auth.login.LoginException: unable to find LoginModule class Page Title Module
Move Remove Collapse
X
Conversation Detail Module
Collapse
  • Filter
  • Time
  • Show
Clear All
new posts

  • javax.security.auth.login.LoginException: unable to find LoginModule class

    Hi,

    I have been unable to invoke Mahout Jobs from Spring Hadoop.

    Code:
    	<hdp:tool-runner id="mahout-tool"
    		tool-class="org.apache.mahout.cf.taste.hadoop.item.RecommenderJob" configuration-ref="hadoopConfiguration">
    		<hdp:arg value="-s SIMILARITY_LOGLIKELIHOOD" />
    		<hdp:arg value="-i user_item_pref.csv" />
    		<hdp:arg value="-o recommendations" />
    	</hdp:tool-runner>
    When invoking a Mahout Job through the tool-runner, I get the following exception:

    Code:
    Caused by: java.lang.RuntimeException: java.io.IOException: failure to login
    	at org.apache.hadoop.mapreduce.JobContext.<init>(JobContext.java:82)
    	at org.apache.hadoop.mapreduce.Job.<init>(Job.java:50)
    	at org.apache.mahout.common.HadoopUtil.prepareJob(HadoopUtil.java:125)
    	at org.apache.mahout.common.AbstractJob.prepareJob(AbstractJob.java:470)
    	at org.apache.mahout.cf.taste.hadoop.preparation.PreparePreferenceMatrixJob.run(PreparePreferenceMatrixJob.java:74)
    	at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
    	at org.apache.mahout.cf.taste.hadoop.item.RecommenderJob.run(RecommenderJob.java:153)
    	at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
    	at org.springframework.data.hadoop.mapreduce.ToolExecutor.runTool(ToolExecutor.java:89)
    	at org.springframework.data.hadoop.mapreduce.ToolRunner.getObject(ToolRunner.java:39)
    	at org.springframework.data.hadoop.mapreduce.ToolRunner.getObject(ToolRunner.java:31)
    	at org.springframework.beans.factory.support.FactoryBeanRegistrySupport.doGetObjectFromFactoryBean(FactoryBeanRegistrySupport.java:142)
    	... 56 more
    Caused by: java.io.IOException: failure to login
    	at org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:433)
    	at org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:395)
    	at org.apache.hadoop.mapreduce.JobContext.<init>(JobContext.java:80)
    	... 67 more
    Caused by: javax.security.auth.login.LoginException: unable to find LoginModule class: org/apache/hadoop/security/UserGroupInformation$HadoopLoginModule
    	at javax.security.auth.login.LoginContext.invoke(LoginContext.java:808)
    	at javax.security.auth.login.LoginContext.access$000(LoginContext.java:186)
    	at javax.security.auth.login.LoginContext$4.run(LoginContext.java:683)
    	at java.security.AccessController.doPrivileged(Native Method)
    	at javax.security.auth.login.LoginContext.invokePriv(LoginContext.java:680)
    	at javax.security.auth.login.LoginContext.login(LoginContext.java:579)
    	at org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:414)
    	... 69 more
    I wonder if this issue is with the way Spring Hadoop uses classloaders to load the LoginModule. Hive had a similar issue:

    https://issues.apache.org/jira/browse/HADOOP-7982

    Thanks,
    Thomas
    Last edited by thomasvdv; Sep 18th, 2012, 12:11 PM.

  • #2
    Unless you're using the jar attribute (available in TRUNK), SHDP doesn't do anything fancy around class-loaders or threads.
    Can you confirm what version of SHDP are you using?

    As for the Hive report, I've looked at it but it seemed to have something to do with libhdfs and JNI code.

    I'll try to run the example myself and report back.

    Comment


    • #3
      I was wondering if this snippet of code could be the culprit of the class loading issue I am facing.

      From ToolExecutor (1.0.0.M2):

      Code:
      		Thread th = Thread.currentThread();
      		ClassLoader oldTccl = th.getContextClassLoader();
      
      		final Tool ft = t;
      
      		try {
      			th.setContextClassLoader(cl);
      
      			if (StringUtils.hasText(user)) {
      				UserGroupInformation ugi = UserGroupInformation.createProxyUser(user, UserGroupInformation.getLoginUser());
      				return ugi.doAs(new PrivilegedExceptionAction<Integer>() {
      
      					@Override
      					public Integer run() throws Exception {
      						return org.apache.hadoop.util.ToolRunner.run(cfg, ft, arguments);
      					}
      				});
      			}

      Comment


      • #4
        I don't think so - we manage the TCCL since otherwise the Hadoop configuration doesn't get passed properly.
        In master we've done several improvements to this mechanism and introduced the jar functionality (for both tool and as a stand-alone element) - can you please try it out and report back?
        See http://www.springsource.org/spring-data/hadoop#maven

        Comment


        • #5
          I tried SNAPSHOT and still get the same error.

          Thomas

          Comment


          • #6
            Thanks for the feedback - I'll try to replicate your problem tomorrow (got swamped with some pig/hive updated).
            Can you add your config (in case there's more to what you posted) and give some info about your environment? What hadoop distro are you using, in what config (pseudo or not - are you using Kerberos?), what OS and JVM?

            Thanks

            Comment


            • #7
              I am using Datastax, no Kerbos, Ubuntu 12.04. I haven't had a chance to run this on a standard Hadoop distro but it would be great if you could give it a try so I can pinpoint it to a potential Datastax issue. Thanks!

              Comment


              • #8
                I've tried reproducing the error but I couldn't get a hold of the user_item_prefs.csv. I've ran the following config against the intro.csv (from Mahout src dist) which returned an error as it format doesn't match the proper format however it proves the job inside the cluster.

                The config available at https://gist.github.com/3776128

                P.S. By the way, your config contained a small, annoying bug - when specifying the input path (-i) don't add an extra space as it gets read as part of the path ("-ipath" vs "-i path").

                Comment

                Working...
                X