-Re: ClassNotFoundException while using RecommenderJob
Janina 2012-03-15, 10:01
Thanks for your fast answer.
I haven't added the jar manually, but by adding the dependency to the
pom.xml. I tried it with and without the dependendy and with different
versions of the dependency, but it remained the same error message.
But the RecommenderJob is meant to be to run a pseudo distributed
recommender on a Hadoop cluster? Am I guessing something wrong? Or do I
have another possibility to run recommendations on a Hadoop Cluster? I have
read that only the clustering and classification parts of mahout are really
able to be distributed on a hadoop cluster.
2012/3/15 Sean Owen <[EMAIL PROTECTED]>
> You shouldn't have to add anything to your jar, if you use the
> supplied 'job' file which contains all transitive dependencies.
> If you do add your own jars, I think you need to unpack and repack
> them, not put them into the overall jar as a jar file, even with a
> MANIFEST.MF entry. I am not sure that works on Hadoop.
> On Thu, Mar 15, 2012 at 9:42 AM, Janina <[EMAIL PROTECTED]>
> > Hi all,
> > I am trying to run a RecommenderJob from a Java program. I have added the
> > files users.txt and input.txt to a Hadoop VM and use the run-method of
> > RecommenderJob to start the calculation. But the the following error
> > message occurs while running the MapReducer:
> > Error: java.lang.ClassNotFoundException:
> > at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
> > at java.security.AccessController.doPrivileged(Native Method)
> > at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
> > at java.lang.ClassLoader.loadClass(ClassLoader.java:307)
> > at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
> > at java.lang.ClassLoader.loadClass(ClassLoader.java:248)
> > at
> > at
> > at
> > at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
> > at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:621)
> > at org.apache.hadoop.mapred.MapTask.run(MapTask.java:305)
> > at org.apache.hadoop.mapred.Child.main(Child.java:170)
> > I have added the required guava-r09.jar explicitly to my jar which also
> > lays on the hadoop cluster.
> > This may be a stupid question, but does anyone know where this error
> > from? This would help me a lot.
> > Thanks and greetings,
> > Janina