Hbase MapReduce jobs need Zookeeper JAR files

If you don’t have Zookeeper JAR files in the Hadoop Java classpath, you will get the following error:

[mike@hadoop-01 ~]$ cd /usr/lib/hbase

[mike@hadoop-01 hbase]$ HADOOP_CLASSPATH=’bin/hbase classpath’ hadoop jar hbase-0.90.4-cdh3u2.jar importtsv

Exception in thread “main” java.lang.NoClassDefFoundError: org/apache/zookeeper/KeeperException

        at java.lang.Class.getDeclaredMethods0(Native Method)

        at java.lang.Class.privateGetDeclaredMethods(Class.java:2427)

        at java.lang.Class.getMethod0(Class.java:2670)

        at java.lang.Class.getMethod(Class.java:1603)

        at org.apache.hadoop.util.ProgramDriver$ProgramDescription.<init>(ProgramDriver.java:56)

        at org.apache.hadoop.util.ProgramDriver.addClass(ProgramDriver.java:99)

        at org.apache.hadoop.hbase.mapreduce.Driver.main(Driver.java:45)

        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)

        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)

        at java.lang.reflect.Method.invoke(Method.java:597)

        at org.apache.hadoop.util.RunJar.main(RunJar.java:186)

Caused by: java.lang.ClassNotFoundException: org.apache.zookeeper.KeeperException

        at java.net.URLClassLoader$1.run(URLClassLoader.java:202)

        at java.security.AccessController.doPrivileged(Native Method)

        at java.net.URLClassLoader.findClass(URLClassLoader.java:190)

        at java.lang.ClassLoader.loadClass(ClassLoader.java:307)

        at java.lang.ClassLoader.loadClass(ClassLoader.java:248)

        … 12 more

[mike@hadoop-01 hbase]$

Older Hbase mapred documents don’t say that Zookeeper JAR is a dependency, but it is.

The problem can be fixed by modifying your hadoop-env.sh file to include paths to both Hbase and Zookeeper JAR files (your JAR names will vary depending on your Hbase and Zookeeper versions) under “# Extra Java CLASSPATH elements.  Optional”:

export HADOOP_CLASSPATH=/usr/local/hbase/hbase-0.90.4-cdh3u2.jar:/usr/local/zookeeper/zookeeper-3.3.3.jar

Note that Hadoop will need to be restarted for the changes to take effect.  Alternatively, you can be adding the following statement to all jobs:

 TableMapReduceUtil.addDependencyJars(job);

Check out “using MapReduce jobs with Hbase” on Cloudera’s site for more information:

https://ccp.cloudera.com/display/CDHDOC/HBase+Installation#HBaseInstallation-UsingMapReducewithHBase

by Michael Alatortsev

Technologist, parallel entrepreneur. Interests: travel, photography, big data, analytics, predictive modeling.

Tagged with: , , ,
Posted in Uncategorized

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: