Skip to main content

Common issues on Shark with CDH5-beta2

Issues on Shark with CDH5-beta2

1.

IncompatibleClassChangeError: Implementing class

Exception in thread "main" java.lang.IncompatibleClassChangeError: Implementing class
    at java.lang.ClassLoader.defineClass1(Native Method)
    at java.lang.ClassLoader.defineClass(ClassLoader.java:800)
    at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
    at java.net.URLClassLoader.defineClass(URLClassLoader.java:449)
    at java.net.URLClassLoader.access$100(URLClassLoader.java:71)
    at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
    at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
    at java.lang.ClassLoader.defineClass1(Native Method)
    at java.lang.ClassLoader.defineClass(ClassLoader.java:800)
    at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
    at java.net.URLClassLoader.defineClass(URLClassLoader.java:449)
    at java.net.URLClassLoader.access$100(URLClassLoader.java:71)
    at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
    at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
    at java.lang.Class.forName0(Native Method)
    at java.lang.Class.forName(Class.java:190)
    at org.apache.hadoop.hive.shims.ShimLoader.createShim(ShimLoader.java:128)
    at org.apache.hadoop.hive.shims.ShimLoader.loadShims(ShimLoader.java:123)
    at org.apache.hadoop.hive.shims.ShimLoader.getHadoopShims(ShimLoader.java:82)
    at org.apache.hadoop.hive.ql.security.HadoopDefaultAuthenticator.setConf(HadoopDefaultAuthenticator.java:51)
    at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:73)
    at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:133)
    at org.apache.hadoop.hive.ql.metadata.HiveUtils.getAuthenticator(HiveUtils.java:365)
    at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:270)
    at shark.SharkCliDriver$.main(SharkCliDriver.scala:128)
    at shark.SharkCliDriver.main(SharkCliDriver.scala)
Solution: One reason for this error is conflicting JARS. In shark-0.9.1 this is caused by hadoop-core-1.0.4.jar
find /opt/shark/shark/ -name hadoop-core-1.0.4.jar
#remove all the hadoop-core-1.0.4.jar

2.

java.lang.NoClassDefFoundError: org/apache/hadoop/hive/conf/HiveConf

Solution : Make sure shark installation is present in worker nodes.
3.

org/apache/hadoop/hive/cli/CliDriver : Unsupported major.minor version 51.0

Solution: shark is compiled with JDK1.7 so set JAVA_HOME to JDK1.7
export JAVA_HOME=/usr/lib/jvm/java-7-oracle-cloudera

4.

Exception in thread "main" java.lang.VerifyError:
class org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$AppendRequestProto overrides final method
getUnknownFields.()Lcom/google/protobuf/UnknownFieldSet;at java.lang.ClassLoader.defineClass1(Native Method)

Solution - This is because of conflict between protobuf-java-2.4.1.jar and protobuf-java-2.5.1.jar 1. Replace protobuf-java-2.4.1.jar with protobuf-java-2.5.1.jar 2. remove protobuf classes from hive-exec*.jar
cd /opt/shark/shark/dep/hive-0.11.0-bin/lib/
jar -xf hive-exec-0.11.0-shark.jar
rm -rf com/ hive-exec-0.11.0-shark.jar
jar -cf hive-exec-0.11.0-shark.jar .

Comments

  1. I have followed your steps but I am getting the last error (4) is there any other protobuf jar that i am missing,

    ReplyDelete
    Replies
    1. Unable to instantiate org.apache.hadoop.hive.metastore.HiveMetaStoreClient....
      Caused by: java.lang.VerifyError: class org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$SetOwnerRequestProto overrides final method getUnknownFields

      Delete
    2. run this from your shark home folder

      find . -name "protobuf-java-"

      Replace all protobuf-java-2.4.1.jar with protobuf-java-2.5.1.jar

      Also verify hive-exec-0.11.0-shark.jar contains which version of protobuf

      Delete
    3. find . -name "protobuf-java-*"

      Delete

Post a Comment

Popular posts from this blog

Upgrading nodejs in ubuntu 14.04

My machine has 5.x installed and had lot of trouble updating it to 8.x. Below are the steps I followed to upgrade nodejs from 5.x to 8.x #add the new source list sudo apt-key adv --keyserver keyserver.ubuntu.com --recv 68576280  sudo apt-add-repository "deb https://deb.nodesource.com/node_8.x $(lsb_release -sc) main" sudo apt-get update #Remove the previous installation sudo apt-get purge nodejs npm  #Verify if proper version is going to be installed apt-cache policy <package> #Install new version sudo apt-get install -y nodejs

spark java.lang.IllegalArgumentException: java.net.UnknownHostException: user

Today I faced an error while trying to use Spark shell. This is how I resolved. scala> val file = sc.textFile("hdfs://...") 14/10/21 13:34:23 INFO MemoryStore: ensureFreeSpace(217085) called with curMem=0, maxMem=309225062 14/10/21 13:34:23 INFO MemoryStore: Block broadcast_0 stored as values to memory (estimated size 212.0 KB, free 294.7 MB) file: org.apache.spark.rdd.RDD[String] = MappedRDD[1] at textFile at <console>:12 scala> file.count() java.lang.IllegalArgumentException: java.net.UnknownHostException: user     at org.apache.hadoop.security.SecurityUtil.buildTokenService(SecurityUtil.java:377)     at org.apache.hadoop.hdfs.NameNodeProxies.createNonHAProxy(NameNodeProxies.java:237)     at org.apache.hadoop.hdfs.NameNodeProxies.createProxy(NameNodeProxies.java:141) This error can be fixed by giving proper hostname and port sc.textFile("hdfs://{hostname}:8020/{filepath}...") scala> file.count() 14/10/21 13:44:23 IN...