CM5.x配置spark错误解决

Posted ChavinKing

tags:

篇首语:本文由小常识网(cha138.com)小编为大家整理,主要介绍了CM5.x配置spark错误解决相关的知识,希望对你有一定的参考价值。

通过cloudera manager 5.x添加spark服务,在创建服务过程中,发现spark服务创建失败,可以通过控制台错误输出看到如下日志信息:

+ perl -pi -e ‘s#{{CMF_CONF_DIR}}#/etc/spark/conf.cloudera.spark_on_yarn/yarn-conf#g‘ /opt/cm-5.9.2/run/cloudera-scm-agent/process/ccdeploy_spark-conf_etcsparkconf.cloudera.spark_on_yarn_1615663591259519890/spark-conf/yarn-conf/yarn-site.xml

++ get_default_fs /opt/cm-5.9.2/run/cloudera-scm-agent/process/ccdeploy_spark-conf_etcsparkconf.cloudera.spark_on_yarn_1615663591259519890/spark-conf/yarn-conf

++ get_hadoop_conf /opt/cm-5.9.2/run/cloudera-scm-agent/process/ccdeploy_spark-conf_etcsparkconf.cloudera.spark_on_yarn_1615663591259519890/spark-conf/yarn-conf fs.defaultFS

++ local conf=/opt/cm-5.9.2/run/cloudera-scm-agent/process/ccdeploy_spark-conf_etcsparkconf.cloudera.spark_on_yarn_1615663591259519890/spark-conf/yarn-conf

++ local key=fs.defaultFS

++ ‘[‘ 1 == 1 ‘]‘

++ /opt/cloudera/parcels/CDH-5.9.2-1.cdh5.9.2.p0.3/lib/hadoop/../../bin/hdfs --config /opt/cm-5.9.2/run/cloudera-scm-agent/process/ccdeploy_spark-conf_etcsparkconf.cloudera.spark_on_yarn_1615663591259519890/spark-conf/yarn-conf getconf -confKey fs.defaultFS

Exception in thread "main" java.lang.UnsupportedClassVersionError: org/apache/hadoop/hdfs/tools/GetConf : Unsupported major.minor version 51.0

at java.lang.ClassLoader.defineClass1(Native Method)

at java.lang.ClassLoader.defineClass(ClassLoader.java:643)

at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)

at java.net.URLClassLoader.defineClass(URLClassLoader.java:277)

at java.net.URLClassLoader.access$000(URLClassLoader.java:73)

at java.net.URLClassLoader$1.run(URLClassLoader.java:212)

at java.security.AccessController.doPrivileged(Native Method)

at java.net.URLClassLoader.findClass(URLClassLoader.java:205)

at java.lang.ClassLoader.loadClass(ClassLoader.java:323)

at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:294)

at java.lang.ClassLoader.loadClass(ClassLoader.java:268)

Could not find the main class: org.apache.hadoop.hdfs.tools.GetConf. Program will exit.

+ DEFAULT_FS=

根据输出日志信息大致可以判断这是因为jdk版本导致的添加spark服务失败。因为这是我全权安装的环境,所以印象中jdk版本是满足cm5安装要求的,我这里使用的是jdk1.7.0_67,如下:

# java -version

java version "1.7.0_67"

Java(TM) SE Runtime Environment (build 1.7.0_67-b01)

Java HotSpot(TM) 64-Bit Server VM (build 24.65-b04, mixed mode)

You have new mail in /var/spool/mail/root

针对于目前java版本并没有问题,满足当前安装的cm5版本要求。所以判断可能是因为我是通过tar包方式安装的java的原因,正常通过rpm包安装应该没有这个问题。下面验证自己的猜测:

这里使用alternatives命令,alternatives命令通常用来管理服务器上的相同软件多版本问题。

--查看服务器java版本,发现jdk1.7.0_67没有再服务器管理之下:

[[email protected] ~]# alternatives --config java

There are 2 programs which provide ‘java‘.

  Selection    Command

-----------------------------------------------

   1           /usr/lib/jvm/jre-1.5.0-gcj/bin/java

*+ 2           /usr/lib/jvm/jre-1.6.0-openjdk.x86_64/bin/java

Enter to keep the current selection[+], or type selection number:

--将jdk1.7.0_67添加到服务器管理中:

[[email protected] ~]# alternatives --install /usr/bin/java java /opt/java/jdk1.7.0_67/bin/java 3

--再次查看服务器java版本信息,并且调整优先级最高的为jdk1.7.0_67:

[[email protected] ~]# alternatives --config java

There are 3 programs which provide ‘java‘.

  Selection    Command

-----------------------------------------------

   1           /usr/lib/jvm/jre-1.5.0-gcj/bin/java

*+ 2           /usr/lib/jvm/jre-1.6.0-openjdk.x86_64/bin/java

   3           /opt/java/jdk1.7.0_67/bin/java

Enter to keep the current selection[+], or type selection number: 3

[[email protected] ~]# alternatives --config java

There are 3 programs which provide ‘java‘.

  Selection    Command

-----------------------------------------------

   1           /usr/lib/jvm/jre-1.5.0-gcj/bin/java

*  2           /usr/lib/jvm/jre-1.6.0-openjdk.x86_64/bin/java

+ 3           /opt/java/jdk1.7.0_67/bin/java

Enter to keep the current selection[+], or type selection number:

调整java版本信息后,再次添加spark服务,成功。

或者卸载原生的java版本,如:

# rpm -e java-1.5.0-gcj-1.5.0.0-29.1.el6.x86_64 java-1.6.0-openjdk-1.6.0.0-1.66.1.13.0.el6.x86_64 tzdata-java-2013g-1.el6.noarch java_cup-0.10k-5.el6.x86_64 java-1.6.0-openjdk-devel-1.6.0.0-1.66.1.13.0.el6.x86_64 gcc-java-4.4.7-4.el6.x86_64 --nodeps

以上是关于CM5.x配置spark错误解决的主要内容,如果未能解决你的问题,请参考以下文章

spark错误信息

spark 编译遇到的错误及解决办法

Spark-Hive 错误,我该如何解决?

spark集群详细搭建过程及遇到的问题解决

Java Spark:使用未知连接列名称连接的数据集的 Spark 错误解决方法

spark 编译遇到的错误及解决办法