Sparkj2.X遇到的一个BUG
Posted zhaojinyan
tags:
篇首语:本文由小常识网(cha138.com)小编为大家整理,主要介绍了Sparkj2.X遇到的一个BUG相关的知识,希望对你有一定的参考价值。
spark安装目录下执行start-all.sh后:
[[email protected] spark]$ sbin/start-all.sh
starting org.apache.spark.deploy.master.Master, logging to /home/zdwy/cdh5.9/spark/logs/spark-zdwy-org.apache.spark.deploy.master.Master-1-master.out
failed to launch org.apache.spark.deploy.master.Master:
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 7 more
full log in /home/zdwy/cdh5.9/spark/logs/spark-zdwy-org.apache.spark.deploy.master.Master-1-master.out
slave1: starting org.apache.spark.deploy.worker.Worker, logging to /home/zdwy/cdh5.9/spark/logs/spark-zdwy-org.apache.spark.deploy.worker.Worker-1-slave1.out
slave2: starting org.apache.spark.deploy.worker.Worker, logging to /home/zdwy/cdh5.9/spark/logs/spark-zdwy-org.apache.spark.deploy.worker.Worker-1-slave2.out
slave1: failed to launch org.apache.spark.deploy.worker.Worker:
slave1: at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
slave1: ... 6 more
slave1: full log in /home/zdwy/cdh5.9/spark/logs/spark-zdwy-org.apache.spark.deploy.worker.Worker-1-slave1.out
slave2: failed to launch org.apache.spark.deploy.worker.Worker:
slave2: at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
slave2: ... 6 more
slave2: full log in /home/zdwy/cdh5.9/spark/logs/spark-zdwy-org.apache.spark.deploy.worker.Worker-1-slave2.out
解决方案:
在spark-env.sh中加入:
export SPARK_DIST_CLASSPATH=$(/home/zdwy/cdh5.9/hadoop/bin/hadoop classpath) 这里的/home/zdwy/cdh5.9/hadoop/bin/hadoop是我的hadoop的安装目录
以上是关于Sparkj2.X遇到的一个BUG的主要内容,如果未能解决你的问题,请参考以下文章