Spark2.1.0安装
Posted
tags:
篇首语:本文由小常识网(cha138.com)小编为大家整理,主要介绍了Spark2.1.0安装相关的知识,希望对你有一定的参考价值。
1.解压安装spark
tar zxf spark-2.1.O-bin-2.6.0-CDH5.10.0.tgz
2.修改配置文件
vim /etc/profile
export SPARK_HOME=/opt/spark/spark-2.1.O
export PATH=$PATH:$SPARK_HOME/bin
source /etc/profile
3.spark配置
cd /opt/spark/spark-2.1.O/conf
mv spark-env.sh.template spark-env.sh
vim spark-env.sh
export JAVA_HOME=/usr/java/jdk1.7.0_79
export PATH=$PATH:$JAVA_HOME/bin
export SCALA_HOME=/usr/scala/scala-2.11.8
export HADOOP_CONF_DIR=/opt/cloudera/parcels/CDH-5.10.0-1.cdh5.10.0.p0.41/lib/hadoop/etc/hadoop
export SPARK_MASTER_IP=192.168.1.7
export SPARK_MASTER_PORT=7077
export SPARK_DIST_CLASSPATH=$(/opt/cloudera/parcels/CDH-5.10.0-1.cdh5.10.0.p0.41/bin/hadoop classpath)
4.修改slaves
cd /opt/spark/spark-2.1.O/conf
mv slaves.template slaves
vim slaves
192.168.1.7
192.168.1.8
192.168.1.9
5.远程拷贝到其他节点
scp -r /opt/spark/spark-2.1.0 [email protected]:/opt/spark/
scp -r /opt/spark/spark-2.1.0 [email protected]:/opt/spark/
6.启动spark
sbin/start-master.sh
sbin/start-slaves.sh
7.web界面查看
192.168.1.7:8080
以上是关于Spark2.1.0安装的主要内容,如果未能解决你的问题,请参考以下文章
hadoop2.7.3+spark2.1.0环境搭建安装jdk