spark配置--on yarn配置
Posted iAthena
tags:
篇首语:本文由小常识网(cha138.com)小编为大家整理,主要介绍了spark配置--on yarn配置相关的知识,希望对你有一定的参考价值。
vim /usr/local/spark/conf/spark-env.sh
export SPARK_DIST_CLASSPATH=$(/usr/local/hadoop/bin/hadoop classpath)
export SCALA_HOME=/usr/local/scala
export JAVA_HOME=/opt/jdk1.8.0_65
export SPARK_MASTER=localhost
export SPARK_LOCAL_IP=localhost
export HADOOP_HOME=/usr/local/hadoop
export SPARK_HOME=/usr/local/spark
export SPARK_LIBARY_PATH=.:$JAVA_HOME/lib:$JAVA_HOME/jre/lib:$HADOOP_HOME/lib/native
export YARN_CONF_DIR=$HADOOP_HOME/etc/hadoop
/usr/local/spark/bin/spark-submit --master yarn --num-executors 2 --executor-cores 1 --class "SimpleApp" ~/sparkapp/target/scala-2.10/simple-project_2.10-1.0.jar
以上是关于spark配置--on yarn配置的主要内容,如果未能解决你的问题,请参考以下文章