spark配置--on yarn配置

Posted iAthena

tags:

篇首语:本文由小常识网(cha138.com)小编为大家整理,主要介绍了spark配置--on yarn配置相关的知识,希望对你有一定的参考价值。

  1. vim /usr/local/spark/conf/spark-env.sh


  1. export SPARK_DIST_CLASSPATH=$(/usr/local/hadoop/bin/hadoop classpath)
  2. export SCALA_HOME=/usr/local/scala
  3. export JAVA_HOME=/opt/jdk1.8.0_65
  4. export SPARK_MASTER=localhost
  5. export SPARK_LOCAL_IP=localhost
  6. export HADOOP_HOME=/usr/local/hadoop
  7. export SPARK_HOME=/usr/local/spark
  8. export SPARK_LIBARY_PATH=.:$JAVA_HOME/lib:$JAVA_HOME/jre/lib:$HADOOP_HOME/lib/native
  9. export YARN_CONF_DIR=$HADOOP_HOME/etc/hadoop

  1. /usr/local/spark/bin/spark-submit --master yarn --num-executors 2 --executor-cores 1 --class "SimpleApp" ~/sparkapp/target/scala-2.10/simple-project_2.10-1.0.jar







以上是关于spark配置--on yarn配置的主要内容,如果未能解决你的问题,请参考以下文章

Dream Spark ------spark on yarn ,yarn的配置

Spark基础:Spark on Yarn(上)

spark配置--on yarn配置

spark on yarn配置

spark on yarn配置

spark on yarn配置