spark-submit提交任务时报错,Error initializing SparkContext

Posted

tags:

篇首语:本文由小常识网(cha138.com)小编为大家整理,主要介绍了spark-submit提交任务时报错,Error initializing SparkContext相关的知识,希望对你有一定的参考价值。

16/03/04 00:21:09 WARN SparkContext: Using SPARK_MEM to set amount of memory to use per executor process is deprecated, please use spark.executor.memory instead.

16/03/04 00:21:09 ERROR SparkContext: Error initializing SparkContext.

org.apache.spark.SparkException: Could not parse Master URL: ‘at‘

at org.apache.spark.SparkContext$.org$apache$spark$SparkContext$$createTaskScheduler(SparkContext.scala:2554)

at org.apache.spark.SparkContext.<init>(SparkContext.scala:489)

at com.bigdata.deal.scala.DomainLib$.main(DomainLib.scala:22)

at com.bigdata.deal.scala.DomainLib.main(DomainLib.scala)

at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)

at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)

at java.lang.reflect.Method.invoke(Method.java:606)

at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:664)

at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:169)

at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:192)

at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:111)

at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)


配置conf时,务必要有 sparkHome  master地址

注意在spark-env.sh中要配置这些,就可以了

[[email protected] spark-1.4.0-bin-hadoop2.6]# cat conf/spark-env.sh

#!/usr/bin/env bash


SPARK_MASTER_IP=mini-cp1

#必须导入JAVA根目录路径

export JAVA_HOME=/usr/local/jdk1.7.0_65

export HADOOP_HOME=/usr/local/hadoop-2.6.0



#export SCALA_HOME=/opt/scala

export SPARK_WORKER_MEMORY=3g

export HADOOP_CONF_DIR=/usr/local/hadoop-2.6.0/etc/hadoop

#SPARK_MEM=${SPARK_MEM:-1g}

export SPARK_MEM=3g

export HADOOP_HOME=/usr/local/hadoop-2.6.0

export HADOOP_COMMON_LIB_NATIVE_DIR=/usr/local/hadoop-2.6.0/lib/native

export HADOOP_OPTS="-Djava.library.path=/usr/local/hadoop-2.6.0/lib"




本文出自 “7274992” 博客,转载请与作者联系!

以上是关于spark-submit提交任务时报错,Error initializing SparkContext的主要内容,如果未能解决你的问题,请参考以下文章

SVN提交时报错Error: Commit failed (details follow): Error: No such revision 194 的异常?

spark-submit提交程序,找不到类名

跪求!!!用jsoup提交表单时报错 HTTP error fetching URL,Status=405

使用git提交时报错:error: RPC failed; HTTP 413 curl 22 The requested URL returned error: 413 Request Entity

Spark-submit 测试任务提交

[转] spark-submit 提交任务及参数说明