为什么使用错误的Java版本进行火花提交
Posted
tags:
篇首语:本文由小常识网(cha138.com)小编为大家整理,主要介绍了为什么使用错误的Java版本进行火花提交相关的知识,希望对你有一定的参考价值。
我使用bin / spark-ec2脚本设置了spark ec2集群。当我SSH到主节点并在示例程序上运行spark-submit时,我从所有执行器中看到以下错误,并且每个执行器都标记为FAILED:
(java.io.IOException: Cannot run program "/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.85.x86_64/jre/bin/java" (in directory "/root/spark/work/app-20150818182649-0015/4"): error=2, No such file or directory)
奇怪的部分是为什么spark正在寻找java-1.7.0-openjdk-1.7.0.85.x86_64。我将JAVA_HOME设置为:/usr/lib/jvm/jre-1.8.0-openjdk。我什至递归地搜索openjdk-1.7.0.85,却一无所获。 所以为什么spark-submit试图使用看似随机的Java版本,它甚至没有安装在主服务器或从服务器上?]]
完整的输出如下:
[ec2-user@ip-172-31-35-149 spark]$ sudo ./bin/spark-submit --class org.apache.spark.examples.mllib.LinearRegression lib/spark-examples-1.4.1-hadoop1.0.4.jar data/mllib/sample_linear_regression_data.txt
15/08/18 18:26:46 INFO spark.SparkContext: Running Spark version 1.4.1
15/08/18 18:26:46 INFO spark.SecurityManager: Changing view acls to: root
15/08/18 18:26:46 INFO spark.SecurityManager: Changing modify acls to: root
15/08/18 18:26:46 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(root); users with modify permissions: Set(root)
15/08/18 18:26:47 INFO slf4j.Slf4jLogger: Slf4jLogger started
15/08/18 18:26:47 INFO Remoting: Starting remoting
15/08/18 18:26:47 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://sparkDriver@172.31.35.149:35948]
15/08/18 18:26:47 INFO util.Utils: Successfully started service 'sparkDriver' on port 35948.
15/08/18 18:26:47 INFO spark.SparkEnv: Registering MapOutputTracker
15/08/18 18:26:47 INFO spark.SparkEnv: Registering BlockManagerMaster
15/08/18 18:26:47 INFO storage.DiskBlockManager: Created local directory at /mnt/spark/spark-7d250ff1-595c-4217-98e1-e8be34868789/blockmgr-24b5de9d-8496-44e8-8806-b091ece651f0
15/08/18 18:26:47 INFO storage.DiskBlockManager: Created local directory at /mnt2/spark/spark-350c9f91-21f0-49f6-a1c1-bc32befdb3e8/blockmgr-b9fc6955-b7c4-46aa-8309-23d51082ef51
15/08/18 18:26:47 INFO storage.MemoryStore: MemoryStore started with capacity 265.1 MB
15/08/18 18:26:47 INFO spark.HttpFileServer: HTTP File server directory is /mnt/spark/spark-7d250ff1-595c-4217-98e1-e8be34868789/httpd-664cbb8f-3599-49b3-8f83-fceb00b5ca7e
15/08/18 18:26:47 INFO spark.HttpServer: Starting HTTP Server
15/08/18 18:26:47 INFO server.Server: jetty-8.y.z-SNAPSHOT
15/08/18 18:26:47 INFO server.AbstractConnector: Started SocketConnector@0.0.0.0:43864
15/08/18 18:26:47 INFO util.Utils: Successfully started service 'HTTP file server' on port 43864.
15/08/18 18:26:47 INFO spark.SparkEnv: Registering OutputCommitCoordinator
15/08/18 18:26:48 INFO server.Server: jetty-8.y.z-SNAPSHOT
15/08/18 18:26:48 INFO server.AbstractConnector: Started SelectChannelConnector@0.0.0.0:4040
15/08/18 18:26:48 INFO util.Utils: Successfully started service 'SparkUI' on port 4040.
15/08/18 18:26:48 INFO ui.SparkUI: Started SparkUI at http://ec2-54-187-197-56.us-west-2.compute.amazonaws.com:4040
15/08/18 18:26:48 INFO spark.SparkContext: Added JAR file:/root/spark/lib/spark-examples-1.4.1-hadoop1.0.4.jar at http://172.31.35.149:43864/jars/spark-examples-1.4.1-hadoop1.0.4.jar with timestamp 1439922408595
15/08/18 18:26:48 INFO client.AppClient$ClientActor: Connecting to master akka.tcp://sparkMaster@ec2-54-187-197-56.us-west-2.compute.amazonaws.com:7077/user/Master...
15/08/18 18:26:49 INFO cluster.SparkDeploySchedulerBackend: Connected to Spark cluster with app ID app-20150818182649-0015
15/08/18 18:26:49 INFO client.AppClient$ClientActor: Executor added: app-20150818182649-0015/0 on worker-20150812175308-172.31.33.91-56792 (172.31.33.91:56792) with 2 cores
15/08/18 18:26:49 INFO cluster.SparkDeploySchedulerBackend: Granted executor ID app-20150818182649-0015/0 on hostPort 172.31.33.91:56792 with 2 cores, 6.0 GB RAM
15/08/18 18:26:49 INFO client.AppClient$ClientActor: Executor added: app-20150818182649-0015/1 on worker-20150812175308-172.31.33.90-48856 (172.31.33.90:48856) with 2 cores
15/08/18 18:26:49 INFO cluster.SparkDeploySchedulerBackend: Granted executor ID app-20150818182649-0015/1 on hostPort 172.31.33.90:48856 with 2 cores, 6.0 GB RAM
15/08/18 18:26:49 INFO client.AppClient$ClientActor: Executor updated: app-20150818182649-0015/0 is now LOADING
15/08/18 18:26:49 INFO client.AppClient$ClientActor: Executor updated: app-20150818182649-0015/1 is now LOADING
15/08/18 18:26:49 INFO client.AppClient$ClientActor: Executor updated: app-20150818182649-0015/0 is now FAILED (java.io.IOException: Cannot run program "/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.85.x86_64/jre/bin/java" (in directory "/root/spark/work/app-20150818182649-0015/0"): error=2, No such file or directory)
15/08/18 18:26:49 INFO cluster.SparkDeploySchedulerBackend: Executor app-20150818182649-0015/0 removed: java.io.IOException: Cannot run program "/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.85.x86_64/jre/bin/java" (in directory "/root/spark/work/app-20150818182649-0015/0"): error=2, No such file or directory
15/08/18 18:26:49 ERROR cluster.SparkDeploySchedulerBackend: Asked to remove non-existent executor 0
15/08/18 18:26:49 INFO client.AppClient$ClientActor: Executor added: app-20150818182649-0015/2 on worker-20150812175308-172.31.33.91-56792 (172.31.33.91:56792) with 2 cores
15/08/18 18:26:49 INFO cluster.SparkDeploySchedulerBackend: Granted executor ID app-20150818182649-0015/2 on hostPort 172.31.33.91:56792 with 2 cores, 6.0 GB RAM
15/08/18 18:26:49 INFO client.AppClient$ClientActor: Executor updated: app-20150818182649-0015/1 is now FAILED (java.io.IOException: Cannot run program "/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.85.x86_64/jre/bin/java" (in directory "/root/spark/work/app-20150818182649-0015/1"): error=2, No such file or directory)
15/08/18 18:26:49 INFO cluster.SparkDeploySchedulerBackend: Executor app-20150818182649-0015/1 removed: java.io.IOException: Cannot run program "/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.85.x86_64/jre/bin/java" (in directory "/root/spark/work/app-20150818182649-0015/1"): error=2, No such file or directory
15/08/18 18:26:49 ERROR cluster.SparkDeploySchedulerBackend: Asked to remove non-existent executor 1
15/08/18 18:26:49 INFO client.AppClient$ClientActor: Executor added: app-20150818182649-0015/3 on worker-20150812175308-172.31.33.90-48856 (172.31.33.90:48856) with 2 cores
15/08/18 18:26:49 INFO cluster.SparkDeploySchedulerBackend: Granted executor ID app-20150818182649-0015/3 on hostPort 172.31.33.90:48856 with 2 cores, 6.0 GB RAM
15/08/18 18:26:49 INFO client.AppClient$ClientActor: Executor updated: app-20150818182649-0015/2 is now LOADING
15/08/18 18:26:49 INFO client.AppClient$ClientActor: Executor updated: app-20150818182649-0015/3 is now LOADING
15/08/18 18:26:49 INFO client.AppClient$ClientActor: Executor updated: app-20150818182649-0015/2 is now FAILED (java.io.IOException: Cannot run program "/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.85.x86_64/jre/bin/java" (in directory "/root/spark/work/app-20150818182649-0015/2"): error=2, No such file or directory)
15/08/18 18:26:49 INFO cluster.SparkDeploySchedulerBackend: Executor app-20150818182649-0015/2 removed: java.io.IOException: Cannot run program "/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.85.x86_64/jre/bin/java" (in directory "/root/spark/work/app-20150818182649-0015/2"): error=2, No such file or directory
15/08/18 18:26:49 ERROR cluster.SparkDeploySchedulerBackend: Asked to remove non-existent executor 2
15/08/18 18:26:49 INFO client.AppClient$ClientActor: Executor added: app-20150818182649-0015/4 on worker-20150812175308-172.31.33.91-56792 (172.31.33.91:56792) with 2 cores
15/08/18 18:26:49 INFO cluster.SparkDeploySchedulerBackend: Granted executor ID app-20150818182649-0015/4 on hostPort 172.31.33.91:56792 with 2 cores, 6.0 GB RAM
15/08/18 18:26:49 INFO client.AppClient$ClientActor: Executor updated: app-20150818182649-0015/3 is now FAILED (java.io.IOException: Cannot run program "/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.85.x86_64/jre/bin/java" (in directory "/root/spark/work/app-20150818182649-0015/3"): error=2, No such file or directory)
15/08/18 18:26:49 INFO cluster.SparkDeploySchedulerBackend: Executor app-20150818182649-0015/3 removed: java.io.IOException: Cannot run program "/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.85.x86_64/jre/bin/java" (in directory "/root/spark/work/app-20150818182649-0015/3"): error=2, No such file or directory
15/08/18 18:26:49 ERROR cluster.SparkDeploySchedulerBackend: Asked to remove non-existent executor 3
15/08/18 18:26:49 INFO client.AppClient$ClientActor: Executor added: app-20150818182649-0015/5 on worker-20150812175308-172.31.33.90-48856 (172.31.33.90:48856) with 2 cores
15/08/18 18:26:49 INFO cluster.SparkDeploySchedulerBackend: Granted executor ID app-20150818182649-0015/5 on hostPort 172.31.33.90:48856 with 2 cores, 6.0 GB RAM
15/08/18 18:26:49 INFO client.AppClient$ClientActor: Executor updated: app-20150818182649-0015/4 is now LOADING
15/08/18 18:26:49 INFO client.AppClient$ClientActor: Executor updated: app-20150818182649-0015/4 is now FAILED (java.io.IOException: Cannot run program "/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.85.x86_64/jre/bin/java" (in directory "/root/spark/work/app-20150818182649-0015/4"): error=2, No such file or directory)
15/08/18 18:26:49 INFO cluster.SparkDeploySchedulerBackend: Executor app-20150818182649-0015/4 removed: java.io.IOException: Cannot run program "/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.85.x86_64/jre/bin/java" (in directory "/root/spark/work/app-20150818182649-0015/4"): error=2, No such file or directory
15/08/18 18:26:49 ERROR cluster.SparkDeploySchedulerBackend: Asked to remove non-existent executor 4
15/08/18 18:26:49 INFO client.AppClient$ClientActor: Executor added: app-20150818182649-0015/6 on worker-20150812175308-172.31.33.91-56792 (172.31.33.91:56792) with 2 cores
15/08/18 18:26:49 INFO cluster.SparkDeploySchedulerBackend: Granted executor ID app-20150818182649-0015/6 on hostPort 172.31.33.91:56792 with 2 cores, 6.0 GB RAM
15/08/18 18:26:49 INFO client.AppClient$ClientActor: Executor updated: app-20150818182649-0015/5 is now LOADING
15/08/18 18:26:49 INFO client.AppClient$ClientActor: Executor updated: app-20150818182649-0015/5 is now FAILED (java.io.IOException: Cannot run program "/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.85.x86_64/jre/bin/java" (in directory "/root/spark/work/app-20150818182649-0015/5"): error=2, No such file or directory)
15/08/18 18:26:49 INFO cluster.SparkDeploySchedulerBackend: Executor app-20150818182649-0015/5 removed: java.io.IOException: Cannot run program "/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.85.x86_64/jre/bin/java" (in directory "/root/spark/work/app-20150818182649-0015/5"): error=2, No such file or directory
15/08/18 18:26:49 ERROR cluster.SparkDeploySchedulerBackend: Asked to remove non-existent executor 5
15/08/18 18:26:49 INFO client.AppClient$ClientActor: Executor added: app-20150818182649-0015/7 on worker-20150812175308-172.31.33.90-48856 (172.31.33.90:48856) with 2 cores
15/08/18 18:26:49 INFO cluster.SparkDeploySchedulerBackend: Granted executor ID app-20150818182649-0015/7 on hostPort 172.31.33.90:48856 with 2 cores, 6.0 GB RAM
15/08/18 18:26:49 INFO client.AppClient$ClientActor: Executor updated: app-20150818182649-0015/6 is now LOADING
15/08/18 18:26:49 INFO client.AppClient$ClientActor: Executor updated: app-20150818182649-0015/6 is now FAILED (java.io.IOException: Cannot run program "/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.85.x86_64/jre/bin/java" (in directory "/root/spark/work/app-20150818182649-0015/6"): error=2, No such file or directory)
15/08/18 18:26:49 INFO cluster.SparkDeploySchedulerBackend: Executor app-20150818182649-0015/6 removed: java.io.IOException: Cannot run program "/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.85.x86_64/jre/bin/java" (in directory "/root/spark/work/app-20150818182649-0015/6"): error=2, No such file or directory
15/08/18 18:26:49 ERROR cluster.SparkDeploySchedulerBackend: Asked to remove non-existent executor 6
15/08/18 18:26:49 INFO client.AppClient$ClientActor: Executor added: app-20150818182649-0015/8 on worker-20150812175308-172.31.33.91-56792 (172.31.33.91:56792) with 2 cores
15/08/18 18:26:49 INFO cluster.SparkDeploySchedulerBackend: Granted executor ID app-20150818182649-0015/8 on hostPort 172.31.33.91:56792 with 2 cores, 6.0 GB RAM
15/08/18 18:26:49 INFO client.AppClient$ClientActor: Executor updated: app-20150818182649-0015/7 is now LOADING
15/08/18 18:26:49 INFO client.AppClient$ClientActor: Executor updated: app-20150818182649-0015/8 is now LOADING
15/08/18 18:26:49 INFO client.AppClient$ClientActor: Executor updated: app-20150818182649-0015/7 is now FAILED (java.io.IOException: Cannot run program "/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.85.x86_64/jre/bin/java" (in directory "/root/spark/work/app-20150818182649-0015/7"): error=2, No such file or directory)
15/08/18 18:26:49 INFO cluster.SparkDeploySchedulerBackend: Executor app-20150818182649-0015/7 removed: java.io.IOException: Cannot run program "/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.85.x86_64/jre/bin/java" (in directory "/root/spark/work/app-20150818182649-0015/7"): error=2, No such file or directory
15/08/18 18:26:49 ERROR cluster.SparkDeploySchedulerBackend: Asked to remove non-existent executor 7
15/08/18 18:26:49 INFO client.AppClient$ClientActor: Executor added: app-20150818182649-0015/9 on worker-20150812175308-172.31.33.90-48856 (172.31.33.90:48856) with 2 cores
15/08/18 18:26:49 INFO cluster.SparkDeploySchedulerBackend: Granted executor ID app-20150818182649-0015/9 on hostPort 172.31.33.90:48856 with 2 cores, 6.0 GB RAM
15/08/18 18:26:49 INFO client.AppClient$ClientActor: Executor updated: app-20150818182649-0015/8 is now FAILED (java.io.IOException: Cannot run program "/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.85.x86_64/jre/bin/java" (in directory "/root/spark/work/app-20150818182649-0015/8"): error=2, No such file or directory)
15/08/18 18:26:49 INFO cluster.SparkDeploySchedulerBackend: Executor app-20150818182649-0015/8 removed: java.io.IOException: Cannot run program "/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.85.x86_64/jre/bin/java" (in directory "/root/spark/work/app-20150818182649-0015/8"): error=2, No such file or directory
15/08/18 18:26:49 ERROR cluster.SparkDeploySchedulerBackend: Asked to remove non-existent executor 8
15/08/18 18:26:49 INFO client.AppClient$ClientActor: Executor added: app-20150818182649-0015/10 on worker-20150812175308-172.31.33.91-56792 (172.31.33.91:56792) with 2 cores
15/08/18 18:26:49 INFO cluster.SparkDeploySchedulerBackend: Granted executor ID app-20150818182649-0015/10 on hostPort 172.31.33.91:56792 with 2 cores, 6.0 GB RAM
15/08/18 18:26:49 INFO client.AppClient$ClientActor: Executor updated: app-20150818182649-0015/9 is now LOADING
15/08/18 18:26:49 INFO client.AppClient$ClientActor: Executor updated: app-20150818182649-0015/9 is now FAILED (java.io.IOException: Cannot run program "/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.85.x86_64/jre/bin/java" (in directory "/root/spark/work/app-20150818182649-0015/9"): error=2, No such file or directory)
15/08/18 18:26:49 INFO cluster.SparkDeploySchedulerBackend: Executor app-20150818182649-0015/9 removed: java.io.IOException: Cannot run program "/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.85.x86_64/jre/bin/java" (in directory "/root/spark/work/app-20150818182649-0015/9"): error=2, No such file or directory
15/08/18 18:26:49 ERROR cluster.SparkDeploySchedulerBackend: Asked to remove non-existent executor 9
15/08/18 18:26:49 ERROR cluster.SparkDeploySchedulerBackend: Application has been killed. Reason: Master removed our application: FAILED
15/08/18 18:26:49 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/stage/kill,null}
15/08/18 18:26:49 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/api,null}
15/08/18 18:26:49 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/,null}
15/08/18 18:26:49 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/static,null}
15/08/18 18:26:49 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors/threadDump/json,null}
15/08/18 18:26:49 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors/threadDump,null}
15/08/18 18:26:49 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors/json,null}
15/08/18 18:26:49 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors,null}
15/08/18 18:26:49 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/environment/json,null}
15/08/18 18:26:49 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/environment,null}
15/08/18 18:26:49 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage/rdd/json,null}
15/08/18 18:26:49 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage/rdd,null}
15/08/18 18:26:49 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage/json,null}
15/08/18 18:26:49 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage,null}
15/08/18 18:26:49 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/pool/json,null}
15/08/18 18:26:49 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/pool,null}
15/08/18 18:26:49 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/stage/json,null}
15/08/18 18:26:49 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/stage,null}
15/08/18 18:26:49 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/json,null}
15/08/18 18:26:49 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages,null}
15/08/18 18:26:49 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs/job/json,null}
15/08/18 18:26:49 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs/job,null}
15/08/18 18:26:49 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs/json,null}
15/08/18 18:26:49 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs,null}
15/08/18 18:26:49 INFO ui.SparkUI: Stopped Spark web UI at http://ec2-54-187-197-56.us-west-2.compute.amazonaws.com:4040
15/08/18 18:26:49 INFO scheduler.DAGScheduler: Stopping DAGScheduler
15/08/18 18:26:49 INFO cluster.SparkDeploySchedulerBackend: Shutting down all executors
15/08/18 18:26:49 INFO cluster.SparkDeploySchedulerBackend: Asking each executor to shut down
15/08/18 18:26:49 INFO util.Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 34521.
15/08/18 18:26:49 INFO netty.NettyBlockTransferService: Server created on 34521
15/08/18 18:26:49 INFO storage.BlockManagerMaster: Trying to register BlockManager
15/08/18 18:26:49 INFO storage.BlockManagerMasterEndpoint: Registering block manager 172.31.35.149:34521 with 265.1 MB RAM, BlockManagerId(driver, 172.31.35.149, 34521)
15/08/18 18:26:49 INFO storage.BlockManagerMaster: Registered BlockManager
15/08/18 18:26:49 ERROR spark.SparkContext: Error initializing SparkContext.
java.lang.IllegalStateException: Cannot call methods on a stopped SparkContext
at org.apache.spark.SparkContext.org$apache$spark$SparkContext$$assertNotStopped(SparkContext.scala:103)
at org.apache.spark.SparkContext.getSchedulingMode(SparkContext.scala:1503)
at org.apache.spark.SparkContext.postEnvironmentUpdate(SparkContext.scala:2007)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:543)
at org.apache.spark.examples.mllib.LinearRegression$.run(LinearRegression.scala:92)
at org.apache.spark.examples.mllib.LinearRegression$$anonfun$main$1.apply(LinearRegression.scala:84)
at org.apache.spark.examples.mllib.LinearRegression$$anonfun$main$1.apply(LinearRegression.scala:83)
at scala.Option.map(Option.scala:145)
at org.apache.spark.examples.mllib.LinearRegression$.main(LinearRegression.scala:83)
at org.apache.spark.examples.mllib.LinearRegression.main(LinearRegression.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:665)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:170)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:193)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:112)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
15/08/18 18:26:49 INFO spark.SparkContext: SparkContext already stopped.
Exception in thread "main" java.lang.IllegalStateException: Cannot call methods on a stopped SparkContext
at org.apache.spark.SparkContext.org$apache$spark$SparkContext$$assertNotStopped(SparkContext.scala:103)
at org.apache.spark.SparkContext.getSchedulingMode(SparkContext.scala:1503)
at org.apache.spark.SparkContext.postEnvironmentUpdate(SparkContext.scala:2007)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:543)
at org.apache.spark.examples.mllib.LinearRegression$.run(LinearRegression.scala:92)
at org.apache.spark.examples.mllib.LinearRegression$$anonfun$main$1.apply(LinearRegression.scala:84)
at org.apache.spark.examples.mllib.LinearRegression$$anonfun$main$1.apply(LinearRegression.scala:83)
at scala.Option.map(Option.scala:145)
at org.apache.spark.examples.mllib.LinearRegression$.main(LinearRegression.scala:83)
at org.apache.spark.examples.mllib.LinearRegression.main(LinearRegression.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:665)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:170)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:193)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:112)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
15/08/18 18:26:49 INFO storage.DiskBlockManager: Shutdown hook called
15/08/18 18:26:49 INFO util.Utils: path = /mnt/spark/spark-7d250ff1-595c-4217-98e1-e8be34868789/blockmgr-24b5de9d-8496-44e8-8806-b091ece651f0, already present as root for deletion.
15/08/18 18:26:49 INFO util.Utils: path = /mnt2/spark/spark-350c9f91-21f0-49f6-a1c1-bc32befdb3e8/blockmgr-b9fc6955-b7c4-46aa-8309-23d51082ef51, already present as root for deletion.
15/08/18 18:26:49 INFO util.Utils: Shutdown hook called
15/08/18 18:26:49 INFO util.Utils: Deleting directory /mnt/spark/spark-7d250ff1-595c-4217-98e1-e8be34868789/httpd-664cbb8f-3599-49b3-8f83-fceb00b5ca7e
15/08/18 18:26:49 INFO util.Utils: Deleting directory /mnt2/spark/spark-350c9f91-21f0-49f6-a1c1-bc32befdb3e8
15/08/18 18:26:49 INFO util.Utils: Deleting directory /mnt/spark/spark-7d250ff1-595c-4217-98e1-e8be34868789
我使用bin / spark-ec2脚本设置了spark ec2集群。当我SSH到主节点并在示例程序上运行spark-submit时,我从所有执行程序中看到以下错误,并且每个执行程序都是...
答案
我将Java从java-1.7.0-openjdk-1.7.0.85.x86_64升级到1.8。我忘了弹起火花工人。因此,工人从开始时即升级到1.8之前就有了道路。
另一答案
我面临着同样的问题。虽然我在奴隶之前并没有改变,但在奴隶身上进行了所有更改之后,我也遇到同样的问题。
以上是关于为什么使用错误的Java版本进行火花提交的主要内容,如果未能解决你的问题,请参考以下文章