Spark2.1.0单机模式无法启动master的问题

Posted 豪放婉约派程序员

tags:

篇首语:本文由小常识网(cha138.com)小编为大家整理,主要介绍了Spark2.1.0单机模式无法启动master的问题相关的知识,希望对你有一定的参考价值。

运行start-master.sh后,日志报错如下:

starting org.apache.spark.deploy.master.Master, logging to /home/hadoop/spark-2.1.0-bin-hadoop2.7/logs/spark-hadoop-org.apache.spark.deploy.master.Master-1-hadoop1.out
[[email protected] sbin]# cat /home/hadoop/spark-2.1.0-bin-hadoop2.7/logs/spark-hadoop-org.apache.spark.deploy.master.Master-1-hadoop1.out
Spark Command: /home/hadoop/hadoop/jdk1.8.0_101/bin/java -cp /home/hadoop/spark-2.1.0-bin-hadoop2.7/conf/:/home/hadoop/spark-2.1.0-bin-hadoop2.7/jars/* -Xmx1g org.apache.spark.deploy.master.Master --host hadoop1 --port 7077 --webui-port 8080
========================================
Using Sparks default log4j profile: org/apache/spark/log4j-defaults.properties
17/03/04 21:09:01 INFO Master: Started daemon with process name: 14373@hadoop1
17/03/04 21:09:01 INFO SignalUtils: Registered signal handler for TERM
17/03/04 21:09:01 INFO SignalUtils: Registered signal handler for HUP
17/03/04 21:09:01 INFO SignalUtils: Registered signal handler for INT
17/03/04 21:09:01 WARN MasterArguments: SPARK_MASTER_IP is deprecated, please use SPARK_MASTER_HOST
17/03/04 21:09:02 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
17/03/04 21:09:02 INFO SecurityManager: Changing view acls to: root
17/03/04 21:09:02 INFO SecurityManager: Changing modify acls to: root
17/03/04 21:09:02 INFO SecurityManager: Changing view acls groups to: 
17/03/04 21:09:02 INFO SecurityManager: Changing modify acls groups to: 
17/03/04 21:09:02 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(root); groups with view permissions: Set(); users  with modify permissions: Set(root); groups with modify permissions: Set()
17/03/04 21:09:02 WARN Utils: Service sparkMaster could not bind on port 7077. Attempting port 7078.
17/03/04 21:09:02 WARN Utils: Service sparkMaster could not bind on port 7078. Attempting port 7079.
17/03/04 21:09:02 WARN Utils: Service sparkMaster could not bind on port 7079. Attempting port 7080.
17/03/04 21:09:02 WARN Utils: Service sparkMaster could not bind on port 7080. Attempting port 7081.
17/03/04 21:09:02 WARN Utils: Service sparkMaster could not bind on port 7081. Attempting port 7082.
17/03/04 21:09:02 WARN Utils: Service sparkMaster could not bind on port 7082. Attempting port 7083.
17/03/04 21:09:02 WARN Utils: Service sparkMaster could not bind on port 7083. Attempting port 7084.
17/03/04 21:09:02 WARN Utils: Service sparkMaster could not bind on port 7084. Attempting port 7085.
17/03/04 21:09:02 WARN Utils: Service sparkMaster could not bind on port 7085. Attempting port 7086.
17/03/04 21:09:02 WARN Utils: Service sparkMaster could not bind on port 7086. Attempting port 7087.
17/03/04 21:09:02 WARN Utils: Service sparkMaster could not bind on port 7087. Attempting port 7088.
17/03/04 21:09:02 WARN Utils: Service sparkMaster could not bind on port 7088. Attempting port 7089.
17/03/04 21:09:02 WARN Utils: Service sparkMaster could not bind on port 7089. Attempting port 7090.
17/03/04 21:09:02 WARN Utils: Service sparkMaster could not bind on port 7090. Attempting port 7091.
17/03/04 21:09:02 WARN Utils: Service sparkMaster could not bind on port 7091. Attempting port 7092.
17/03/04 21:09:02 WARN Utils: Service sparkMaster could not bind on port 7092. Attempting port 7093.
Exception in thread "main" java.net.BindException: 无法指定被请求的地址: Service sparkMaster failed after 16 retries (starting from 7077)! Consider explicitly setting the appropriate port for the service sparkMaster (for example spark.ui.port for SparkUI) to an available port or increasing spark.port.maxRetries.
    at sun.nio.ch.Net.bind0(Native Method)
    at sun.nio.ch.Net.bind(Net.java:433)
    at sun.nio.ch.Net.bind(Net.java:425)
    at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:223)
    at io.netty.channel.socket.nio.NioserverSocketChannel.doBind(NioServerSocketChannel.java:127)
    at io.netty.channel.AbstractChannel$AbstractUnsafe.bind(AbstractChannel.java:501)
    at io.netty.channel.DefaultChannelPipeline$HeadContext.bind(DefaultChannelPipeline.java:1218)
    at io.netty.channel.AbstractChannelHandlerContext.invokeBind(AbstractChannelHandlerContext.java:506)
    at io.netty.channel.AbstractChannelHandlerContext.bind(AbstractChannelHandlerContext.java:491)
    at io.netty.channel.DefaultChannelPipeline.bind(DefaultChannelPipeline.java:965)
    at io.netty.channel.AbstractChannel.bind(AbstractChannel.java:210)
    at io.netty.bootstrap.AbstractBootstrap$2.run(AbstractBootstrap.java:353)
    at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:408)
    at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:455)
    at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:140)
    at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:144)
    at java.lang.Thread.run(Thread.java:745)

 

解决办法:

在spark-env.sh中配置:

export  SPARK_MASTER_HOST=127.0.0.1
export  SPARK_LOCAL_IP=127.0.0.1

再次运行启动脚本即可。

以上是关于Spark2.1.0单机模式无法启动master的问题的主要内容,如果未能解决你的问题,请参考以下文章

Spark2.1.0之源码分析——事件总线

无法使用 Apache spark 2.1.0 连接到 hive 数据库

Spark2.1.0编译

Spark2.1.0安装

在Spark2.1.0中使用Date作为DateFrame列

pyspark.sql.utils.AnalysisException:u'无法推断Parquet的模式。必须手动指定。